Download presentation
Presentation is loading. Please wait.
Published byGary Austin Modified over 9 years ago
1
SATE 2010 Analysis Aurélien Delaitre, NIST aurelien.delaitre@nist.gov October 1, 2010 The SAMATE Project http://samate.nist.gov/
2
Outline What tools find What people find – CVEs – Manual analysis
3
Building on SATE 2009 SATE 2010 SATE 2009 SATE 2010
4
Improving categories True Insignificant SATE 2009 Security Quality Insignificant SATE 2010
5
Improving the guidelines 45 lines → 314 lines Considering weakness types Better uniformity in evaluations
6
Decision process Security False Insignificant Unknown Quality Path...Type... Context... Bug...
7
Sampling Warnings of each class of severity 1 - 4
8
Weakness categories
9
Quality and security related
10
Non-false overlap
11
CVEs Key elements of the path for matching: Blocks of code Sink or upflow path elements But not exhaustive
12
Example /* Dialect Index */ dialect = tvb_get_letohs(tvb, offset); if (si->sip && si->sip->extra_info_type==SMB_EI_DIALECTS) { dialects = si->sip->extra_info; if (dialect num) { dialect_name = dialects->name[dialect]; } if (!dialect_name) { dialect_name = "unknown"; }
13
Manual analysis Dovecot for C Pebble for Java – Used a slightly later version
14
Dovecot No remotely exploitable vulnerability found Threat modeling Fuzzing Code review
15
Pebble Several vulnerabilities found Threat modeling Pen. test Code review
16
Tools ∩ humans No human findings for Dovecot No matches for Chrome and Wireshark
17
Interpretation All weaknesses CVEs Tool findings CVEs ∩ tool findings = ∅
18
Interpretation CVE descriptions ∩ tool findings = ∅ All weaknesses CVEs Tool findings CVE descriptions
19
Questions
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.