Presentation is loading. Please wait.

Presentation is loading. Please wait.

This is a work of the U.S. Government and is not subject to copyright protection in the United States. The OWASP Foundation OWASP AppSec DC October 2005.

Similar presentations


Presentation on theme: "This is a work of the U.S. Government and is not subject to copyright protection in the United States. The OWASP Foundation OWASP AppSec DC October 2005."— Presentation transcript:

1 This is a work of the U.S. Government and is not subject to copyright protection in the United States. The OWASP Foundation OWASP AppSec DC October 2005 http://www.owasp.org/ The Software Assurance Metrics and Tool Evaluation (SAMATE) Project Paul E. Black Computer Scientist, NIST paul.black@nist.gov +1 301-975-4794

2 OWASP AppSec DC 2005 2 Outline  Overview of the NIST SAMATE project  Purposes of tool and technique evaluation  Software and effectiveness metrics  Report of workshop on Defining the State of the Art in Software Security Tools  Final comments

3 OWASP AppSec DC 2005 3 The NIST SAMATE Project  Surveys  Tools  Researchers and companies  Workshops & conference sessions  Taxonomy of software assurance (SwA) functions & techniques  Order of importance (cost/benefit, criticalities, …)  Gaps and research agendas  Studies to develop metrics  Enable tool evaluations  Write detailed specification  Develop test plans and reference material  Collect tool evaluations, case studies, and comparisons http://samate.nist.gov/

4 OWASP AppSec DC 2005 4 Taxonomy of SwA Tool Functions and Techniques  Concept or business need  Use cases  Changes to current systems  Requirements and design  Consistency  Completeness  Compliance  Implementation  The usual …  Assessment and acceptance  External  Automated vulnerability scanners  Penetration test assistants  Other standard testing techniques: usage, spec-based, statistical, worst-case/criticality, etc.  Insider  Automated code scanners –Syntactic, e.g., “grep” –Semantic  Code review assistants –Source code –Virtual Machine code (e.g., Java bytecode or.Net intermediate code) –Binary (debugger, decompiler)  Operation  Operator training  Auditing  Penetration test

5 OWASP AppSec DC 2005 5 Planned Workshops  Enumerate SwA functions and techniques  Approach (code vs. spec, static vs. dynamic)  Software type (distributed, real time, secure)  Type of fault detected  Recruit focus groups  Which are the most “important”?  Highest cost/benefit ratio?  Finds highest priority vulnerabilities?  Most widely used?  Critique reference dataset  Identify gaps in functions and recommend research  Plan and initiate studies for metrics

6 OWASP AppSec DC 2005 6 Outline  Overview of the NIST SAMATE project  Purposes of tool and technique evaluation  Software and effectiveness metrics  Report of workshop on Defining the State of the Art in Software Security Tools  Final comments

7 OWASP AppSec DC 2005 7 Purposes of Tool or Technique Evaluations  Precisely document what a tool does (and doesn’t) do … in order to …  Provide feedback to tool developers  Simple changes to make  Directions for future releases  Inform users  Match the tool or technique to a particular situation  Understand significance of tool results  Know how techniques work together

8 OWASP AppSec DC 2005 8 Developing a Specification  After a technique or tool function is selected by the working group …  NIST and focus group develops a clear, testable specification  Specification posted for public comment  Comments incorporated  Develop a measurement methodology:  Test cases  Procedures  Reference implementations and data  Scripts and auxiliary programs  Interpretation criteria

9 126121824 Workshop1 SwA classes 34591521 Workshop 3 metrics studies Workshop 2 research gaps focus group class 1 focus group class 1 Function Taxonomy Tool Survey Publication strawman spec test plan draft Spec0 Spec1 select func strawman spec draft Spec0 Spec1 SAMATE Project Timeline focus group class 2 focus group class 2 tool testing matrix select func

10 OWASP AppSec DC 2005 10 Why Look at Checking First?  Vital for software developed outside, i.e., when process is not visible  Applicable to legacy software  Feedback for process improvement  Process experiments are expensive  Many are working on process (SEI, PSP, etc.)

11 OWASP AppSec DC 2005 11 Outline  Overview of the NIST SAMATE project  Purposes of tool and technique evaluation  Software and effectiveness metrics  Report of workshop on Defining the State of the Art in Software Security Tools  Final comments

12 OWASP AppSec DC 2005 12 But, is the tool or methodology effective?  Is this program secure (enough)?  How secure does tool X make a program?  How much more secure does technique X make a program after techniques Y and Z ?  Do they really find or prevent bugs and vulnerabilities?  Dollar for dollar, does methodology P or S give more reliability?

13 OWASP AppSec DC 2005 13 Toward Software Metrics  Qualitative comparison  Formally defined quantity  Unit and scale  Measured value  Derived units warmer, colderbuggy, secure temperaturequality? confidence? degree, Kelvin ? Heat energy=smt Software assurance  pt

14 OWASP AppSec DC 2005 14 Benefits of SAMATE Project  Define metrics for evaluating SwA tools  Users can make more informed tool choices  Neutral test program  Tool creators make better tools

15 OWASP AppSec DC 2005 15 Outline  Overview of the NIST SAMATE project  Purposes of tool and technique evaluation  Software and effectiveness metrics  Report of workshop on Defining the State of the Art in Software Security Tools  Final comments

16 OWASP AppSec DC 2005 16 Workshop on Defining the State of the Art in Software Security Tools  Workshop Characteristics  NIST, Gaithersburg  10 & 11 August 2005  http://samate.nist.gov/softSecToolsSOA  45 people from government, universities, vendors and service providers, and research companies came  Proceedings, including discussion notes and submitted material, should be available from above URL when you see this.  Goals  Understand the state of the art of software security assurance tools in detecting security flaws and vulnerabilities.  Discuss metrics to evaluate the effectiveness of such tools.  Collect flawed and “clean” software for a reference dataset.  Publish a report on classes of software security vulnerabilities.

17 OWASP AppSec DC 2005 17 Outcomes of Workshop I  Understand the state of the art of software security assurance tools in detecting security flaws and vulnerabilities.  A report is being written.  Discuss metrics to evaluate the tool effectiveness  All agreed that software metrics and tool effectiveness metrics are a good idea.  No consensus on how to approach the challenge.

18 OWASP AppSec DC 2005 18 Outcomes of Workshop II  Collect flawed and “clean” software to be a reference.  Several collections emerged: MIT, Fortify, etc.  Attendees agreed that a shared reference dataset would help.  NIST reference dataset in development.  Prototype available at (URL forthcoming)  Report on classes of software security vulnerabilities  Discussed several existing flaw taxonomies: Clasp, PLOVER (CVE), etc.  Attendees agreed a common taxonomy would help.  Discussions continuing on samate email list.

19 OWASP AppSec DC 2005 19 Outline  Overview of the NIST SAMATE project  Purposes of tool and technique evaluation  Software and effectiveness metrics  Report of workshop on Defining the State of the Art in Software Security Tools  Final comments

20 OWASP AppSec DC 2005 20 Society has 3 options:  Learn how to make software that works  Limit size or authority of software  Accept failing software

21 OWASP AppSec DC 2005 21 Contact to Participate  Paul E. Black Project Leader Software Diagnostics & Conformance Testing Division, Software Quality Group, Information Technology Laboratory, NIST paul.black@nist.gov


Download ppt "This is a work of the U.S. Government and is not subject to copyright protection in the United States. The OWASP Foundation OWASP AppSec DC October 2005."

Similar presentations


Ads by Google