Download presentation
Presentation is loading. Please wait.
Published byDana Arnold Modified over 8 years ago
1
Slide 1SATC June 2000 Dolores R. Wallace* NASA Goddard Space Flight Center Greenbelt, Maryland 20771 dwallac@pop300.gsfc.nasa.gov for the American Society for Quality Special Interest Group for Software June, 2000 * This work was performed when MS Wallace was employed by the Information Technology Laboratory, National Institute of Standards and Technology Software Verification and Validation
2
Slide 2SATC June 2000 Tonight’s Discussion Definition and Role of V&V in Building Software Quality Standards and Guidance for V&V V&V Tasks V&V Examples, via Case Studies NIST Fault and Failure Repository
3
Slide 3SATC June 2000 Elements of Software Quality Process – examples CMM; ISO9000 (ASQ) -Standards to define what must be performed to build software systems -Assessment of organizations for conformance People – example ASQ’s CSQE -Skilled people to get processes right -People with knowledge beyond computer science/swe -Licensing “software engineers” by state examinations Product – example Software V&V* -Evaluation, measurement of the product -Uncertainty reduced by reference, measurement methods * Quality built in via excellent development practices but assurance needed
4
Slide 4SATC June 2000 Definition of V&V Verification –Confirmation by examination and provisions of objective evidence that specified requirements have been fulfilled. Ensures that product(s) of each development phase meets requirements levied by previous phase, and is internally complete, consistent and correct enough to support the next development phase. Validation –Confirmation by examination and provisions of objective evidence that the particular requirements for a specific intended use are fulfilled. Through the process of execution, ensures that product conforms to functional and performance specifications stipulated in the requirements.
5
Slide 5SATC June 2000 V&V Objectives Assess software products, processes during life cycle Facilitate early detection and correction of software errors Reduce effort to remove faults, via early detection Demonstrate software, system requirements correct, complete, accurate, consistent, testable Enhance management insight into process and product risk Support the software life cycle processes to ensure compliance with program performance, schedule, and cost requirements, and Enhance operational correctness and product maintainability
6
Slide 6SATC June 2000 Organization of V&V Independent V&V Part of “development” process Combination All use standards, guidelines to determine best fit
7
Slide 7SATC June 2000 New : IEEE 1012-1998 Software V&V More comprehensive than IEEE 1012-1986 Product and process examination Integrity levels, metrics, independence Compliance with a higher level standard Separation of operation and maintenance New tasks: criticality analysis; hazard analysis; risk assessment; configuration management assessment Retention of tasks in original 1012 standard: –V&V management –Evaluation of all artifacts, all stages –Testing, beginning with requirements activities
8
Slide 8SATC June 2000 Guidance: NIST SP 500-234 http://hissa.nist.gov/VV234 Provides guidance for performing V&V –independence types –step-by-step activities Describes verification, test techniques –brief overview identifies issues for each technique –questions, issues for V&V of reused software Explains some metrics for V&V –general metrics; metrics for design, code, test –reliability models
9
Slide 9SATC June 2000 Selected V&V Tasks Traceability Analysis Evaluation of Requirements, Design, Code –Inspection, walkthrough, review –Analysis (e.g., control flow, database, algorithm, performance) –Formal Verification –Simulation, modeling Change Impact Assessment Configuration Management Assessment Test –Requirements based –Evaluation of test documentation –Simulation –Regression testing Measurement
10
Slide 10SATC June 2000 Traceability Analysis Trace system, software requirements through design, code and test materials. Benefits: Identification of most important or riskiest paths Sets stage for special analyses (e.g., timing, interface) Locations of interactions Completeness / omission Identification of re-test areas Impact of change Discovery of root cause of faults, failures
11
Slide 11SATC June 2000 Change Impact Analysis Use traceability analysis to identify every place affected by proposed change Identification and evaluation of all interactions affected by changes Evaluation of how changes affect assumptions about COTS, other components Identification of regression tests
12
Slide 12SATC June 2000 EVALUATION Verify and validate that the software item satisfies the requirements of its predecessor software requirements (e.g., design to requirements; code to design). Verify software item complies with standards, references, regulations, policies, physical laws, and business rules. Validate the design sequences of states and state changes using logic and data flows coupled with domain expertise, prototyping results, engineering principles, or other basis. Validate that the flow of data and control satisfy functionality and performance requirements. Validate data usage and format. Assess the appropriateness of methods and standards for that item. Verify specified configuration management procedures.
13
Slide 13SATC June 2000 EVALUATION (Con’t) Verify internal consistency between the elements of the item and external consistency with its predecessor. Verify that all terms and concepts are documented consistently. Validate that the logic, computational, and interface precision (e.g., truncation and rounding) satisfy the requirements in the system environment. Validate that the modeled physical phenomena conform to system accuracy requirements and physical laws. Verify and validate that the software source code interfaces with hardware, user, operator, software, and other systems for correctness, consistency, completeness, accuracy, testability. Verify functionality:algorithms, states, performance & scheduling, controls, configuration data, exception handling.
14
Slide 14SATC June 2000 Two Case Studies Affirm Need for V&V Case Study 1 - 342 failures or real systems, in service - Logic, calculation, and CM problems prevalent - QA not implemented in maintenance - Logic faults: many likely in requirements Case Study 2 - 1 system under development - Strong focus on requirements analysis found 54% faults - Logic, calculation faults prevalent Further information on Case Study 1: http://hissa.nist.gov/effProject/handbook/failure http://hissa.nist.gov/project/lessonsfailures.pdf
15
Slide 15SATC June 2000 Fault Distribution by Class Case Study 1
16
Slide 16SATC June 2000 Case Study 2: System in Development
17
Slide 17SATC June 2000 Case Study 2: System in Development
18
Slide 18SATC June 2000 Prevalent Faults Case Study 1 Logic Computation Change Impact Configuration management Requirements Performance Quality Assurance Case Study 2 Logic Specification Computation Performance Improvements Output Initialization Data handling
19
Slide 19SATC June 2000 EXAMPLE: Prevalent Faults Calculation –increments, boundary values –precision, rounding –units, interactions with other functions –scaling, sequencing, indexing –previous algorithm incorrect in new version Logic –incorrectly specified conditions, switches –interactions –ranges –previous logic affected by current changes Detection approaches –Specialized analyses; inspections &walkthroughs; simulation; traceability; change impact analysis
20
Slide 20SATC June 2000 EXAMPLES of V&V Usage Fault Calculation: constants incorrectly coded Change impact: change not verified CM: use of wrong master program Interface: software does not properly interface with external device Logic: incomplete or incorrect control logic V&V Detection Focused inspection; unit test V&V process check on QA; regression test V&V check of CM procedures Interface analysis Review: control flow analysis; test
21
Slide 21SATC June 2000 Examples (con’t) FAULT Requirements: exceptional conditions not specified Calculation: algorithm incorrect in reference material Calculation: failure of register to reset Interface analysis: incorrect interaction of 2 components yields false message Logic: function switches into inappropriate disconnect Logic: failure under multiple conditions at extreme values V&V Detection Requirements analysis/ review; design evaluation Requirements review against reference material Design, code evaluation Interface analysis; integration test Analysis, focus on logic Modeling and simulation
22
Slide 22SATC June 2000 Practices for Assurance Focused review, inspection against characteristics faults & vendor's history. Training in the application domain, behavior. Traceability analysis. Mental execution of potentially troublesome locations (e.g., an algorithm, a loop, an interface). Code reading. Recording, use of fault information & symptoms. Checklists. Formal, informal proofs of correctness, algorithms. Simulation.
23
Slide 23SATC June 2000 Practices for Testing Test cases aimed at manifesting prevalent symptoms observed by device operators. Test cases that cover all pairs of input values, and all three or four-way combinations. Training in the application domain and specifically about the device's intended behavior. Stress testing. Change impact analysis and regression testing.
24
Slide 24SATC June 2000 Testing (con’t) Integration testing focused on interface values under varying conditions. SCM release of versions with evidence of change impact analysis, regression testing; validation of changes. System testing under various environmental circumstances: conditions, input data incorrect or different from expected environmental conditions. Recording of test results, all failures and their resolution, fault type of the software.
25
Slide 25SATC June 2000 NIST Project to Build Fault Profiles Industry data (under non-disclosure agreements) Profiles generated by language, other characteristics Custom profiles generated by users Public domain tools for use by developers to collect, analyze their project fault an failure data NIST public repository housing tools, handbook, current data at http://hissa.nist.gov/effProject CONTACT: Michael.Koo@nist.gov
26
Slide 26SATC June 2000 Public Data Collection user interface EFF PUBLIC REPOSITORY Interface to all components (http://hissa.nist.gov/effProject) Profiles Custom views Statistics Graphics Related Information Handbook Only NIST Enters data Http access to
27
Slide 27SATC June 2000 SUMMARY V&V is a rigorous engineering discipline V&V encompasses all life cycle V&V tasks repeat, with specifics to the stage of development V&V tasks and their intensity vary with integrity level Other technical methods support the tasks V&V testing at all levels requires early plans
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.