Download presentation
Presentation is loading. Please wait.
Published byCorey Bethanie Fox Modified over 9 years ago
1
David Wettergreen School of Computer Science Carnegie Mellon University Systems Engineering Carnegie Mellon © 2006 Testing: Verification and Validation
2
Systems Engineering Carnegie Mellon © 2006 1 Definitions Error A problem at its point of origin Example: coding problem found in code inspection Defect A problem beyond its point of origin Example: requirements problem found in design inspection, system failure during deployment
3
Systems Engineering Carnegie Mellon © 2006 2 Cost of Defects According to NASA analysis: The cost of finding a defects during system test versus finding the defect in design Dollar cost is 100:1 Cost in time 200:1
4
Systems Engineering Carnegie Mellon © 2006 3 Wow! Cost of Defects The cost of finding a defect in a requirement is $100, in test $10,000 On average, design and code reviews reduce the cost of testing by 50-80% including the cost of the reviews
5
Systems Engineering Carnegie Mellon © 2006 4 Cost of Repair Requirements Design Build Test Maintain 52050 100 [Boehm81] 1
6
Systems Engineering Carnegie Mellon © 2006 5 Quality and Profit Which is better? Time-to-profit may be improved by more investment in build quality early in the process Time -to-market Time -to-profit Time -to-market Time -to-profit Support Cost Revenue [Fujimura93]
7
Systems Engineering Carnegie Mellon © 2006 6 Product Stability Measure defects (correlated with delivered unit) System, product, component, etc. Normalize defects to importance/criticality Number of Defects Time
8
Systems Engineering Carnegie Mellon © 2006 7 Definitions From Webster’s Verify 1) to confirm or substantiate in law by oath 2) to establish the truth, accuracy, or reality of Validate 2) to support or corroborate on sound or authoritative basis
9
Systems Engineering Carnegie Mellon © 2006 8 What’s the difference? Verification is determining whether the system is built right : correctly translating design into implementation Validation is determining whether the right system is built : does the implementation meet the requirements
10
Systems Engineering Carnegie Mellon © 2006 9 Verification and Validation Verification is applied at each transition in the development process Validation is applied with respect to the results of each phase either for acceptance or process improvement. Inspection for Verification Testing for Validation
11
Systems Engineering Carnegie Mellon © 2006 10 Verification and Validation User RequirementsArchitectureDesign Product What is the practical difference between verification and validation?
12
Systems Engineering Carnegie Mellon © 2006 11 Verification and Validation User RequirementsArchitectureDesign Product Verification
13
Systems Engineering Carnegie Mellon © 2006 12 Verification and Validation User RequirementsArchitectureDesign Product User Validation Verification
14
Systems Engineering Carnegie Mellon © 2006 13 Verification and Validation User RequirementsArchitectureDesign Product User Validation Requirements Validation Verification
15
Systems Engineering Carnegie Mellon © 2006 14 Verification and Validation User RequirementsArchitectureDesign Product User Validation Requirements Validation Architectural Validation Design Validation Verification
16
Systems Engineering Carnegie Mellon © 2006 15 Temporal Structural Functional Real-Time Systems Not only do we have the standard software and system concerns… …but performance is crucial as well Real-time System
17
Systems Engineering Carnegie Mellon © 2006 16 Real-Time Systems In real-time systems (soft, firm, hard), correctness of function depends upon the ability of the system to be timely In real-time systems, correct functionality may also depend on: reliability, robustness, availability, security If the system cannot meet any of these constraints, it may be defective
18
Systems Engineering Carnegie Mellon © 2006 17 Requirements Inspections Biggest potential return on investment Attributes of good requirement specification: Unambiguous Complete Verifiable Consistent Modifiable Traceable Usable
19
Systems Engineering Carnegie Mellon © 2006 18 Requirements Inspections Inspection objectives: Each requirement is consistent with and traceable to prior information Each requirement is clear, concise, internally consistent, unambiguous, and testable Are we building the right system?
20
Systems Engineering Carnegie Mellon © 2006 19 Design Inspection Opportunity to catch problems early. Objectives: Does the design address all the requirements? Are all design elements traceable to specific requirements? Does the design conform to applicable standards? Are we building the system correctly?
21
Systems Engineering Carnegie Mellon © 2006 20 Test Procedure Inspections Focus on verifying the validation process. Does the test validate all the requirements using formal procedure with predictable results and metrics. Objectives: Do validation tests accurately reflect requirements? Have validation tests taken advantage of knowledge of the design? Is the system ready for validation testing?
22
Systems Engineering Carnegie Mellon © 2006 21 Requirements Validation Check: Validity - Does the system provide the functions that best support customer need? Consistency - Are there any requirements conflicts? Completeness - Are all required functions included? Realism - Can requirements be implemented with available resources and technology Verifiability - Can requirements be checked?
23
Systems Engineering Carnegie Mellon © 2006 22 Testing Testing is an aspect of Verification and Validation Testing can verify correct implementation of a design Testing can validate accomplishment of requirement specifications Testing is often tightly coupled with implementation (integration and evaluation) but it also is important to production
24
Systems Engineering Carnegie Mellon © 2006 23 When to Test To test, you need something to evaluate Algorithms Prototypes Components/Sub-systems Functional Implementation Complete Implementation Deployed System Testing can begin as soon as there’s something to test!
25
Systems Engineering Carnegie Mellon © 2006 24 Testing Participants System Engineering Component Engineering Test Engineering Test Architecture Test Planning Test Measurements Test Requirements and Evaluation Test Conduct and Analysis Test Equipment Test Equipment Requirements [Kossiakoff03]
26
Systems Engineering Carnegie Mellon © 2006 25 Testing Strategies White Box (or Glass Box) Testing Component level testing where internal elements are exposed Test cases are developed with developers’ knowledge of critical design issues Functional testing for verification Black Box Testing Component level testing where structure of test object is unknown Test cases are developed using specifications only Operational testing for validation
27
Systems Engineering Carnegie Mellon © 2006 26 Black Box Testing Positive Tests Valid test data is derived from specifications Both high and low probability data is used Tests reliability Negative Tests Invalid test data is derived violating specifications Tests robustness of the test object Need both kinds of tests and high and low probability events to develop statistical evidence for reliability and robustness
28
Systems Engineering Carnegie Mellon © 2006 27 Testing Strategies X Y Max X Max Y Min X Min Y Normal Boundary Normal Boundary Abnormal Test Envelopes Given a behavior with 2 parameters we can establish the test envelope Useful for identifying boundary conditions
29
Systems Engineering Carnegie Mellon © 2006 28 Test Envelopes Boundary conditions define positive and negative tests Test cases should include High probability zones in the normal region High probability zones in the abnormal region Low probability zones in the abnormal region if the outcome is catastrophic
30
Systems Engineering Carnegie Mellon © 2006 29 Hierarchical Testing Top-down testing Developed early during development High level components are developed Low level components are “stubbed” Allows for verification of overall structure of the system (testing the architectural pattern and infrastructure)
31
Systems Engineering Carnegie Mellon © 2006 30 Hierarchical Testing Bottom-up testing Lowest level components are developed first Dedicated “test harnesses” are developed to operationally test the low-level components Good approach for flat, distributed, functionally partitioned, systems (pipeline architecture)
32
Systems Engineering Carnegie Mellon © 2006 31 Testing Strategies Regression Testing Testing that is done after system has been modified Assure that those things that it used to do—that it still should do—still function Assure that any new functionality behaves as specified
33
Systems Engineering Carnegie Mellon © 2006 32 Testing Complications When a test discrepancy occurs (the test “fails”) it could be a fault in: Test equipment (test harness) Test procedures Test execution Test analysis System under test Impossible performance requirement The first step in resolution is to diagnose the source of the test discrepancy
34
Systems Engineering Carnegie Mellon © 2006 33 Operational Testing Validation Techniques Simulation Simulate the real-world to provide inputs to the system Simulate the real world for evaluating the output from the system Simulate the system itself to evaluate its fitness Simulation can be expensive
35
Systems Engineering Carnegie Mellon © 2006 34 Radar Cockpit Displays Flight Controls Flaps, Ailerons, Rudder, Elevator Inertial Navigation Propulsion Systems Operational Testing Simulation is a primary tool in real-time systems development
36
Systems Engineering Carnegie Mellon © 2006 35 Radar Cockpit Displays Flight Controls Flaps, Ailerons, Rudder, Elevator Inertial Navigation Propulsion Systems Operational Testing Avionics integration labs develop “airplane on a bench”
37
Systems Engineering Carnegie Mellon © 2006 36 Radar Cockpit Displays Flight Controls Flaps, Ailerons, Rudder, Elevator Inertial Navigation Propulsion Systems Operational Testing Full motion simulators are developed to train aircrews, test usability of flight control systems, and human factors
38
Systems Engineering Carnegie Mellon © 2006 37 Radar Cockpit Displays Flight Controls Flaps, Ailerons, Rudder, Elevator Inertial Navigation Propulsion Systems Operational Testing Radar, INS, offensive, operability, and defensive avionics are tested in antiechoic chambers
39
Systems Engineering Carnegie Mellon © 2006 38 Radar Cockpit Displays Flight Controls Flaps, Ailerons, Rudder, Elevator Inertial Navigation Propulsion Systems Operational Testing Fight control software and systems are installed on “flying test beds” to ensure they work
40
Systems Engineering Carnegie Mellon © 2006 39 Radar Cockpit Displays Flight Controls Flaps, Ailerons, Rudder, Elevator Inertial Navigation Propulsion Systems Operational Testing The whole system is put together and a flight test program undertaken
41
Systems Engineering Carnegie Mellon © 2006 40 Operational Test Plan Types of tests Unit Tests – Test a component Integration Tests – Test a set of components System Tests – Test an entire system Acceptance Tests – Have users test system
42
Systems Engineering Carnegie Mellon © 2006 41 Operational Test Plan The Operational Test Plan should identify: Objectives Prerequisites Preparation, Participants, Logistics Schedule Tests Expected Outcomes and Completion For each specific test detail: Measurement/Metric, Objective, Procedure
43
Systems Engineering Carnegie Mellon © 2006 42 Verification and Validation Plans Test plans, or more rigorous V&V plans, are often left for late in the development process Many development models do not consider V&V, or focus on testing after the system is implemented Verifying progress in development is a continuous, parallel process Start thinking about V&V during requirements specification How will these requirements be verified in design? Think traceability. How will the implementation be verified? What formulation of requirements will clarify system validation?
44
Systems Engineering Carnegie Mellon © 2006 43 Review Testing can verify correct implementation of a design and verify operational performance Testing can validate accomplishment of requirement specifications Variety of test strategies must be tailored to the specific application depending upon: Likely failure modes Known complexities Reliability and safety concerns
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.