David Wettergreen School of Computer Science Carnegie Mellon University Systems Engineering Carnegie Mellon © 2006 Testing: Verification and Validation.

Slides:



Advertisements
Similar presentations
Software Testing. Quality is Hard to Pin Down Concise, clear definition is elusive Not easily quantifiable Many things to many people You'll know it when.
Advertisements

SOFTWARE TESTING. Software Testing Principles Types of software tests Test planning Test Development Test Execution and Reporting Test tools and Methods.
Testing and Quality Assurance
Documentation Testing
Chapter 9 Testing the System, part 2. Testing  Unit testing White (glass) box Code walkthroughs and inspections  Integration testing Bottom-up Top-down.
Unit 251 Implementation and Integration Implementation Unit Testing Integration Integration Approaches.
Illinois Institute of Technology
SE 555 Software Requirements & Specification Requirements Validation.
1 Software Testing and Quality Assurance Lecture 1 Software Verification & Validation.
Testing - an Overview September 10, What is it, Why do it? Testing is a set of activities aimed at validating that an attribute or capability.
Software Testing Name: Madam Currie Course: Swen5431 Semester: Summer 2K.
SOFTWARE PROJECT MANAGEMENT Project Quality Management Dr. Ahmet TÜMAY, PMP.
Software Process and Product Metrics
Chapter 11: Testing The dynamic verification of the behavior of a program on a finite set of test cases, suitable selected from the usually infinite execution.
Introduction to Software Testing
Software Testing & Strategies
Issues on Software Testing for Safety-Critical Real-Time Automation Systems Shahdat Hossain Troy Mockenhaupt.
SOFTWARE QUALITY ASSURANCE Maltepe University Faculty of Engineering SE 410.
BY RAJESWARI S SOFTWARE TESTING. INTRODUCTION Software testing is the process of testing the software product. Effective software testing will contribute.
Functional Testing Test cases derived from requirements specification document – Black box testing – Independent testers – Test both valid and invalid.
Software Integration and Documenting
Software Dependability CIS 376 Bruce R. Maxim UM-Dearborn.
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
©Ian Sommerville 1995 Software Engineering, 5th edition. Chapter 22Slide 1 Verification and Validation u Assuring that a software system meets a user's.
Introduction to Software Quality Assurance (SQA)
Introduction to RUP Spring Sharif Univ. of Tech.2 Outlines What is RUP? RUP Phases –Inception –Elaboration –Construction –Transition.
CLEANROOM SOFTWARE ENGINEERING.
Applying the Inspection Process. What Software Artifacts Are Candidates for Inspection? Software Requirements Software Designs Code Test Plans.
CMSC 345 Fall 2000 Unit Testing. The testing process.
Software Testing.
Topics Covered: Software requirement specification(SRS) Software requirement specification(SRS) Authors of SRS Authors of SRS Need of SRS Need of SRS.
Software Inspections. Defect Removal Efficiency The number of defects found prior to releasing a product divided by The number of defects found prior.
Instructor: Peter Clarke
Software Testing Testing principles. Testing Testing involves operation of a system or application under controlled conditions & evaluating the results.
1 Software testing. 2 Testing Objectives Testing is a process of executing a program with the intent of finding an error. A good test case is in that.
Testing Basics of Testing Presented by: Vijay.C.G – Glister Tech.
Software Testing Testing types Testing strategy Testing principles.
Testing Workflow In the Unified Process and Agile/Scrum processes.
Testing -- Part II. Testing The role of testing is to: w Locate errors that can then be fixed to produce a more reliable product w Design tests that systematically.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
Historical Aspects Origin of software engineering –NATO study group coined the term in 1967 Software crisis –Low quality, schedule delay, and cost overrun.
Software Testing and Quality Assurance Software Quality Assurance 1.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
Lecture 13.  Failure mode: when team understands requirements but is unable to meet them.  To ensure that you are building the right system Continually.
Software Testing and Quality Assurance 1. What is the objectives of Software Testing?
Smart Home Technologies
Software Quality Assurance and Testing Fazal Rehman Shamil.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
Requirements Management with Use Cases Module 2: Introduction to RMUC Requirements Management with Use Cases Module 2: Introduction to RMUC.
1 Phase Testing. Janice Regan, For each group of units Overview of Implementation phase Create Class Skeletons Define Implementation Plan (+ determine.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
Testing and Evolution CSCI 201L Jeffrey Miller, Ph.D. HTTP :// WWW - SCF. USC. EDU /~ CSCI 201 USC CSCI 201L.
LECTURE 5 Nangwonvuma M/ Byansi D. Components, interfaces and integration Infrastructure, Middleware and Platforms Techniques – Data warehouses, extending.
Verification vs. Validation Verification: "Are we building the product right?" The software should conform to its specification.The software should conform.
Testing Integral part of the software development process.
Software Testing Strategies for building test group
PREPARED BY G.VIJAYA KUMAR ASST.PROFESSOR
PREPARED BY G.VIJAYA KUMAR ASST.PROFESSOR
Software Engineering (CSI 321)
Rekayasa Perangkat Lunak Part-13
IEEE Std 1074: Standard for Software Lifecycle
Introduction to Software Testing
Lecture 09:Software Testing
Verification and Validation Unit Testing
Software testing.
Progression of Test Categories
Baisc Of Software Testing
Software Reviews.
Software Testing Strategies
Presentation transcript:

David Wettergreen School of Computer Science Carnegie Mellon University Systems Engineering Carnegie Mellon © 2006 Testing: Verification and Validation

Systems Engineering Carnegie Mellon © Definitions Error A problem at its point of origin Example: coding problem found in code inspection Defect A problem beyond its point of origin Example: requirements problem found in design inspection, system failure during deployment

Systems Engineering Carnegie Mellon © Cost of Defects According to NASA analysis: The cost of finding a defects during system test versus finding the defect in design Dollar cost is 100:1 Cost in time 200:1

Systems Engineering Carnegie Mellon © Wow! Cost of Defects The cost of finding a defect in a requirement is $100, in test $10,000 On average, design and code reviews reduce the cost of testing by 50-80% including the cost of the reviews

Systems Engineering Carnegie Mellon © Cost of Repair Requirements Design Build Test Maintain [Boehm81] 1

Systems Engineering Carnegie Mellon © Quality and Profit Which is better? Time-to-profit may be improved by more investment in build quality early in the process Time -to-market Time -to-profit Time -to-market Time -to-profit Support Cost Revenue [Fujimura93]

Systems Engineering Carnegie Mellon © Product Stability Measure defects (correlated with delivered unit) System, product, component, etc. Normalize defects to importance/criticality Number of Defects Time

Systems Engineering Carnegie Mellon © Definitions From Webster’s Verify 1) to confirm or substantiate in law by oath 2) to establish the truth, accuracy, or reality of Validate 2) to support or corroborate on sound or authoritative basis

Systems Engineering Carnegie Mellon © What’s the difference? Verification is determining whether the system is built right : correctly translating design into implementation Validation is determining whether the right system is built : does the implementation meet the requirements

Systems Engineering Carnegie Mellon © Verification and Validation Verification is applied at each transition in the development process Validation is applied with respect to the results of each phase either for acceptance or process improvement. Inspection for Verification Testing for Validation

Systems Engineering Carnegie Mellon © Verification and Validation User RequirementsArchitectureDesign Product What is the practical difference between verification and validation?

Systems Engineering Carnegie Mellon © Verification and Validation User RequirementsArchitectureDesign Product Verification

Systems Engineering Carnegie Mellon © Verification and Validation User RequirementsArchitectureDesign Product User Validation Verification

Systems Engineering Carnegie Mellon © Verification and Validation User RequirementsArchitectureDesign Product User Validation Requirements Validation Verification

Systems Engineering Carnegie Mellon © Verification and Validation User RequirementsArchitectureDesign Product User Validation Requirements Validation Architectural Validation Design Validation Verification

Systems Engineering Carnegie Mellon © Temporal Structural Functional Real-Time Systems Not only do we have the standard software and system concerns… …but performance is crucial as well Real-time System

Systems Engineering Carnegie Mellon © Real-Time Systems In real-time systems (soft, firm, hard), correctness of function depends upon the ability of the system to be timely In real-time systems, correct functionality may also depend on: reliability, robustness, availability, security If the system cannot meet any of these constraints, it may be defective

Systems Engineering Carnegie Mellon © Requirements Inspections Biggest potential return on investment Attributes of good requirement specification: Unambiguous Complete Verifiable Consistent Modifiable Traceable Usable

Systems Engineering Carnegie Mellon © Requirements Inspections Inspection objectives: Each requirement is consistent with and traceable to prior information Each requirement is clear, concise, internally consistent, unambiguous, and testable Are we building the right system?

Systems Engineering Carnegie Mellon © Design Inspection Opportunity to catch problems early. Objectives: Does the design address all the requirements? Are all design elements traceable to specific requirements? Does the design conform to applicable standards? Are we building the system correctly?

Systems Engineering Carnegie Mellon © Test Procedure Inspections Focus on verifying the validation process. Does the test validate all the requirements using formal procedure with predictable results and metrics. Objectives: Do validation tests accurately reflect requirements? Have validation tests taken advantage of knowledge of the design? Is the system ready for validation testing?

Systems Engineering Carnegie Mellon © Requirements Validation Check: Validity - Does the system provide the functions that best support customer need? Consistency - Are there any requirements conflicts? Completeness - Are all required functions included? Realism - Can requirements be implemented with available resources and technology Verifiability - Can requirements be checked?

Systems Engineering Carnegie Mellon © Testing Testing is an aspect of Verification and Validation Testing can verify correct implementation of a design Testing can validate accomplishment of requirement specifications Testing is often tightly coupled with implementation (integration and evaluation) but it also is important to production

Systems Engineering Carnegie Mellon © When to Test To test, you need something to evaluate Algorithms Prototypes Components/Sub-systems Functional Implementation Complete Implementation Deployed System Testing can begin as soon as there’s something to test!

Systems Engineering Carnegie Mellon © Testing Participants System Engineering Component Engineering Test Engineering Test Architecture Test Planning Test Measurements Test Requirements and Evaluation Test Conduct and Analysis Test Equipment Test Equipment Requirements [Kossiakoff03]

Systems Engineering Carnegie Mellon © Testing Strategies White Box (or Glass Box) Testing Component level testing where internal elements are exposed Test cases are developed with developers’ knowledge of critical design issues Functional testing for verification Black Box Testing Component level testing where structure of test object is unknown Test cases are developed using specifications only Operational testing for validation

Systems Engineering Carnegie Mellon © Black Box Testing Positive Tests Valid test data is derived from specifications Both high and low probability data is used Tests reliability Negative Tests Invalid test data is derived violating specifications Tests robustness of the test object Need both kinds of tests and high and low probability events to develop statistical evidence for reliability and robustness

Systems Engineering Carnegie Mellon © Testing Strategies X Y Max X Max Y Min X Min Y Normal Boundary Normal Boundary Abnormal Test Envelopes Given a behavior with 2 parameters we can establish the test envelope Useful for identifying boundary conditions

Systems Engineering Carnegie Mellon © Test Envelopes Boundary conditions define positive and negative tests Test cases should include High probability zones in the normal region High probability zones in the abnormal region Low probability zones in the abnormal region if the outcome is catastrophic

Systems Engineering Carnegie Mellon © Hierarchical Testing Top-down testing Developed early during development High level components are developed Low level components are “stubbed” Allows for verification of overall structure of the system (testing the architectural pattern and infrastructure)

Systems Engineering Carnegie Mellon © Hierarchical Testing Bottom-up testing Lowest level components are developed first Dedicated “test harnesses” are developed to operationally test the low-level components Good approach for flat, distributed, functionally partitioned, systems (pipeline architecture)

Systems Engineering Carnegie Mellon © Testing Strategies Regression Testing Testing that is done after system has been modified Assure that those things that it used to do—that it still should do—still function Assure that any new functionality behaves as specified

Systems Engineering Carnegie Mellon © Testing Complications When a test discrepancy occurs (the test “fails”) it could be a fault in: Test equipment (test harness) Test procedures Test execution Test analysis System under test Impossible performance requirement The first step in resolution is to diagnose the source of the test discrepancy

Systems Engineering Carnegie Mellon © Operational Testing Validation Techniques Simulation Simulate the real-world to provide inputs to the system Simulate the real world for evaluating the output from the system Simulate the system itself to evaluate its fitness Simulation can be expensive

Systems Engineering Carnegie Mellon © Radar Cockpit Displays Flight Controls Flaps, Ailerons, Rudder, Elevator Inertial Navigation Propulsion Systems Operational Testing Simulation is a primary tool in real-time systems development

Systems Engineering Carnegie Mellon © Radar Cockpit Displays Flight Controls Flaps, Ailerons, Rudder, Elevator Inertial Navigation Propulsion Systems Operational Testing Avionics integration labs develop “airplane on a bench”

Systems Engineering Carnegie Mellon © Radar Cockpit Displays Flight Controls Flaps, Ailerons, Rudder, Elevator Inertial Navigation Propulsion Systems Operational Testing Full motion simulators are developed to train aircrews, test usability of flight control systems, and human factors

Systems Engineering Carnegie Mellon © Radar Cockpit Displays Flight Controls Flaps, Ailerons, Rudder, Elevator Inertial Navigation Propulsion Systems Operational Testing Radar, INS, offensive, operability, and defensive avionics are tested in antiechoic chambers

Systems Engineering Carnegie Mellon © Radar Cockpit Displays Flight Controls Flaps, Ailerons, Rudder, Elevator Inertial Navigation Propulsion Systems Operational Testing Fight control software and systems are installed on “flying test beds” to ensure they work

Systems Engineering Carnegie Mellon © Radar Cockpit Displays Flight Controls Flaps, Ailerons, Rudder, Elevator Inertial Navigation Propulsion Systems Operational Testing The whole system is put together and a flight test program undertaken

Systems Engineering Carnegie Mellon © Operational Test Plan Types of tests Unit Tests – Test a component Integration Tests – Test a set of components System Tests – Test an entire system Acceptance Tests – Have users test system

Systems Engineering Carnegie Mellon © Operational Test Plan The Operational Test Plan should identify: Objectives Prerequisites Preparation, Participants, Logistics Schedule Tests Expected Outcomes and Completion For each specific test detail: Measurement/Metric, Objective, Procedure

Systems Engineering Carnegie Mellon © Verification and Validation Plans Test plans, or more rigorous V&V plans, are often left for late in the development process Many development models do not consider V&V, or focus on testing after the system is implemented Verifying progress in development is a continuous, parallel process Start thinking about V&V during requirements specification How will these requirements be verified in design? Think traceability. How will the implementation be verified? What formulation of requirements will clarify system validation?

Systems Engineering Carnegie Mellon © Review Testing can verify correct implementation of a design and verify operational performance Testing can validate accomplishment of requirement specifications Variety of test strategies must be tailored to the specific application depending upon: Likely failure modes Known complexities Reliability and safety concerns