CPSC 871 John D. McGregor Module 6 Session 2 Validation and Verification.

Slides:



Advertisements
Similar presentations
QuEdge Testing Process Delivering Global Solutions.
Advertisements

Verification and Validation
Computer Science Department
System Integration Verification and Validation
Software Quality Assurance Plan
The design process IACT 403 IACT 931 CSCI 324 Human Computer Interface Lecturer:Gene Awyzio Room:3.117 Phone:
Modeling the Process and Life Cycle CSCI 411 Advanced Database and Project Management Monday, February 2, 2015.
Software Modeling SWE5441 Lecture 3 Eng. Mohammed Timraz
Software Testing and Quality Assurance
Software Testing and Quality Assurance
1 Test Planning CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology March 9, 2007.
Software Testing and Quality Assurance
January 20, 2002ECEN5033 University of Colorado, Testing OO Software Part Two 1 Testing Object-Oriented Software – Testing Models Software Engineering.
Software Testing and Quality Assurance: Planning for Testing
Testing - an Overview September 10, What is it, Why do it? Testing is a set of activities aimed at validating that an attribute or capability.
CSE Senior Design II Test Planning Mike O’Dell Based on an earlier presentation by Mike O’Dell, UTA.
Introduction to Software Testing
Software Testing & Strategies
Verification and Validation
Test Design Techniques
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
©Ian Sommerville 1995 Software Engineering, 5th edition. Chapter 22Slide 1 Verification and Validation u Assuring that a software system meets a user's.
Introduction to Software Quality Assurance (SQA)
Best Practices By Gabriel Rodriguez
Software Testing Life Cycle
CPIS 357 Software Quality & Testing
Implementation Yaodong Bi. Introduction to Implementation Purposes of Implementation – Plan the system integrations required in each iteration – Distribute.
Software Inspection A basic tool for defect removal A basic tool for defect removal Urgent need for QA and removal can be supported by inspection Urgent.
An Introduction to Software Architecture
CS 360 Lecture 3.  The software process is a structured set of activities required to develop a software system.  Fundamental Assumption:  Good software.
Chapter 8 – Software Testing Lecture 1 1Chapter 8 Software testing The bearing of a child takes nine months, no matter how many women are assigned. Many.
FCS - AAO - DM COMPE/SE/ISE 492 Senior Project 2 System/Software Test Documentation (STD) System/Software Test Documentation (STD)
HCI in Software Process Material from Authors of Human Computer Interaction Alan Dix, et al.
Testing Workflow In the Unified Process and Agile/Scrum processes.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
1 Introduction to Software Engineering Lecture 1.
SOFTWARE DESIGN AND ARCHITECTURE LECTURE 05. Review Software design methods Design Paradigms Typical Design Trade-offs.
Software Testing. Software testing is the execution of software with test data from the problem domain. Software testing is the execution of software.
Verification and Validation Assuring that a software system meets a user's needs.
Chapter 8 Lecture 1 Software Testing. Program testing Testing is intended to show that a program does what it is intended to do and to discover program.
Chair of Software Engineering Exercise Session 6: V & V Software Engineering Prof. Dr. Bertrand Meyer March–June 2007.
CPSC 873 John D. McGregor Session 9 Testing Vocabulary.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
CSPC 464 Fall 2014 Son Nguyen.  Attendance/Roster  Introduction ◦ Instructor ◦ Students  Syllabus  Q & A.
CPSC 871 John D. McGregor Module 8 Session 1 Testing.
Modelling the Process and Life Cycle. The Meaning of Process A process: a series of steps involving activities, constrains, and resources that produce.
Software Quality Assurance and Testing Fazal Rehman Shamil.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
Introduction to Software Engineering 1. Software Engineering Failures – Complexity – Change 2. What is Software Engineering? – Using engineering approaches.
CPSC 872 John D. McGregor Session 31 This is it..
Testing throughout Lifecycle Ljudmilla Karu. Verification and validation (V&V) Verification is defined as the process of evaluating a system or component.
CPSC 372 John D. McGregor Module 8 Session 1 Testing.
John D. McGregor Session 9 Testing Vocabulary
Software Verification and Validation
Chapter 10 Software Quality Assurance& Test Plan Software Testing
Testing in a Software Product Line Organization
Software and Systems Integration
Manfred Huber Based on an earlier presentation by Mike O’Dell, UTA
John D. McGregor Session 9 Testing Vocabulary
John D. McGregor Session 9 Testing Vocabulary
Software Quality Engineering
Chapter 2 Modeling the Process and Life Cycle Shari L. Pfleeger Joanne M. Atlee 4th Edition.
Introduction to Software Testing
Test Planning Mike O’Dell (some edits by Vassilis Athitsos)
Lecture 09:Software Testing
Software System Integration
An Introduction to Software Architecture
Software Testing Lifecycle Practice
Chapter 7 Software Testing.
Presentation transcript:

CPSC 871 John D. McGregor Module 6 Session 2 Validation and Verification

Definitions Integration is the assembling of pieces into a whole – Subsystems into a system or systems into a system of systems Verification is determining that an element performs its functions without fault Validation is determining that what the element does is what it should do.

Relationship element Integrated system element verification Integrated system validation Verification techniques are applied before an element is released. When a specific set of elements has been verified they are integrated into a larger element. The functionality of the integrated system is validated before the system is made available for use. element

The “V-model” Requirements Coding review Analysis Architectural Design Detailed Design Guided Inspection Unit System Integration

Parallel model Requirements Development Testing Coding review Use cases Analysis Guided Inspection Analysis models Architectural Design ATAMArchitecture description Detailed Design Guided Inspection Design models Implementation Unit Integration System

Testing artifacts Plans – Usual levels – unit, etc – How test is coordinated between core assets and products Test cases – – Data sets Infrastructure – Must be able to execute the software in a controlled environment so that the outputs can be observed and compared to the expected results.

IEEE Test Plan - 1 Introduction Test Items Tested Features Features Not Tested (per cycle) Testing Strategy and Approach –Syntax –Description of Functionality –Arguments for tests –Expected Output –Specific Exclusions –Dependencies –Test Case Success/Failure Criteria

IEEE Test Plan - 2 Pass/Fail Criteria for the Complete Test Cycle Entrance Criteria/Exit Criteria Test Suspension Criteria and Resumption Requirements Test Deliverables/Status Communications Vehicles Testing Tasks Hardware and Software Requirements Problem Determination and Correction Responsibilities Staffing and Training Needs/Assignments Test Schedules Risks and Contingencies Approvals

Coverage A measure that can be used to compare validation (and verification) techniques. An item is “covered” when it has been touched by at least one test case. An inspection technique that uses a scenario as a test case will touch several artifacts including interfaces and implementation designs. Then the next scenario should be selected to touch other artifacts. The more disjoint the sets of “touched artifacts” are, the better the coverage per set of scenarios.

Coverage - 2 The domain determines how much coverage is sufficient. Airworthy systems need a much more complete coverage than a business system where faults can be recovered from. But coverage is not the whole story…

Designing for testability - 1 The ease with which a piece of software gives up its faults Observability – Special interfaces – Meta-level access to state Controllability – Methods that can set state – Meta-level access to these state changing methods Product line Separate test interface which can be compiled into a test version but eliminated from the production version

Designing for testability - 2 Strategy must be based on language – Java – can use reflection – C++ - must have more static design Separate the interfaces for observing (get methods) and controlling (set methods) Compose component based on purpose; only add those interfaces needed for the purpose Couple the composition of tests with the composition of components

Using testability You have just tested your 1000 line program and found 27 defects. How do you feel? Have you finished?

Mapping from scenario to design

Architecture Tradeoff Analysis Method (ATAM) “Test” the architecture Use scenarios from the use cases Test for architectural qualities such as – extensibility – maintainability

ADeS architecture simulation

Operational profiles

Variation representation

System tests Test sample applications – Combinations of choices at variation points – Full scale integration tests – Limited to what can be built at the moment – Involve product builders early Test specific application – Tests a specific product prior to deployment – Rerun some of the selected products’ test cases – Feedback results to core asset builders

Combinatorial testing – Specific technique Pair-wise testing – OATS Multi-way test coverage – more than pair- wise, less than all possible Minimum test sets -

Orthogonal Array Testing System (OATS) One factor for each variation point One level for each variant within a factor “All combinations” is usually impossible but pair-wise usually is manageable. Constraints identify test cases that are invalid FactorLevelConstraint VP1VP1.1 VP1.2 Both VP2None VP2.1 VP2.2 VP3VP3.1 Requires VP1.2 VP3.2 Requires VP4 Both VP4VP4.1 VP4.2 VP5VP5.1 VP5.2

Example test matrix Use standard pre- defined arrays Use standard pre- defined arrays This one is larger than needed but that will work This one is larger than needed but that will work Each of the factors has values 0,1,2 Each of the factors has values 0,1,2 Defined to include all pair-wise combinations Defined to include all pair-wise combinations factor level

Mapping FactorLevelConstraint VP1VP1.1 VP1.2 Both VP2None VP2.1 VP2.2 VP3VP3.1 Requires VP1.2 VP3.2 Requires VP4 Both VP4VP4.1 VP4.2 VP5VP5.1 VP map

Mapped array VP1VP2VP3VP4VP5 VP1.1NoneVP3.1VP4.1VP5.1c VP1.2VP2.1VP3.2VP4.2VP5.2 BothVP2.2Both22x VP1.1NoneVP3.22VP5.2x VP1.2VP2.1BothVP4.12x BothVP2.2VP3.1VP4.2VP5.1 VP1.1VP2.1VP3.122x,c VP1.2VP2.2VP3.2VP4.1VP5.1 BothNoneBothVP4.2VP5.2 VP1.1VP2.2BothVP4.1VP5.2 VP1.2NoneVP3.1VP4.22x BothVP2.1VP3.22VP5.1x VP1.1VP2.1BothVP4.2VP5.1 VP1.2VP2.2VP3.12VP5.2x BothNoneVP3.2VP4.12x VP1.1VP2.2VP3.2VP4.22x VP1.2NoneBoth2VP5.1x BothVP2.1VP3.1VP4.1VP5.2 Legend: c = constraint; x = any choice for constant will work Every row is a system under test. 17 test products vs 72 possible combinations Columns 4 and 5 have more levels than are needed. Columns 6 and 7 are not needed at all. Where a “2” is in the column, this indicates the tester could repeat a value (one of the variants).

Validation Validation takes on the customer’s perspective as the basis for examining the product. Validation goes back to the CONOPS. The system threads should be consistent with the CONOPS and are a rich source of test cases.

Validation Validation continues into the client side by having the customer perform acceptance tests. These are defined as part of the contract. Planning for system validation should closely reflect the context of the acceptance tests.