1 Validating Test Design Determining a planned test design is valid using the System Reference Model. IV&V Workshop 16 September 2009 John Schipper SRMV.

Slides:



Advertisements
Similar presentations
System Integration Verification and Validation
Advertisements

Deliverable #8: Detailed Design - Overview due: Wednesday, 7 March All Deliverable materials are to be posted into Team Concert. Your to.
Tom Gullion SRMV Toolsmith (608) Executing UML Models as Verification Oracles.
Chapter 7: Key Process Areas for Level 2: Repeatable - Arvind Kabir Yateesh.
Documenting a Software Architecture By Eng. Mohanned M. Dawoud.
1 Software Requirement Analysis Deployment Package for the Basic Profile Version 0.1, January 11th 2008.
Software Testing and Quality Assurance
Software Testing and Quality Assurance
Copyright © 2006 Software Quality Research Laboratory DANSE Software Quality Assurance Tom Swain Software Quality Research Laboratory University of Tennessee.
Creating Architectural Descriptions. Outline Standardizing architectural descriptions: The IEEE has published, “Recommended Practice for Architectural.
Major Exam II Reschedule 5:30 – 7:30 pm in Tue Dec 5 th.
Project Management Session 7
DITSCAP Phase 2 - Verification Pramod Jampala Christopher Swenson.
Principle of Functional Verification Chapter 1~3 Presenter : Fu-Ching Yang.
Project Documentation and its use in Testing JTALKS.
1 CMPT 275 Software Engineering Requirements Analysis Process Janice Regan,
NASA Space Launch System (SLS) Independent Verification and Validation (IV&V) Analysis Processes within Enterprise Architecture (EA) September 11, 2013.
Introduction to Software Testing
IV&V Facility Model-based Design Verification IVV Annual Workshop September, 2009 Tom Hempler.
Test Design Techniques
Codex Guidelines for the Application of HACCP
What is Business Analysis Planning & Monitoring?
Effective Methods for Software and Systems Integration
S/W Project Management
RUP Requirements RUP Artifacts and Deliverables
Chapter 4 Interpreting the CMM. Group (3) Fahmi Alkhalifi Pam Page Pardha Mugunda.
Commercial Database Applications Testing. Test Plan Testing Strategy Testing Planning Testing Design (covered in other modules) Unit Testing (covered.
UML - Development Process 1 Software Development Process Using UML (2)
Introduction to Software Quality Assurance (SQA)
Software Testing Lifecycle Practice
Chapter 2 The process Process, Methods, and Tools
Software Engineering 2003 Jyrki Nummenmaa 1 REQUIREMENT SPECIFICATION Today: Requirements Specification Requirements tell us what the system should.
-Nikhil Bhatia 28 th October What is RUP? Central Elements of RUP Project Lifecycle Phases Six Engineering Disciplines Three Supporting Disciplines.
SE-02 SOFTWARE ENGINEERING LECTURE 3 Today: Requirements Analysis Requirements tell us what the system should do - not how it should do it. Requirements.
8:15 AM Tuesday September 15, 2009 Karl Frank, Point of Contact for Constellation Projects Establishing IV&V Expectations Diagrams for illustrative purposes.
RUP Implementation and Testing
Business Analysis and Essential Competencies
Team Skill 6: Building the Right System From Use Cases to Implementation (25)
T. Dawson, TASC 9/11/13 Use of a Technical Reference in NASA IV&V.
Ways for Improvement of Validity of Qualifications PHARE TVET RO2006/ Training and Advice for Further Development of the TVET.
CEN rd Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Phases of Software.
Requirements Engineering CSE-305 Requirements Engineering Process Tasks Lecture-5.
Creating a Shared Vision Model. What is a Shared Vision Model? A “Shared Vision” model is a collective view of a water resources system developed by managers.
Software Engineering – University of Tampere, CS DepartmentJyrki Nummenmaa REQUIREMENT SPECIFICATION Today: Requirements Specification.
BSBPMG505A Manage Project Quality Manage Project Quality Project Quality Processes Diploma of Project Management Qualification Code BSB51507 Unit.
Lecture 7: Requirements Engineering
University of Palestine software engineering department Testing of Software Systems Testing throughout the software life cycle instructor: Tasneem.
© 2012 xtUML.org Bill Chown – Mentor Graphics Model Driven Engineering.
Assessing the influence on processes when evolving the software architecture By Larsson S, Wall A, Wallin P Parul Patel.
Lecture-3.
Notes of Rational Related cyt. 2 Outline 3 Capturing business requirements using use cases Practical principles  Find the right boundaries for your.
Validating Requirements Determining Completeness and Correctness of Requirements Using the System Reference Model IV&V Workshop 16 September 2009.
Apply Quality Management Techniques Project Quality Processes Certificate IV in Project Management Qualification Code BSB41507 Unit Code BSBPMG404A.
Configuration Management and Change Control Change is inevitable! So it has to be planned for and managed.
IV&V Facility 26SEP071 Validation Workshop Dr. Butch Caffall Director, NASA IV&V Facility 26SEP07.
BSBPMG404A Apply Quality Management Techniques Apply Quality Management Techniques Project Quality Processes C ertificate IV in Project Management
Requirement Engineering. Recap Elaboration Behavioral Modeling State Diagram Sequence Diagram Negotiation.
Architecture View Models A model is a complete, simplified description of a system from a particular perspective or viewpoint. There is no single view.
Requirement Engineering
Software Quality Assurance and Testing Fazal Rehman Shamil.
CS3320-Chap21 Office Hours TR 1:00-2:15 PM W 2:30-3:30 PM By appointment.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
Slide 1SATC June 2000 Dolores R. Wallace* NASA Goddard Space Flight Center Greenbelt, Maryland for the American Society.
SOFTWARE TESTING AND QUALITY ASSURANCE. Software Testing.
Information Technology Project Management, Seventh Edition.
 System Requirement Specification and System Planning.
The Software Lifecycle Stuart Faulk. Definition Software Life Cycle: evolution of a software development effort from concept to retirement Life Cycle.
Introduction to Software Testing
Test Planning Mike O’Dell (some edits by Vassilis Athitsos)
Software Verification, Validation, and Acceptance Testing
Presentation transcript:

1 Validating Test Design Determining a planned test design is valid using the System Reference Model. IV&V Workshop 16 September 2009 John Schipper SRMV Product Line

2 Overview Test Design Validation Goal and Criteria Relevant Definitions Test Design Validation, SRM and 3 Questions –To understand the use of the SRM, which includes the 3 questions, to validate a test design –To understand what is produced during validation activities Test Design Validation and the IV&V WBS –IVV SLP 09-1; 1.3 Validate Test Design Proposed Guidelines Challenges and Insights

3 Validate Test Design Goal The goal is to determine if the implemented system will be properly verified. The software products can be confirmed to meet the specification and the operational need using the “validated test design” as a tool.

4 Validation Scope We are not trying to make any initial limitations or assumptions on the scope of test design validation. The scope and level of depth of any particular IV&V task (not just test design validation) is driven/guided by a an overall vision and plan for the IV&V effort that is based on risk assessment. The risk assessment is performed by the IV&V Facility Portfolio Based Risk Assessment (PBRA).

5 We tend to use the term “test case” as if there’s one test venue they comprise. In reality, there are typically various levels of testing on a program, such as: –Unit testing (of software, i.e., module testing) –Software integration testing –System level testing in low fidelity environments –System level testing in a high fidelity environment (maybe with real hardware) –Field testing or flight testing of parts or all of a system. There are also distinctions between developmental testing and formal qualification testing. There may be other gradations of this, too, such as SW testing on a single processor vs on redundant multi-processor sets or in distributed processing configurations. To meet the objective of timely analysis, an iterative approach is required. Each iteration may include some or all of the lower-level WBS elements necessary to validate the different levels of tests (e.g., system, integration, unit). Iterative Approach

6 Valid Test Design Criteria A valid test design meets each of the following criteria: –The scope of the test design covers the behaviors identified in the SRM under nominal and adverse conditions. It includes the scenarios captured in the SRM activity and sequence diagrams Future SRMs would also include all the assertion validation (Junit) test scenarios in the SRM. –The scope of the test design covers all of the validated requirements. –The test cases will exercise the software sufficiently and within the operational context to verify that the system behaviors, requirements, and interfaces are properly implemented. –The test oracle(s) contain the correct inputs and expected outputs for the software behaviors, requirements, and interfaces they are designed to test.

7 Some Relevant Definitions Capability – (IVV) Ability to perform an action or set of actions that result in accomplishing a goal. (2) a function, an activity or a process that the system should be able to perform; (3) a goal to be accomplished by a behavior or set of behaviors Limitation – (IVV) Any condition that impedes a capability (2) a constraint that keeps a capability from being able to perform its function; (3) a constraint or condition that impedes a behavior Test Design – Documentation specifying the details of the test approach for a software feature or combination of software features and identifying the associated tests. [IEEE ] Test Case – (A) A set of test inputs, execution conditions, and expected results developed for a particular objective, such as to exercise a particular program path or to verify compliance with a specific requirement. (B) Documentation specifying inputs, predicted results, and a set of execution conditions for a test item. [IEEE ] System Reference Model (SRM) – An independently developed understanding of the system of interest.

8 Test Design Validation, SRM and 3 Questions

9 SRM Represents Three Validation Questions Stakeholders want to know Will the system software do what it is supposed to do? Will the system-software do what it is not supposed to do? Will the system software respond as expected under adverse conditions? SRM strives to capture What the system software is supposed to do? What the system software is not supposed to do? What is the system software expected response under adverse conditions? 9

10 Why Use a Model? Experience shows that a project’s formal system, functional, software and interface requirements are not always a complete expression of system behavior Development of a model –captures insight/knowledge of the system –maps target requirements to model elements –drives out issues, which are communicated to the project The SRM provides an additional formal expression of system behavior in the context of the 3 questions, and for attributes such as pre-conditions, extensions and post-conditions to be used in analysis The SRM establishes scope (via the PBRA) for focusing the evaluation A concise description of the IV&V team’s understanding of the problem –Analysis tool –Communication tool

11 SRM Artifacts Includes sets of Modeling Artifacts –Use cases –Activity Diagrams, Sequences Diagrams –State charts –Domain Models (Class Diagrams, Communication Diagrams) –State chart Assertions –JUnit Test Cases –Correlation mapping to validated requirements

12 SRM, Question 1 and Test Validation Q1: Evidence that the system software will do what it is supposed to do? a.Is the scope of the test plans adequate to test the validated requirements? b.Will the planned testing adequately verify the behaviors identified in the SRM under nominal and adverse conditions? c.Will the test cases exercise the software sufficiently and within the operational context to verify that the system behaviors, requirements, and interfaces are properly implemented? d.Will the test oracles contain the correct inputs and expected outputs for the software behaviors, requirements, and interfaces they are designed to test?

13 Example Behavior Extensions – Q2 & Q3 Preconditions Main Success Scenario – Q1 & Q2 Constraints Post-conditions Goal References What the system is supposed to do Complex behavior (activity diagram) is planned to be verified (tested). What the system is supposed to do Planned testing is adequate to verify behavior. Note: Slide represents one diagram but multiple parts of the model may apply depending on planned objective. What the system is supposed to do Test cases exercise the software sufficiently and within the operational context

14 SRM, Question 2 and Test Validation Q2: Evidence that the system software will not do what it is not supposed to do? a.Will the program verify all requirements that address undesired behaviors, preventative/protective behaviors necessary to handle undesired behaviors? b.Will the program confirm that undesired behaviors do not happen even though they may not be addressed by the requirements? c.Was any independent testing performed to further understand what was or was not tested by the development Project?

15 Example Behavior Extensions – Q2 & Q3 Preconditions Main Success Scenario – Q1 & Q2 Constraints Post-conditions Goal References System software will not do what it is not supposed to do Test case verifies preventative/protective behaviors necessary to handle undesired behaviors.

16 SRM, Question 3 and Test Validation Q3: Evidence that the system software will respond as expected to adverse conditions? a.Will the program verify all requirements that address adverse conditions and their associated responsive behaviors? b.Does the program include scenarios that test adverse conditions that may not be covered by the requirements? c.Was any independent testing performed to address adverse conditions?

17 Example Behavior Extensions – Q2 & Q3 Preconditions Main Success Scenario – Q1 & Q2 Constraints Post-conditions Goal References Adverse Conditions Test cases verify all requirements that address adverse conditions and their associated responsive behaviors. Adverse Conditions Scenarios that test adverse conditions that may not be covered by the requirements

18 Some SRM Challenges Finding test information from activity diagram and other UML diagrams is a formidable task. Sorry, there’s no cookbook – it still requires engineering judgment.

19 WBS 1.3 Validate Test Design –Target of analyses is test plans and test cases at varying levels (build/software, system, integration level) –Appropriate levels should be consistent with the levels of testing for the development Project –Appropriate levels should be consistent with the requirements levels that were validated –WBS Validate Test Scope Ensure that the scope of the test design covers all of the validated requirements Ensure that the scope of the test design covers the behaviors in the SRM under nominal and adverse conditions –WBS Validate Test Cases The test cases will exercise the software sufficiently and within the operational context to verify that the system behaviors, requirements, and interfaces are properly implemented –WBS Validate Test Oracle The test oracle(s) contain the correct inputs and expected outputs for the software behaviors, requirements, and interfaces they are designed to test Test Design Validation and the IV&V WBS

20 Proposed Guidelines Formatted as Use Case Guidelines –Stakeholders: (PL/LE) Primary stakeholder and recipient of validation issues and test design validation report, (SRMV Product Line) Perform the validation, (Development Project) Recipient of validation issues and test design validation report –Main Success Steps: Provides high level guidance. The what not the how. –Extensions: Basic variances currently recognized by guidance. Note it is expected that tailoring may be required. Location of guidelines –Per completion of reviewing process with SRMV Product Line the guidelines and other material will be placed onto the IV&V Knowledge Management Site (kms.ivv.nasa.gov). Following are excerpts from proposed guidelines.

21 Evaluate the test design against both the SRM and the validated requirements. A valid test design scope meets each of the following criteria The scope of the test design covers the behaviors identified in the SRM under nominal and adverse conditions. The scope of the test design covers all of the validated requirements. Validate Test Scope

22 Validate Test Scope Main Success Steps 1.Identify the planned verification within the provided artifacts utilizing the behaviors and validated requirements from the SRM at the requested level. Based on these identified sources, assess the quality and content of the artifacts as needed for validating the scope of the verification. 2.For each validated requirement, determine from the artifacts what verification method(s) is intended and validate the planned method. Utilizing insight into the meaning and operational context of the requirement, as may be provided by the SRM, validate that the method is appropriate and correct. WBS: Covers all of the validated requirements.

23 Validate Test Scope Main Success Steps (cont) 3.Validate that the development project targeted requirements verification provides the needed coverage of the representative SRM behaviors. For each requirement, traced to said behavior, determine verification has appropriate conditions to fully verify the requirement as understood in the context of the SRM. Where test is the method, determine if suitable scenarios are planned (nominal and off-nominal) and an appropriate test environment will be used. WBS: Covers the behaviors identified in the SRM under nominal and adverse conditions. 4.Validate that the development project targeted verification provides the needed coverage of the representative SRM behaviors. For validated SRM behavior, not traced to validated requirements per step 3, determine verification has appropriate test(s) and conditions as understood from the context of the SRM. Validate suitable scenarios are planned (nominal and off-nominal) and an appropriate test environment will be used. WBS: Covers the behaviors identified in the SRM under nominal and adverse conditions.

24 Validate Test Scope Extensions 1a. Provided material is inadequate for validating the scope for certain behaviors and/or validated requirements. Generate risks, deviations to the validation effort, or obtain needed material from the project. 2a. Inappropriate or incorrect verification method is identified. This may indicate that the scope of test design will not adequately show the specification and the operational need is met using the test design as a tool. Generate issue(s) as appropriate. 2b. No verification method is indicated, hindering analysis. Generate risks against impacts to analysis and/or issues.

25 Validate Test Scope Extensions (cont) 3a. The developers planned verification does not adequately cover a requirement in all contexts in which it needs to be verified or the model context is determined to be incorrect or incomplete. Inadequate coverage may indicate that the scope of test design will not adequately show the specification and operational needs are met. Based on analysis of the differences either generate development project issues or reconcile the SRM. WBS: Covers the behaviors identified in the SRM under nominal and adverse conditions. 4a. The developers planned verification does not adequately cover a validated behavior which it needs to be verified or the model context is determined to be incorrect or incomplete. Inadequate coverage may indicate that the scope of test design will not adequately show the behavior and operational needs are met. Based on analysis of the differences either generate development project issues or reconcile the SRM. WBS: Covers the behaviors identified in the SRM under nominal and adverse conditions.

26 Validate Test Scope Outcome The assessment and validation effort on the planned scope of test (verification), based on targeted SRM behavior, has been completed. The analysis, risks, and project issues generated are ready for review by the stakeholders.

27 Evaluate the test design against both the SRM and the validated requirements. The goal is to verify and validate that the software products meet the specification and the operational needs using the validated test design as a tool. A valid test design [as it pertains to test cases] meets the following criterion: –The test cases will exercise the software sufficiently and within the operational context to verify that the system behaviors, requirements, and interfaces are properly implemented. Validate Test Case/Scenario

28 1.Validation Team uses traceability materials from Development Project artifacts (e.g., test/verification matrices from requirements specs) to establish an initial mapping from requirements to test cases. 2.Using above mapping, plus IV&V’s mapping between the developer requirements and the SRM elements produced by requirements validation, Validation Team evaluates the validation target to assure that the validation target effectively covers The Three Questions and meets the following quality criteria: a.Test scenarios: The test cases will exercise the software within the operational context sufficiently to verify that the system behaviors, the requirements, and interfaces are properly implemented. b.Test fidelity: The test facility provides an environment sufficiently detailed and complete to exercise all behaviors of interest in a manner that accurately simulates their operational environment. c.Test criteria: The test cases contain explicit pass/fail criteria sufficient to assure assessment of the function and/or performance of all behaviors of interest. Validate Test Case Main Success Steps

29 3.From familiarity with the test venue gained thru previous step, and knowledge of the SRM, Validation Team identifies any additional requirements/behaviors beyond those found in the Development Project traceability materials that could reasonably be expected to be covered by the validation target. 4.For the additional requirements/behaviors identified in Step 3, Validation Team revisits the validation target, considering The Three Questions, and applying quality criteria a. thru c. for any additional test coverage found. 5.Validation Team submits TIMs for: a.Deficiencies identified with coverage of tested requirements/behaviors (criteria a. thru c.) b.Requirements for which test coverage was expected but not found c.Requirements for which test coverage was found but which were not reflected in Development Project traceability materials 6.Validation Team produces Test Case Validation Report for this iteration. Validate Test Case Main Success Steps (cont)

30 2.a. Validation Team finds that validation target lacks the information necessary to assess one or more of the quality criteria (e.g., testbed capabilities insufficiently defined, test cases lack pass/fail criteria, etc.) 2.a.1. Validation Team submits TIM on general deficiency of artifact 2.a.2 Validation Team continues, assessing those remaining criteria that are supported Validate Test Case Extensions

31 Validate Test Case Outcome The assessment of targeted test cases based on SRM behavior has been completed for the portion of the SRM containing the behaviors of interest. The analysis, risks, and project issues generated are ready for review by the stakeholders. Challenge Test/verification is not a singular activity but occurs in stages. Can you draw conclusions as reflected in Outcome or is there a need for all relevant stages to have been evaluated?

32 Test Design Report Template –Template is simple. Don’t need to rehash process. Citations, references and links (some already provided) are preferable. –Allows Summarization of work/content. –Reiterate targeted behaviors and define material assessed. –Key discussions to report Facility three questions. Variations/extensions to published guidelines and/or processes Lessons Learned, Best Practices –Results Issues/Observations/Risks Conclusions/Goodness 32

33 Test Design Validation depends on an adequate SRM that answers the three questions and at the appropriate level for validation. Will SRM elaboration end with requirements validation or continue on into test design validation? The second validation question is difficult from the developers standpoint. It is impossible to fully verify the null cases and most projects don’t do much of that. The future inclusion of formalized assertions, as part of the SRM, should help IV&V get a better understanding about test design and Q2 through the use of validation scenarios. Verification will unfold in stages. Ultimate conclusions cannot be drawn until all stages have been factored in, yet the project needs timely input. Test Design Challenges

34 Thoughts on Challenges –Test Design Validation, using the SRM, depends on an adequate SRM and correlation. How do you know when your model is adequate to perform the analysis without “peeking,” or doing the forbidden – “model to the target?” Iterate! –This is how we avoid developing a product that is too high level or is overkill (and therefore too expensive, and takes too long) for the task. Model a little, validate a little. The SRM is a tool. The validated requirements are part of your toolset. 34

35 Thoughts (cont.) How can I capture all behavior or domain knowledge in the model? –You can’t. My correlation map (required input for test design validation) is a runaway train nightmare, what do I do? –Use the model for additional context for non- requirements-based scenarios (robustness, stress and boundary tests). –Use the validated requirements for tracing and goodness/quality(?) checks on test design artifacts. They are in the same “language” as the test artifacts. 35

36 Take Away The SRM is another tool for analysis – Scoping Context Don’t forget the 3 Q’s! Validated requirements count too! Sorry, there’s no cookbook – it still requires engineering judgment Iterate! Otherwise you will be sad.

37 Q&A