Advanced Technology Center Slide 1 Requirements-Based Testing Dr. Mats P. E. Heimdahl University of Minnesota Software Engineering Center Dr. Steven P.

Slides:



Advertisements
Similar presentations
1 Verification by Model Checking. 2 Part 1 : Motivation.
Advertisements

MOdel-based GENeration of Tests for Embedded Systems # FP7-ICT Embedded Systems Design WP3: Qualitative Fault Modelling András Pataricza,
Introducing Formal Methods, Module 1, Version 1.1, Oct., Formal Specification and Analytical Verification L 5.
IT Requirements Capture Process. Motivation for this seminar Discovering system requirements is hard. Formally testing use case conformance is hard. We.
Dagstuhl Intro Mike Whalen Program Director University of Minnesota Software Engineering Center.
SAS_08_Model_Val_Tech_Heimdahl MAC-T IVV Model-Validation in Model-Based Development Kurt Woodham L-3 Communications Ajitha Rajan, Mats Heimdahl.
Dagstuhl Intro Mike Whalen. 2 Mike Whalen My main goal is to reduce software verification and validation (V&V) cost and increasing.
Advanced Technology Center Slide 1 Formal Verification of Flight Critical Software Dr. Steven P. Miller Advanced Computing Systems Elise A. Anderson Commercial.
Advanced Technology Center Slide 1 Formal Methods in Safety-Critical Systems Dr. Steven P. Miller Advanced Computing Systems Rockwell Collins 400 Collins.
Formal Model-Based Development in Aerospace Systems: Challenges to Adoption Mats P. E. Heimdahl University of Minnesota Software Engineering Center Critical.
Formal Methods in Software Engineering Credit Hours: 3+0 By: Qaisar Javaid Assistant Professor Formal Methods in Software Engineering1.
Software Testing and Quality Assurance
Software Testing and Quality Assurance
Building Reliable Software Requirements and Methods.
Behavioral Design Outline –Design Specification –Behavioral Design –Behavioral Specification –Hardware Description Languages –Behavioral Simulation –Behavioral.
ECE Synthesis & Verification1 ECE 667 Spring 2011 Synthesis and Verification of Digital Systems Verification Introduction.
Automated V&V for High Integrity Systems A Targeted Formal Methods Approach Simon Burton Research Associate Rolls-Royce University Technology Centre University.
Software Testing and Reliability Testing Real-Time Systems Aditya P. Mathur Purdue University May 19-23, Corporation Minneapolis/St Paul,
Describing Syntax and Semantics
School of Computer ScienceG53FSP Formal Specification1 Dr. Rong Qu Introduction to Formal Specification
5/24/011 Advanced Tool Integration for Embedded Systems Assurance Insup Lee Department of Computer and Information Science University of Pennsylvania.
Software Engineering Tools and Methods Presented by: Mohammad Enamur Rashid( ) Mohammad Rashim Uddin( ) Masud Ur Rahman( )
NYC Technology Forum Introduction to Test Automation 11/2/07 All rights reserved Not to be reproduced without permission Bill Rinko-Gay Solutions Director,
Software Considerations in Airborne Systems
Verification technique on SA applications using Incremental Model Checking 컴퓨터학과 신영주.
Formal Methods 1. Software Engineering and Formal Methods  Every software engineering methodology is based on a recommended development process  proceeding.
CS527: (Advanced) Topics in Software Engineering Overview of Software Quality Assurance Tao Xie ©D. Marinov, T. Xie.
Dr. Pedro Mejia Alvarez Software Testing Slide 1 Software Testing: Building Test Cases.
DGTA-ADF Migrating to a Software Assurance Standard 2008 ADF Software Symposium FLTLT Patrick Redmond SCI-DGTA.
© Siemens AG, CT SE 1, Dr. A. Ulrich C O R P O R A T E T E C H N O L O G Y Research at Siemens CT SE Software & Engineering Development Techniques.
Model-Based Design & Analysis
Testing : A Roadmap Mary Jean Harrold Georgia Institute of Technology Presented by : Navpreet Bawa.
Chapter 2: Software Process Omar Meqdadi SE 2730 Lecture 2 Department of Computer Science and Software Engineering University of Wisconsin-Platteville.
Advanced Technology Center Slide 1 Model-Based Safety Analysis Overview Dr. Steven P. Miller Dr. Mats P. E. Heimdahl Advanced Computing Systems Rockwell.
Scientific Computing By: Fatima Hallak To: Dr. Guy Tel-Zur.
Foundations of Software Testing Chapter 5: Test Selection, Minimization, and Prioritization for Regression Testing Last update: September 3, 2007 These.
1 New Development Techniques: New Challenges for Verification and Validation Mats Heimdahl Critical Systems Research Group Department of Computer Science.
Overview of Formal Methods. Topics Introduction and terminology FM and Software Engineering Applications of FM Propositional and Predicate Logic Program.
Framework for the Development and Testing of Dependable and Safety-Critical Systems IKTA 065/ Supported by the Information and Communication.
West Virginia University Towards Practical Software Reliability Assessment for IV&V Projects B. Cukic, E. Gunel, H. Singh, V. Cortellessa Department of.
1 A Spectrum of IV&V Modeling Techniques Mats Heimdahl (Co-PI) Jimin Gao (RA) University of Minnesota Tim Menzies (Co-PI) David Owen (RA) West Virginia.
BE-SECBS FISA 2003 November 13th 2003 page 1 DSR/SAMS/BASP IRSN BE SECBS – IRSN assessment Context application of IRSN methodology to the reference case.
1 Hybrid-Formal Coverage Convergence Dan Benua Synopsys Verification Group January 18, 2010.
© Andrew IrelandDependable Systems Group On the Scalability of Proof Carrying Code for Software Certification Andrew Ireland School of Mathematical & Computer.
Page 1 Analysis of Asynchronous Systems Steven P. Miller Michael W. Whalen {spmiller, Advanced Computing Systems Rockwell.
1 Qualitative Reasoning of Distributed Object Design Nima Kaveh & Wolfgang Emmerich Software Systems Engineering Dept. Computer Science University College.
Page 1 Advanced Technology Center HCSS 03 – April 2003 vFaat: von Neumann Formal Analysis and Annotation Tool David Greve Dr. Matthew Wilding Rockwell.
Semantics In Text: Chapter 3.
Requirements Engineering-Based Conceptual Modelling From: Requirements Engineering E. Insfran, O. Pastor and R. Wieringa Presented by Chin-Yi Tsai.
Comparing model-based and dynamic event-extraction based GUI testing techniques : An empirical study Gigon Bae, Gregg Rothermel, Doo-Hwan Bae The Journal.
Verification & Validation By: Amir Masoud Gharehbaghi
May08-21 Model-Based Software Development Kevin Korslund Daniel De Graaf Cory Kleinheksel Benjamin Miller Client – Rockwell Collins Faculty Advisor – Dr.
Using Symbolic PathFinder at NASA Corina Pãsãreanu Carnegie Mellon/NASA Ames.
1 The Requirements Problem Chapter 1. 2 Standish Group Research Research paper at:  php (1994)
Quality Assurance in the Presence of Variability Kim Lauenroth, Andreas Metzger, Klaus Pohl Institute for Computer Science and Business Information Systems.
CS3320-Chap21 Office Hours TR 1:00-2:15 PM W 2:30-3:30 PM By appointment.
Security Codesign Steve Dawson and Victoria Stavridou Bruno Dutertre, Josh Levy, Bob Riemenschneider, Hassen Saidi, Tomas Uribe System Design Laboratory.
Whole Test Suite Generation. Abstract Not all bugs lead to program crashes, and not always is there a formal specification to check the correctness of.
Assessing Requirements Quality through Requirements Coverage Ajitha RajanUniversity of Minnesota Mats HeimdahlUniversity of Minnesota Kurt WoodhamL3 Communications.
Foundations of Software Testing Chapter 5: Test Selection, Minimization, and Prioritization for Regression Testing Last update: September 3, 2007 These.
SAMCAHNG Yun Goo Kim I. Formal Model Based Development & Safety Analysis II. UML (Model) Based Safety RMS S/W Development February KIM, YUN GOO.
On the Relation Between Simulation-based and SAT-based Diagnosis CMPE 58Q Giray Kömürcü Boğaziçi University.
Critical Systems Testing Experts EXB Solutions - Contact us at cFS Workshop – Automated Test for NASA cFS David C. McComas 1, Susanne.
Aditya P. Mathur Purdue University
Software Verification and Validation
Software Verification, Validation, and Acceptance Testing
Software Verification and Validation
Department of Computer Science Abdul Wali Khan University Mardan
Software Verification and Validation
Rich Model Toolkit – An Infrastructure for Reliable Computer Systems
Presentation transcript:

Advanced Technology Center Slide 1 Requirements-Based Testing Dr. Mats P. E. Heimdahl University of Minnesota Software Engineering Center Dr. Steven P. Miller Dr. Michael W. Whalen Advanced Computing Systems Rockwell Collins 400 Collins Road NE, MS Cedar Rapids, Iowa

Advanced Technology Center Slide 2 Outline of Presentation Motivation Validation Testing Conformance Testing What’s Next

Advanced Technology Center Slide 3 How We Develop Software SW High-Level Reqs. Development SW Design Description Dev. (SW Low-Level Reqs. & SW Arch. SW Source Code Dev. SW Integration (Executable Code Production) SW Low- Level Testing SW Integration Testing HW/SW Integration Testing

Advanced Technology Center Slide 4 How we Will Develop Software (From V to a Y) SW High-Level Reqs. Development Software Model SW Integration (Executable Code Production) SW Integration Testing HW/SW Integration Testing Can we trust the code generator? How do we know our model is correct? Validation Testing Formal Verification Conformance Testing

Advanced Technology Center Slide 5 Outline of Presentation Motivation Validation Testing Conformance Testing What’s Next

Advanced Technology Center Slide 6 How we Will Develop Software (From V to a Y) SW High-Level Reqs. Development Software Model SW Integration (Executable Code Production) SW Integration Testing HW/SW Integration Testing How do we know our model is correct?

Advanced Technology Center Slide 7 Modeling Process SW High-Level Reqs. Development Software Model SW Integration (Executable Code Production) Desired Model Properties High-Level Requirements Low-Level Requirements

Advanced Technology Center Slide 8 Problem—Modeling Frenzy SW High-Level Reqs. Development Software Model SW Integration (Executable Code Production) Desired Model Properties Headfirst into modeling How do we know the model is “right”? How do we test the model?

Advanced Technology Center Slide 9 One Solution: Redefine Requirements System Reqs. Development Software Model SW Integration (Executable Code Production) SW Integration Testing HW/SW Integration Testing SoftwareDevelopment Processes(DO-178B) SoftwareDevelopment Processes(DO-178B) SystemDevelopment Processes(ARP4754) SystemDevelopment Processes(ARP 4754) The model is the requirements Use Engineering Judgment when Testing

Advanced Technology Center Slide 10 One Solution: Redefine Requirements System Reqs. Development Software Model SW Integration (Executable Code Production) SW Integration Testing HW/SW Integration Testing SoftwareDevelopment Processes(DO-178B) SoftwareDevelopment Processes(DO-178B) SystemDevelopment Processes(ARP4754) SystemDevelopment Processes(ARP 4754) The model is the requirements Use Engineering Judgment when Testing My Comment

Advanced Technology Center Slide 11 Testing Does not go Away System Reqs. Development Software Model SW Integration (Executable Code Production) SW Integration Testing HW/SW Integration Testing Extensive Testing (MC/DC)

Advanced Technology Center Slide 12 It Simply Moves System Reqs. Development Software Model SW Integration (Executable Code Production) SW Integration Testing HW/SW Integration Testing Extensive Testing (MC/DC)

Advanced Technology Center Slide 13 Do it Right! SW High-Level Reqs. Development Software Model SW Integration (Executable Code Production) Desired Model Properties Analysis (Model Checking, Theorem Proving) Specification Test – Is the Model Right?

Advanced Technology Center Slide 14 How Much to Test? State Coverage Masking MC/DC? Transition Coverage ? Decision Coverage ? Def-Use Coverage ? Somethin g New?? MC/DC Where Do the Tests Come From?

Advanced Technology Center Slide 15 Properties are Requirements… Requirements Based Testing SW High-Level Reqs. Development Software Model SW Integration (Executable Code Production) Desired Model Properties Cover the Properties!

Advanced Technology Center Slide 16 Properties are Requirements

Advanced Technology Center Slide 17 Requirements Based Testing Advantages  Objective Measurement of Model Validation Efforts – Requirements Coverage in Model-based Development – Help Identify Missing Requirements Measure converge of model  Basis for Automated Generation of Requirements-based Tests – Even If Properties Are Not Used for Verification, They Can Be Used for Test Automation How Are Properties “Covered” with Requirements-based Tests?

Advanced Technology Center Slide 18 Property Coverage “If the onside FD cues are off, the onside FD cues shall be displayed when the AP is engaged” – G(((!Onside_FD_On & !Is_AP_Engaged) -> X(Is_AP_Engaged -> Onside_FD_On))  Property Automata Coverage – Cover a Synchronous Observer Representing the Requirement (Property)  Structural Property Coverage – Demonstrate Structurally “Interesting” Ways in Which the Requirement (Property) Is Met

Advanced Technology Center Slide 19 Property Automata Coverage  Cover Accepting State Machine As Opposed to Structure of Property  Büchi Coverage – State Coverage, Transition Coverage, Lasso Coverage…

Advanced Technology Center Slide 20 Alternative Machine  Different synthesis algorithms give different automata – Will affect the test cases required for coverage

Advanced Technology Center Slide 21 Structural Property Coverage  Define Structural Coverage Criteria for the Property Specification – Traditional Condition-based Criteria such as MC/DC Prime Candidates  Property Coverage Different than Code Coverage – Coverage of Code and Models Evaluate a decision with a specific combination of truth values in the decision – Coverage of Properties Run an execution scenario that illustrates a specific way a requirement (temporal property) is satisfied

Advanced Technology Center Slide 22Example – G(((!Onside_FD_On & !Is_AP_Engaged) -> X(Is_AP_Engaged -> Onside_FD_On))  Demonstrate That Somewhere Along Some Execution Trace Each MC/DC Case Is Met – Only the “positive” MC/DC cases The negative cases should have no traces  In the Case of G(p)—Globally p Holds—we Need to Find a Test Where – in the prefix the requirement p is met – we reach a state of the trace where the requirement p holds because of the specific MC/DC case of interest – let us call this case a – then the requirement p keeps on holding through the remainder of the trace  p U ( a U X(G p)) ppappp

Advanced Technology Center Slide 23Summary  Objective Measurement of Model Validation Efforts – Requirements Coverage in Model-based Development – Help Identify Missing Requirements  Basis for Automated Generation of Requirements-based Tests – Even If Properties Are Not Used for Verification, They Can Be Used for Test Automation and Test Measurement  Challenges – How Are Properties Specified? Combination of Observers and Temporal Properties – What Coverage Criteria Are Suitable? – How Is Automation Achieved? – How Do We Eliminate “Obviously” Bad Tests? Should We? – How Do We Generate “Realistic” Test-cases? – Rigorous Empirical Studies Badly Needed

Advanced Technology Center Slide 24 Outline of Presentation Motivation Validation Testing Conformance Testing What’s Next

Advanced Technology Center Slide 25 How we Will Develop Software (From V to a Y) SW High-Level Reqs. Development Software Model SW Integration (Executable Code Production) SW Integration Testing HW/SW Integration Testing Can we trust the code generator?

Advanced Technology Center Slide 26 “Correct” Code Generation—How?  Provably Correct Compilers – Very Hard (and Often Not Convincing)  Proof Carrying Code  Generate Test Suites From Model – Compare Model Behavior With Generated Code – Unit Testing Is Now Not Eliminated, but Largely Automated Specification/Model Implementation Output Specification Based Tests Generate

Advanced Technology Center Slide 27 Existing Capabilities  Several Commercial and Research Tools for Test-Case Generation – TVEC Theorem Proving and Constraint Solving techniques – Reactis from Reactive Systems Inc. Random, Heuristic, and Guided Search – University of Minnesota Bounded Model Checking – NASA Langley Bounded Model Checking/Decision Procedures/Constraint Solving  Tools Applicable to Relevant Notations – In Our Case Simulink

Advanced Technology Center Slide 28 An Initial Experiment  Used a Model of the Mode Logic of a Flight Guidance System As a Case Example  Fault Seeding – Representative Faults – Generated 100 Faulty Specifications  Generate Test Suites – Selection of Common (and Not So Common) Criteria  Fault Detection – Ran the Test Suites Against the Faulty Specifications – Recorded the Total Number of Faults Detected

Advanced Technology Center Slide 29 Fault Finding Results Same Effort

Advanced Technology Center Slide 30 Model “Cheats” Test Generator FCS Architecture

Advanced Technology Center Slide 31 Effect of Test Set Size

Advanced Technology Center Slide 32Summary  Automated Generation of Conformance Tests – Current Technology Largely Allows This Automation  Challenges – Development of Suitable Coverage Criteria – Effect of Test Set Size on Test Set Effectiveness – Effect of Model Structure on Coverage Criteria Effectiveness – Traceability of Tests to Constructs Tested – Empirical Studies of Great Importance

Advanced Technology Center Slide 33 Outline of Presentation Motivation Conformance Testing Validation Testing What’s Next

Advanced Technology Center Slide 34 New Challenges for Testing  Model Validation – Requirements-based Testing – How Do We Best Formalize the Requirements? – What Coverage Criteria Are Feasible? – Which Coverage Criteria Are Effective (If Any)? – How Do We Generate “Realistic” Tests? – Will This Be a Practical (Tractable) Solution?  Conformance Testing – What Coverage Criteria Are Effective? Detecting Faults From Manual Coding Detecting Faults From Code Generation – Relationship Between Model Structure and Criteria Effectiveness – Traceability From Tests to Model – Relationship Between Model Coverage and Code Coverage Optimizations in Code Generator Will Compromise Coverage

Advanced Technology Center Slide 35Discussion

Advanced Technology Center Slide 36 Perfection is Not Necessary  Tools and Models Only Need To Be Better Than Manual Processes… – How Do We Demonstrate This? Empirical Studies Are of Great Importance ≥ Missed Faults I Think Many Already Are

Advanced Technology Center Slide 37 DO-178B Test Objectives 1.The executable code complies with the high-level requirements. 2.The executable code complies with the specification (low-level requirements). 3.Test coverage of high-level requirements is achieved 4.Test coverage of specification (low-level requirements) is achieved 5.Test coverage of the executable code is achieved Requirements-Based Testing Conformance Testing