Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 New Development Techniques: New Challenges for Verification and Validation Mats Heimdahl Critical Systems Research Group Department of Computer Science.

Similar presentations


Presentation on theme: "1 New Development Techniques: New Challenges for Verification and Validation Mats Heimdahl Critical Systems Research Group Department of Computer Science."— Presentation transcript:

1 1 New Development Techniques: New Challenges for Verification and Validation Mats Heimdahl Critical Systems Research Group Department of Computer Science and Engineering University of Minnesota 4-192 EE/CS; 200 Union Street SE Minneapolis, MN 55455

2 http://www.cs.umn.edu/crisys 2 Domain of Concern

3 http://www.cs.umn.edu/crisys 3 How we Develop Software Concept Formation Requirements Specification Design Implementation Integration System Unit Test Integration Test System Test Object Code Test Analysis

4 http://www.cs.umn.edu/crisys 4 Validation and Verification Concept Formation Requirements Specification Design Implementation Integration Verification: Are we building the thing right? Validation: Are we building the right thing? System

5 http://www.cs.umn.edu/crisys 5 Model-Based Development Specification Model Visualization Prototyping Testing Code Analysis Properties

6 http://www.cs.umn.edu/crisys 6 Model-Based Development Tools Commercial Products  Esterel Studio and SCADE Studio from Esterel Technologies  SpecTRM from Safeware Engineering  Rhapsody from I-Logix  Simulink and Stateflow from Mathworks Inc.  Rose Real-Time from Rational  Etc. Etc.

7 http://www.cs.umn.edu/crisys 7 Research Tools (many): RSML -e and Nimbus RSML -e Formal Models (~20 running concurrently) Simulations of environment

8 http://www.cs.umn.edu/crisys 8 System Specification/Model How we Will Develop Software Concept Formation Requirements Implementation Integration Properties Analysi s Integration Test Syste m Test Specification Test

9 http://www.cs.umn.edu/crisys 9 FGS/FMS Mode Logic RSML -e and Nimbus RSML -e Formal Models (~20 running concurrently) Simulations of environment

10 http://www.cs.umn.edu/crisys 10 Sample RSML -e Specification

11 http://www.cs.umn.edu/crisys 11 Capture Requirements as Shalls

12 http://www.cs.umn.edu/crisys 12 Translated All the Shalls into SMV Properties

13 http://www.cs.umn.edu/crisys 13 Early Validation of Requirements Using Model-Checking (NuSMV) Prove Over 300+ Properties in Less Than an Hour Found Several Errors in Our Models Using Model-Checking Substantially Revised the Shalls to Correct Errors

14 http://www.cs.umn.edu/crisys 14 Early Validation of Requirements Using Theorem Proving (PVS) Proved Several Hundred Properties Using PVS More Time Consuming than Model-Checking Use When Model-Checking Won’t Work

15 http://www.cs.umn.edu/crisys 15 Model-Based Development Examples

16 http://www.cs.umn.edu/crisys 16 A Simplified Development Model Requirements and Specification Code Unit Test System Test Time

17 http://www.cs.umn.edu/crisys 17 Ongoing Research Specification Model Visualization Prototyping Testing Code Analysis Properties CMU, SRI, Stanford, UC Berkley, VERIMAG, NASA, Etc., Etc. RSML -e, SCR, SpecTRM, Statecharts, Esterel, SCADE, Simulink, Etc. Etc. RSML -e, SCR, SpecTRM, Statecharts, Esterel, SCADE, Simulink, Etc. Etc. –UML Minnesota, Pennsylvania, George Mason, NRL, NASA Ames, Etc. Proof carrying code, Provably correct compilers, Test for correctness

18 http://www.cs.umn.edu/crisys 18 Problems… Specification Model Visualization Prototyping Testing Code Analysis Properties Are the languages usable—syntax and semantics? Can they play nice together? Can we trust execution environment? Trust the results? Tested enough? Can we really trust the code?

19 http://www.cs.umn.edu/crisys 19 Benefits of Modeling Time Savings Fewer “Bugs”

20 http://www.cs.umn.edu/crisys 20 Code Generation Time Savings Fewer “Bugs” Coding effort greatly reduced

21 http://www.cs.umn.edu/crisys 21 Qualified Code Generation (theory) Time Savings Unit testing eliminated for generated code Unit testing moved here.

22 http://www.cs.umn.edu/crisys 22 System Specification/Model Code Generation Concerns Concept Formation Requirements Implementation Integration Properties Can we trust the code generator? Is our model “right”? Can we trust the execution environment? Can we trust our analysis tools? Can we trust our properties?

23 http://www.cs.umn.edu/crisys 23 “Correct” Code Generation Provably correct compilers  Very hard (and often not convincing) Proof carrying code  Total correctness required Base all specification testing on the generated code  Loose the benefits of working at the specification level Generate test suites from specification  Compare specification behavior with generated code to better trust your specification testing  Unit testing is now not eliminated, but completely automated Specification/Model Implementation Specification Based Tests Output Generate

24 http://www.cs.umn.edu/crisys 24 Specification Testing Certify the execution environment  Too costly and probably impossible Specification based testing  Any discrepancy and either the code generator is wrong, or the execution environment is wrong, or the target platform is faulty When have we tested enough?  Specification coverage criteria What is adequate coverage? Criteria for measurement are not good for generation –Technically covering the specification, but with useless tests  Do we reveal faults Tradeoff between the impossible and the inadequate

25 http://www.cs.umn.edu/crisys 25 Proof Techniques (theory) Time Savings Reduced testing since properties proved correct in specification stage Proofs performed here

26 http://www.cs.umn.edu/crisys 26 System Specification/Model Verification Trust Concept Formation Requirements Implementation Integration Properties Proof validity in production environment? We need properties (requirements)!!! Often lost in the modeling “frenzy” How do we trust our proofs?

27 http://www.cs.umn.edu/crisys 27 Proof Techniques Certify analysis tools  Too costly and probably impossible Use redundant proof paths  Technically feasible, but is the redundancy “trustworthy”??  Cost… Automation is key  Must keep analysis cost under control Generate test suites from specification  Low cost since it is already done for the code generator Trusted Translators ? RSML -e State Exploration Model Checker Theorem Prover Translation Trusted Translators? Many languages and many analysis techniques

28 http://www.cs.umn.edu/crisys 28 Proof Techniques (worst case) Time Savings Most analysis is not easy, nor cheap! Added burden that cannot be leveraged later

29 http://www.cs.umn.edu/crisys 29 Regression Verification 100s, if not 1000s, of properties Large Evolving Model Analysis Result Iterated Weekly? Daily? Hourly? Abstraction cost amortized Impact of change on abstraction Approximate techniques in day-to-day activities

30 http://www.cs.umn.edu/crisys 30 Can We Achieve the Goal? Time Savings Abbreviated system testing augmented with generated tests Redundant proof process (PVS, SMV, Prover, SAL,…) Specification testing Test case generation Verifiable code generator Automated unit testing (to MC/DC?)—to check code generator and specification execution environment Yes! ? ? ? ? ?

31 http://www.cs.umn.edu/crisys 31 Perfection is Not Necessary We only need to be better than what we are now…  How do we demonstrate this? Empirical studies are of great importance ≥ Missed Faults

32 http://www.cs.umn.edu/crisys 32 Education of Regulatory Agencies Regulatory agencies are very conservative  And rightly so…  Avionics software is very good We need to understand regulatory and industry concerns to get our techniques into practice We need to have convincing evidence that our techniques work and are effective

33 http://www.cs.umn.edu/crisys 33 New Challenges for V&V Validate models  The models must satisfy the “real” requirements  Validate the properties used in analysis  Model testing crucial to success Validate tools  We will rely a lot on tools for model validation, can we trust them?  Creative use of testing necessary Verify and Validate generated code  Can we trust that the translation was correct?  Test automation crucial  Includes output to analysis tools Adapt to the various modeling notations  Models will not come in one language  Translation between notations and tools

34 http://www.cs.umn.edu/crisys 34 Discussion


Download ppt "1 New Development Techniques: New Challenges for Verification and Validation Mats Heimdahl Critical Systems Research Group Department of Computer Science."

Similar presentations


Ads by Google