Software Engineering Center Compositional Analysis of System Architectures (using Lustre) Mike Whalen Program Director University of Minnesota Software.

Slides:



Advertisements
Similar presentations
Automated Theorem Proving Lecture 1. Program verification is undecidable! Given program P and specification S, does P satisfy S?
Advertisements

Translation-Based Compositional Reasoning for Software Systems Fei Xie and James C. Browne Robert P. Kurshan Cadence Design Systems.
ECOE 560 Design Methodologies and Tools for Software/Hardware Systems Spring 2004 Serdar Taşıran.
Introducing Formal Methods, Module 1, Version 1.1, Oct., Formal Specification and Analytical Verification L 5.
Abstraction and Modular Reasoning for the Verification of Software Corina Pasareanu NASA Ames Research Center.
1 Translation Validation: From Simulink to C Michael RyabtsevOfer Strichman Technion, Haifa, Israel Acknowledgement: sponsored by a grant from General.
SOFTWARE TESTING. INTRODUCTION  Software Testing is the process of executing a program or system with the intent of finding errors.  It involves any.
Timed Automata.
Rigorous Software Development CSCI-GA Instructor: Thomas Wies Spring 2012 Lecture 13.
Dagstuhl Intro Mike Whalen. 2 Mike Whalen My main goal is to reduce software verification and validation (V&V) cost and increasing.
Background information Formal verification methods based on theorem proving techniques and model­checking –to prove the absence of errors (in the formal.
Copyright © 2006 Addison-Wesley. All rights reserved.1-1 ICS 410: Programming Languages Chapter 3 : Describing Syntax and Semantics Axiomatic Semantics.
ISBN Chapter 3 Describing Syntax and Semantics.
An Integration of Program Analysis and Automated Theorem Proving Bill J. Ellis & Andrew Ireland School of Mathematical & Computer Sciences Heriot-Watt.
Integrated Design and Analysis Tools for Software-Based Control Systems Shankar Sastry (PI) Tom Henzinger Edward Lee University of California, Berkeley.
Presenter : Shih-Tung Huang Tsung-Cheng Lin Kuan-Fu Kuo 2015/6/15 EICE team Model-Level Debugging of Embedded Real-Time Systems Wolfgang Haberl, Markus.
Chess Review May 11, 2005 Berkeley, CA Composable Code Generation for Distributed Giotto Tom Henzinger Christoph Kirsch Slobodan Matic.
1 Software Testing and Quality Assurance Lecture 15 - Planning for Testing (Chapter 3, A Practical Guide to Testing Object- Oriented Software)
Design of Fault Tolerant Data Flow in Ptolemy II Mark McKelvin EE290 N, Fall 2004 Final Project.
Copyright © 2006 The McGraw-Hill Companies, Inc. Programming Languages 2nd edition Tucker and Noonan Chapter 18 Program Correctness To treat programming.
Describing Syntax and Semantics
School of Computer ScienceG53FSP Formal Specification1 Dr. Rong Qu Introduction to Formal Specification
Formal verification Marco A. Peña Universitat Politècnica de Catalunya.
The Software Product Life Cycle. Views of the Software Product Life Cycle  Management  Software engineering  Engineering design  Architectural design.
Principle of Functional Verification Chapter 1~3 Presenter : Fu-Ching Yang.
Chapter 10: Architectural Design
Software Engineering Center Compositional Safety and Security Analysis of Architecture Models Mike Whalen Program Director University of Minnesota Software.
Software Engineering Prof. Dr. Bertrand Meyer March 2007 – June 2007 Chair of Software Engineering Static program checking and verification Slides: Based.
Advanced Technology Center Slide 1 Requirements-Based Testing Dr. Mats P. E. Heimdahl University of Minnesota Software Engineering Center Dr. Steven P.
© Copyright 2014 Rockwell Collins, Inc. All rights reserved. Resolute: An Assurance Case Language for Architecture Models Andrew Gacek, John Backes, Darren.
1 New Development Techniques: New Challenges for Verification and Validation Mats Heimdahl Critical Systems Research Group Department of Computer Science.
Overview of Formal Methods. Topics Introduction and terminology FM and Software Engineering Applications of FM Propositional and Predicate Logic Program.
B. Fernández, D. Darvas, E. Blanco Formal methods appliedto PLC code verification Automation seminar CERN – IFAC (CEA) 02/06/2014.
Software Engineering Research paper presentation Ali Ahmad Formal Approaches to Software Testing Hierarchal GUI Test Case Generation Using Automated Planning.
Copyright John C. Knight SOFTWARE ENGINEERING FOR DEPENDABLE SYSTEMS John C. Knight Department of Computer Science University of Virginia.
Large Scale Software Systems Derived from Dr. Fawcett’s Notes Phil Pratt-Szeliga Fall 2010.
CIS 540 Principles of Embedded Computation Spring Instructor: Rajeev Alur
1 Context-dependent Product Line Practice for Constructing Reliable Embedded Systems Naoyasu UbayashiKyushu University, Japan Shin NakajimaNational Institute.
Page 1 Analysis of Asynchronous Systems Steven P. Miller Michael W. Whalen {spmiller, Advanced Computing Systems Rockwell.
Page 1 Advanced Technology Center HCSS 03 – April 2003 vFaat: von Neumann Formal Analysis and Annotation Tool David Greve Dr. Matthew Wilding Rockwell.
FDT Foil no 1 On Methodology from Domain to System Descriptions by Rolv Bræk NTNU Workshop on Philosophy and Applicablitiy of Formal Languages Geneve 15.
6. A PPLICATION MAPPING 6.3 HW/SW partitioning 6.4 Mapping to heterogeneous multi-processors 1 6. Application mapping (part 2)
1 CSCD 326 Data Structures I Software Design. 2 The Software Life Cycle 1. Specification 2. Design 3. Risk Analysis 4. Verification 5. Coding 6. Testing.
MODEL-BASED SOFTWARE ARCHITECTURES.  Models of software are used in an increasing number of projects to handle the complexity of application domains.
International Workshop Jan 21– 24, 2012 Jacksonville, Fl USA Model-based Systems Engineering (MBSE) Initiative Slides by Henson Graves Presented by Matthew.
Scientific Debugging. Errors in Software Errors are unexpected behaviors or outputs in programs As long as software is developed by humans, it will contain.
Verification & Validation By: Amir Masoud Gharehbaghi
ANU COMP2110 Software Design in 2003 Lecture 10Slide 1 COMP2110 Software Design in 2004 Lecture 12 Documenting Detailed Design How to write down detailed.
Formal Specification: a Roadmap Axel van Lamsweerde published on ICSE (International Conference on Software Engineering) Jing Ai 10/28/2003.
What’s Ahead for Embedded Software? (Wed) Gilsoo Kim
SAT-Based Model Checking Without Unrolling Aaron R. Bradley.
Formal Verification. Background Information Formal verification methods based on theorem proving techniques and model­checking –To prove the absence of.
Quality Assurance in the Presence of Variability Kim Lauenroth, Andreas Metzger, Klaus Pohl Institute for Computer Science and Business Information Systems.
1 Proving program termination Lecture 5 · February 4 th, 2008 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A.
Automated Formal Verification of PLC (Programmable Logic Controller) Programs
CIS 540 Principles of Embedded Computation Spring Instructor: Rajeev Alur
R-customizers Goal: define relation between graph and its customizers, study domains of adaptive programs, merging of interface class graphs.
T imed Languages for Embedded Software Ethan Jackson Advisor: Dr. Janos Szitpanovits Institute for Software Integrated Systems Vanderbilt University.
Object Design More Design Patterns Object Constraint Language Object Design Specifying Interfaces Review Exam 2 CEN 4010 Class 18 – 11/03.
Copyright 1999 G.v. Bochmann ELG 7186C ch.1 1 Course Notes ELG 7186C Formal Methods for the Development of Real-Time System Applications Gregor v. Bochmann.
Mechanisms for Requirements Driven Component Selection and Design Automation 최경석.
Software Testing.
Software Testing Techniques
Logical architecture refinement
Programming Languages 2nd edition Tucker and Noonan
Automated Extraction of Inductive Invariants to Aid Model Checking
John D. McGregor Session 5 Error Modeling
Computer Science 340 Software Design & Testing
Programming Languages 2nd edition Tucker and Noonan
Presentation transcript:

Software Engineering Center Compositional Analysis of System Architectures (using Lustre) Mike Whalen Program Director University of Minnesota Software Engineering Center Sponsored by NSF Research Grant CNS

Acknowledgements Rockwell Collins (Darren Cofer, Andrew Gacek, Steven Miller, Lucas Wagner) UPenn: (Insup Lee, Oleg Sokolsky) UMN (Mats P. E. Heimdahl) CMU SEI (Peter Feiler) September, 20122LCCC 2012: Mike Whalen

Component Level Formal Analysis Efforts September, 2012LCCC 2012: Mike Whalen3

Vision February, 2012IFIP 2012: Mike Whalen4 System design & verification through pattern application and compositional reasoning COMPUTING RESOURCE SENSOR LRU FAIL-SILENT NODE FROM REPLICAS COMPUTING RESOURCE A COMPUTING RESOURCE B VOTE MULTIPLE DATA SENSOR 1 SENSOR 2 SENSOR 3 VERIFIED AVAILABILITY VERIFIED INTEGRITY ARCHITECTURE MODEL COMPOSITIONAL PROOF OF CORRECTNESS (ASSUME – GUARANTEE) SAFETY, BEHAVIORAL, PERFORMANCE PROPERTIES ABSTRACTION VERIFICATION REUSE COMPOSITION © Copyright 2011 Rockwell Collins, Inc. All rights reserved.

System verification September, 2012LCCC 2012: Mike Whalen5 PATTERN & COMP SPEC LIBRARY SYSTEM MODELING ENVIRONMENT INSTANTIATE ARCHITECTURAL PATTERNS SYSTEM MODEL AUTO GENERATE SYSTEM IMPLEMENTATION ARCH PATTERN MODELS COMPONENT MODELS ANNOTATE & VERIFY MODELS COMPONENT LIBRARY SPECIFICATIONSYSTEM DEVELOPMENT FOUNDRY COMPOSITIONAL REASONING & ANALYSIS Instantiation: Check structural constraints, Embed assumptions & guarantees in system model Instantiation: Check structural constraints, Embed assumptions & guarantees in system model Compositional Verification: System properties are verified by model checking using component & pattern contracts Compositional Verification: System properties are verified by model checking using component & pattern contracts Reusable Verification: Proof of component and pattern requirements (guarantees) and specification of context (assumptions) Reusable Verification: Proof of component and pattern requirements (guarantees) and specification of context (assumptions) © Copyright 2011 Rockwell Collins, Inc. All rights reserved.

Hierarchical reasoning about systems Avionics system req requirement Relies upon ◦ Accuracy of air data sensors ◦ Control commands from FCS  Mode of FGS  FGS control law behavior  Failover behavior between FGS systems  …. ◦ Response of Actuators ◦ Timing/Lag/Latency of Communications September, 2012LCCC 2012: Mike Whalen6 FCS Avionics System Under single-fault assumption, GC output transient response is bounded in time and magnitude AutopilotFGS_LFGS_R ADS_LADS_R … … System Modes Control Laws Co-ord

Compositional Reasoning for Active Standby Want to prove a transient response property ◦ The autopilot will not cause a sharp change in pitch of aircraft. ◦ Even when one FGS fails and the other assumes control Given assumptions about the environment ◦ The sensed aircraft pitch from the air data system is within some absolute bound and doesn’t change too quickly ◦ The discrepancy in sensed pitch between left and right side sensors is bounded. and guarantees provided by components ◦ When a FGS is active, it will generate an acceptable pitch rate As well as facts provided by pattern application ◦ Leader selection: at least one FGS will always be active (modulo one “failover” step) September, 2012LCCC 2012: Mike Whalen7 transient_response_1 : assert true -> abs(CSA.CSA_Pitch_Delta) < CSA_MAX_PITCH_DELTA ; transient_response_2 : assert true -> abs(CSA.CSA_Pitch_Delta - prev(CSA.CSA_Pitch_Delta, 0.0)) < CSA_MAX_PITCH_DELTA_STEP ;

Contracts between patterns and components Avionics system requirement Relies upon ◦ Guarantees provided by patterns and components ◦ Structural properties of model ◦ Resource allocation feasibility ◦ Probabilistic system-level failure characteristics September, 2012LCCC 2012: Mike Whalen8 LS PALSRep Platform synchronous communication one node operational timing constraints not co-located Avionics System leader transition bounded ASSUMPTIONS GUARANTEES Under single-fault assumption, GC output transient response is bounded in time and magnitude RT sched & latency Error model Behavior StructureResourceProbabilistic Principled mechanism for “passing the buck” © Copyright 2011 Rockwell Collins, Inc. All rights reserved.

Contracts Derived from Lustre and Property Specification Language (PSL) formalism ◦ IEEE standard ◦ In wide use for hardware verification Assume / Guarantee style specification ◦ Assumptions: “Under these conditions” ◦ Promises (Guarantees): “…the system will do X” Local definitions can be created to simplify properties September, 2012LCCC 2012: Mike Whalen9 Contract: fun abs(x: real) : real = if (x > 0) then x else -x ; const ADS_MAX_PITCH_DELTA: real = 3.0 ; const FCS_MAX_PITCH_SIDE_DELTA: real = 2.0 ; … property AD_L_Pitch_Step_Delta_Valid = true -> abs(AD_L.pitch.val - prev(AD_L.pitch.val, 0.0)) < ADS_MAX_PITCH_DELTA ; … active_assumption: assume some_fgs_active ; transient_assumption : assume AD_L_Pitch_Step_Delta_Valid and AD_R_Pitch_Step_Delta_Valid and Pitch_lr_ok ; transient_response_1 : assert true -> abs(CSA.CSA_Pitch_Delta) < CSA_MAX_PITCH_DELTA ; transient_response_2 : assert true -> abs(CSA.CSA_Pitch_Delta – prev(CSA.CSA_Pitch_Delta, 0.0)) < CSA_MAX_PITCH_DELTA_STEP ; Contract: fun abs(x: real) : real = if (x > 0) then x else -x ; const ADS_MAX_PITCH_DELTA: real = 3.0 ; const FCS_MAX_PITCH_SIDE_DELTA: real = 2.0 ; … property AD_L_Pitch_Step_Delta_Valid = true -> abs(AD_L.pitch.val - prev(AD_L.pitch.val, 0.0)) < ADS_MAX_PITCH_DELTA ; … active_assumption: assume some_fgs_active ; transient_assumption : assume AD_L_Pitch_Step_Delta_Valid and AD_R_Pitch_Step_Delta_Valid and Pitch_lr_ok ; transient_response_1 : assert true -> abs(CSA.CSA_Pitch_Delta) < CSA_MAX_PITCH_DELTA ; transient_response_2 : assert true -> abs(CSA.CSA_Pitch_Delta – prev(CSA.CSA_Pitch_Delta, 0.0)) < CSA_MAX_PITCH_DELTA_STEP ; © Copyright 2011 Rockwell Collins, Inc. All rights reserved.

Reasoning about contracts Notionally: It is always the case that if the component assumption is true, then the component will ensure that the guarantee is true. ◦ G(A  P); An assumption violation in the past may prevent component from satisfying current guarantee, so we need to assert that the assumptions are true up to the current step: ◦ G(H(A)  P) ; September, LCCC 2012: Mike Whalen

Reasoning about Contracts Given the set of component contracts: Γ = { G(H(A c )  P c ) | c ∈ C } Architecture adds a set of obligations that tie the system assumption to the component assumptions This process can be repeated for any number of abstraction levels September, LCCC 2012: Mike Whalen

Composition Formulation Suppose we have Then if for all q ∈ Q ◦ Γ  G((Z(H( Θ q )) ^ Δ q )  q) Then: G(q) for all q ∈ Q [Adapted from McMillan] September, LCCC 2012: Mike Whalen

A concrete example Order of data flow through system components is computed by reasoning engine ◦ {System inputs}  {FGS_L, FGS_R} ◦ {FGS_L, FGS_R}  {AP} ◦ {AP}  {System outputs} Based on flow, we establish four proof obligations ◦ System assumptions  FGS_L assumptions ◦ System assumptions  FGS_R assumptions ◦ System assumptions + FGS_L guarantees + FGS_R guarantees  AP assumptions ◦ System assumptions + {FGS_L, FGS_R, AP} guarantees  System guarantees System can handle circular flows, but user has to choose where to break cycle September, 2012LCCC 2012: Mike Whalen 13

14 Tool Chain AADL SysML-AADL translation EDICT: Architectural patterns Lute: Structural verification AGREE: Compositional behavior verification OSATE: AADL modeling Enterprise Architect Eclipse KIND SysML Lustre September, 2012LCCC 2012: Mike Whalen

September, 2012LCCC 2012: Mike Whalen15 Research Challenges

Proving Current back-end analysis performed using SMT-based k-induction model checking technique [Hagen and Tinelli: FMCAD 2008] Very scalable if properties can be inductively proven Unfortunately, Inductive proofs often fail because properties are too weak Lots of work on lemma/invariant discovery to strengthen properties  Bjesse and Claessen: SAT-based verification without State Space Traversal  Bradley: SAT-based Model Checking without Unrolling  Tinelli: Instantiation-Based Invariant Discovery [NFM 2011] These strengthening methods are not targeted towards our problem Only supports analysis of linear models September, LCCC 2012: Mike Whalen

Scaling What do you do when systems and subcomponents have hundreds of requirements? ◦ FGS mode logic: 280 requirements ◦ DWM: >600 requirements Need to create automated slicing techniques for predicates rather than code. ◦ Perhaps this will be in the form of counterexample-guided refinement

Assigning blame Counterexamples are often hard to understand for big models It is much worse (in my experience) for property- based models Given a counterexample, can you automatically assign blame to one or more subcomponents? Given a “blamed” component, can you automatically open the black box to strengthen the component guarantee? SignalStep AD_L.pitch.val AD_L.pitch.validFALSETRUEFALSETRUE FALSE AD_R.pitch.val AD_R.pitch.validTRUEFALSETRUEFALSE TRUE AP.CSA.csa_pitch_delta AP.GC_L.cmds.pitch_delta AP.GC_L.mds.activeTRUEFALSE TRUE AP.GC_R.cmds.pitch_delta AP.GC_R.mds.activeTRUE FALSE Assumptions for APTRUE Assumptions for FCITRUE Assumptions for FGS_LTRUE Assumptions for FGS_RTRUE FGS_L.GC.cmds.pitch_delta FGS_L.GC.mds.activeFALSE TRUEFALSE FGS_L.LSO.leader FGS_L.LSO.validFALSETRUEFALSETRUE FALSE FGS_R.GC.cmds.pitch_delta FGS_R.GC.mds.activeTRUEFALSE FGS_R.LSO.leader FGS_R.LSO.validTRUEFALSETRUEFALSE TRUE leader_pitch_delta System level guaranteesTRUE FALSE September, LCCC 2012: Mike Whalen

“Argument Engineering” Disparate kinds of evidence throughout the system ◦ Probabilistic ◦ Resource ◦ Structural properties of model ◦ Behavioral properties of model How do we tie these things together? Evidence graph, similar to proof graph in PVS ◦ Shows evidential obligations that have not been discharged September, 2012LCCC 2012: Mike Whalen19

Dealing with Time Current analysis is synchronous ◦ It assumes all subcomponents run at the same rate ◦ It assumes single-step delay between subcomponents This is not how the world works! ◦ …unless you use Time-Triggered Architectures or PALS Adding more realistic support for time is crucial to accurate analyses ◦ Time intervals tend to diverge in hierarchical verification ◦ E.g. synchronization. September, 2012LCCC 2012: Mike Whalen20

Provocations September, 2012LCCC 2012: Mike Whalen21 We do not yet have a clear idea of how to effectively partition system analyses to perform effective compositional reasoning across domains The Collins/UMN META tools are a first step towards this goal. We need research to combine analyses to make overall system analysis more effective.

Conclusions Still early work… ◦ Many AADL constructs left to be mapped ◦ Many timing issues need to be resolved ◦ Better support for proof engineering needs to be found But ◦ Already can do some interesting analysis with tools ◦ Sits in a nice intersection between requirements engineering and formal methods ◦ Lots of work yet on how best to specify requirements September, 2012LCCC 2012: Mike Whalen22

Thank you! September, 2012LCCC 2012: Mike Whalen23