Towards a Compositional SPIN Corina Păsăreanu QSS, NASA Ames Research Center Dimitra Giannakopoulou RIACS/USRA, NASA Ames Research Center.

Slides:



Advertisements
Similar presentations
1 Verification by Model Checking. 2 Part 1 : Motivation.
Advertisements

CS 267: Automated Verification Lecture 8: Automata Theoretic Model Checking Instructor: Tevfik Bultan.
Translation-Based Compositional Reasoning for Software Systems Fei Xie and James C. Browne Robert P. Kurshan Cadence Design Systems.
Introducing Formal Methods, Module 1, Version 1.1, Oct., Formal Specification and Analytical Verification L 5.
Verification of Evolving Software Natasha Sharygina Joint work with Sagar Chaki and Nishant Sinha Carnegie Mellon University.
Abstraction and Modular Reasoning for the Verification of Software Corina Pasareanu NASA Ames Research Center.
Lecture 8: Three-Level Architectures CS 344R: Robotics Benjamin Kuipers.
Formal Modelling of Reactive Agents as an aggregation of Simple Behaviours P.Kefalas Dept. of Computer Science 13 Tsimiski Str Thessaloniki Greece.
Ensuring Operating System Kernel Integrity with OSck By Owen S. Hofmann Alan M. Dunn Sangman Kim Indrajit Roy Emmett Witchel Kent State University College.
Background information Formal verification methods based on theorem proving techniques and model­checking –to prove the absence of errors (in the formal.
Automated assume-guarantee reasoning for component verification Dimitra Giannakopoulou (RIACS), Corina Păsăreanu (Kestrel) Automated Software Engineering.
An Automata-based Approach to Testing Properties in Event Traces H. Hallal, S. Boroday, A. Ulrich, A. Petrenko Sophia Antipolis, France, May 2003.
An Associative Broadcast Based Coordination Model for Distributed Processes James C. Browne Kevin Kane Hongxia Tian Department of Computer Sciences The.
1 Learning Assumptions for Compositional Verification J. M. Cobleigh, D. Giannakopoulou and C. S. Pasareanu Presented by: Sharon Shoham.
Developing Verifiable Concurrent Software Tevfik Bultan Department of Computer Science University of California, Santa Barbara
© 2007 Carnegie Mellon University Optimized L*-based Assume-Guarantee Reasoning Sagar Chaki, Ofer Strichman March 27, 2007.
Semantics with Applications Mooly Sagiv Schrirber html:// Textbooks:Winskel The.
A. Frank - P. Weisberg Operating Systems Introduction to Tasks/Threads.
1 Formal Engineering of Reliable Software LASER 2004 school Tutorial, Lecture1 Natasha Sharygina Carnegie Mellon University.
System-Level Types for Component-Based Design Paper by: Edward A. Lee and Yuhong Xiong Presentation by: Dan Patterson.
Transaction Based Modeling and Verification of Hardware Protocols Xiaofang Chen, Steven M. German and Ganesh Gopalakrishnan Supported in part by SRC Contract.
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
Regular Model Checking Ahmed Bouajjani,Benget Jonsson, Marcus Nillson and Tayssir Touili Moran Ben Tulila
Implementation Yaodong Bi. Introduction to Implementation Purposes of Implementation – Plan the system integrations required in each iteration – Distribute.
C++ Code Analysis: an Open Architecture for the Verification of Coding Rules Paolo Tonella ITC-irst, Centro per la Ricerca Scientifica e Tecnologica
ECE 720T5 Winter 2014 Cyber-Physical Systems Rodolfo Pellizzoni.
Designing For Testability. Incorporate design features that facilitate testing Include features to: –Support test automation at all levels (unit, integration,
CMPS 3223 Theory of Computation Automata, Computability, & Complexity by Elaine Rich ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Slides provided.
Learning Based Assume-Guarantee Reasoning Corina Păsăreanu Perot Systems Government Services, NASA Ames Research Center Joint work with: Dimitra Giannakopoulou.
Lifecycle Verification of the NASA Ames K9 Rover Executive Dimitra Giannakopoulou Mike Lowry Corina Păsăreanu Rich Washington.
1 Automatic Refinement and Vacuity Detection for Symbolic Trajectory Evaluation Orna Grumberg Technion Haifa, Israel Joint work with Rachel Tzoref.
CSC-682 Cryptography & Computer Security Sound and Precise Analysis of Web Applications for Injection Vulnerabilities Pompi Rotaru Based on an article.
Automatic Assumption Generation for Compositional Verification Dimitra Giannakopoulou (RIACS), Corina Păsăreanu (Kestrel) Automated Software Engineering.
Modelling III: Asynchronous Shared Memory Model Chapter 9 by Nancy A. Lynch presented by Mark E. Miyashita.
Software Engineering Research paper presentation Ali Ahmad Formal Approaches to Software Testing Hierarchal GUI Test Case Generation Using Automated Planning.
CS6133 Software Specification and Verification
Inferring Specifications to Detect Errors in Code Mana Taghdiri Presented by: Robert Seater MIT Computer Science & AI Lab.
Dynamic Component Substitutability Analysis Edmund Clarke Natasha Sharygina* Nishant Sinha Carnegie Mellon University The University of Lugano.
Dr. Tom WayCSC Testing and Test-Driven Development CSC 4700 Software Engineering Based on Sommerville slides.
Joseph Cordina 1/11 The Use of Model-Checking for the Verification of Concurrent Algorithms Joseph Cordina Department of C.S.&A.I.
6.852: Distributed Algorithms Spring, 2008 Class 13.
By, Venkateswara Reddy. Tallapu Reddy. 1.Introduction. 2.What is X-Machine Testing..?? 3.Methods of X-Machine Testing. 4.Variants of X- Machine. 5.Stream.
1 Qualitative Reasoning of Distributed Object Design Nima Kaveh & Wolfgang Emmerich Software Systems Engineering Dept. Computer Science University College.
Symbolic Execution with Abstract Subsumption Checking Saswat Anand College of Computing, Georgia Institute of Technology Corina Păsăreanu QSS, NASA Ames.
Inferring Finite Automata from queries and counter-examples Eggert Jón Magnússon.
1 CSEP590 – Model Checking and Automated Verification Lecture outline for August 6, 2003.
1 CSCD 326 Data Structures I Software Design. 2 The Software Life Cycle 1. Specification 2. Design 3. Risk Analysis 4. Verification 5. Coding 6. Testing.
Learning Symbolic Interfaces of Software Components Zvonimir Rakamarić.
Lecture 5 1 CSP tools for verification of Sec Prot Overview of the lecture The Casper interface Refinement checking and FDR Model checking Theorem proving.
ICFEM 2002, Shanghai Reasoning about Hardware and Software Memory Models Abhik Roychoudhury School of Computing National University of Singapore.
MNP1163/MANP1163 (Software Construction).  Minimizing complexity  Anticipating change  Constructing for verification  Reuse  Standards in software.
Using Symbolic PathFinder at NASA Corina Pãsãreanu Carnegie Mellon/NASA Ames.
Software Quality and Safety Pascal Mbayiha.  software engineering  large, complex systems  functionality, changing requirements  development difficult.
Theory-Aided Model Checking of Concurrent Transition Systems Guy Katz, Clark Barrett, David Harel New York University Weizmann Institute of Science.
Concepts and Realization of a Diagram Editor Generator Based on Hypergraph Transformation Author: Mark Minas Presenter: Song Gu.
Formal Verification. Background Information Formal verification methods based on theorem proving techniques and model­checking –To prove the absence of.
Concrete Model Checking with Abstract Matching and Refinement Corina Păsăreanu QSS, NASA Ames Research Center Radek Pelánek Masaryk University, Brno, Czech.
Automated Formal Verification of PLC (Programmable Logic Controller) Programs
Error Explanation with Distance Metrics Authors: Alex Groce, Sagar Chaki, Daniel Kroening, and Ofer Strichman International Journal on Software Tools for.
/ PSWLAB Thread Modular Model Checking by Cormac Flanagan and Shaz Qadeer (published in Spin’03) Hong,Shin Thread Modular Model.
MOPS: an Infrastructure for Examining Security Properties of Software Authors Hao Chen and David Wagner Appears in ACM Conference on Computer and Communications.
Compositional Verification for System-on-Chip Designs SRC Student Symposium Paper 16.5 Nishant Sinha Edmund Clarke Carnegie Mellon University.
Compositional Verification part II Dimitra Giannakopoulou and Corina Păsăreanu CMU / NASA Ames Research Center.
Verifying Component Substitutability Nishant Sinha Sagar Chaki Edmund Clarke Natasha Sharygina Carnegie Mellon University.
Agenda  Quick Review  Finish Introduction  Java Threads.
Chapter 8 – Software Testing
High Coverage Detection of Input-Related Security Faults
Fault Injection: A Method for Validating Fault-tolerant System
Over-Approximating Boolean Programs with Unbounded Thread Creation
A Refinement Calculus for Promela
Presentation transcript:

Towards a Compositional SPIN Corina Păsăreanu QSS, NASA Ames Research Center Dimitra Giannakopoulou RIACS/USRA, NASA Ames Research Center

Main objective: An integrated environment that supports software development and verification/validation throughout the lifecycle; detect integration problems early, prior to coding Approach: Compositional (“divide and conquer”) verification, for increased scalability, at design level Use design level artifacts to improve/aid coding, testing and fault containment Compositional Verification TestingDesignCodingRequirementsDeployment C1C1 C2C2 C1C1 C2C2 M1M1 M2M2 models implementations Cost of detecting/fixing defects increases Integration issues handled early Project Overview

Towards a Compositional SPIN Learning based compositional analysis for increased scalability [TACAS’03] – LTSA tool Contributions –Generic tool architecture for learning based assume guarantee reasoning Handles multiple components Uses SPIN to answer model checking questions Other tools can be used (e.g. Java PathFinder) –Heuristic for automated generation of interface specifications –Application to realistic resource arbiter for a spacecraft Significant memory gains over “traditional” non-compositional model checking

Outline Introduction Background: assume-guarantee reasoning and learning Tool architecture Implementation –Using SPIN for answering learning questions –Promela subset Case study: MER Arbiter model –Description, results, discussion Conclusions and future work

Compositional Verification M2M2 M1M1 A satisfies P? Check P on entire system: too many states! Use the natural decomposition of the system into its components to break-up the verification task Check components in isolation: Does M 1 satisfy P? –Typically a component is designed to satisfy its requirements in specific contexts / environments Assume-guarantee reasoning: introduces assumption A representing M 1 ’s “context” Does system made up of M 1 and M 2 satisfy property P?

Assume-Guarantee Rules M2M2 M1M1 A satisfies P? Simplest assume-guarantee rule We can also handle symmetric rules “discharge” the assumption 1.  A  M 1  P  2.  true  M 2  A  3.  true  M 1 || M 2  P  How do we come up with the assumption? (usually a difficult manual process) Reason about triples:  A  M  P  The formula is true if whenever M is part of a system that satisfies A, then the system must also guarantee P

Approaches Infer assumptions automatically Two solutions developed 1.Algorithmic generation of assumption (controller); knowledge of environment is not required [ASE’02] 2.Incremental assumption computation based on counterexamples, learning and knowledge of environment [TACAS’03, SAVCBS’03 (symmetric rules)]

Formalisms Components modeled as finite state machines (FSM) –FSMs assembled with parallel composition operator “||” Synchronizes shared actions, interleaves remaining actions A property P is a FSM –P describes all legal behaviors –P err – determinize & complete P; bad behaviors lead to “error” –Component M satisfies P iff error state unreachable in (M || P err ) Assume-guarantee reasoning –Assumptions and guarantees are FSMs –  A  M  P  holds iff error state unreachable in (A || M || P err ) –The alphabet of A,  A, contains all environment actions that appear in P

Example Input insend ack Output outsend ack Order err in out in out ||

Computing the Weakest Assumption Given component M, property P, and the interface of M with its environment, generate the weakest environment assumption A such that: assuming A, M ╞ P Weakest means that for all environments E: (E || M ╞ P) IFF E╞ A

Learning for Assume-guarantee Reasoning Use an off-the-shelf learning algorithm to build appropriate assumption for the rule Process is iterative Assumptions are generated by querying the system, and are gradually refined Queries are answered by model checking Refinement is based on counterexamples obtained by model checking Termination is guaranteed Extended with symmetric rules 1.  A  M 1  P  2.  true  M 2  A  3.  true  M 1 || M 2  P 

Learning with L* L* algorithm by Angluin, improved by Rivest & Schapire Learns an unknown regular language U (over alphabet  ) and produces a DFA A such that L(A) = U Uses a teacher to answer two types of questions Unknown regular language U L* conjecture: A i query: string s true false remove string t add string t output DFA A such that L(A) = U true false is s in U? is L(A i )=U?

Learning Assumptions Use L* to generate candidate assumptions  A = (  M 1   P)   M 2 L* query: string s true false  s  M 1  P  conjecture: A i  A i  M 1  P   true  M 2  A i  true error? true false true yes false no remove counterex. add counterex. P holds in M 1 || M 2 P violated 1.  A  M 1  P  2.  true  M 2  A  3.  true  M 1 || M 2  P  Model Checking

Characteristics Terminates with minimal automaton A for U Generates DFA candidates A i : |A 1 | < | A 2 | < … < |A| Produces at most n candidates, where n = |A| # queries:  (kn 2 + n logm), –m is size of largest counterexample, k is size of alphabet For n components (M 1 || M 2 || … || M n ) apply algorithm recursively: –  A i  M 1  P  as before –  true  M 2 || … || M n  A i  invokes the framework

 true  P  A i  Learning Interface Specifications Compute an assumption (as weak as possible) for a component M 1 to guarantee some property P, when environment is not available  A =  P L* query: string s true false  s  M 1  P  conjecture: A i  A i  M 1  P  true counterexample true false return A i counterexample not accurate with respect to !P || !M 1

Implementation in SPIN Stand-alone application –Invokes SPIN for queries and conjectures Handles only a Promela subset –Components are processes –Communication through rendezvous channels Safety properties –SPIN trace assertions Checking assume-guarantee triples –Encode assumptions into environment processes that run in parallel with components

Case Study Model derived from MER Resource Arbiter –Local management of resource contention between resource consumers (e.g. science instruments, communication systems) –Avoid simultaneous conflicts (e.g. driving while capturing a camera image are mutually incompatible) –Enforce orderly activity in accordance with predefined priorities encoded in a lookup table 5 resources –Comm, Drive, PanCam, Arm, Rat 5 User threads –Non-deterministically decide to use any of the 5 resources ~3000 LOC Checked several safety properties

Arbiter Architecture ARB U5U5 U4U4 Request, Cancel U3U3 U2U2 U1U1 Grant, Deny Rescind Property P Mutual exclusion between resources Comm and Drive can not be used at the same time

MER Arbiter Property Mutual exclusion between resources Comm and Drive can not be used at the same time PanCam and Arm can not be used at the same time Rat and Arm can not be used at the same time

Incremental Property Checking Compute A 1 … A 5 s.t.  A 1  U 1  P   A 2  U 2  A 1   A 3  U 3  A 2   A 4  U 4  A 3   A 5  U 5  A 4   true  ARB  A 5  Result P holds on U 1 ||.. || U 5 || ARB Two techniques –Recursive invocation –Interface specification Comparison with non-compositional analysis U1U1 U2U2 P A2A2 A1A1 ARB A1A1 U3U3 U4U4 A2A2 A4A4 A3A3 A3A3 U5U5 A5A5 A4A4 A5A5

AnalysisMemoryState SpaceTime t model t compile t run Assumption Size Monolithic544 MB3.9e s 0.8s 33.7sN/A Recursive2.6 MB s 1.1s 0.03s Interface2.6 MB s 1.3s 0.02s12 Analysis Results Results do not reflect the cost of generating the assumptions

Cost of Assumption Generation Analysisqueriesoracle1oracle2t SPIN +t Lear n Mem Learn t LTSA Mem LTSA Recursive s1743K42.8s20400K Interface (A 1 ) s508K3.07s4845K LTSA tool

Discussion Significant memory savings Serious time overhead Experimented with SPIN’s feature for separate compilation –Encode assumptions as never claims –Restrict search of the verifier New encoding for queries –Significant improvement –Cost of interface generation reduced from s to s

Conclusion and Future Work Tool architecture for compositional verification –Uses L* for automatic derivation of assumptions –Automated derivation of interface specification “Loose” integration with SPIN –Model checking for queries and conjectures Application to Arbiter case study –Significant memory gains –Serious time overhead  ; techniques to address this issue Future work –Tighter integration with SPIN –Parallel checks for queries –Investigate buffered message passing communication –Liveness (learning for infinitary regular sets) –Distinguish between read and write operations