Rigorous Software Development CSCI-GA 3033-011 Instructor: Thomas Wies Spring 2012 Lecture 12.

Slides:



Advertisements
Similar presentations
Automated Theorem Proving Lecture 1. Program verification is undecidable! Given program P and specification S, does P satisfy S?
Advertisements

Rigorous Software Development CSCI-GA Instructor: Thomas Wies Spring 2013 Lecture 5 Disclaimer. These notes are derived from notes originally.
Rigorous Software Development CSCI-GA Instructor: Thomas Wies Spring 2012 Lecture 10.
Rigorous Software Development CSCI-GA Instructor: Thomas Wies Spring 2013 Lecture 4.
Rigorous Software Development CSCI-GA Instructor: Thomas Wies Spring 2012 Lecture 6 Disclaimer. These notes are derived from notes originally.
Semantics Static semantics Dynamic semantics attribute grammars
ICE1341 Programming Languages Spring 2005 Lecture #6 Lecture #6 In-Young Ko iko.AT. icu.ac.kr iko.AT. icu.ac.kr Information and Communications University.
Hoare’s Correctness Triplets Dijkstra’s Predicate Transformers
Rigorous Software Development CSCI-GA Instructor: Thomas Wies Spring 2012 Lecture 11.
Formal Semantics of Programming Languages 虞慧群 Topic 5: Axiomatic Semantics.
Axiomatic Verification I Prepared by Stephen M. Thebaut, Ph.D. University of Florida Software Testing and Verification Lecture 17.
Axiomatic Semantics The meaning of a program is defined by a formal system that allows one to deduce true properties of that program. No specific meaning.
Copyright © 2006 Addison-Wesley. All rights reserved.1-1 ICS 410: Programming Languages Chapter 3 : Describing Syntax and Semantics Axiomatic Semantics.
ISBN Chapter 3 Describing Syntax and Semantics.
Copyright © 2006 Addison-Wesley. All rights reserved. 3.5 Dynamic Semantics Meanings of expressions, statements, and program units Static semantics – type.
Predicate Transformers
1 Semantic Description of Programming languages. 2 Static versus Dynamic Semantics n Static Semantics represents legal forms of programs that cannot be.
CS 355 – Programming Languages
Programming Language Semantics Axiomatic Semantics Chapter 6.
Axiomatic Semantics Dr. M Al-Mulhem ICS
CS 330 Programming Languages 09 / 18 / 2007 Instructor: Michael Eckmann.
Copyright © 2006 The McGraw-Hill Companies, Inc. Programming Languages 2nd edition Tucker and Noonan Chapter 18 Program Correctness To treat programming.
PSUCS322 HM 1 Languages and Compiler Design II Formal Semantics Material provided by Prof. Jingke Li Stolen with pride and modified by Herb Mayer PSU Spring.
Dr. Muhammed Al-Mulhem 1ICS ICS 535 Design and Implementation of Programming Languages Part 1 Fundamentals (Chapter 4) Axiomatic Semantics ICS 535.
Software Quality: Testing and Verification II. 2 1.A failure is an unacceptable behaviour exhibited by a system — The frequency of failures measures software.
CS 330 Programming Languages 09 / 16 / 2008 Instructor: Michael Eckmann.
Software Verification Bertrand Meyer Chair of Software Engineering Lecture 2: Axiomatic semantics.
Describing Syntax and Semantics
Pre/Post Condition Logic 03/06/2013. Agenda Hoare’s Logic Overview Application to Pre/Post Conditions.
CSE 755, part3 Axiomatic Semantics Will consider axiomatic semantics (A.S.) of IMP: ::=skip | | | | ; | | Only integer vars; no procedures/fns; vars declared.
Reading and Writing Mathematical Proofs
CSI 3125, Axiomatic Semantics, page 1 Axiomatic semantics The assignment statement Statement composition The "if-then-else" statement The "while" statement.
Chapter 25 Formal Methods Formal methods Specify program using math Develop program using math Prove program matches specification using.
ISBN Chapter 3 Describing Semantics -Attribute Grammars -Dynamic Semantics.
CS 363 Comparative Programming Languages Semantics.
Program Analysis and Verification Spring 2014 Program Analysis and Verification Lecture 4: Axiomatic Semantics I Roman Manevich Ben-Gurion University.
ARTIFICIAL INTELLIGENCE [INTELLIGENT AGENTS PARADIGM] Professor Janis Grundspenkis Riga Technical University Faculty of Computer Science and Information.
3.2 Semantics. 2 Semantics Attribute Grammars The Meanings of Programs: Semantics Sebesta Chapter 3.
Chapter 3 Part II Describing Syntax and Semantics.
Semantics In Text: Chapter 3.
COP4020 Programming Languages Introduction to Axiomatic Semantics Prof. Robert van Engelen.
From Hoare Logic to Matching Logic Reachability Grigore Rosu and Andrei Stefanescu University of Illinois, USA.
Principle of Programming Lanugages 3: Compilation of statements Statements in C Assertion Hoare logic Department of Information Science and Engineering.
Cs7100(Prasad)L18-9WP1 Axiomatic Semantics Predicate Transformers.
CSC3315 (Spring 2009)1 CSC 3315 Languages & Compilers Hamid Harroud School of Science and Engineering, Akhawayn University
Axiomatic Verification II Prepared by Stephen M. Thebaut, Ph.D. University of Florida Software Testing and Verification Lecture Notes 18.
Spring 2017 Program Analysis and Verification
Formal Methods in Software Engineering 1
Hoare Logic LN chapter 5, 6 but without 6. 8, 6. 12, 6
Lecture 5 Floyd-Hoare Style Verification
Axiomatic semantics Points to discuss: The assignment statement
Programming Languages and Compilers (CS 421)
Programming Languages 2nd edition Tucker and Noonan
Axiomatic Verification II
Semantics In Text: Chapter 3.
Axiomatic Verification II
Axiomatic Verification I
Predicate Transformers
Formal Methods in software development
Axiomatic Semantics Will consider axiomatic semantics (A.S.) of IMP:
Towards a Unified Theory of Operational and Axiomatic Semantics
Axiomatic Verification I
Predicate Transforms II
Program correctness Axiomatic semantics
CIS 720 Lecture 3.
Programming Languages and Compilers (CS 421)
CIS 720 Lecture 3.
Programming Languages 2nd edition Tucker and Noonan
COP4020 Programming Languages
Presentation transcript:

Rigorous Software Development CSCI-GA Instructor: Thomas Wies Spring 2012 Lecture 12

Axiomatic Semantics An axiomatic semantics consists of: – a language for stating assertions about programs; – rules for establishing the truth of assertions. Some typical kinds of assertions: – This program terminates. – If this program terminates, the variables x and y have the same value throughout the execution of the program. – The array accesses are within the array bounds. Some typical languages of assertions – First-order logic – Other logics (temporal, linear) – Special-purpose specification languages (Z, Larch, JML)

Assertions for IMP The assertions we make about IMP programs are of the form: {A} c {B} with the meaning that: – If A holds in state q and q ! q – then B holds in q A is the pre-condition and B is the post-condition For example: { y x } z := x; z := z + 1 { y < z } is a valid assertion These are called Hoare triples or Hoare assertions c

Semantics of Hoare Triples Now we can define formally the meaning of a partial correctness assertion: ² {A} c {B} iff 8 q 2 Q. 8 q 2 Q. q ² A Æ q ! q ) q ² B and the meaning of a total correctness assertion: ² [A] c [B] iff 8 q 2 Q. q ² A ) 9 q Q. q ! q Æ q ² B or even better: 8 q 2 Q. 8 q Q. q ² A Æ q ! q ) q ² B Æ 8 q 2 Q. q ² A ) 9 q 2 Q. q ! q Æ q ² B c c c c

Inference Rules for Hoare Triples We write ` {A} c {B} when we can derive the triple using inference rules There is one inference rule for each command in the language. Plus, the rule of consequence ` A ) A ` {A} c {B} ` B ) B ` {A} c {B}

Inference Rules for Hoare Logic One rule for each syntactic construct: ` {A} skip {A} ` {A[e/x]} x:=e {A} ` {A} if b then c 1 else c 2 {B} ` {A Æ b} c 1 {B} ` {A Æ : b} c 2 {B} ` {A} c 1 ; c 2 {C} ` {A} c 1 {B} ` {B} c 2 {C} ` { I } while b do c { I Æ : b} ` { I Æ b} c { I }

Example: A Proof in Hoare Logic We want to derive that {n ¸ 0} p := 0; x := 0; while x < n do x := x + 1; p := p + m {p = n * m}

` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} Only applicable rule (except for rule of consequence): ` {A} c 1 ; c 2 {B} ` {A} c 1 {C} ` {C} c 2 {B} c1c1 c2c2 B A ` {C} while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 {C} Example: A Proof in Hoare Logic

` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} What is C? ` {C} while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 {C} Look at the next possible matching rules for c 2 ! Only applicable rule (except for rule of consequence): ` { I } while b do c { I Æ : b} ` { I Æ b} c { I } We can match { I } with {C} but we cannot match { I Æ : b} and {p = n * m} directly. Need to apply the rule of consequence first! c1c1 c2c2 B A Example: A Proof in Hoare Logic

` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} What is C? BA ` {C} while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 {C} Look at the next possible matching rules for c 2 ! Only applicable rule (except for rule of consequence): ` { I } while b do c { I Æ : b} ` { I Æ b} c { I } ` A ) A ` {A} c {B} ` B ) B ` {A} c {B} Rule of consequence: c c A B I = A = A = C Example: A Proof in Hoare Logic

` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} What is I ? ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } Lets keep it as a placeholder for now! ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} ` { I Æ x<n } x := x+1; p:=p+m { I } Next applicable rule: ` {A} c 1 ; c 2 {B} ` {A} c 1 {C} ` {C} c 2 {B} B A c1c1 c2c2 Example: A Proof in Hoare Logic

` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} ` { I Æ x<n } x := x+1; p:=p+m { I } B A c1c1 c2c2 ` { I Æ x<n } x := x+1 {C} What is C?Look at the next possible matching rules for c 2 ! Only applicable rule (except for rule of consequence): ` {A[e/x]} x:=e {A} ` {C} p:=p+m { I } Example: A Proof in Hoare Logic

` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} What is C?Look at the next possible matching rules for c 2 ! Only applicable rule (except for rule of consequence): ` {A[e/x]} x:=e {A} ` { I [p+m/p} p:=p+m { I } ` { I Æ x<n } x:=x+1; p:=p+m { I } ` { I Æ x<n } x:=x+1 { I [p+m/p]} Example: A Proof in Hoare Logic

` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} ` { I Æ x<n } x:=x+1; p:=p+m { I } ` { I Æ x<n } x:=x+1 { I [p+m/p]} Only applicable rule (except for rule of consequence): ` {A[e/x]} x:=e {A} ` { I [p+m/p} p:=p+m { I } Need rule of consequence to match { I Æ x<n } and { I [x+1/x, p+m/p]} Example: A Proof in Hoare Logic

` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} ` { I Æ x<n } x:=x+1; p:=p+m { I } ` { I Æ x<n } x:=x+1 { I [p+m/p]} ` { I [p+m/p} p:=p+m { I } ` I Æ x < n ) I [x+1/x, p+m/p] ` { I [x+1/x, p+m/p]} x:=x+1 { I [p+m/p]} Lets just remember the open proof obligations!... Example: A Proof in Hoare Logic

` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` I Æ x < n ) I [x+1/x, p+m/p] Lets just remember the open proof obligations!... Continue with the remaining part of the proof tree, as before. ` { I [0/x]} x:=0 { I } ` { n ¸ 0} p:=0 { I [0/x]} ` { I [0/p, 0/x]} p:=0 { I [0/x]} ` n ¸ 0 ) I [0/p, 0/x] Now we only need to solve the remaining constraints! Example: A Proof in Hoare Logic

` I Æ x ¸ n ) p = n * m ` I Æ x < n ) I [x+1/x, p+m/p] Find I such that all constraints are simultaneously valid: ` n ¸ 0 ) I [0/p, 0/x] I ´ p = x * m Æ x · n ` p = x * n Æ x · n Æ x ¸ n ) p = n * m ` p = p * m Æ x · n Æ x < n ) p+m = (x+1) * m Æ x+1 · n ` n ¸ 0 ) 0 = 0 * m Æ 0 · n All constraints are valid! Example: A Proof in Hoare Logic

Using Hoare Rules Hoare rules are mostly syntax directed There are three obstacles to automation of Hoare logic proofs: – When to apply the rule of consequence? – What invariant to use for while ? – How do you prove the implications involved in the rule of consequence? The last one is how theorem proving gets in the picture – This turns out to be doable! – The loop invariants turn out to be the hardest problem! – Should the programmer give them?

Verification Conditions Goal: given a Hoare triple {A} P {B}, derive a single assertion VC(A,P,B) such that ² VC(A,P,B) iff ² {A} P {B} VC(A,P,B) is called verification condition. Verification condition generation factors out the hard work – Finding loop invariants – Finding function specifications Assume programs are annotated with such specifications – We will assume that the new form of the while construct includes an invariant: { I } while b do c – The invariant formula I must hold every time before b is evaluated.

Verification Condition Generation Idea for VC generation: propagate the post- condition backwards through the program: – From {A} P {B} – generate A ) F(P, B) This backwards propagation F(P, B) can be formalized in terms of weakest preconditions.

Weakest Preconditions The weakest precondition WP(c,B) holds for any state q whose c-successor states all satisfy B: q ² WP(c,B) iff 8 q 2 Q. q ! q ) q ² B Compute WP(P,B) recursively according to the structure of the program P. B WP(c,B) q q q c c c c

Loop-Free Guarded Commands Introduce loop-free guarded commands as an intermediate representation of the verification condition c ::= assume b | assert b | havoc x | c 1 ; c 2 | c 1 c 2

From Programs to Guarded Commands GC( skip ) = assume true GC(x := e) = assume tmp = x; havoc x; assume (x = e[tmp/x]) GC(c 1 ; c 2 ) = GC(c 1 ) ; GC(c 2 ) GC( if b then c 1 else c 2 ) = (assume b; GC(c 1 )) (assume : b; GC(c 2 )) GC({ I } while b do c) = ? where tmp is fresh

Guarded Commands for Loops GC({ I } while b do c) = assert I ; havoc x 1 ;...; havoc x n ; assume I ; (assume b; GC(c); assert I ; assume false) assume : b where x 1,..., x n are the variables modified in c

Computing Weakest Preconditions WP(assume b, B) = b ) B WP(assert b, B) = b Æ B WP(havoc x, B) = B[a/x](a fresh in B) WP(c 1 ;c 2, B) = WP(c 1, WP(c 2, B)) WP(c 1 c 2,B) = WP(c 1, B) Æ WP(c 2, B)

Putting Everything Together Given a Hoare triple H ´ {A} P {B} Compute c H = assume A; GC(P); assert B Compute VC H = WP(c H, true) Infer ` VC H using a theorem prover.

Example: VC Generation {n ¸ 0} p := 0; x := 0; {p = x * m Æ x · n} while x < n do x := x + 1; p := p + m {p = n * m}

assume n ¸ 0; GC(p := 0; x := 1; {p = x * m Æ x · n} while x < n do x := x + 1; p := p + m ); assert p = n * m assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; GC(x := 0; {p = x * m Æ x · n} while x < n do x := x + 1; p := p + m ); assert p = n * m assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; GC({p = x * m Æ x · n} while x < n do x := x + 1; p := p + m ); assert p = n * m assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n; havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; GC(x := x + 1; p := p + m); assert p = x * m Æ x · n; assume false ) assume x ¸ n; assert p = n * m assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n; havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assume false ) assume x ¸ n; assert p = n * m Computing the guarded command Example: VC Generation

WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n; havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assert false ) assume x ¸ n; assert p = n * m, true) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n; havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assert false ) assume x ¸ n, assert p = n * m, true) Computing the weakest precondition Example: VC Generation n ¸ 0 Æ p 0 = p Æ pa 3 = 0 Æ x 0 = x Æ xa 3 = 0 ) pa 3 = xa 3 * m Æ xa 3 · n Æ (pa 2 = xa 2 * m Æ xa 2 · n ) ((xa 2 < n Æ x 1 = xa 2 Æ xa 1 = x Æ p 1 = pa 2 Æ pa 1 = p 1 + m) ) pa 1 = xa 1 * m Æ xa 1 · n) Æ (xa 2 ¸ n ) pa 2 = n * m))

The resulting VC is equivalent to the conjunction of the following implications Example: VC Generation n ¸ 0 Æ p 0 = p Æ pa 3 = 0 Æ x 0 = x Æ xa 3 = 0 ) pa 3 = xa 3 * m Æ xa 3 · n n ¸ 0 Æ p 0 = p Æ pa 3 = 0 Æ x 0 = x Æ xa 3 = 0 Æ pa 2 = xa 2 * m Æ xa 2 · n ) xa 2 ¸ n ) pa 2 = n * m n ¸ 0 Æ p 0 = p Æ pa 3 = 0 Æ x 0 = x Æ xa 3 = 0 Æ pa 2 = xa 2 * m Æ xa 2 < n Æ x 1 = xa 2 Æ xa 1 = x Æ p 1 = pa 2 Æ pa 1 = p 1 + m ) pa 1 = xa 1 * m Æ xa 1 · n

simplifying the constraints yields all of these implications are valid, which proves that the original Hoare triple was valid, too. Example: VC Generation n ¸ 0 ) 0 = 0 * m Æ 0 · n xa 2 · n Æ xa 2 ¸ n ) xa 2 * m = n * m xa 2 < n ) xa 2 * m + m = (xa 2 + 1) * m Æ xa · n

The Diamond Problem assume A; c d; assert B A ) WP (c, WP(c, B) Æ WP(d, B)) Æ WP (d, WP(c, B) Æ WP(d, B)) Number of paths through the program can be exponential in the size of the program. Size of weakest precondition can be exponential in the size of the program. c c d d

Avoiding the Exponential Explosion Defer the work of exploring all paths to the theorem prover: WP(assume b, B, C) = (b ) B, C) WP(assert b, B, C) = (b Æ B, C) WP(havoc x, B, C) = (B[a/x], C)(a fresh in B) WP(c 1 ;c 2, B, C) = let F 2, C 2 = WP(c 2, B, C) in WP(c 1, F 2, C 2 ) WP(c 1 c 2,B, C) = let X = fresh propositional variable in let F 1, C 1 = WP(c 1, X, true) and F 2, C 2 = WP(c 2, X, true) in (F 1 Æ F 2, C Æ C 1 Æ C 2 Æ (X, B)) WP(P, B) = let F, C = WP(P, B, true) in C ) F

Translating Method Calls to GCs requires assignable x 1,..., x n ensures Q; T m (T 1 p 1,..., T k p k ) {... } A method call y = x.m(y 1,..., y k ); is desugared into the guarded command assert P[x/this, y 1 /p 1,..., y k /p k ] ; havoc x 1 ;..., havoc x n ; havoc y ; assume Q [ x/this, y / \result ]

Handling More Complex Program State When is the following Hoare triple valid? {A} x.f = 5 {x.f + y.f = 10} A ought to imply y.f = 5 Ç x = y The IMP Hoare rule for assignment would give us: (x.f + y.f = 10) [5/x.f] ´ 5 + y.f = 10 ´ y.f = 5 (we lost one case) How come the rule does not work?

Modeling the Heap We cannot have side-effects in assertions – While generating the VC we must remove side-effects! – But how to do that when lacking precise aliasing information? Important technique: postpone alias analysis to the theorem prover Model the state of the heap as a symbolic mapping from addresses to values: – If e denotes an address and h a heap state then: – sel(h,e) denotes the contents of the memory cell – upd(h,e,v) denotes a new heap state obtained from h by writing v at address e

Heap Models We allow variables to range over heap states – So we can quantify over all possible heap states. Model 1 – One heap for each object – One index constant for each field. We postulate f1 f2. – r.f1 is sel(r,f1) and r.f1 = e is r := upd(r,f1,e) Model 2 (Burnstall-Bornat) – One heap for each field – The object address is the index – r.f1 is sel(f1,r) and r.f1 = e is f1 := upd(f1,r,e)

Hoare Rule for Field Writes To model writes correctly, we use heap expressions – A field write changes the heap of that field { B[upd(f, e 1, e 2 )/f] } e 1.f = e 2 {B} Important technique: model heap as a semantic object And defer reasoning about heap expressions to the theorem prover with inference rules such as (McCarthy): sel(upd(h, e 1, e 2 ), e 3 ) = e 2 if e 1 = e 3 sel(h, e 3 ) if e 1 e 3

Example: Hoare Rule for Field Writes Consider again: { A } x.f = 5 { x.f + y.f = 10 } We obtain: A ´ (x.f + y.f = 10)[upd(f, x, 5)/f] ´ (sel(f, x) + sel(f, y) = 10)[upd(f, x, 5)/f] ´ sel(upd(f x 5) x) + sel(upd(f x 5) y) = 10 (*) ´ 5 + sel(upd(f, x, 5), y) = 10 ´ if x = y then = 10 else 5 + sel(f, y) = 10 ´ x = y Ç y.f = 5 (**) To (*) is theorem generation. From (*) to (**) is theorem proving.

Modeling new Statements Introduce – a new predicate isAllocated(e, t) denoting that object e is allocated at allocation time t – and a new variable allocTime denoting the current allocation time. Add background axioms: allocTime = 0 8 x t. isAllocated(x, t) ) isAllocated(x, t+1) isAllocated(null, 0) Translate new x.T() to havoc x; assume : isAllocated(x, allocTime); assume Type(x) = T; assume isAllocated(x, allocTime + 1); allocTime := allocTime + 1; **Translation of call to constructor x.T()**

Invariant Generation

Tools such as ESC/Java enable automated program verification by – automatically generating verification conditions and – automatically checking validity of the generated VCs. The user still needs to provide the invariants. – This is often the hardest part. Can we generate invariants automatically?

Programs as Systems of Constraints 1: assume y ¸ z; 2: while x < y do x := x + 1; 3: assert x ¸ z `1`1 `1`1 `2`2 `2`2 `3`3 `3`3 ` err ` exit y ¸ z x ¸ z x ¸ y x < y Æ x = x + 1 x < z ½ 1 = move( ` 1, ` 2 ) Æ y ¸ z Æ skip(x,y,z) ½ 2 = move( ` 2, ` 2 ) Æ x < y Æ x = x + 1 Æ skip(y,z) ½ 3 = move( ` 2, ` 3 ) Æ x ¸ y Æ skip(x,y,z) ½ 4 = move( ` 3, ` err ) Æ x < z Æ skip(x,y,z) ½ 5 = move( ` 3, ` exit ) Æ x ¸ z Æ skip(x,y,z) move( ` 1, ` 2 ) = pc = ` 1 Æ pc = ` 2 skip(x 1,..., x n ) = x 1 = x 1 Æ... Æ x n = x n

Program P = (V, init, R, error) V : finite set of program variables init : initiation condition given by a formula over V R : a finite set of transition constraints – transition constraint ½ 2 R given by a formula over V and their primed versions V – we often think of R as disjunction of its elements R = ½ 1 Ç... Ç ½ n error : error condition given by a formula over V

P = (V, init, R, error) V = {pc, x, y, z} init = pc = ` 1 R = { ½ 1, ½ 2, ½ 3, ½ 4, ½ 5 } where ½ 1 = move( ` 1, ` 2 ) Æ y ¸ z Æ skip(x,y,z) ½ 2 = move( ` 2, ` 2 ) Æ x < y Æ x = x + 1 Æ skip(y,z) ½ 3 = move( ` 2, ` 3 ) Æ x ¸ y Æ skip(x,y,z) ½ 4 = move( ` 3, ` err ) Æ x < z Æ skip(x,y,z) ½ 5 = move( ` 3, ` exit ) Æ x ¸ z Æ skip(x,y,z) error = pc = ` err Programs as Systems of Constraints `1`1 `1`1 `2`2 `2`2 `3`3 `3`3 ` err ` exit y ¸ z x ¸ z x ¸ y x < y Æ x = x + 1 x < z

Programs as Transition Systems states Q are valuations of program variables V initial states Q init are the states satisfying the initiation condition init Q init = {q 2 Q | q ² init } transition relation ! is the relation defined by the transition constraints in R q 1 ! q 2 iff q 1, q 2 ² R error states Q err are the states satisfying the error condition error Q err = {q 2 Q | q ² error }

Partial Correctness of Programs a state q is reachable in P if it occurs in some computation of P q 0 ! q 1 ! q 2 !... ! qwhere q 0 2 Q init denote by Q reach the set of all reachable states of P a program P is safe if no error state is reachable in P Q reach Å Q err = ; or, if Q reach is expressed as a formula reach over V ² reach Æ error ) false

state space Q error states Q err Partial Correctness of Programs reachable states Q reach initial states Q init

Example: Reachable States of a Program 1: assume y ¸ z; 2: while x < y do x := x + 1; 3: assert x ¸ z Reachable states reach = pc = ` 1 Ç pc = ` 2 Æ y ¸ z Ç pc = ` 3 Æ y ¸ z Æ x ¸ y Ç pc = ` exit Æ y ¸ z Æ x ¸ y `1`1 `1`1 `2`2 `2`2 `3`3 `3`3 ` err ` exit y ¸ z x ¸ z x ¸ y x < y Æ x = x + 1 x < z What is the connection with invariants? Can we compute reach?

Invariants of Programs an invariant Q I of a program P is a superset of its reachable states: Q reach µ Q I an invariant Q I is safe if it does not contain any error states: Q I Æ Q err = ; or if Q I is expressed as a formula I over V ² I Æ error ) false reach is the smallest invariant of P. In particular, if P is safe then so is reach.

state space Q error states Q err Partial Correctness of Programs reachable states Q reach initial states Q init safe invariant Q I

Strongest Postconditions The strongest postcondition post( ½,A) holds for any state q that is a ½ -successor state of some state satisfying A: q ² post( ½,A) iff 9 q 2 Q. q ² A Æ q,q ² ½ or equivalently post( ½, A) = ( 9 V. A Æ ½ ) [V/V] Compute reach by applying post iteratively to init

Example: Application of post A = pc = ` 2 Æ y ¸ z ½ = move( ` 2, ` 2 ) Æ x < y Æ x = x + 1 Æ skip(y,z) post( ½, A) = ( 9 V. A Æ ½ ) [V/V] = ( 9 pc x y z. pc = ` 2 Æ y ¸ z Æ pc = ` 2 Æ pc = ` 2 Æ x<y Æ x = x+1 Æ y = y Æ z = z) [pc/pc, x/x, y/y, z/z] = (y ¸ z Æ pc = ` 2 Æ x + 1 < y) [pc/pc, x/x, y/y, z/z] = y ¸ z Æ pc = ` 2 Æ x · y

Iterating post A, if i = 0 post(post i -1 ( ½, A) if i > 0

Finite Iteration of post May Suffice

Example Iteration ½ 1 = move( ` 1, ` 2 ) Æ y ¸ z Æ skip(x,y,z) ½ 2 = move( ` 2, ` 2 ) Æ x < y Æ x = x + 1 Æ skip(y,z) ½ 3 = move( ` 2, ` 3 ) Æ x ¸ y Æ skip(x,y,z) ½ 4 = move( ` 3, ` err ) Æ x < z Æ skip(x,y,z) ½ 5 = move( ` 3, ` exit ) Æ x ¸ z Æ skip(x,y,z) post 0 (R, init) = init = pc = ` 1 post 1 (R, init) = post( ½ 1, init) = pc = ` 2 Æ y ¸ z post 2 (R, init) = post( ½ 2, post(R, init)) Ç post( ½ 3, post(R, init)) = pc = ` 2 Æ y ¸ z Æ x · y Ç pc = ` 3 Æ y ¸ z Æ x ¸ y post 3 (R, init) = post( ½ 2, post 2 (R, init)) Ç post( ½ 3, post 2 (R, init)) Ç post( ½ 4, post 2 (R, init)) Ç post( ½ 5, post 2 (R, init))

Example Iteration ½ 1 = move( ` 1, ` 2 ) Æ y ¸ z Æ skip(x,y,z) ½ 2 = move( ` 2, ` 2 ) Æ x < y Æ x = x + 1 Æ skip(y,z) ½ 3 = move( ` 2, ` 3 ) Æ x ¸ y Æ skip(x,y,z) ½ 4 = move( ` 3, ` err ) Æ x < z Æ skip(x,y,z) ½ 5 = move( ` 3, ` exit ) Æ x ¸ z Æ skip(x,y,z) post 2 (R, init) = post( ½ 2, post(R, init)) Ç post( ½ 3, post(R, init)) = pc = ` 2 Æ y ¸ z Æ x · y Ç pc = ` 3 Æ y ¸ z Æ x ¸ y post 3 (R, init) = post( ½ 2, post 2 (R, init)) Ç post( ½ 3, post 2 (R, init)) Ç post( ½ 4, post 2 (R, init)) Ç post( ½ 5, post 2 (R, init)) = pc = ` 2 Æ y ¸ z Æ x · y Ç pc = ` 3 Æ y ¸ z Æ x = y Ç pc = ` exit Æ y ¸ z Æ x · y Ç false

Example Iteration post 3 (R, init) = = pc = ` 2 Æ y ¸ z Æ x · y Ç pc = ` 3 Æ y ¸ z Æ x ¸ y Ç pc = ` exit Æ y ¸ z Æ x · y post 4 (R, init) = post 3 (R, init) Fixed point: reach = post 0 (R, init) Ç post 1 (R, init) Ç post 2 (R, init) Ç post 3 (R, init) = pc = ` 1 Ç pc = ` 2 Æ y ¸ z Ç pc = ` 3 Æ y ¸ z Æ x ¸ y Ç pc = ` exit Æ y ¸ z Æ x · y

Checking Safety An inductive invariant I contains the initial states and is closed under successors: ² init ) I and ² post(R, I ) ) I A program is safe if there exists a safe inductive invariant. reach is the strongest inductive invariant.

Inductive Invariants for Example Program weakest inductive invariant: true – set of all states – contains error states strongest inductive invariant: reach pc = ` 1 Ç pc = ` 2 Æ y ¸ z Ç pc = ` 3 Æ y ¸ z Æ x ¸ y Ç pc = ` exit Æ y ¸ z Æ x ¸ y slightly weaker inductive invariant: pc = ` 1 Ç pc = ` 2 Æ y ¸ z Ç pc = ` 3 Æ y ¸ z Æ x ¸ y Ç pc = ` exit Can we drop another conjunct in one of the disjuncts?

1: assume y ¸ z; 2: while x < y do x := x + 1; 3: assert x ¸ z Safe inductive invariant: pc = ` 1 Ç pc = ` 2 Æ y ¸ z Ç pc = ` 3 Æ y ¸ z Æ x ¸ y Ç pc = ` exit `1`1 `1`1 `2`2 `2`2 `3`3 `3`3 ` err ` exit y ¸ z x ¸ z x ¸ y x < y Æ x = x + 1 x < z Inductive Invariants for Example Program

Computing Inductive Invariants We can compute the strongest inductive invariants by iterating post on init. Can we ensure that this process terminates? In general no: checking safety of programs is undecidable. But we can compute weaker inductive invariants – conservatively abstract the behavior of the program – iterate an abstraction of post that is guaranteed to terminate.