Download presentation
Presentation is loading. Please wait.
1
TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A Sumit Gulwani Microsoft Research, Redmond, USA sumitg@microsoft.com The Fixpoint Brush in The Art of Invariant Generation Workshop on Invariant Generation, Floc 2010
2
1.Program Transformations –Reduce need for sophisticated invariant generation. –E.g., control-flow refinement (PLDI ‘09), loop-flattening/peeling (PLDI ‘10/POPL ’08) non-standard cut-points (PLDI ‘08) quantitative attributes instrumentation (POPL ’10). 2.Colorful Logic –Language of Invariants. –E.g., arithmetic, uninterpreted fns, lists/arrays. 3.Fixpoint Brush –Automatic generation of invariants in some shade of logic (such as conjunctive/k-disjunctive/predicate abstraction). –E.g., Iterative, Template/Constraint based, Proof rules. 1 Art of Invariant Generation
3
Fixpoint Brush Iterative and monotonic Forward –Backward Template/Constraint based Proof rules Iterative, but non-monotonic –Probabilistic Inference –Learning 2
4
Start with Precondition and propagate facts forward using Existential elimination operator until fixpoint. Data-flow analysis –Join at merge points –Finite abstract domains Abstract Interpretation –Join at merge points Join( 1 2 ) = strongest s.t. 1 ) and 2 ) –Widening for infinite height abstract domains. Model Checking –Analyze all paths precisely without join at merge points. –Finite abstractions (e.g., boolean abstraction) required Counterexample guided abstraction refinement BDD based data-structures for efficiency 3 Iterative Forward: Examples
5
Difference Constraints Abstract element: –conjunction of x i -x j · c ij –can be represented using matrix M, where M[i][j]=c ij Decide(M): 1.M’ := Saturate(M); Saturation means closure with deductions of form: x-y ≤ c 1 and y-z ≤ c 2 implies x-z ≤ c 1 +c 2 2.Declare unsat iff 9 i: M’[i][i] < 0 Join(M 1, M 2 ): 1.M’ 1 := Saturate(M 1 ); M’ 2 := Saturate(M 2 ); 2.Let M 3 be s.t. M 3 [i][j] = Max { M’ 1 [i][j], M’ 2 [i][j] } 3.return M 3 4
6
1 · y<51 Æ z=y+2 Example: Abstract Interpretation using Difference Constraints y < 50 y := 0; z := 2; y++; z++; y=0, z=2 y=1 Æ z=3 ? ? ? ? true y=0 Æ z=2 0 · y · 1 Æ z=y+2 1 · y · 2 Æ z=y+2 0 · y · 2 Æ z=y+2 0 · y Æ z=y+2 0 · y<50 Æ z=y+2 0 · y<51 Æ z=y+2 y=50 Æ z=y+2 False True Assert (z=52) 5
7
y=F(a) a= h a,b i z=a-1 a= h a,b i y=F(b) b= h a,b i z=b-1 b= h a,b i z=a-1 Æ y=F(a) z=b-1 Æ y=F(b) h a,b i =1+z y=F( h a,b i ) Join la Join uf Eliminate uf+la y=F(1+z) { h a,b i } Join la+uf Combination: Example of Join Algorithm 6
8
Uninterpreted Functions –A Polynomial-Time Algorithm for Global Value Numbering; Gulwani, Necula; SAS ‘04 Linear Arithmetic + Uninterpreted Functions –Combining Abstract Interpreters; Gulwani, Tiwari; PLDI ’06 Theory of Arrays/Lists –Quantified Abstract Domains Lifting Abstract Interpreters to Quantified Logical Domains; Gulwani, McCloskey, Tiwari; POPL ‘08 Discovering Properties about Arrays in Simple Programs; Halbwachs, Péron; PLDI ‘08 –Shape Analysis Parametric Shape Analysis via 3-Valued Logic; Sagiv, Reps, Wilhelm; POPL ‘99, TOPLAS ‘02 7 Iterative Forward: References
9
Theory of Arrays/Lists –Combination of Shape Analysis + Arithmetic A Combination Framework for tracking partition sizes; Gulwani, Lev-Ami, Sagiv; POPL ’09 Non-linear Arithmetic –User-defined axioms + Expression Abstraction A Numerical Abstract Domain Based on Expression Abstraction and Max Operator with Application in Timing Analysis; Gulavani, Gulwani; CAV ’08 –Polynomial Equalities An Abstract Interpretation Approach for Automatic Generation of Polynomial Invariants; Rodriguez-Carbonell, Kapur; SAS ‘04 8 Iterative Forward: References
10
Fixpoint Brush Iterative and monotonic –Forward Backward Template/Constraint based Proof-rules Iterative, but non-monotonic –Probabilistic Inference –Learning 9
11
Comparison with Iterative Forward –Positives: Can compute preconditions, Goal-directed –Negatives: Requires assertions or template assertions. Transfer Function for Assignment Node is easier. –Substitution takes role of existential elimination. Transfer Function for Conditional Node is challenging. –Requires abductive reasoning/under-approximations. Abduct( ,g) = weakest ’ s.t. ’ Æ g ) Case-split reasoning as opposed to Saturation based, and hence typically not closed under conjunctions. Optimally weak solutions for negative unknowns as opposed to optimally strong solutions for positive unknowns. 10 Iterative Backward
12
Program Verification using Templates over Predicate Abstraction; Srivastava, Gulwani; PLDI ‘09 Assertion Checking Unified; Gulwani, Tiwari; VMCAI ‘07 Computing Procedure Summaries for Interprocedural Analysis; Gulwani, Tiwari; ESOP ‘07 11 Iterative Backward: References
13
Fixpoint Brush Iterative and monotonic –Forward –Backward Template/Constraint based Proof rules Iterative, but non-monotonic –Probabilistic Inference –Learning 12
14
Template-based Invariant Generation Goal-directed invariant generation for verification of a Hoare triple (Pre, Program, Post) 13 Pre while (c) S Post Pre ) I I Æ : c ) Post (I Æ c)[S] ) I 9 I I 8X8X Verification Constraint (Second-order) Base Case Precision Inductive Case VCGen Key Idea: Reduce the second-order verification constraint to a first-order satisfiability constraint that can be solved using off-the-shelf SAT/SMT solvers –Choose a template for I (specific color/shade in some logic). –Convert 8 into 9.
15
Trick for converting 8 to 9 is known for following domains: Linear Arithmetic –Farkas Lemma Linear Arithmetic + Uninterpreted Fns. –Farkas Lemma + Ackerman’s Reduction Non-linear Arithmetic –Grobner Basis Predicate Abstraction –Boolean indicator variables + Cover Algorithm (Abduction) Quantified Predicate Abstraction –Boolean indicator variables + More general Abduction 14 Key Idea in reducing 8 to 9 for various Domains
16
Linear Arithmetic –Constraint-based Linear-relations analysis; Sankaranarayanan, Sipma, Manna; SAS ’04 –Program analysis as constraint solving; Gulwani, Srivastava, Venkatesan; PLDI ‘08 Linear Arithmetic + Uninterpreted Fns. –Invariant synthesis for combined theories; Beyer, Henzinger, Majumdar, Rybalchenko; VMCAI ‘07 Non-linear Arithmetic –Non-linear loop invariant generation using Gröbner bases; Sankaranarayanan, Sipma, Manna; POPL ’04 Predicate Abstraction –Constraint-based invariant inference over predicate abstraction; Gulwani, Srivastava, Venkatesan; VMCAI ’09 Quantified Predicate Abstraction –Program verification using templates over predicate abstraction; Srivastava, Gulwani; PLDI ‘09 15 Template-based Invariant Generation: References
17
Linear Arithmetic Linear Arithmetic + Uninterpreted Fns. Non-linear Arithmetic Predicate Abstraction Quantified Predicate Abstraction 16 Template-based Invariant Generation
18
Farkas Lemma 8 X Æ k (e k ¸ 0) ) e ¸ 0 iff 9 k ¸ 0 8 X (e ´ + k k e k ) Example Let’s find Farkas witness, 1, 2 for the implication x ¸ 2 Æ y ¸ 3 ) 2x+y ¸ 6 9, 1, 2 ¸ 0 s.t. 8 x,y [2x+y-6 ´ + 1 (x-2) + 2 (y-3)] Equating coefficients of x,y and constant terms, we get: 2= 1 Æ 1= 2 Æ -6= -2 1 -3 2 which implies =1, 1 =2, 2 =1. 17
19
Solving 2 nd order constraints using Farkas Lemma 9 I 8 X 1 (I,X) Second-order to First-order – Assume I has some form, e.g., j a j x j ¸ 0 – 9 I 8 X 1 (I,X) translates to 9 a j 8 X 2 (a j,X) First-order to “only existentially quantified” –Farkas Lemma helps translate 8 to 9 – 8 X ( Æ k (e k ¸ 0) ) e ¸ 0) iff 9 k ¸ 0 8 X (e ´ + k k e k ) Eliminate X from polynomial equality by equating coefficients. – 9 a j 8 X 2 (a j,X) translates to 9 a j 9 ¸ k 3 (a j, ¸ k ) “only existentially quantified” to SAT –Bit-vector modeling for integer variables 18
20
Example [n=1 Æ m=1] x := 0; y := 0; while (x < 100) x := x+n; y := y+m; [y ¸ 100] a 0 + a 1 x + a 2 y + a 3 n + a 4 m ¸ 0 b 0 + b 1 x + b 2 y + b 3 n + b 4 m ¸ 0 y ¸ x m ¸ 1 n · 1 a 2 =b 0 =c 4 =1, a 1 =b 3 =c 0 =-1 a 0 + a 1 x + a 2 y + a 3 n + a 4 m ¸ 0 b 0 + b 1 x + b 2 y + b 3 n + b 4 m ¸ 0 c 0 + c 1 x + c 2 y + c 3 n + c 4 m ¸ 0 y ¸ x m ¸ n a 2 =b 2 =1, a 1 =b 1 =-1 a 0 + a 1 x + a 2 y + a 3 n + a 4 m ¸ 0 Invalid triple or Imprecise Template UNSAT Invariant Template Satisfying Solution Loop Invariant I n=m=1 Æ x=y=0 ) I I Æ x ¸ 100 ) y ¸ 100 I Æ x<100 ) I[x à x+n, y à y+m] VCGen
21
Linear Arithmetic Linear Arithmetic + Uninterpreted Fns. Non-linear Arithmetic Predicate Abstraction Quantified Predicate Abstraction 20 Template-based Invariant Generation
22
Solving 2 nd order constraints using Boolean Indicator Variables + Cover Algorithm 9 I 8 X 1 (I,X) Second-order to First-order (Boolean indicator variables) –Assume I has the form P 1 Ç P 2, where P 1, P 2 µ P Let b i,j denote presence of p j 2 P in P i Then, I can be written as ( Æ j b 1,j ) p j ) Ç ( Æ j b 2,j ) p j ) 9 I 8 X 1 (I,X) translates to 9 b i,j 8 X 2 (b i,j,X) –Can generalize to k disjuncts First-order to “only existentially quantified” –Cover Algorithm helps translate 8 to 9 – 9 b i,j 8 X 2 (b i,j,X) translates to SAT formula 9 b i,j 3 (b i,j ) 21
23
Example [m > 0] x := 0; y := 0; while (x < m) x := x+1; y := y+1; [y=m] I m>0 ) I[x à 0, y à 0] I Æ x ¸ m ) y=m I Æ x<m ) I[x à x+1, y à y+1] VCGen x · y, x ¸ y, x<y x · m, x ¸ m, x<m y · m, y ¸ m, y<m Suppose P = and k = 1 Then, I has the form P 1, where P 1 µ P m>0 ) P 1 [x à 0, y à 0] (1) P 1 Æ x ¸ m ) y=m (2) P 1 Æ x<m ) P 1 [x à x+1, y à y+1] (3)
24
Example x · y, x ¸ y, x<y x · m, x ¸ m, x<m y · m, y ¸ m, y<m 0 · 0, 0 ¸ 0, 0<0 0 · m, 0 ¸ m, 0<m (1) m>0 ) P 1 [x à 0, y à 0] Hence, (1) ´ P 1 doesn’t contain x<y, x ¸ m, y ¸ m ´ : b x ¸ m Æ : b x<y Æ : b y ¸ m x à 0, y à 0 (2) P 1 Æ x ¸ m ) y=m There are 3 maximally-weak choices for P 1 (computed using Predicate Cover Algorithm) Hence, (2) ´ P 1 contains at least one of above combinations ´ ( b x<m ) Ç ( b y · m Æ b y ¸ m ) Ç ( b x · y Æ b y · m ) (i) y · m Æ y ¸ m (ii) x<m (iii) x · y Æ y · m
25
Example (1) : b x ¸ m Æ : b x<y Æ : b y ¸ m (2) ( b x<m ) Ç ( b y · m Æ b y ¸ m ) Ç ( b x · y Æ b y · m ) (3) ( b y · m ) ( b y <m Ç b y · x )) Æ: b x<m Æ: b y<m b y · x, b y · m, b x · y : true rest: false SAT Solver I: (y=x Æ y · m) [m > 0] x := 0; y := 0; while (x < m) x := x+1; y := y+1; [y=m] I Obtained from solving local/small SMT queries
26
Going beyond Invariant Generation with Constraint- based techniques… 25 Bonus Material Where can we go?
27
Postcondition: The best fit line shouldn’t deviate more than half a pixel from the real line, i.e., |y – (Y/X)x| · 1/2 26 Example: Bresenham’s Line Drawing Algorithm [0<Y · X] v 1 :=2Y-X; y:=0; x:=0; while (x · X) out[x] := y; if (v 1 <0) v 1 :=v 1 +2Y; else v 1 :=v 1 +2(Y-X); y++; return out; [ 8 k (0 · k · X ) |out[k]–(Y/X)k| · ½)]
28
27 Transition System Representation [0<Y · X] v 1 :=2Y-X; y:=0; x:=0; while (x · X) v 1 <0: out’=Update(out,x,y) Æ v’ 1 =v 1 +2Y Æ y’=y Æ x’=x+1 v 1 ¸ 0: out’=Update(out,x,y) Æ v 1 =v 1 +2(Y-X) Æ y’=y+1 Æ x’=x+1 [ 8 k (0 · k · X ) |out[k]–(Y/X)k| · ½)] [Pre] s entry ; while (g loop ) g body1 : s body1 ; g body2 : s body2 ; [Post] Where, g body1 : v 1 <0 g body2 : v 1 ¸ 0 g loop : x · X s entry : v 1 ’=2Y-X Æ y’=0 Æ x’=0 s body1 : out’=Update(out,x,y) Æ v’ 1 =v 1 +2Y Æ x’=x+1 Æ y’=y s body2 : out’=Update(out,x,y) Æ v’ 1 =v 1 +2(Y-X) Æ x’=x+1 Æ y’=y+1 Or, equivalently,
29
28 Verification Constraint Generation & Solution Pre Æ s entry ) I’ I Æ g loop Æ g body1 Æ s body1 ) I’ I Æ g loop Æ g body2 Æ s body2 ) I’ I Æ : g loop ) Post 0<Y · X Æ v 1 =2(x+1)Y-(2y+1)X Æ 2(Y-X) · v 1 · 2Y Æ 8 k(0 · k · x ) |out[k]–(Y/X)k| · ½) Given Pre, Post, g loop, g body1, g body2, s body1, s body2, we can find solution for I using constraint-based techniques. Verification Constraint: I:
30
29 The Surprise! Verification Constraint: Pre Æ s entry ) I’ I Æ g loop Æ g body1 Æ s body1 ) I’ I Æ g loop Æ g body2 Æ s body2 ) I’ I Æ : g loop ) Post What if we treat each g and s as unknowns like I? We get a solution that has g body1 = g body2 = false. –This doesn’t correspond to a valid transition system. –We can fix this by encoding g body1 Ç g body2 = true. We now get a solution that has g loop = true. –This corresponds to a non-terminating loop. –We can fix this by encoding existence of a ranking function. We now discover each g and s along with I. –We have gone from Invariant Synthesis to Program Synthesis.
31
Fixpoint Brush Iterative and monotonic –Forward –Backward Template/Constraint based Proof rules Iterative, but non-monotonic –Probabilistic Inference –Learning 30
32
Problem specific. Requires understanding of commonly used design patterns. Can provide the most scalable solution, if applicable. Case Study: Ranking function computation for estimating loop bounds. 31 Proof Rules
33
Consider the loop while (cond) X := F(X) Transition system representation s: cond Æ X’=F(X) Example: Transition system representation of the loop while (x < n) {x++; n--;} is x<n Æ x’=x+1 Æ n’=n-1 An integer valued function r is a ranking function for transition s if s ) (r>0) s ) r[X’/X] · r-1 We denote this by Rank(r,s). 32 Ranking Function
34
Iterative Forward –Instrument counter c and find an upper bound n. n-c is a ranking function. (Gulwani, Mehra, Chilimbi, POPL ‘09) Constraint-based –Assume a template a 0 + i a i x i for the ranking function r and then solve for a i ’s in the constraint 9 a i (s ) (r>0 Æ r[X’/X] · r-1) using Farkas lemma –Goal directed –Complete PTIME method for synthesis of linear ranking fns. Podelski, Rybalchenko; VMCAI ‘04 Proof Rules –“The Reachability Bound Problem”, Gulwani, Zuleger, PLDI ‘10 33 Finding Ranking Functions
35
If s ) (e>0 Æ e[X’/X] < e), then Rank(e,s). Candidates for e: e 1 -e 2, for any conditionals e 1 > e 2 in s n>0 Æ n’ · n Æ A[n] A[n’] –Ranking function: n If s ) (e ¸ 1 Æ e[X’/X] · e/2), then Rank(log e,s). Candidates for e: e 1 /e 2 for any conditionals e 1 > e 2 in s i’=i/2 Æ i>1 –Ranking function: log (i) i’=2 £ i Æ i>0 Æ n>i Æ n’=n –Ranking function: log (n/i) 34 Arithmetic Iteration Patterns
36
If s ) (RSB(e’) < RSB(e) Æ e 0), then Rank(RSB(e),s) Candidates for e: all bitvector variables x in s x’=x&(x-1) Æ x 0 –Ranking function: x –Ranking function: RSB(x)/2 RSB: Position of rightmost significant 1-bit 35 Bit-vector Iteration Pattern Input: bitvector b while (BitScanForward(&id1,b)) b := b | ((1 << id1)-1); if (BitScanForward(&id2,~x) break; b := b & (~((1 << id2)-1);
37
Let Rank(r1,s1) and Rank(r2,s2). Non-Interference NI(s1,s2,r2): Non-enabling condition: s1 ± s2 = false Rank preserving condition: s1 ) r2[x’/x] · r2 If NI(s1,s2,r2) and NI(s2,s1,r1), then Max(0,r1) + Max(0,r2) is a ranking fn for s1 Ç s2. If NI(s1,s2,r2) then (r1,r2) is a lexicographic ranking fn for s1 Ç s2. 36 Composing ranking functions
38
Useful for backward symbolic execution across loops. If s ) e’ ≤ e + c, then e after ≤ e before + c £ Rank(s) 37 Relating values before/after a loop n := m while (e) { …… n := n+3; …… } n after · n before + 3 £ Rank(s)
39
Fixpoint Brush Iterative and monotonic –Forward –Backward Template/Constraint based Proof-rules Iterative, but non-monotonic –Probabilistic Inference –Learning 38
40
Undergraduate Textbook on Algorithms by Cormen, Leiserson, Rivest, Stein describes 3 fundamental methods for recurrence solving: Iteration Method –Expands/unfolds the recurrence relation 39 Recurrence Solving Techniques
41
Expand (unfold) the recurrence and express it as a summation of terms. Consider the recurrence:T(n) = n + 3T(n/4) T(n) = n + 3(n/4) + 9 T(n/16) = n + 3(n/4) + 9(n/16) + 27 T(n/64) ≤ n + 3(n/4) + 9(n/16) + 27(n/64) + … + 3 log 4 n θ(1) ≤ [n ∑ i≥0 (3/4) i ]+ θ(3 log 4 n ) = 4n + o(n) = O(n) 40 Iteration Method
42
Undergraduate Textbook on Algorithms by Cormen, Leiserson, Rivest, Stein describes 3 fundamental methods for recurrence solving: Example of a recurrence: T(n)=T(n-1)+2n Æ T(0)=0 Iteration Method –Expands/unfolds the recurrence relation –Similar to Iterative approach Substitution Method –Assumes a template for a closed-form 41 Recurrence Solving Techniques vs. Our Fixpoint Brush
43
Involves guessing the form of the solution and then substituting it in the equations to find the constants. Consider the recurrence: T(n)=T(n-1)+2n Æ T(0)=0 Let T(n) ≤ an 2 + bn + c for some a,b,c T(n) ≤ a(n-1) 2 +b(n-1)+c + 2n = an 2 + (b-2a+2)n + (a+c-b) This implies (b-2a+2) ≤ b and (a+c-b) ≤ c T(0) = c This implies c=0 Hence, b ≥ a ≥ 1 42 Substitution Method
44
Undergraduate Textbook on Algorithms by Cormen, Leiserson, Rivest, Stein describes 3 fundamental methods for recurrence solving: Example of a recurrence: T(n)=T(n-1)+2n Æ T(0)=0 Iteration Method –Expands/unfolds the recurrence relation –Similar to Iterative approach Substitution Method –Assumes a template for a closed-form –Similar to Template/Constraint-based approach Master’s Method –Provides a cook-book solution for T(n)=aT(n/b)+f(n) 43 Recurrence Solving Techniques vs. Our Fixpoint Brush
45
Provides a cook-book method for solving recurrences of the form T(n) = aT(n/b)+f(n), where a≥1 and b>1. If f(n) = O(n log b a – є ) for some constant є > 0, then T(n) = θ(n log b a ). If f(n) = θ(n log b a ), then T(n) = θ(n log b a lg n). If f(n) = O(n log b a + є ) for some constant є > 0, and if a f(n/b) ≤ c f(n) for some constant c < 1 and all sufficiently large n, then T(n) = θ(f(n)). Requires memorization of three cases, but then the solution of many recurrences can be determined quite easily, often with pencil and paper. 44 Master’s Method
46
Undergraduate Textbook on Algorithms by Cormen, Leiserson, Rivest, Stein describes 3 fundamental methods for recurrence solving: Example of a recurrence: T(n)=T(n-1)+2n Æ T(0)=0 Iteration Method –Expands/unfolds the recurrence relation –Similar to Iterative approach Substitution Method –Assumes a template for a closed-form –Similar to Template/Constraint-based approach Master’s Method –Provides a cook-book solution for T(n)=aT(n/b)+f(n) –Similar to Proof-rules approach 45 Recurrence Solving Techniques vs. Our Fixpoint Brush
47
Fixpoint Brush Iterative and monotonic –Forward –Backward Template/Constraint based Proof-rules Iterative, but non-monotonic (Distance to fixed-point decreases in each iteration). Probabilistic Inference –Learning 46
48
Prog. Point Invariant entry x=0x=0 11 x = 0 Æ y = 50 22 x · 50 ) y = 50 Æ 50 · x ) x = y Æ x · 100 33 x · 50 ) y = 50 Æ 50 · x ) x = y Æ x <100 44 x <50 Æ y = 50 55 x · 50 Æ y = 50 66 50 · x <100 Æ x = y 77 50< x · 100 Æ x = y 88 x · 50 ) y = 50 Æ 50 · x ) x = y Æ x · 100 exit y = 100 Example y := 50; y = 100 False x := x +1; y := y +1; x < 50 x <100 True False 11 22 33 44 55 66 77 88 x = 0 Proof of Validity entry exit 47
49
Probabilistic Inference for Program Verification Initialize invariants at all program points to any element (from an abstract domain over which the proof exists) Pick a program point (randomly) whose invariant is locally inconsistent & update it to make it less inconsistent. 48
50
Consistency of an invariant I at program point I is consistent at iff Post( ) ) I Æ I ) Pre( ) Post( ) is the strongest postcondition of “the invariants at the predecessors of ” at Pre( ) is the weakest precondition of “the invariants at the successors of ” at Example: I Q P R c 11 Post( 2 ) = StrongestPost(P,s) Pre( 2 ) = (c ) Q) Æ ( : c ) R) s 49
51
Measuring Inconsistency of an invariant I at Local inconsistency of invariant I at program point = IM(Post( ), I) + IM(I, Pre( )) Where the inconsistency measure IM( 1, 2 ) is some approximation of the number of program states that violate 1 ) 2 50
52
Example of an inconsistency measure IM Consider the abstract domain of Boolean formulas (with the usual implication as the partial order). Let 1 ´ a 1 Ç … Ç a n in DNF and 2 ´ b 1 Æ … Æ b m in CNF IM( 1, 2 ) = (a i,b j ) where (a i,b j ) = 0, if a i ) b j = 1, otherwise 51
53
Program Verification as Probabilistic Inference; Gulwani, Jojic; POPL ’07 Learning Regular Sets from Queries and Counterexamples; Angluin; Information and Computing ‘87 –Learning Synthesis of interface specifications for Java classes; Alur, Madhusudan, Nam; POPL ‘05. –Learning meets verification; Leucker; FMCO ‘06 May/Must Analyses? Game-theoretic Analyses? 52 Iterative but non-monotonic: References
54
Fixpoint Brush: Summary Iterative and monotonic –Forward Requires method for Join, Existential Elimination –Backward Requires method for Abduct Template/Constraint based –Works well for small code-fragments –Requires a method to convert 8 to 9 Proof rules –Scalable –Requires understanding of design patterns Iterative, but non-monotonic –Perhaps better for dealing with noisy systems. 53
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.