Algebra of Concurrent Programming Tony Hoare Cambridge 2011.

Slides:



Advertisements
Similar presentations
Technologies for finding errors in object-oriented software K. Rustan M. Leino Microsoft Research, Redmond, WA Lecture 1 Summer school on Formal Models.
Advertisements

Automated Theorem Proving Lecture 1. Program verification is undecidable! Given program P and specification S, does P satisfy S?
Modeling issues Book: chapters 4.12, 5.4, 8.4, 10.1.
Semantics Static semantics Dynamic semantics attribute grammars
ICE1341 Programming Languages Spring 2005 Lecture #6 Lecture #6 In-Young Ko iko.AT. icu.ac.kr iko.AT. icu.ac.kr Information and Communications University.
Hoare’s Correctness Triplets Dijkstra’s Predicate Transformers
Rigorous Software Development CSCI-GA Instructor: Thomas Wies Spring 2012 Lecture 11.
Tracelets: a Model for the Laws of Concurrent Programming Tony Hoare OxfordFeb 2012.
Rigorous Software Development CSCI-GA Instructor: Thomas Wies Spring 2012 Lecture 13.
The Engineering Design of Systems: Models and Methods
Axiomatic Semantics The meaning of a program is defined by a formal system that allows one to deduce true properties of that program. No specific meaning.
Copyright © 2006 Addison-Wesley. All rights reserved.1-1 ICS 410: Programming Languages Chapter 3 : Describing Syntax and Semantics Axiomatic Semantics.
ISBN Chapter 3 Describing Syntax and Semantics.
1 Discrete Structures Lecture 29 Predicates and Programming Read Ch
1 Design by Contract Building Reliable Software. 2 Software Correctness Correctness is a relative notion  A program is correct with respect to its specification.
Program Proving Notes Ellen L. Walker.
1/22 Programs : Semantics and Verification Charngki PSWLAB Programs: Semantics and Verification Mordechai Ben-Ari Mathematical Logic for Computer.
Algebra unifies calculi of programming Tony Hoare Feb 2012.
CS 355 – Programming Languages
Programming Language Semantics Denotational Semantics Chapter 5 Based on a lecture by Martin Abadi.
An Algebra for Program Designs Tony Hoare MoscowJuly 2011.
Discrete Mathematics Lecture 4 Harper Langston New York University.
Logic in Computer Science Transparency No Chapter 3 Propositional Logic 3.6. Propositional Resolution.
Syllabus Every Week: 2 Hourly Exams +Final - as noted on Syllabus
From Chapter 4 Formal Specification using Z David Lightfoot
Chapter 3 Describing Syntax and Semantics Sections 1-3.
Copyright © 2006 The McGraw-Hill Companies, Inc. Programming Languages 2nd edition Tucker and Noonan Chapter 18 Program Correctness To treat programming.
PSUCS322 HM 1 Languages and Compiler Design II Formal Semantics Material provided by Prof. Jingke Li Stolen with pride and modified by Herb Mayer PSU Spring.
Chapter 3 Describing Syntax and Semantics Sections 1-3.
CS 330 Programming Languages 09 / 16 / 2008 Instructor: Michael Eckmann.
Software Verification Bertrand Meyer Chair of Software Engineering Lecture 2: Axiomatic semantics.
Describing Syntax and Semantics
Introduction to Logic Logical Form: general rules
Floyd Hoare Logic. Semantics A programming language specification consists of a syntactic description and a semantic description. Syntactic description:symbols.
Programming Language Semantics Denotational Semantics Chapter 5 Part III Based on a lecture by Martin Abadi.
CSE 755, part3 Axiomatic Semantics Will consider axiomatic semantics (A.S.) of IMP: ::=skip | | | | ; | | Only integer vars; no procedures/fns; vars declared.
DECIDABILITY OF PRESBURGER ARITHMETIC USING FINITE AUTOMATA Presented by : Shubha Jain Reference : Paper by Alexandre Boudet and Hubert Comon.
Charts and nets: States, Messages, and Events Tony Hoare In honour of David Harel ViennaJuly 2014.
Review I Rosen , 3.1 Know your definitions!
1 Inference Rules and Proofs (Z); Program Specification and Verification Inference Rules and Proofs (Z); Program Specification and Verification.
ISBN Chapter 3 Describing Semantics -Attribute Grammars -Dynamic Semantics.
CS 363 Comparative Programming Languages Semantics.
Propositional Calculus CS 270: Mathematical Foundations of Computer Science Jeremy Johnson.
CS 267: Automated Verification Lecture 3: Fixpoints and Temporal Properties Instructor: Tevfik Bultan.
Chapter 3 Part II Describing Syntax and Semantics.
Semantics In Text: Chapter 3.
Laws of concurrent design Tony Hoare Microsoft ResearchCambridge FMCAD October.
COP4020 Programming Languages Introduction to Axiomatic Semantics Prof. Robert van Engelen.
ISBN Chapter 3 Describing Syntax and Semantics.
Daniel Kroening and Ofer Strichman Decision Procedures An Algorithmic Point of View Deciding Combined Theories.
Dr. Naveed Riaz Design and Analysis of Algorithms 1 1 Formal Methods in Software Engineering Lecture # 26.
Cs7100(Prasad)L18-9WP1 Axiomatic Semantics Predicate Transformers.
CMSC 330: Organization of Programming Languages Operational Semantics.
CSC3315 (Spring 2009)1 CSC 3315 Languages & Compilers Hamid Harroud School of Science and Engineering, Akhawayn University
C HAPTER 3 Describing Syntax and Semantics. D YNAMIC S EMANTICS Describing syntax is relatively simple There is no single widely acceptable notation or.
Operational Semantics Mooly Sagiv Tel Aviv University Sunday Scrieber 8 Monday Schrieber.
Information Technology Department
Reasoning About Code; Hoare Logic
Programming Languages 2nd edition Tucker and Noonan
Semantics In Text: Chapter 3.
Computer Security: Art and Science, 2nd Edition
Predicate Transformers
Formal Methods in software development
Axiomatic Semantics Will consider axiomatic semantics (A.S.) of IMP:
Logic Logic is a discipline that studies the principles and methods used to construct valid arguments. An argument is a related sequence of statements.
Program correctness Axiomatic semantics
Lecture 2: Axiomatic semantics
Programming Languages 2nd edition Tucker and Noonan
COP4020 Programming Languages
Presentation transcript:

Algebra of Concurrent Programming Tony Hoare Cambridge 2011

With ideas from Ian Wehrman John Wickerson Stephan van Staden Peter O’Hearn Bernhard Moeller Georg Struth Rasmus Petersen …and others

Subject matter: designs variables (p, q, r) stand for computer programs, designs, specifications,… they all describe what happens inside/around a computer that executes a given program. The program itself is the most precise. The specification is the most abstract. Designs come in between.

Examples Postcondition: – execution ends with array A sorted Conditional correctness: – if execution ends, it ends with A sorted Precondition: – execution starts with x even Program: x := x+1 – the final value of x one greater than the initial

Examples Safety: – There are no buffer overflows Termination: – execution is finite (ie., always ends) Liveness: – no infinite internal activity (livelock) Fairness: – a response is always given to each request Probability: – the ration of a’s to b’s tends to 1 with time

Unification Same laws apply to programs, designs, specifications Same laws apply to many forms of correctness. Tools based on the laws serve many purposes. Distinctions can be drawn later – when the need for them is apparent

Refinement: p ⊑ q Everything described by p is also described by q, e.g., – spec p implies spec q – prog p satisfies spec q – prog p more determinate than prog q stepwise development of a spec is – spec ⊒ design ⊒ program stepwise analysis of a program is – program ⊑ design ⊑ spec

Various terminology p ⊑ q below lesser stronger lower bound more precise …deterministic included in  antecedent => above greater weaker upper bound more abstract...non-deterministic containing (sets) consequent (pred)

Law: ⊑ is a partial order ⊑ is transitive p ⊑ r if p ⊑ q and q ⊑ r needed for stepwise development/analysis ⊑ is antisymmetric p = r if p ⊑ r and r ⊑ p needed for abstraction ⊑ is reflexive – p ⊑ p for convenience

Binary operator: p ; q sequential composition of p and q each execution of p;q consists of – all events x from an execution of p – and all events y from an execution of q subject to ordering constraint, either – strong-- weak – interruptible-- inhibited

alternative constraints on p;q strong sequence: – all x from p must precede all y from q weak sequence: – no y from q can precede any x from p interruptible: – other threads may interfere between x and y separated: – updates to private variables are protected. all our algebraic laws will apply to each alternative

Hoare triple: {p} q {r} defined as p;q ⊑ r – starting in the final state of an execution of p, q ends in the final state of some execution of r – p and r may be arbitrary designs. example: {..x+1 ≤ n} x:= x + 1 {..x ≤ n} where..b (finally b) describes all executions that end in a state satisfying a single-state predicate b.

monotonicity Law: ( ; is monotonic wrto ⊑) : – p;q ⊑ p’;q if p ⊑ p’ – p;q ⊑ p;q’ if q ⊑ q’ – compare: addition of numbers Rule (of consequence): – p’ ⊑ p & {p} q {r} & r ⊑ r’ implies {p’} q {r’} Rule is interprovable with first law

associativity Law (; is associative) : – (p;q);q’ = p;(q;q’) Rule (sequential composition): – {p} q {s} & {s} q’ {r} implies {p} q;q’ {r} half the law interprovable from rule

Unit(skip):  a program that does nothing Law (  is the unit of ;): – p;  = p =  ;p Rule (nullity) – {p}  {p} a quarter of the law is interprovable from Rule

concurrent composition: p | q each execution of (p|q) consists of – all events x of an execution of p, – and all events y of an execution of q same laws apply to alternatives: – interleaving: x precedes or follows y – true concurrency: x neither precedes nor follows y. – separation: x and y independent Laws: | is associative, commutative and monotonic

Separation Logic Law (locality of ; wrto |): – (s|p) ; q ⊑ s |(p;q)(left locality ) – p ; (q|s) ⊑ (p;q) | s(right locality) Rule (frame) : – {p} q {r} implies {p|s} q {r|s} Rule interprovable with left locality

Concurrency law Law (; exchanges with *) – (p|q) ; (p’|q’) ⊑ (p;p’) | (q;q’) – like exchange law of category theory Rule (| compositional) – {p} q {r} & {p’} q’ {r’} implies {p|p’} q|q’ {r|r’} Rule interprovable with the law

p|q ; p’|q’ p p’ q’ q by columns

p|q ; p’|q’ ⊑ p p’ q’ q p;p’ | q;q’ by rows

Regular language model p, q, r,… are languages – descriptions of execution of fsm. p ⊑ q is inclusion of languages p;q is (lifted) concatenation of strings – i.e., {st| s ∊ p & t ∊ q} p|q is (lifted) interleaving of strings  = { } (only the empty string) “c” = { } (only the string “c”)

Left locality Theorem: (s|p) ; q ⊑ s | (p;q) Proof: in lhs: s interleaves with just p, and all of q comes at the end. in rhs: s interleaves with all of p;q so lhs is a special case of rhs p s s ; q q q⊑p s q s q q

Exchange Theorem: (p|q) ; (p’|q’) ⊑ (p;p’) | (q;q’) – in lhs: all of p and q comes before all of p’ and q’. – in rhs: end of p may interleave with q’ or start of p’ with q the lhs is a special case of the rhs. p q p ; q’ p’ q’ ⊑p q q’ p p’ q’

Conclusion regular expressions satisfy all our laws for ⊑, ;, and | and for other operators introduced later

Part 2. More Program Control Structures Non-determinism, intersection Iteration, recursion, fixed points Subroutines, contracts, transactions Basic commands

Subject matter variables (p, q, r) stand for programs, designs, specifications,… they are all descriptions of what happens inside and around a computer that is executing a program. the differences between programs and specs are often defined from their syntax.

Specification syntax includes disjunction (or, ⊔ ) to express abstraction, or to keep options open – ‘it may be painted green or blue’ conjunction (and, ⊓ ) combines requirements – it must be cheaper than x and faster than y negation (not) for safety and security – it must not explode implication (contracts) – if the user observes the protocol, so will the system

Program syntax excludes disjunction – non-deterministic programs difficult to test conjunction – inefficient to find a computation satisfying both negation – incomputable implication – which side of contract?

programs include sequential composition (;) concurrent composition (|) interrupts iteration, recursion contracts (declarations) transactions assignments, inputs, outputs, jumps,… So include these in our specifications!

Bottom  An unimplementable specification – like the false predicate A program that has no execution – the compiler stops it from running Define  as least solution of: _ ⊑ _ Theorem:  ⊑ r –  satisfies every spec, – but cannot be run (Dijkstra’s miracle)

Algebra of  Law (  is the zero of ;) : –  ; p =  = p ;  Theorem : {p}  {q} Quarter of law provable from theorem

Top ⊤ a vacuous specification, – satisfied by anything, – like the predicate true a program with an error – for which the programmer is responsible – e.g., subscript error, violation of contract… define ⊤ as greatest solution of: _ ⊑ _

Algebra of ⊤ Law: none Theorem: none – you can’t prove a program with this error – it might admit a virus! A debugging implementation may supply useful laws for ⊤

Non-determinism (or): p ⊔ q describes all executions that either satisfy p or satisfy q. The choice is not (yet) determined. It may be determined later – in development of the design – or in writing the program – or by the compiler – or even at run time

lub (join): ⊔ Define p⊔q as least solution of p ⊑ _ & q ⊑ _ Theorem – p ⊑ r & q ⊑ r iff p⊔q ⊑ r Theorem – ⊔ is associative, commutative, monotonic, idempotent and increasing – it has unit ⊥ and zero ⊤

glb (meet): ⊓ Define p⊓q as greatest solution of _ ⊑ p & _ ⊑ q

Distribution Law ( ; distributive through ⊔ ) – p ; (q⊔q’) = p;q ⊔ p;q’ – (q⊔q’) ; p = q;p ⊔ q’;p Rule (non-determinism) – {p} q {r} & {p} q’ {r} implies {p} q⊔q’ {r} – i.e., to prove something of q⊔q’ prove the same thing of both q and q’ quarter of law interprovable with rule

Conditional: p if b else p’ Define p ⊰b⊱ p’ as b.. ⊓ p ⊔ not(b).. ⊓ p’ – where b.. describes all executions that begin in a state satisfying b. Theorem. p ⊰b⊱ p’ is associative, idempotent, distributive, and – p ⊰b⊱ q = q ⊰not(b)⊱ p (skew symm) – (p ⊰b⊱ p’ ) ⊰c⊱ (q ⊰b⊱ q’) = (p ⊰c⊱ q) ⊰b⊱ (p’ ⊰c⊱ q’) (exchange)

Transaction Defined as (p ⊓..b) ⊔ (q ⊓..c) – where..b describes all executions that end satisfying single-state predicate b. Implementation: – execute p first – test the condition b afterwards – terminate if b is true – backtrack on failure of b – and try alternative q with condition c.

Transaction (realistic) Let r describe the non-failing executions of a transaction t. – r is known when execution of t is complete. – any successful execution of t is committed – a single failed execution of t is undone, – and q is done instead. Define: (t if r else q) = t if t ⊑ r = (t ⊓ r) ⊔ q otherwise

Contracts Let q be the body of a subroutine Let s be its specification Let (q.. s) assert that q meets s Programmer error (⊤) if not so Caller of subroutine may assume that s describes all its calls Implementation may just execute q

Least upper bound Let S be an arbitrary set of designs Define ⊔ S as least solution of ∀s∊ S. s ⊑ _ – ( ∀s∊ S. s ⊑ r ) ⇒ ⊔S ⊑ r (all r) everything is an upper bound of { }, so ⊔ { }=  – a case where ⊔S ∉ S

similarly ⊓ S is greatest lower bound of S ⊓ { } = ⊤

Subroutine with contract: q.. s Define (q..s) as glb of the set q ⊑ _ & _ ⊑ s Theorem: (q.. s) = q if q ⊑ s = ⊤ otherwise

Iteration (Kleene *) q* is least solution of – (ɛ ⊔ (q; _) ) ⊑ _ q* = def ⊔ {s| (ɛ ⊔ q; s) ⊑ s} – ɛ ⊔ q; q* ⊑ q* – ɛ ⊔ q; q’ ⊑ q’ implies q* ⊑ q’ – q* = ⊔ {qⁿ | n ∊ Nat}(continuity) Rule (invariance): – {p}q*{p} if{p}q{p}

Infinite replication !p is the greatest solution of _ ⊑ p|_ – as in the pi calculus all executions of !p are infinite – or possibly empty

Recursion Let F(_) be a monotonic function between programs. Theorem: all functions defined by monotonic operators are monotonic. μF is strongest solution of F(_) ⊑ _ νF is weakest solution of _ ⊑ F(_) Theorem (Knaster-Tarski): These solutions exist.

Basic statements/assertions skip  bottom  top⊤ assignment:x := e(x) assertion:assert b assumption:assume b finally..b initiallyb..

more assign thru pointer:[a] := e output:c!e input:c?x points to:a|-> e – a |-> _= def exists v. a|-> v throw, catch alloc, dispose

Laws(examples) assume b= def b..⊓  assert b= def b..⊓  ⊔ not(b).. x:=e(x) ; x:=f(x)=x := f(e(x)) – in a sequential language

more (p|-> _ ); [p] := e⊑p|-> e – in separation logic c!e | c?x=x := e – in CSP but not in CCS or Pi throw x ; (catch x; p) =p

Part 3 Unifying Semantic Theories Six familiar semantic definition styles. Their derivation from the algebra and vice versa.

operational rules algebraic laws deduction rules

Hoare Triple a method for program verification {p} q {r} ≝ p;q ⊑ r – one way of achieving r is by first doing p and then doing q Theorem (sequential composition): – {p} q {s} & {s} q’ {r} implies {p} q;q’ {r} – proved by associativity

Plotkin reduction a method for program execution -> r = def p ; q ⊒ r – if p describes state before execution of q then r describes a possible final state, eg. – ->..(x = 37) Theorem (sequential composition): -> s & -> r implies r

Milner transition method of execution for processes p – q -> r≝p ⊒ q;r – one of the ways of executing p is by first executing q and then executing r. – e.g., (x := x+3) –(x:=x+1)-> (x:=x+2) Theorem (sequential composition): – p –q-> s & s –q’-> r => p –(q;q’)-> r (big-step rule for ; )

partial correctness describes what may happen p[q]r= def p ⊑ q;r – if p describes a state before execution of q, then execution of q may achieve r Theorem (sequential composition): p [q] s & s [q’] r implies p [q;q’] r useful if r describes error states, and q describes initial states from which a test execution of q may end in error.

Summary {p} q {r}= def p;q ⊑ r – Hoare triple ->r= def p;q ⊒ r – Plotkin reduction p –q->r = def p ⊒ q;r – Milner transition p [q] r = de fp ⊑ q;r – test generation

Sequential composition Law: ; is associative Theorem: sequence rule is valid for all four triples. the Law is provable from the conjunction of all of them

Skip  Law: p ;  = p =  ; p Theorems: {p}  {p} p [  ] p p −  → p –>p Law follows from conjunction of all four theorems

Left distribution ; through ⊔ Law: p;(q ⊔ q’)=p;q ⊔ p;q’ Theorems: – {p} (q⊔q’) {r} if {p}q{r} and {p}q’{r} – -> r if -> r or -> r – p [q⊔q’] r if p [q] r or p [q’] r – p -(q⊔q’)-> r if p –q->r and p -q’->r (not used in CCS) law provable from either and rule together with either or rule.

locality and frame left locality(s|p) ; q ⊑s | (p;q) Hoare frame: {p} q {r} ⇒ {s|p} q {s|r} right locality p ; (q|s) ⊑ (p;q) | s Milner frame: p -q-> r⇒(p|s) - q-> (r|s) Full locality requires both frame rules

Separation logic Exchange law: – (p | p’) ; (q| q’)  (p ; q) | (p’;q’) Theorems – {p} q {r} & {p’} q’ {r’} ⇒ {p|p’} q|q’ {r|r’} – p -q -> r & p’–q’-> r’ => p|p’ –q|q’-> r|r’ the law is provable from either theorem For the other two triples, the rules are equivalent to the converse exchange law.

usual restrictions on triples in {p} q {r}, p and r are of form..b,..c in p [q] r,p and r are of form b.., c.. in ->r,p and r are of form..b,..c in p –q->r, p and r are programs in p –q->r (small step), q is atomic (in all cases, q is a program) all laws are valid without these restrictions

Weakest precondition (-;) (q -; r) = def the weakest solution of ( _ ;q ⊆ r) – the same as Dijkstra’s wp(q, r) – for backward development of programs

Weakest precondition (-;) Law (-; adjoint to ;) – p ⊑ q -; riffp;q ⊑ r(galois) Theorem – (q -; r) ; q⊑r – p⊑q -; (p ; q) Law provable from the theorems – cf. (r div q)  q ≤ r – r≤(r  q) div q

Theorems q’ ⊑ q & r ⊑ r’=> q-;r ⊑ q’-;r’ (q;q’)-;r ⊑q-;(q’-;r) q-;r ⊑(q;s) -; (r;s)

Specification statement (;-) (p ;- r) = def the weakest solution of ( p ; _ ⊆ r) – Back/Morgan’s specification statement – for stepwise refinement of designs – same as p⇝r in RGSep – same as (requires p; ensures r) in VCC

Law of consequence

Frame laws

Part 4 Denotational Models A model is a mathematical structure that satisfies the axioms of an algebra, and realistically describes a useful application, for example, program execution.

Models denotational models algebraic laws

Some Standard Models: Boolean algebra ( {0,1}, ≤, , , (1 - _) ) predicate algebra (Frege, Heyting) – (ℙS,├, , , (S - _), =>, ∃, ∀) regular expressions (Kleene): – (ℙA*, ⊆, ∪, ;, ɛ, { }, | ) binary relations (Tarski): – (ℙ(S  S), ⊆, ∪, ∩, ;, Id, not(_), converse(_)) algebra of designs is a superset of these

Model: (EV, EX, PR) EV is an underlying set of events (x, y,..) that can occur in any execution of any program EX are executions (e, f,…), modelled as sets of events PR are designs (p, q, r,…), modelled as sets of executions.

Set concepts ⊑ is  (set inclusion) ⊔ is  (set union) ⊓ is  (intersection of sets)  is { } (the empty set) ⊤is EV (the universal set)

With (|) p | q = {e ∪ f | e ε p & f ε q & e∩f = { } } – each execution of p|q is the disjoint union of an execution of p and an execution of q – p|q contains all such disjoint unions | generalises many binary operators

Introducing time TIM is a set of times for events – partially ordered by ≤ Let when : EV -> TIM – map each event to its time of occurrence.

Definition of < x < y = def not(when(y) ≤ when(x)) – x < y & y < x means that x and y occur ‘in true concurrency’. e x < y – no event of f occurs before an event of e – hence e<f implies e  f = { } If ≤ is a total order, – there is no concurrency, – executions are time-ordered strings

Sequential composition (then) p ; q = {e  f | e∊p & f∊q & e<f} special case: if ≤ is a total order, – e < f means that e  f is concatenation (e⋅f) of strings – ; is the composition of regular expressions

Theorems These definitions of ; and | satisfy the locality and exchange laws. (s|p) ; q ⊑ s |(p;q) (p|q) ; (p’|q’) ⊑ (p;p’) | (q;q’) – Proof: the lhs describes fewer interleavings than the rhs. special case: regular expressions satisfy all our laws for ⊑, ⊔, ;, and |

Disjoint concurrency (||) p||q = def (p ; q)  (q ; p) – all events of p concurrent with all of q. – no interaction is possible between them. Theorems: (p||q) ; r  p || (q ; r) (p||q) ; (p’||q’)  (p;p’) || (q;q’) – Proof: the rhs has more disjointness constraints than the lhs. – the wrong way round! So make the programmer responsible for disjointness, using interfaces!

Interfaces Let q be the body of a subroutine Let s be its specification Let (q.. s) assert that q is correct Caller may assume s Implementer may execute q

Solution p*q = def (p|q.. p||q) = p|qif p|q ⊑ p||q ⊤ otherwise – programmer is responsible for absence of interaction between p and q. Theorem: ; and * satisfy locality and exchange. – Proof: in cases where lhs ≠ rhs, rhs = ⊤

Problem ; is almost useless in the presence of arbitrary interleaving (interference). It is hard to prove disjointness of p||q We need a more complex model – which constrains the places at which a program may make changes.

Separation PL is the set of places at which an event can occur each place is ‘owned’ by one thread, – no other thread can act there. Let where:EV -> PL map each event to its place of occurrence. where(e) = def {where(x) | x ∊ e }

Separation principle events at different places are concurrent events at the same place are totally ordered in time ∀x,y ∊ EV. where(x) = where(y) iff x≤y or y≤x

Picture time space

Theorem p || q = {e  f | e ∊ p & f ∊ q & where(e)  where(f) = { } } proved from separation principle

Convexity Principle Each execution contains every event that occurs between any of its events. ∀e ∊ EX, y ∊ EV. ∀x, z ∊ e. when(x) ≤ when(y) ≤ when(z) => y ∊ e – no event from elsewhere can interfere between any two events of an execution

A convex execution of p;q time space pq

A non-convex ‘execution’ of p;q time space pq

Conclusion: in Praise of Algebra Reusable Modular Incremental Unifying Discriminative Computational Comprehensible Abstract Beautiful!

Algebra likes pairs Algebra chooses as primitives – operators with two operands +,  – predicates with two places =,  – laws with two operators & v, +  – algebras with two componentsrings

Tuples Tuples are defined in terms of pairs. – Hoare triples – Plotkin triples – Jones quintuples – seventeentuples …

Semantic Links deductions transitions denotations algebra

Increments algebra

Filling the gaps algebra