Download presentation
Presentation is loading. Please wait.
Published byKeyon Pilling Modified over 9 years ago
1
Foundational Certified Code in a Metalogical Framework Karl Crary and Susmit Sarkar Carnegie Mellon University
2
2 Motivation: Grid Computing Make use of idle computing cycles over the network [e.g. SETI] Computer owners download and execute code from developers A key issue: Unknown developers, so consumers are concerned about safety
3
3 Certified code Package the code with certificate [PCC, TAL] Certificate: a machine verifiable proof of safety Typically, proof that code is well-typed in a safe type system DeveloperConsumer Code Certificate Is code safe ? Knowledge Code is safe!
4
4 Type System? Is that safe? Old Answer: Fix a type system, trust peer-review New Answer: Give developers flexibility of using their own type systems Need to check this is safe Known as Foundational Certified Code DeveloperConsumer Machine details Type System Code Certificate
5
5 Roadmap Our system Metalogics Safety Policy A Safety Proof Related and future work
6
6 Our System Developer Certificate Code Consumer Safety PolicySafety Condition Does Code satisfy the Safety Policy? Code satisfies my Safety Condition Why is your safety condition any good? I can prove it to you! Safety Proof
7
7 Metalogic : meta theorems We use LF to express logics e.g., operational semantics producer’s safety conditions We care about meta theorems: If some input derivation exists, then an output derivation exists e.g., Safety Theorem
8
8 How to check meta theorems? Choice 1: reflect metalogical reasoning in the framework Choice 2: use a logic designed for metalogical reasoning e.g. Twelf [Schurmann]
9
9 Programming in Meta logics We write logic programs relating derivations limited to -1 reasoning, authors plan stronger system Need to do induction on structure of derivation System can check these logic programs are total (user annotations required)
10
10 Roadmap Our system Metalogics Safety policy A safety proof Related and future work
11
11 Safety policy - Preliminaries Formalize operational semantics of the IA32 architecture Formalize machine states: memory, register files, stack, instruction pointer Formalize transitions from state to state Remove transitions deemed unsafe
12
12 Example: transition for addition addl $5,(%eax) load 4 bytes from (%eax), load immediate operand 5, add them, store result back in (%eax), update EFLAGS and advance EIP This can go wrong, e.g. if eax points to protected memory Solution: The formal load and store relations do not apply in such cases
13
13 Safety Policy Define initial state on loading program P We never get to a state where the (formal) machine does not have a transition Another way of stating: the formal machine is never stuck Halt state treated specially
14
14 Why is this safe? Real machine’s transitions according to formal machine’s transitions: real machine is performing safe operations To perform unsafe operations, real machine takes a transition not in formal machine This does not happen in a safe machine
15
15 Roadmap Our system Metalogics Safety policy A safety proof Related and future work
16
16 Example Safety Proof A particular safety proof Our safety proof is for TALT [Crary] Type system for an assembly language Fairly low-level, but still abstract Our foundational safety proof is syntactic [Hamid et al.]
17
17 Safety Our conditions will isolate a set of safe states Safe states cannot transition to stuck states Safe State M1 State M2
18
18 Key Lemmas Progress Preservation Safe State M1State M2 Safe Safe State M1 State M2
19
19 Putting it together – Safety Theorem Transitions from a safe state cannot go to a stuck state Safe State M1 State M2 Safe
20
20 Idea of proof Safe machine Three parts of the proof Abstract Type Safety (previous work) Simulation Determinism Safe State M Typed abstract M’ implements
21
21 TALT safety proof [Crary] This has two top level lemmas: Progress: A well typed abstract machine makes a transition Preservation: If a well typed abstract machine makes a transition, the resulting (abstract) machine is well typed
22
22 Concrete Machine Lemmas Simulation Determinism Abstract M2 Concrete M2’ Concrete M1’ Abstract M1 Concrete M1 Concrete M2 Concrete M2’
23
23 Progress Safe State M1State M2 Abstract, typed M1’ Abstract M2’ implements progress implements State M2
24
24 Preservation Safe State M1State M2Safe implements Typed abstract M1’ TypedAbstract M2’ progress M2+ implements Safe
25
25 Implementation Statistics Safety Policy : 2,081 lines of code Safety Proof : 44,827 lines of code Time to check : 75 sec Number of lemmas : 1,466 Man years : 1 and 1/2
26
26 Related work Foundational PCC - Appel et al FTAL - Hamid et al Temporal Logic PCC - Bernard and Lee
27
27 Future Work Develop a compiler from Standard ML to TALT Expand the target language to include many more IA32 instructions Specify and prove other properties, e.g. Running time bounds
28
28
29
29 Indeterminism The data may be indeterminate, due to e.g. input Safety demands that any instance be safe We have an oracle that the semantics consults to determine what to do Oracle is quantified in safety theorem
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.