Protocol Composition Logic (PCL)

Slides:



Advertisements
Similar presentations
Security attacks. - confidentiality: only authorized parties have read access to information - integrity: only authorized parties have write access to.
Advertisements

ECE454/CS594 Computer and Network Security Dr. Jinyuan (Stella) Sun Dept. of Electrical Engineering and Computer Science University of Tennessee Fall 2011.
Lecture 3Dr. Verma1 COSC 6397 – Information Assurance Module M2 – Protocol Specification and Verification University of Houston Rakesh Verma Lecture 3.
Rigorous Software Development CSCI-GA Instructor: Thomas Wies Spring 2012 Lecture 11.
Non-monotonic Properties for Proving Correctness in a Framework of Compositional Logic Koji Hasebe Mitsuhiro Okada (Dept. of Philosophy, Keio University)
A Logic of Secure Systems and its Application to Trusted Computing Anupam Datta, Jason Franklin, Deepak Garg, and Dilsun Kaynar Carnegie Mellon University.
1 Introduction CSE 5351: Introduction to cryptography Reading assignment: Chapter 1 of Katz & Lindell.
Formal Derivation of Security Protocols Anupam DattaAnte Derek John C. Mitchell Dusko Pavlovic Stanford University Kestrel Institute HCSS April 15, 2004.
Security Analysis of Network Protocols Anupam Datta Stanford University May 18, 2005.
Compositional Protocol Logic CS 395T. Outline uFloyd-Hoare logic of programs Compositional reasoning about properties of programs uDDMP protocol logic.
PCL: A Logic for Security Protocols Anupam Datta Stanford University Secure Software Systems, CMU October 3, 5, 2005.
Formally (?) Deriving Security Protocols Anupam Datta WIP with Ante Derek, John Mitchell, Dusko Pavlovic October 23, 2002.
Abstraction and Refinement in Protocol Derivation Anupam DattaAnte Derek John C. Mitchell Dusko Pavlovic Stanford University Kestrel Institute CSFW June.
Symbolic Logic for Complexity- theoretic Model of Security Protocols Anupam Datta Ante Derek John C. Mitchell Vitaly Shmatikov Mathieu Turuani May 5, 2005.
Proving Security of Industrial Network Protocols: Theory and Practice Anupam Datta Stanford University Oakland PC Crystal Ball Workshop January 2007.
Protocol Verification by the Inductive Method John Mitchell Stanford TECS Week2005.
Security Analysis of Network Protocols: Compositional Reasoning and Complexity-theoretic Foundations Anupam Datta Stanford University May 23, 2005.
Logic for Computer Security Protocols Ante Derek.
Protocol Composition Logic Arnab Roy joint work with A. Datta, A. Derek, N. Durgin, J.C. Mitchell, D. Pavlovic CS259: Security Analysis of Network Protocols,
Protocol Composition Logic II Anupam Datta Fall A: Foundations of Security and Privacy.
Protocol Composition Logic John Mitchell Stanford TECS Week2005.
Logic for Protocol Composition A. Datta, A. Derek, J. Mitchell, D. Pavlovic.
Protocol Composition Logic John Mitchell Stanford CS
Information Security of Embedded Systems : BAN-Logic Prof. Dr. Holger Schlingloff Institut für Informatik und Fraunhofer FIRST.
Logics for Security Protocols Anupam Datta Fall A: Foundations of Security and Privacy.
Inductive Verification of Protocols Anupam Datta CMU Fall A: Foundations of Security and Privacy.
Computational Soundness for PCL Dilsun Kaynar Carnegie Mellon University Foundations of Security and Privacy October 11, 2007.
Formal Analysis of Security Protocols Dr. Changyu Dong
BAN LOGIC Amit Chetal Monica Desai November 14, 2001
Introduction1-1 Data Communications and Computer Networks Chapter 6 CS 3830 Lecture 31 Omar Meqdadi Department of Computer Science and Software Engineering.
CSCE 813 Internet Security Cryptographic Protocol Analysis.
1 Reasoning about Concrete Security in Protocol Proofs A. Datta, J.Y. Halpern, J.C. Mitchell, R. Pucella, A. Roy.
Correctness Proofs and Counter-model Generation with Authentication-Protocol Logic Koji Hasebe Mitsuhiro Okada Department of Philosophy, Keio University.
Authentication. Goal: Bob wants Alice to “prove” her identity to him Protocol ap1.0: Alice says “I am Alice” Failure scenario?? “I am Alice”
Lecture 5 1 CSP tools for verification of Sec Prot Overview of the lecture The Casper interface Refinement checking and FDR Model checking Theorem proving.
Protocol Composition Logic (PCL): Part II Anupam Datta CS 259.
PCL: A Logic for Proving Security of Industrial Network Protocols Anupam Datta CMU May 2007.
Model Checking for Security Protocols Will Marrero, Edmund Clarke, Shomesh Jha.
Outline The basic authentication problem
Cryptography CS 555 Topic 34: SSL/TLS.
Formal Methods for Security Protocols
CS259: Security Analysis of Network Protocols, Winter 2008
Matching Logic An Alternative to Hoare/Floyd Logic
Protocol Composition Logic II
Security Protocols Analysis
Basic Network Encryption
Logic for Computer Security Protocols
Just Fast Keying (JFK) Protocol
The Inductive Approach to Verifying Cryptographic Protocols
Man in the Middle Attacks
刘振 上海交通大学 计算机科学与工程系 电信群楼3-509
Protocol Verification by the Inductive Method
Logic for Computer Security Protocols
CDK4: Chapter 7 CDK5: Chapter 11 TvS: Chapter 9
Protocol ap1.0: Alice says “I am Alice”
An Executable Model for JFKr
An Executable Model for Security Protocol JFKr
Symbolic and Computational Analysis of Network Protocol Security
CDK: Chapter 7 TvS: Chapter 9
Chapter -7 CRYPTOGRAPHIC HASH FUNCTIONS
Basic Network Encryption
Digital Signatures Cryptographic technique analogous to hand-written signatures. sender (Bob) digitally signs document, establishing he is document owner/creator.
Protocol Verification by the Inductive Method
Security Requirements for an Abbreviated MSA Handshake
刘振 上海交通大学 计算机科学与工程系 电信群楼3-509
Protocol Verification by the Inductive Method
Security: Integrity, Authentication, Non-repudiation
Chapter 8 roadmap 8.1 What is network security?
Overview of an MSA Security Proof
Presentation transcript:

Protocol Composition Logic (PCL) CS 259 Protocol Composition Logic (PCL) Anupam Datta

Protocol Analysis Techniques Crypto Protocol Analysis Formal Models Dolev-Yao (perfect cryptography) Computational Models Probabilistic Interactive TM Probabilistic process calculi Probabilistic I/O automata Computational PCL … Protocol Logics Model Checking Process Calculi Inductive Proofs … BAN, PCL FDR, Murphi Spi-calculus Paulson

PCL Highlights Mechanizable proofs of security for unbounded number of protocol runs Similar guarantees as Paulson’s Method; better than finite-state model-checking Codifies high-level reasoning principles used in protocol design, cf. Kaufman-Perlman-Speciner, Network Security Modular proof methods Cryptographic soundness

Modular Analysis / Composition Auth Server Laptop Access Point EAP-TLS: Certificates to Authorization (PMK) (Shared Secret-PMK) 4WAY Handshake: PMK to Keys for data communication 802.11i Key Management 20 msgs in 4 components Group key: Keys for broadcast communication Data protection: AES based using above keys Goal: Divide and Conquer

Cryptographic Soundness Cryptographers and logicians working in computer security don’t talk to each other (Disclaimer: Examples not representative) Goal: Logical methods for complexity-theoretic model of cryptography

Protocol Composition Logic: PCL Intuition Formalism Protocol programming language Protocol logic Syntax Semantics Proof System Example Signature-based challenge-response Composition Cryptographic Soundness [Datta, Derek, Durgin, Mitchell, Pavlovic] [Datta, Derek, Mitchell, Pavlovic] [Datta, Derek, Mitchell, Shmatikov, Turuani]

PCL - Intuition Honest Principals, Protocol Attacker Private Data Alice’s information Protocol Private data or keys Sends and receives

Example: Challenge-Response m, A n, sigB {m, n, A} A B sigA {m, n, B} Alice reasons: if Bob is honest, then: only Bob can generate his signature [protocol independent] if Bob generates a signature of the form sigB{m, n, A}, he sends it as part of msg2 of the protocol, and he must have received msg1 from Alice [protocol specific] Alice deduces: Received (B, msg1) Λ Sent (B, msg2)

Logic: Background Logic Proof System Syntax Formulas Semantics Truth p, p  q, (p  q), p  q Semantics Truth Model, M = {p = true, q = false} M |= p  q Proof System Axioms and proof rules Provability p  (q  p) p p  q q Soundness Theorem Provability implies truth Axioms and proof rules hold in all “relevant” models

Formalizing the Approach Protocol Programming Language Syntax for writing protocol code Precise description of protocol execution Protocol Composition Logic (PCL) Syntax for stating security properties Trace-based semantics Proof system Formally proving security properties Soundness Theorem First logic for security protocols Related to: BAN, Floyd-Hoare, CSP/CCS, temporal logic, NPATRL

Protocol programming language A protocol is described by specifying a program for each role Server = [receive x; new n; send {x, n}] Building blocks Terms (think “messages”) names, nonces, keys, encryption, … Actions (operations on terms) send, receive, pattern match, …

Terms Example: x, sigB{m, x, A} is a term t ::= c constant term x variable N name K key t, t tupling sigK{t} signature encK{t} encryption Example: x, sigB{m, x, A} is a term

Actions send t; send a term t receive x; receive a term into variable x match t/p(x); match term t against p(x) A program is just a sequence of actions Notation: we often omit match actions receive sigB{A, n} = receive x; match x/sigB{A, n}

Challenge-Response programs m, A n, sigB {m, n, A} A B sigA {m, n, B} InitCR(A, X) = [ new m; send A, X, {m, A}; receive X, A, {x, sigX{m, x, A}}; send A, X, sigA{m, x, X}}; ] < > RespCR(B) = [ receive Y, B, {y, Y}; new n; send B, Y, {n, sigB{y, n, Y}}; receive Y, B, sigY{y, n, B}}; ] < >

Execution Model Initial Configuration, IC Run Set of principals and keys Assignment of  1 role to each principal Run Interleaving of actions of honest principals and attacker starting from IC Position in run new x send {x}B A receive {x}B receive {z}B Process calculus operational semantics B new z send {z}B C

Formulas true at a position in run Action formulas a ::= Send(P,t) | Receive (P,t) | New(P,t) | Decrypt (P,t) | Verify (P,t) Formulas  ::= a | Has(P,t) | Fresh(P,t) | Honest(N) | Contains(t1, t2) |  | 1 2 | x  | a < a Modal formula  [ actions ] P  Example Has(X, secret)  ( X = A  X = B) Specifying secrecy

Semantics Protocol Q Satisfaction Q, R |  [ actions ] P  Defines set of roles (e.g., initiator, responder) Run R of Q is sequence of actions by principals following roles, plus attacker Satisfaction Q, R |  [ actions ] P  If some role of P in R does exactly actions starting from state where  is true, then  is true in state after actions completed Q |  [ actions ] P  Q, R |  [ actions ] P  for all runs R of Q

Challenge-Response Property Specifying authentication for Initiator CR | true [ InitCR(A, B) ] A Honest(B)  ( Send(A, {A,B,m})  Receive(B, {A,B,m})  Send(B, {B,A,{n, sigB {m, n, A}}})  Receive(A, {B,A,{n, sigB {m, n, A}}}) ) Authentication as “matching conversations” [Bellare-Rogaway93]

Proof System Goal: Formally prove security properties Axioms Simple formulas provable by hand Inference rules Proof steps Theorem Formula obtained from axioms by application of inference rules

Sample axioms about actions New data true [ new x ]P Has(P,x) true [ new x ]P Has(Y,x)  Y=P Actions true [ send m ]P Send(P,m) Knowledge true [receive m ]P Has(P,m) Verify true [ match x/sigX{m} ] P Verify(P,m)

Reasoning about knowledge Pairing Has(X, {m,n})  Has(X, m)  Has(X, n) Encryption Has(X, encK(m))  Has(X, K-1)  Has(X, m)

Encryption and signature Public key encryption Honest(X)  Decrypt(Y, encX{m})  X=Y Signature Honest(X)  Verify(Y, sigX{m})   m’ (Send(X, m’)  Contains(m’, sigX{m})

Sample inference rules First-order logic rules      Generic rules  [ actions ]P   [ actions ]P   [ actions ]P   

Bidding conventions (motivation) Blackwood response to 4NT 5§ : 0 or 4 aces 5¨ : 1 ace 5© : 2 aces 5ª : 3 aces Reasoning If my partner is following Blackwood, then if she bid 5©, she must have 2 aces

Honesty rule (rule scheme) roles R of Q.  protocol steps A of R. Start(X) [ ]X   [ A ]X  Q |- Honest(X)   This is a finitary rule: Typical protocol has 2-3 roles Typical role has 1-3 receives Only need to consider A waiting to receive Protocol invariants proved by induction

Honesty rule (example use) roles R of Q.  protocol steps A of R. Start(X) [ ]X   [ A ]X  Q |- Honest(X)   Example use: If Y receives a message m from X, and Honest(X)  (Sent(X,m)  Received(X,m’)) then Y can conclude Honest(X)  Received(X,m’)) Proved using honesty rule

Correctness of CR CR |- true [ InitCR(A, B) ] A Honest(B)  ( InitCR(A, X) = [ new m; send A, X, {m, A}; receive X, A, {x, sigX{m, x, A}}; send A, X, sigA{m, x, X}}; ] RespCR(B) = [ receive Y, B, {y, Y}; new n; send B, Y, {n, sigB{y, n, Y}}; receive Y, B, sigY{y, n, B}}; ] CR |- true [ InitCR(A, B) ] A Honest(B)  ( Send(A, {A,B,m})  Receive(B, {A,B,m})  Send(B, {B,A,{n, sigB {m, n, A}}})  Receive(A, {B,A,{n, sigB {m, n, A}}}) ) Authentication as “matching conversations” [Bellare-Rogaway93]

Correctness of CR – step 1 InitCR(A, X) = [ new m; send A, X, {m, A}; receive X, A, {x, sigX{m, x, A}}; send A, X, sigA{m, x, X}}; ] RespCR(B) = [ receive Y, B, {y, Y}; new n; send B, Y, {n, sigB{y, n, Y}}; receive Y, B, sigY{y, n, B}}; ] 1. A reasons about her own actions CR |- true [ InitCR(A, B) ] A Verify(A, sigB {m, n, A})

Correctness of CR – step 2 InitCR(A, X) = [ new m; send A, X, {m, A}; receive X, A, {x, sigX{m, x, A}}; send A, X, sigA{m, x, X}}; ] RespCR(B) = [ receive Y, B, {y, Y}; new n; send B, Y, {n, sigB{y, n, Y}}; receive Y, B, sigY{y, n, B}}; ] 2. Properties of signatures CR |- true [ InitCR(A, B) ] A Honest(B)   m’ (Send(B, m’)  Contains(m’, sigB {m, n, A}) Recall signature axiom

Correctness of CR – Honesty InitCR(A, X) = [ new m; send A, X, {m, A}; receive X, A, {x, sigX{m, x, A}}; send A, X, sigA{m, x, X}}; ] RespCR(B) = [ receive Y, B, {y, Y}; new n; send B, Y, {n, sigB{y, n, Y}}; receive Y, B, sigY{y, n, B}}; ] Invariant proved with Honesty rule CR |- Honest(X)  Send(X, m’)  Contains(m’, sigx {y, x, Y})   New(X, y)  m= X, Y, {x, sigB{y, x, Y}}  Receive(X, {Y, X, {y, Y}}) Induction over protocol steps

Correctness of CR – step 3 InitCR(A, X) = [ new m; send A, X, {m, A}; receive X, A, {x, sigX{m, x, A}}; send A, X, sigA{m, x, X}}; ] RespCR(B) = [ receive Y, B, {y, Y}; new n; send B, Y, {n, sigB{y, n, Y}}; receive Y, B, sigY{y, n, B}}; ] 3. Use Honesty invariant CR |- true [ InitCR(A, B) ] A Honest(B)  Receive(B, {A,B,m}),…

Correctness of CR – step 4 InitCR(A, X) = [ new m; send A, X, {m, A}; receive X, A, {x, sigX{m, x, A}}; send A, X, sigA{m, x, X}}; ] RespCR(B) = [ receive Y, B, {y, Y}; new n; send B, Y, {n, sigB{y, n, Y}}; receive Y, B, sigY{y, n, B}}; ] 4. Use properties of nonces for temporal ordering CR |- true [ InitCR(A, B) ] A Honest(B)  Auth Nonces are “fresh” random numbers

Complete proof

We have a proof. So what? Soundness Theorem: If Q |-  then Q |=  If  is a theorem then  is a valid formula  holds in any step in any run of protocol Q Unbounded number of participants Dolev-Yao intruder

Using PCL: Summary Modeling the protocol Modeling security properties Program for each protocol role Modeling security properties Using PCL syntax Authentication, secrecy easily expressed Proving security properties Using PCL proof system Soundness theorem guarantees that provable properties hold in all protocol runs Example: C. He, M. Sundararajan, A. Datta, A. Derek, J. C. Mitchell, A modular correctness proof of TLS and IEEE 802.11i, ACM CCS 2005

Challenge-Response programs (1) m, A n, sigB {m, n, A} A B sigA {m, n, B} InitCR(A, X) = [ new m; send A, X, {m, A}; receive X, A, {x, sigX{m, x, A}}; send A, X, sigA{m, x, X}}; ] < > RespCR(B) = [ receive Y, B, {y, Y}; new n; send B, Y, {n, sigB{y, n, Y}}; receive Y, B, sigY{y, n, B}}; ] < >

Challenge-Response Property (2) Specifying authentication for Initiator CR | true [ InitCR(A, B) ] A Honest(B)  ( Send(A, {A,B,m})  Receive(B, {A,B,m})  Send(B, {B,A,{n, sigB {m, n, A}}})  Receive(A, {B,A,{n, sigB {m, n, A}}}) )

Challenge-Response Proof (3)

Weak Challenge-Response m n, sigB {m, n} A B sigA {m, n} InitWCR(A, X) = [ new m; send A, X, {m}; receive X, A, {x, sigX{m, x}}; send A, X, sigA{m, x}}; ] RespWCR(B) = [ receive Y, B, {y}; new n; send B, Y, {n, sigB{y, n}}; receive Y, B, sigY{y, n}}; ]

Correctness of WCR – step 1 InitWCR(A, X) = [ new m; send A, X, {m}; receive X, A, {x, sigX{m, x}}; send A, X, sigA{m, x}}; ] RespWCR(B) = [ receive Y, B, {y}; new n; send B, Y, {n, sigB{y, n}}; receive Y, B, sigY{y, n}}; ] 1. A reasons about it’s own actions WCR |- [ InitWCR(A, B) ] A Verify(A, sigB {m, n})

Correctness of WCR – step 2 InitWCR(A, X) = [ new m; send A, X, {m}; receive X, A, {x, sigX{m, x}}; send A, X, sigA{m, x}}; ] RespWCR(B) = [ receive Y, B, {y}; new n; send B, Y, {n, sigB{y, n}}; receive Y, B, sigY{y, n}}; ] 2. Properties of signatures WCR |- [ InitCR(A, B) ] A Honest(B)   m’ (Send(B, m’)  Contains(m’, sigB {m, n, A})

Correctness of WCR – Honesty InitWCR(A, X) = [ new m; send A, X, {m}; receive X, A, {x, sigX{m, x}}; send A, X, sigA{m, x}}; ] RespWCR(B) = [ receive Y, B, {y}; new n; send B, Y, {n, sigB{y, n}}; receive Y, B, sigY{y, n}}; ] Honesty invariant WCR |- Honest(X)  Send(X, m’)  Contains(m’, sigx {y, x})   New(X, y)   Z (m= X, Z, {x, sigB{y, x}}  Receive(X, {Z, X, {y, Z}}))

Correctness of WCR – step 3 InitWCR(A, X) = [ new m; send A, X, {m}; receive X, A, {x, sigX{m, x}}; send A, X, sigA{m, x}}; ] RespWCR(B) = [ receive Y, B, {y}; new n; send B, Y, {n, sigB{y, n}}; receive Y, B, sigY{y, n}}; ] 3. Use Honesty rule WCR |- [ InitWCR(A, B) ] A Honest(B)  Z (Receive(B, {Z,B,m}), …)

Result WCR does not have the strong authentication property for the initiator Counterexample Intruder can forge sender’s and receiver’s identity in first two messages A -> X(B) m X(C) -> B m B -> X(C) n, sigB(m, n) X(B) ->A n, sigB(m, n)