Presentation is loading. Please wait.

Presentation is loading. Please wait.

6 June 2002 - Lecture 3 1 TU Dresden - Ws on Proof Theory and Computation Formal Methods for Security Protocols Catuscia Palamidessi Penn State university,

Similar presentations


Presentation on theme: "6 June 2002 - Lecture 3 1 TU Dresden - Ws on Proof Theory and Computation Formal Methods for Security Protocols Catuscia Palamidessi Penn State university,"— Presentation transcript:

1 6 June 2002 - Lecture 3 1 TU Dresden - Ws on Proof Theory and Computation Formal Methods for Security Protocols Catuscia Palamidessi Penn State university, USA

2 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation2 Security Protocols Contents of previous lectures: Brief introduction to security protocols Aims and properties authentication, secrecy, integrity, anonymity, etc. Brief introduction to Cryptographic tools Symmetric and asymmetric cryptography one-way functions, door traps Vulnerability of Security protocols Next: Introduction to Concurrency

3 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation3 Brief introduction to Concurrency The CSP approach Communicating Sequential Processes [Hoare 78] Mathematical framework for the description and analysis of systems consisting of processes interacting via exchange of messages Automatic tools available for proving properties of CSP specifications: Model-checker FDR Theorem prover PVS

4 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation4 The CSP formalism A small mathematical language containing the main constructs for specifying concurrency, parallelism, communication, choice, hiding etc. The evolution of processes is based on a sequence of events or actions Visible actions  Interaction with other processes, communication Invisible action  Internal computation steps

5 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation5 The CSP language: Syntax Inaction: Stop Termination, deadlock, incapability of performing any action, either internal or external Input: in ? x : A  P(x) Execute an input action on channel in, get message x of type A, then continue as P(x) Output: out ! m  P(x) Execute an output action on channel out, send message m, then continue as P(x) Recursion: P(y 1,…,y n ) = Body(y 1,…,y n ) Process definition. P is a process name, y 1,…,y n are the parameters, Body(y 1,…,y n ) is a process expression Example: Copy = in ? x  out ! m  Copy

6 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation6 The CSP Syntax External (aka guarded) choice: P [] Q Execute a choice between P and Q. Do not choose a process which cannot proceed Example: (a ? x  P(x)) [] (b ? x  Q(x)) Execute one and only one input action. If only one is available then choose that one. If both are available than choose arbitrarily. If none are available then block. The unchoosen branch is discarded (commitment) Internal choice: P + Q Execute an arbitrary choice between P and Q. It is possible to choose a process which cannot proceed

7 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation7 The CSP Syntax Parallel operator w/synchronization: P || Q P and Q proceed in parallel and are obliged to synchronize on all the common actions Example: (c ? x  P(x)) || (c ! m  Q) Synchronization: the two processes can proceed only if their actions correspond Handshaking: sending and receiving is simultaneous (clearly an abstraction. Buffered communication can anyway be modeled by implementing a buffer process) Communication: m is transmitted to the first process, which continues as P(m). Broadcasting: c ! m is available for other parallel procs Question: what happens with the process ((c?x  P(x)) [] (d?y  Q(y))) || (c!m  R)

8 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation8 The CSP Syntax Parallel operator w/synchronization and interleaving: P || A Q P and Q are obliged to synchronize only on the common actions in A They interleave on all the actions not in A Example: (c ? x  P(x)) || {c} ((c ! m  Q) [] (d ! n  R)) the two processes can either synchronize on the action on channel c, or the second process can perform an action on d. In this second case the first process will remain blocked, though, until the second will decide to perform (if ever) an output action on c. Question: in what part of the second process could this action on c be performed ? Abbreviation: P ||| Q stands for P ||  Q

9 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation9 The CSP’s Syntax Hiding: P \ A P \A behaves as P except that all the actions in A are turned into invisible actions. So they cannot be used anymore to synchronize with other processes. One possible use of this mechanism is to avoid that external processes interfere with the communication channels in P. (Internalization of communication in P.) Renaming: P[y/x] P[x/y] behaves as P except that all the occurrences of x are renamed by y. Typically this serves to create different instances of the same process scheme Abbr: P[y 1,y 2 /x 1,x 2 ] will stand for P[y 1 /x 1 ][y 2 /x 2 ]

10 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation10 Modeling Security Protocols in CSP Security protocols work through the interaction of a number of processes in parallel that send messages to each other. A formalism for concurrency is therefore an obvious notation for describing the participants and their role in the protocol Example: The Yahalom protocol Message 1 a  b : a.n a Message 2 b  s : b.{a.n a.n b } ServerKey(b) Message 3 s  a : {b. k ab.n a.n b } ServerKey(a).{a.k ab } ServerKey(b) Message 4 a  b : {a. k ab } ServerKey(b).{n b } k ab

11 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation11 Modeling Security Protocols in CSP We assume that each process has channels Receive Send that it uses for all communications with the other nodes via the medium Let us assume that A (Alice) and B (Bob) use the protocol, with A as initiator and B as responder, and that J (Jeeves) is the secure server

12 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation12 Modeling Security Protocols in CSP A ’s view (initiator): Message 1 a sends to b: a.n a Message 3 a gets from j: {b. k ab.n a.n b } ServerKey(a).{a.k ab } ServerKey(b) Message 4 a sends to b: {a. k ab } ServerKey(b).{n b } k ab In CSP this behavior can be modeled as follows: Initiator(a,n a ) = env?b: Agent  send.a.b.a.n a  [] (receive.J.a{b. k ab.n a.n b } ServerKey(a).m k ab  Key  send.a.b.m.{n b } k ab  Session(a,b,k ab,n a,n b ) ) n b  Nonce m  T

13 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation13 Modeling Security Protocols in CSP B ’s view (responder): Message 1 b gets from a: a.n a Message 2 b sends to j: b.{a.n a.n b } ServerKey(b) Message 4 b gets from a: {a. k ab } ServerKey(b).{n b } k ab In CSP this behavior can be modeled as follows: Responder(b,n b ) = [] (receive.a.b.a.n a  send.b.J.b.{a.n a.n b } ServerKey(b) k ab  Key  receive.a.b.{a. k ab } ServerKey(b).{n b } k ab n b  Nonce  Session(b,a,k ab,n a,n b ) ) m  T

14 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation14 Modeling Security Protocols in CSP J’s view (server): Message 2 j gets from ‘b’: b.{a.n a.n b } ServerKey(b) Message 3 j sends to a : {b. k ab.n a.n b } ServerKey(a).{a.k ab } ServerKey(b) In CSP this behavior can be modeled as follows: Server(J,k ab ) = [] (receive.b.J.b.{a.n a.n b } ServerKey(b) A,B  Agent  send.J.a. {b. k ab.n a.n b } ServerKey(a).{a.k ab } ServerKey(b) N b,n b  Nonce  Server(J,k s ) ) Server(J) = ||| Server(J,k ab ) k ab  Keys Server Question: why several server processes in parallel?

15 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation15 Modeling an intruder We want to model an intruder that represents all potential intruder behaviors Intruder(X) = learn ? m: messages  Intruder(close(X U {m}) [] say ! m: X /\ messages  Intruder(X) Close(X) represents all the possible information that the attacker can infer from X. Typically we assume: Dolew-Yao Assumptions: k, m |- {m} k {m} k, k -1 |- m |- x i x 1, …, x n |- }

16 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation16 Putting the network together Alice Bob Jeeves Yves take.Alice.y fake.x.Bob send receive learn say receive send Initiator(Alice,n A )[fake,take/receive,send] ||| Responder(Bob,n B )[fake,take/receive,send] ||| Server(Jeeves)[fake,take/receive,send] ||| Intruder(  )[take.x.y,fake.x.y/learn,say]

17 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation17 Alternative with direct channels Alice Bob Jeeves Yves take.Alice.y fake.x.Bob send receive learn say receive send S =[fake,comm,take,comm,/receive,send,receive,send] Initiator(Alice,n A )[S] ||| Responder(Bob,n B )[S] ||| Server(Jeeves)[S] ||| Intruder(  )[S] Comm.Alice.Bob

18 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation18 Expressing Security Properties in CSP Security properties: the goals that a protocol is meant to satisfy, relatively to specific kinds and levels of threat – the intruders and their capabilities We will consider the following security properties: Secrecy messages, keys, etc. have not become known Authentication Guarantees about the parties involved in the protocol Non-repudiation Evidence of the involvement of the other party Anonymity Protecting the identity of agents wrt particular events

19 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation19 Anonymity We will model events as consisting of two components: the event itself, x, and the identity of the agent performing the event, a a.x AnUsers: the users who want to remain secret Given x, define A = {a.x | a  AnUsers } Definition: A protocol described as a CSP system P provides anonymity if an arbitrary permutation of the events in A, applied to all the traces of P, does not alter the set of all possible traces of P

20 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation20 Anonymity Traces of a process: the sequences of visible actions in all possible runs Example: a -> b -> Stop ||| c -> d -> Stop Traces: a.b.c.d a.c.b.d c.a.b.d a.c.d.b c.a.d.b c.d.a.b Example: a -> b -> c -> Stop || {b} d -> b -> e -> Stop Traces: a.d.b.c.e d.a.b.c.e a.d.b.e.c d.a.b.e.c

21 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation21 Anonymity Let AnUsers = {p 1,p 2 } Let A = {p 1.m, p 2.m} Example 1 p 1.m -> p 2.m -> Stop Example 2 p 1.m -> Stop ||| p 2.m -> Stop Example 3 p 1.m -> Stop + p 2.m -> Stop Question: for each system, say whether or not it provides anonymity wrt A

22 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation22 Anonymity A more involved example: P = p 1. m -> a -> Stop [] p 2. m -> a -> Stop || { p 1.m, p 2.m } p 1. m -> b -> Stop [] p 2. m -> c -> Stop Question: Does P provides anonymity wrt A = {p 1.m, p 2.m}

23 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation23 Anonymity Answer: No P has traces (p 1.m).b.a, (p 2.m).c.a, … but not (p 2.m).b.a, (p 1.m).c.a, … The permutation { p 1 -> p 2, p 2 -> p 1 } changes the traces. However, if we assume that the observer has no visibility of the actions b and c, then the system does provide anonymity wrt A = {p 1.m, p 2.m} One elegant way to formalize the concept of visibility in CSP is to use the the hiding operator: P\{ b, c } provides anonymity wrt A Note: Hiding A would not be correct. Example: p 1.m -> Stop

24 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation24 Anonymity In general, given P, consider the sets: A = {a.x | a  AnUsers } : the actions that we want to know only partially (we want to know x but not a) B : the actions that we want to observe C = Actions – (B U A) : The actions we want to hide BC A  The system to consider for the Anonymity analysis: P\C Method: for any perm  : A -> A Check that  (traces(P\C)) = traces(P\C)

25 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation25 The dining cryptographers Three cryptographers share a meal The meal is paid either by the organization (master) or by one of them. The decision on who pays is taken by the master Each of them is informed by the master whether or not he is paying GOAL: The cryptographers would like to know whether the organization is paying or not, but without knowing the identity of the cryptographer who is paying (if any).

26 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation26 The dining cryptographers Solution: Each cryptographer tosses a coin. Each coin is in between two cryptographers. The result of each coin-tossing is visible to the adjacent cryptographers, and only to them. Each cryptographer examines the two adjacent coins If he is not paying, he announces “agree” if the results are the same, and “disagree” otherwise. If he is paying, he says the opposite Claim: if the number of “disagree” is even, then the master is paying. Otherwise, one of them is paying. In the latter case, the non paying cryptographers will not be able to deduce whom exactly is paying

27 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation27 Example: The dining cryptographers Crypt (0) Crypt (1) Crypt (2) Master Coin( 2) Coin (1) Coin (0) pays.0notpays.0 look.2.0 out.1

28 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation28 The dining cryptographers Specification in CSP: Master and Coins Master =  n pays.n -> notpays.(n+1) -> notpays (n+2) -> Stop + notpays.0 -> notpays.1 -> notpays.2 -> Stop Coin(n) = Heads(n) + Tails(n) Heads(n) = look.n.n.hd ->Stop ||| look.(n-1).n.hd ->Coin(n) Tails(n) = look.n.n.tl -> Stop ||| look.(n-1).n.tl ->Coin(n) Note: the arithmetic operations are modulo 3

29 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation29 The dining cryptographers Specification in CSP: Cryptographers Crypt(n) = notpays(n) -> Check(n) [] pays(n) -> Check’(n) Check(n) = look.n.n?x -> look.n.(n+1)?y -> if (x=y) then out.n. agree -> Stop else out.n. disagree -> Stop Check’(n) = look.n.n?x -> look.n.(n+1)?y -> if (x=y) then out.n. disagree -> Stop else out.n. agree -> Stop

30 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation30 The dining cryptographers Specification in CSP: The whole system Crypts = Crypt(0) ||| Crypt(1) ||| Crypt(2) Coins = Coin(0) ||| Coin(1) ||| Coin(2) Meal = Master || {pays, notpays} ( Coins || {look} Crypts )

31 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation31 The dining cryptographers The anonymity property A = { pays.0, pays.1, pays.2 } B = { out } C = Actions – (B U A) = {look,notpays} Theorem: For every permutation  : A -> A, we have  ( Traces( Meal\C)) = traces(Meal\C) This theorem means that an external observer cannot infer which cryptographer has paid. This theorem can be proved by using the automatic tool FDR.

32 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation32 The dining cryptographers One can argue that previous result is not strong enough: a cryptographer has more information than an external observer. Let us then do the analysis for a cryptographer, say Crypt(0) A = { pays.1, pays.2 } B = { pays.0, notpays.0, look.0, out } C = Actions – (B U A) Theorem: For every permutation  : A -> A, we have  (traces(Meal\C)) = traces(Meal\C) This means that if Crypt(1) or Crypt(2) pay, then Crypt(0) can’t infer which of them has paid. The same can be shown for the other two. So Meal\C provides the desired anonymity property.

33 6 June 2002 - Lecture 3TU Dresden - Ws on Proof Theory and Computation33 The dining cryptographers Example of a case in which the anonymity property does not hold. Assume that Crypt(0) can access the result of the third coin, namely has visibility of the result of the action look.2.2 A = { pays.1, pays.2 } B = { pays.0, notpays.0, look.0, out } U { look.2.2 } C = Actions – (B U A) We have that for some permutation  : A -> A,  (traces(Meal\C)) =/= traces(Meal\C) pays.2 notpays.0 look.00.heads look.0.1.heads look.2.2.heads out.2.disagree  YES pays.1 notpays.0 look.00.heads look.0.1.heads look.2.2.heads out.2.disagree  NO


Download ppt "6 June 2002 - Lecture 3 1 TU Dresden - Ws on Proof Theory and Computation Formal Methods for Security Protocols Catuscia Palamidessi Penn State university,"

Similar presentations


Ads by Google