Download presentation
Presentation is loading. Please wait.
Published byHolly Woods Modified over 9 years ago
1
Analysis of an E-voting Protocol in the Applied Pi Calculus May 7, 2012
2
2 Consider the following protocol: B A : nb;text1 A: begin(A,B,nb) A B : pkA;A;{na;nb;text2} skA B: end(A,B,nb) B: begin(B,A,na) B A : pkB;B;{na;text3} skB A: end(B,A,na) ◦ A public-key infrastructure is assumed: pkA, pkB are the public keys of A and B, respectively ◦ skA, skB are the signing keys of A and B, respectively ◦ text1,text2,text3 are publicly known messages Write a model of this protocol in ProVerif Check whether the protocol correctly realizes mutual authentication between A and B; fix it if needed
3
free c. (* channel *) free A,B. (* identifiers *) free text1,text2,text3. (* publicly known messages *) fun sign/2. fun pk/1. private fun sk/1. reduc check(sign(x,sk(y)),pk(y))=x. (* queries *) query evinj:end(x,y,z) ==> evinj:begin(x,y,z). query evinj:end2(x,y,z) ==> evinj:begin2(x,y,z). 3 B A : nb;text1 A B : pkA;A;{na;nb;text2} skA B A : pkB;B;{na;text3} skB
4
let initiatorB = new nb; out(c,(B,nb,text1)); in(c,(x)); let ((pka,=A),mess1) = x in let (na,=nb,=text2) = check(mess1,pka) in event end(A,B,nb); event begin2(B,A,na); out(c,(pk(B),B,sign((na,text3),sk(B)))). 4 B A : nb;text1 A B : pkA;A;{na;nb;text2} skA B A : pkB;B;{na;text3} skB
5
let responderA = in(c,x); let (B,nb,=text1) = x in event begin(A,B,nb); new na; out(c,((pk(A),A),sign((na,nb,text2),sk(A)))); in(c,(pkb, =B,mess1)); let (=na,=text3) = check(mess1,pkb) in event end2(B,A,na). process !responderA | !initiatorB 5 Mutual authentication? NO! The attacker can impersonate the users cheating on keys How to fix: Certificates (pkID, ID) should be signed by a TTP and identities should be checked B A : nb;text1 A B : pkA;A;{na;nb;text2} skA B A : pkB;B;{na;text3} skB
6
Today’s class is based on a paper by Steve Kremer and Mark Ryan: ◦ “Analysis of an Electronic Voting Protocol in the Applied Pi Calculus” Analysis of an e-voting protocol by Fujioka, Okamoto and Ohta known as FOO92 ◦ partly carried out using ProVerif ◦ where ProVerif is not powerful enough, by hand proofs 6
7
Fairness ◦ No early results can be obtained (to avoid influencing the remaining voters) Eligibility ◦ Only legitimate voters can vote, and only once Privacy ◦ The association of a voter with her vote is not revealed to anyone Individual verifiability ◦ A voter can verify that her vote was really counted Universal verifiability ◦ A voter can verify that the published outcome really is the sum of all votes Receipt-freeness ◦ A voter cannot prove that she voted in a certain way (to protect voters from coercion) … in the presence of corrupt election authorities 7
8
8 Eligibility Only legitimate voters can vote and only once Individual verifiability A voter can verify that her vote was really counted Privacy The association of a voter with her vote is not revealed to anyone Receipt-freeness A voter cannot prove that she voted in a certain way
9
Most of these security goals look different from the ones studied so far, but they can be expressed in terms of ◦ Secrecy: ProVerif supports reasoning about secrecy for direct flows ◦ Testing equivalence: ProVerif supports testing equivalence, but its reasoning is notably incomplete 9
10
Kremer and Ryan express FOO92 in ProVerif input language Then they prove that it satisfies: ◦ Fairness(with ProVerif) ◦ Eligibility(with ProVerif) ◦ Privacy(by hand) 10
11
Voter Administrator ◦ Checks if the voter is legitimate The voter has the right to vote The voter has not voted already Collector ◦ Collects and publishes the votes 11
12
Constructor ◦ commit/2. Destructor ◦ open/2. Reduction rule ◦ open(commit(v,r),r) = v. From our abstract point of view, bit commitment is exactly like symmetric encryption ◦ v is a vote ◦ r is a random key to encrypt the vote 12
13
Constructors ◦ enc/1. ◦ dec/1. ◦ sign/2. Destructor ◦ checksign/2. Reduction rule ◦ checksign(sign(m,enc(kp)),dec(kp)) = m. 13
14
Constructor ◦ blind/2. Destructor ◦ unblind/2. Reduction rules (where b is the blinding factor) ◦ unblind(blind(m,b),b) = m. Like symmetric encryption ◦ unblind(sign(blind(m,b),sk),b) = sign(m,sk). In order to legitimize a vote, an administrator signs the vote that has been blinded by a voter The voter can unblind it while preserving the administrators signature, before forwarding it to the vote collector 14
15
An observer who sees a message on a channel must not be able to tell the origin and the destination of the message That is exactly how channels are modeled in the spi-calculus (and in ProVerif) Implementing anonymous channels in reality is problematic, but there are some solutions, like MIX-nets and onion routing 15
16
Three consecutive phases: ◦ Legitimization phase The administrator legitimizes the votes ◦ Voting phase The voters send their votes to the collectors ◦ Opening phase The collector publishes the votes The end of each phase is a global synchronization point ◦ the next phase does not start before the previous phase has ended 16
17
Voter V selects a vote v and computes the commitment x of v and a random key r ◦ x = commit(v,r) V computes the message e using a blinding function and a random blinding factor b ◦ e = blind(x,b) = blind(commit(v,r),b) V digitally signs e and sends the signature to the administrator A together with her identity ◦ V A: V, sign(e,sv) = sign(blind(commit(v,r),b),sv) A verifies that ◦ V has the right to vote ◦ V has not voted yet ◦ The signature is valid If so, A sends her digital signature to V ◦ A V : sign(blind(commit(v,r),b),sa) V unblinds the message obtaining y = sign(commit(v,r),sa) 17
18
V sends y, A ’s signature on the commitment to V ’s vote, to the collector C using an anonymous channel ◦ V C: y = sign(x,sa) = sign(commit(v,r),sa) C checks correctness of the signature y and, if the test succeeds, enters (l,x,y) onto a list as an l -th item 18
19
The collector C received all votes The voters reveal the random key r so that C can open the votes and publish them ◦ C publishes the list of votes ( l i,x i,y i ) ◦ V verifies that her commitment is in the list and sends l,r to C via an anonymous channel ( V C: l, r ) ◦ C opens the l -th ballot using the random r and publishes the vote v 19
20
Which security goal would be violated if blinding was omitted? ◦ Privacy The first message, without blinding, would create an observable link between vote and voter (once the voter has revealed the random key r) Which security goal would be violated if bit commitment was not collision-free? commit(v,r) = commit(v ’,r ’ ) for some (v,r)≠(v ’,r ’ ) ◦ Fairness The voter could change his vote after the voting phase (by publishing r ’ instead of r ). 20
21
let voter = new r; new b; let (bcv) = blind(commit(v,r),b) in out(net,(v, sign(bcv,sv))); in(net,lbcv); let (=bcv) = checksign(lbcv,pka) in let (lcv) = unblind(lbcv,b) in phase 1; out(net,lcv); in(net,(l,=lcv)); phase 2; out(net,(l,r)). The vote v is: ◦ a free variable of the system... votes are guessable ◦... and not new-generated names they model unguessable data ProVerif phases specifies global synchronization points 21
22
let administrator = in(privCh,(V,pkv)); (* voter and public key *) in(net,(=V,sbcv)); let (bcv) = checksign(sbcv,pkv) in out(net,sign(bcv,ska)). In Kremer and Ryan’s model, the administrator does not check for duplicate votes 22
23
let collector = phase 1; in(net,lcv); new l; out(net,(l,lcv)); phase 2; in(net,(=l,r)); let (v) = open(checksign(lcv,pka),r) in out(net,v). Remark: There is a small discrepancy ◦ Informal description: Collector checks administrator signature in voting phase ◦ ProVerif model: Collector checks administrator signature in opening phase 23
24
Fairness ensures that no early results can be obtained Kremer and Ryan verify fairness as a secrecy property ◦ It should be impossible for an attacker to learn a vote before the opening phase (2) Strong secrecy of the votes up to the end of the voting phase ProVerif successfully proves that FOO92 guarantees that (in this model) an intruder cannot obtain the votes or learn any information about them before the voting phase ends 24
25
Eligibility verifies that ◦ only legitimate voters can vote… The attacker has a challengeVote (a global name) The attacker is illegitimate (he does not have a valid signing key) They modify the collector process (it publishes a fresh name attack if and only if he receives the challengeVote ) Then attack becomes public if and only if the collector receives the challengeVote from the attacker In order to verify that the challengeVote from the attacker never reaches the collector, it suffices to show that attack remains secret the problem is reduced to secrecy ◦ … and only once This cannot be verified in the model because all voters share the same key 25
26
let collector = phase 1; in(net,lcv); new l; out(net,(l,lcv)); phase 2; in(net,(=l,r)); let (v) = open(checksign(lcv,pka),r) in new attack; if (v) = challengeVote then out(net,attack) else out(net,v). ProVerif succeeds in verifying the standard secrecy of attack, therefore FOO92 guarantees eligibility 26
27
Privacy aims to guarantee that the association of a voter with her vote is not revealed to anyone ◦ We need to suppose that at least two voters are honest (if there is only one honest voter then privacy can never be guaranteed) Voter V 1 – vote 1 Voter V 2 – vote 2 ◦ Privacy: P[vote 1 /v 1, vote 2 /v 2 ] ≈ P[vote 2 /v 1, vote 1 /v 2 ] ProVerif fails on this, because its reasoning about testing equivalence is incomplete (proof by hand) 27
28
The phase separator between legitimization (1) and voting (2) phase is crucial for privacy Without the separator, the following attack on privacy would be possible: ◦ The attacker blocks all messages coming from voters other than V until he has seen on the network two messages that are signed by A ◦ The attacker knows that the second of these A -signed messages contains V ’s committed vote (unblinded!) ◦ Once V publishes his random key r, the attacker can open V ’s committed vote (knowing that this is V ’s vote) 28
29
For their hand proof of privacy for FOO92, Kremer and Ryan use of a powerful proof method for testing equivalence, called labeled bisimilarity ◦ Abadi, Fournet: “Mobile Values, New Names, and Secure Communication” 29
30
Abadi and Fournet’s article is based on the applied pi calculus 30 ProVerif 1.distinguishes between constructors and destructors 2.reduction rules 3.ProVerif language is restricted to enable automatic analysis 4.ProVerif actually allows certain equations, too, (keyword: equation), but internally translates these to reduction rules Applied pi calculus 1.no distinction between constructors and destructors 2.equations 3.more general than ProVerif language
31
The trouble with testing equivalence is that, by definition, for proving P ≃ Q it quantifies over all the possible contexts Labeled bisimilarity is a relation that is contained in testing equivalence, and whose definition avoids an infinite quantification 31
32
Abadi/Fournet enrich the syntax domain of processes with active substitutions {M/x} They enrich the operational semantics with: ◦ a labeled reduction rule that allows to reduce outputs without matching input: new u; (Q | out c M;P) new u; (Q | P | {M/x}) where x is not a free variable in P or Q ◦ a rule that allows to reduce inputs without matching outputs: new u; (Q | inp c x;P) new u; (Q | {M/x}P) where (fv(M) ∪ fn(M)) ∩ u = ∅ 32 new x.out c x inp c M
33
A frame is a process that is built up from stop and active substitutions, using parallel composition and new-generation: We let ψ range over frames and σ over substitutions. Every enriched process P can be mapped to a frame ψ(P) by replacing by stop all processes that are not active substitutions, parallel compositions or new-generations. Example: (new c; new d; (inp c x;P | {M/x} | out d N;Q)) = = new c; new d; (stop | {M/x} | stop) = new c; new d; ({M/x}) 33
34
(M = N)ψ iff ∃ n,σ s.t. ◦ ψ = new n.σ ◦ σM=σN ◦ n∩(fn(M) ∪ fn(N))= ∅ Example: Assume fun f/1, fun g/1, and no equations ◦ ψ 0 = new k; new s; {k/x, s/y} ◦ ψ 1 = new k; new s; {f(k)/x, g(k)/y} ◦ ψ 2 = new k; new s; {k/x, f(k)/y} Then (f(x)=y)ψ 2, but not (f(x)=y)ψ 1 and not (f(x)=y)ψ 0 34
35
Static Equivalence of Frames: ◦ φ≈ s ψ iff dom(φ)=dom(ψ) ( ∀ M,N)((M=N)φ ⇔ (M=N))ψ) Example: ψ 0 ≉ s ψ 2, ψ 1 ≉ s ψ 2, ψ 0 ≈ s ψ 1 Static Process Equivalence: P≈ s Q iff ψ(P)≈ s ψ(Q) Depending on the underlying equational theory, static process equivalence can be quite hard to show, but at least it does not depend on the dynamics of processes 35
36
Labeled bisimilarity ≈ l is an equivalence relation on processes In order to prove that P≈ l Q, one needs to prove the existence of a relation R such that PRQ: ◦ P≈ s Q ◦ If P P’, then ( ∃ Q’)(Q * Q’ and P’ R Q’) ◦ If P P’, then ( ∃ Q’)(Q * * Q’ and P’ R Q’). Furthermore, one needs to prove these three statements with the roles of P and Q reversed Note that one has to apply these rules iteratively, because R occurs in the 2 nd and 3 rd rule Technically, ≈ l is defined as the largest symmetric, binary relation on processes that satisfies the three rules 36 αα
37
Theorem: If P≈ l Q, then P ≃ Q This theorem tells us that for proving testing equivalence P ≃ Q, it suffices to prove labeled bisimilarity P≈ l Q Proving labeled bisimilarity is often simpler than proving testing equivalence directly 37
38
In order to show privacy for FOO92, Kremer and Ryan show the following labeled bisimilarity: P[vote1/v1, vote2/v2] ≈ l P[vote2/v1, vote1/v2] 38
39
Consider the following protocol between two computers, a server and a client: S C: (hello; timestamp) C: begin(C,S,timestamp) C S: timestamp; hash(k CS ) S: end(C,S,timestamp) ◦ k CS is a long-term key shared between S and C ◦ timestamp may be considered as a nonce Write a model of this protocol in ProVerif Check whether the protocol correctly realizes the authentication of C to S; fix it if needed Check whether the protocol preserves the secrecy of k CS. What about weak secrecy? 39
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.