Anonymity and Probabilistic Model Checking CS 259 John Mitchell 2008.

Slides:



Advertisements
Similar presentations
Security attacks. - confidentiality: only authorized parties have read access to information - integrity: only authorized parties have write access to.
Advertisements

Ulams Game and Universal Communications Using Feedback Ofer Shayevitz June 2006.
Secure Multiparty Computations on Bitcoin
Automatic Verification Book: Chapter 6. What is verification? Traditionally, verification means proof of correctness automatic: model checking deductive:
PROTOCOL VERIFICATION & PROTOCOL VALIDATION. Protocol Verification Communication Protocols should be checked for correctness, robustness and performance,
Polling With Physical Envelopes A Rigorous Analysis of a Human–Centric Protocol Tal Moran Joint work with Moni Naor.
Protocols for Anonymity CS 395T. Overview uBasic concepts of anonymity Chaum’s MIX Dining cryptographers Knowledge-based definitions of anonymity uProbabilistic.
Digital Signatures and Hash Functions. Digital Signatures.
1 Introduction CSE 5351: Introduction to cryptography Reading assignment: Chapter 1 of Katz & Lindell.
Foundations of Cryptography Lecture 4 Lecturer: Moni Naor.
Slide 1 Vitaly Shmatikov CS 378 Digital Cash. slide 2 Digital Cash: Properties uDigital “payment message” with properties of cash uUnforgeable Users cannot.
Short course on quantum computing Andris Ambainis University of Latvia.
Probabilistic Contract Signing CS 259 Vitaly Shmatikov.
Probabilistic Model Checking CS 395T. Overview uCrowds redux uProbabilistic model checking PRISM model checker PCTL logic Analyzing Crowds with PRISM.
Session 4 Asymmetric ciphers.
Crowds: Anonymity for Web Transactions Paper by: Michael K. Reiter and Aviel D. Rubin, Presented by Eric M. Busse Portions excerpt from Crowds: Anonymity.
Apr 22, 2003Mårten Trolin1 Agenda Course high-lights – Symmetric and asymmetric cryptography – Digital signatures and MACs – Certificates – Protocols Interactive.
ITIS 6200/8200. time-stamping services Difficult to verify the creation date and accurate contents of a digital file Required properties of time-stamping.
Modelling and Analysing of Security Protocol: Lecture 9 Anonymous Protocols: Theory.
CRYPTOGRAPHY WHAT IS IT GOOD FOR? Andrej Bogdanov Chinese University of Hong Kong CMSC 5719 | 6 Feb 2012.
Optimistic Synchronous Multi-Party Contract Signing N. Asokan, Baum-Waidner, M. Schunter, M. Waidner Presented By Uday Nayak Advisor: Chris Lynch.
Lecture 20: April 12 Introduction to Randomized Algorithms and the Probabilistic Method.
Anonymous Communication Luis von Ahn Andrew Bortz Nick Hopper The Aladdin Center Carnegie Mellon University.
Network Security – Part 2 V.T. Raja, Ph.D., Oregon State University.
K-Anonymous Message Transmission Luis von Ahn Andrew Bortz Nick Hopper The Aladdin Center Carnegie Mellon University.
Information Theory and Security
 Structured peer to peer overlay networks are resilient – but not secure.  Even a small fraction of malicious nodes may result in failure of correct.
0x1A Great Papers in Computer Security Vitaly Shmatikov CS 380S
E- Business Digital Signature Varna Free University Prof. Teodora Bakardjieva.
Computer Science Public Key Management Lecture 5.
Toward Prevention of Traffic Analysis Fengfeng Tu 11/26/01.
1 Lecture 18: Security issues specific to security key management services –privacy –integrity/authentication –nonrepudiation/plausible deniability.
On the Anonymity of Anonymity Systems Andrei Serjantov (anonymous)
A Tale of Research: From Crowds to Deeper Understandings Matthew Wright Jan. 25, : Adv. Network Security.
Privacy and Anonymity CS432 - Security in Computing Copyright © 2005, 2006 by Scott Orr and the Trustees of Indiana University.
CSE 486/586, Spring 2012 CSE 486/586 Distributed Systems Case Study: TOR Anonymity Network Bahadir Ismail Aydin Computer Sciences and Engineering University.
Lecture 19 Page 1 CS 111 Online Symmetric Cryptosystems C = E(K,P) P = D(K,C) E() and D() are not necessarily the same operations.
Securing Every Bit: Authenticated Broadcast in Wireless Networks Dan Alistarh, Seth Gilbert, Rachid Guerraoui, Zarko Milosevic, and Calvin Newport.
Provable Unlinkability Against Traffic Analysis Amnon Ta-Shma Joint work with Ron Berman and Amos Fiat School of Computer Science, Tel-Aviv University.
Untraceable Electronic Mail, Return Addresses, and Digital Pseudonyms David Chaum CACM Vol. 24 No. 2 February 1981 Presented by: Adam Lee 1/24/2006 David.
Analysis of a Fair Exchange Protocol Vitaly Shmatikov John Mitchell Stanford University.
Privacy Enhancing Technologies Spring What is Privacy? “The right to be let alone” Confidentiality Anonymity Access Control Most privacy technologies.
Game-Based Verification of Fair Exchange Protocols CS 259 Vitaly Shmatikov.
Security protocols  Authentication protocols (this lecture)  Electronic voting protocols  Fair exchange protocols  Digital cash protocols.
Crowds: Anonymity for Web Transactions Michael K. Reiter Aviel D. Rubin Jan 31, 2006Presented by – Munawar Hafiz.
Anonymity – Crowds R. Newman. Topics Defining anonymity Need for anonymity Defining privacy Threats to anonymity and privacy Mechanisms to provide anonymity.
Probabilistic Model Checking for Security Protocols CS 259 Vitaly Shmatikov.
Advanced Database Course (ESED5204) Eng. Hanan Alyazji University of Palestine Software Engineering Department.
Slide 1 Vitaly Shmatikov CS 361S Anonymity Networks.
CS 395T Game-Based Verification of Contract Signing Protocols.
CSE 486/586, Spring 2013 CSE 486/586 Distributed Systems Global States Steve Ko Computer Sciences and Engineering University at Buffalo.
Probabilistic Anonymity Mohit Bhargava, IIT New Delhi Catuscia Palamidessi, INRIA Futurs & LIX.
Paris, 17 December 2007MPRI Course on Concurrency MPRI – Course on Concurrency Lecture 14 Application of probabilistic process calculi to security Catuscia.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
Alternating Temporal Logic and Game-Based Properties CS 259 John Mitchell with slides from Vitaly Shmatikov.
Chapter 21 Asynchronous Network Computing with Process Failures By Sindhu Karthikeyan.
Mix networks with restricted routes PET 2003 Mix Networks with Restricted Routes George Danezis University of Cambridge Computer Laboratory Privacy Enhancing.
A Key Management Scheme for Distributed Sensor Networks Laurent Eschaenauer and Virgil D. Gligor.
CSE 486/586 CSE 486/586 Distributed Systems Global States Steve Ko Computer Sciences and Engineering University at Buffalo.
Fall 2006CS 395: Computer Security1 Key Management.
Probabilistic Contract Signing CS 395T. Probabilistic Fair Exchange uTwo parties exchange items of value Signed commitments (contract signing) Signed.
1 Anonymity. 2 Overview  What is anonymity?  Why should anyone care about anonymity?  Relationship with security and in particular identification 
Introduction to Randomized Algorithms and the Probabilistic Method
Anonymous Communication
Protocols for Anonymous Communication
0x1A Great Papers in Computer Security
Anonymous Communication
‘Crowds’ through a PRISM
Anonymous Communication
Probabilistic Contract Signing
Presentation transcript:

Anonymity and Probabilistic Model Checking CS 259 John Mitchell 2008

Course schedule uLectures Prob. model checking, other tools (with examples) uHomework 2 Posted last week, due Tues Feb 12 Simple exercises using probabilistic tool uProjects Presentation 2: Feb 19, 21 –Describe tool or approach and the properties you will check Presentation 3: Mar 4 – 13 (two or three meetings) –Final results: turn in slides and tool input

Dining Cryptographers uClever idea how to make a message public in a perfectly untraceable manner David Chaum. “The dining cryptographers problem: unconditional sender and recipient untraceability.” Journal of Cryptology, uGuarantees information-theoretic anonymity for message senders This is an unusually strong form of security: defeats adversary who has unlimited computational power uImpractical, requires huge amount of randomness In group of size N, need N random bits to send 1 bit

Three-Person DC Protocol Three cryptographers are having dinner. Either NSA is paying for the dinner, or one of them is paying, but wishes to remain anonymous. 1.Each diner flips a coin and shows it to his left neighbor. Every diner will see two coins: his own and his right neighbor’s. 2.Each diner announces whether the two coins are the same. If he is the payer, he lies (says the opposite). 3.Odd number of “same”  NSA is paying; even number of “same”  one of them is paying But a non-payer cannot tell which of the other two is paying!

? Non-Payer’s View: Same Coins “same”“different” payer ? “same”“different” Without knowing the coin toss between the other two, non-payer cannot tell which of them is lying

? Non-Payer’s View: Different Coins “same” payer Without knowing the coin toss between the other two, non-payer cannot tell which of them is lying ? “same”

Superposed Sending uThis idea generalizes to any group of size N uFor each bit of the message, every user generates 1 random bit and sends it to 1 neighbor Every user learns 2 bits (his own and his neighbor’s) uEach user announces (own bit XOR neighbor’s bit) uSender announces (own bit XOR neighbor’s bit XOR message bit) uXOR of all announcements = message bit Every randomly generated bit occurs in this sum twice (and is canceled by XOR), message bit occurs once

DC-Based Anonymity is Impractical uRequires secure pairwise channels between group members Otherwise, random bits cannot be shared uRequires massive communication overhead and large amounts of randomness uDC-net (a group of dining cryptographers) is robust even if some members cooperate Guarantees perfect anonymity for the other members uA great protocol to analyze Difficult to reason about each member’s knowledge

Definitions of Anonymity u“Anonymity is the state of being not identifiable within a set of subjects.” There is no such thing as absolute anonymity uUnlinkability of action and identity E.g., sender and his are no more related within the system than they are related in a-priori knowledge uUnobservability Any item of interest (message, event, action) is indistinguishable from any other item of interest u“Anonymity is bullshit” - Joan Feigenbaum

Anonymity and Knowledge uAnonymity deals with hiding information User’s identity is hidden Relationship between users is hidden User cannot be identified within a set of suspects uNatural way to express anonymity is to state what the adversary should not know Good application for logic of knowledge Not supported by conventional formalisms for security (process calculi, I/O automata, …) uTo determine whether anonymity holds, need some representation of knowledge

k-Anonymity uBasic idea Someone robbed the bank Detectives know that it is one of k people uAdvantage Does not involve probability uDisadvantages Does not involve probability Depends on absence of additional information

Data Anonymity uProblem: de-identifying data does not necessarily make it anonymous. It can often be re-identified: Ethnicity Visit date Diagnosis Procedure Medication Total bill ZIP Birth date Sex Medical Data Name Address Date registered Party Date last voted Voter Lists ZIP Birth date Sex SOURCE: LATANYA SWEENEY

Date of birth, gender + 5-digit ZIP uniquely identifies 87.1% of U.S. population ZIP has 112,167 people, 11%, not 0% uniquely identified. Insufficient # over 55 living there. SOURCE: LATANYA SWEENEY = one ZIP code

Anonymity via Random Routing uHide message source by routing it randomly Popular technique: Crowds, Freenet, Onion Routing uRouters don’t know for sure if the apparent source of a message is the true sender or another router Only secure against local attackers!

Onion Routing R R4R4 R1R1 R2R2 R R R3R3 Bob R R R uSender chooses a random sequence of routers Some routers are honest, some hostile Sender controls the length of the path Similar to a MIX cascade uGoal: hostile routers shouldn’t learn that Alice is talking to Bob [Reed, Syverson, Goldschlag ’97] Alice

The Onion R4R4 R1R1 R2R2 R3R3 Bob Alice {R 2,k 1 } pk(R 1 ),{ } k 1 {R 3,k 2 } pk(R 2 ),{ } k 2 {R 4,k 3 } pk(R 3 ),{ } k 3 {B,k 4 } pk(R 4 ),{ } k 4 {M} pk(B) Routing info for each link encrypted with router’s public key Each router learns only the identity of the next router

Crowds System C C4C4 C1C1 C2C2 C C C C3C3 C0C0 sender recipient C C C C pfpf 1-p f uRouters form a random path when establishing connection In onion routing, random path is chosen in advance by sender uAfter receiving a message, honest router flips a biased coin With probability P f randomly selects next router and forwards msg With probability 1-P f sends directly to the recipient [Reiter,Rubin ‘98] Messages encrypted with shared symmetric keys

Probabilistic Notions of Anonymity uBeyond suspicion The observed source of the message is no more likely to be the true sender than anybody else uProbable innocence Probability that the observed source of the message is the true sender is less than 50% uPossible innocence Non-trivial probability that the observed source of the message is not the true sender Guaranteed by Crowds if there are sufficiently many honest routers: N good +N bad  p f / (p f -0.5)  (N bad +1)

A Couple of Issues uIs probable innocence enough? … 1% 49% 1% uMultiple-paths vulnerability Can attacker relate multiple paths from same sender? –E.g., browsing the same website at the same time of day Each new path gives attacker a new observation Can’t keep paths static since members join and leave Maybe Ok for “plausible deniability”

Probabilistic Model Checking uParticipants are finite-state machines Same as Mur  uState transitions are probabilistic Transitions in Mur  are nondeterministic uStandard intruder model Same as Mur  : model cryptography with abstract data types uMur  question: Is bad state reachable? uProbabilistic model checking question: What’s the probability of reaching bad state? “bad state”

Discrete-Time Markov Chains uS is a finite set of states us 0  S is an initial state  T :S S  [0,1] is the transition relation s,s’  S  s’ T(s,s’)=1 uL is a labeling function (S, s 0, T, L)

Markov Chain: Simple Example B A C D 0.2 E s0s0 Probabilities of outgoing transitions sum up to 1.0 for every state Probability of reaching E from s 0 is 0.2   0.1  0.5=0.14 The chain has infinite paths if state graph has loops –Need to solve a system of linear equations to compute probabilities

PRISM [Kwiatkowska et al., U. of Birmingham] uProbabilistic model checker uSystem specified as a Markov chain Parties are finite-state machines w/ local variables State transitions are associated with probabilities – Can also have nondeterminism (Markov decision processes) All parameters must be finite uCorrectness condition specified as PCTL formula uComputes probabilities for each reachable state –Enumerates reachable states –Solves system of linear equations to find probabilities

PRISM Syntax B A C D 0.2 E s0s0 module Simple state: [1..5] init 1; [] state=1 -> 0.8: state’= : state’=3; [] state=2 -> 0.1: state’= : state’=4; [] state=3 -> 0.5: state’= : state’=5; endmodule IF state=3 THEN with prob. 50% assign 4 to state, with prob. 50% assign 5 to state

Modeling Crowds with PRISM uModel probabilistic path construction uEach state of the model corresponds to a particular stage of path construction 1 router chosen, 2 routers chosen, … uThree probabilistic transitions Honest router chooses next router with probability p f, terminates the path with probability 1-p f Next router is probabilistically chosen from N candidates Chosen router is hostile with certain probability uRun path construction protocol several times and look at accumulated observations of the intruder

PRISM: Path Construction in Crowds module crowds... // N = total # of routers, C = # of corrupt routers // badC = C/N, goodC = 1-badC [] (!good & !bad) -> goodC: (good’=true) & (revealAppSender’=true) + badC: (badObserve’=true); // Forward with probability PF, else deliver [] (good & !deliver) -> PF: (pIndex’=pIndex+1) & (forward’=true) + notPF: (deliver’=true);... endmodule Next router is corrupt with certain probability Route with probability PF, else deliver

PRISM: Intruder Model module crowds... // Record the apparent sender and deliver [] (badObserve & appSender=0) -> (observe0’=observe0+1) & (deliver’=true);... // Record the apparent sender and deliver [] (badObserve & appSender=15) -> (observe15’=observe15+1) & (deliver’=true);... endmodule For each observed path, bad routers record apparent sender Bad routers collaborate, so treat them as a single attacker No cryptography, only probabilistic inference

uProbabilistic Computation Tree Logic uUsed for reasoning about probabilistic temporal properties of probabilistic finite state spaces uCan express properties of the form “under any scheduling of processes, the probability that event E occurs is at least p’’ By contrast, Mur  can express only properties of the form “does event E ever occur?’’ PCTL Logic [Hansson, Jonsson ‘94]

uState formulas First-order propositions over a single state  ::= True | a |    |    |  | P >p [  ] uPath formulas Properties of chains of states  ::= X  |  U  k  |  U  PCTL Syntax Predicate over state variables (just like a Mur  invariant) Path formula holds with probability > p State formula holds for every state in the chain First state formula holds for every state in the chain until second becomes true

PCTL: State Formulas uA state formula is a first-order state predicate Just like non-probabilistic logic X=3 y=0 X=1 y=1 X=1 y=2 0.2 X=2 y= s0s0 0.5  = (y>1) | (x=1) True False

PCTL: Path Formulas uA path formula is a temporal property of a chain of states  1 U  2 = “  1 is true until  2 becomes and stays true” X=3 y=0 X=1 y=1 X=1 y=2 0.2 X=2 y= s0s0 0.5  = (y>0) U (x>y) holds for this chain

PCTL: Probabilistic State Formulas uSpecify that a certain predicate or path formula holds with probability no less than some bound X=3 y=0 X=1 y=1 X=1 y=2 0.2 X=2 y= s0s0 0.5  = P >0.5 [(y>0) U (x=2)] False True False

Intruder Model Redux module crowds... // Record the apparent sender and deliver [] (badObserve & appSender=0) -> (observe0’=observe0+1) & (deliver’=true);... // Record the apparent sender and deliver [] (badObserve & appSender=15) -> (observe15’=observe15+1) & (deliver’=true);... endmodule Every time a hostile crowd member receives a message from some honest member, he records his observation (increases the count for that honest member)

Negation of Probable Innocence launch -> [true U (observe0>observe1) & done] > 0.5 “The probability of reaching a state in which hostile crowd members completed their observations and observed the true sender (crowd member #0) more often than any of the other crowd members (#1 … #9) is greater than 0.5” launch -> [true U (observe0>observe9) & done] > 0.5 …

Analyzing Multiple Paths with PRISM uUse PRISM to automatically compute interesting probabilities for chosen finite configurations u“Positive”: P(K 0 > 1) Observing the true sender more than once u“False positive”: P(K i  0 > 1) Observing a wrong crowd member more than once u“Confidence”: P(K i  0  1 | K 0 > 1) Observing only the true sender more than once K i = how many times crowd member i was recorded as apparent sender

Size of State Space All hostile routers are treated as a single router, selected with probability 1/6

Sender Detection (Multiple Paths) uAll configurations satisfy probable innocence uProbability of observing the true sender increases with the number of paths observed u… but decreases with the increase in crowd size uIs this an attack? uReiter & Rubin: absolutely not uBut… Can’t avoid building new paths Hard to prevent attacker from correlating same-sender paths 1/6 of routers are hostile

Attacker’s Confidence u“Confidence” = probability of detecting only the true sender uConfidence grows with crowd size uMaybe this is not so strange True sender appears in every path, others only with small probability Once attacker sees somebody twice, he knows it’s the true sender uIs this an attack? uLarge crowds: lower probability to catch senders but higher confidence that the caught user is the true sender uBut what about deniability? 1/6 of routers are hostile

Probabilistic Contract Signing Slides borrowed from Vitaly Shmatikov

Probabilistic Fair Exchange uTwo parties exchange items of value Signed commitments (contract signing) Signed receipt for an message (certified ) Digital cash for digital goods (e-commerce) uImportant if parties don’t trust each other Need assurance that if one does not get what it wants, the other doesn’t get what it wants either uFairness is hard to achieve Gradual release of verifiable commitments Convertible, verifiable signature commitments Probabilistic notions of fairness

Properties of Fair Exchange Protocols Fairness At each step, the parties have approximately equal probabilities of obtaining what they want Optimism If both parties are honest, then exchange succeeds without involving a judge or trusted third party Timeliness If something goes wrong, the honest party does not have to wait for a long time to find out whether exchange succeeded or not

Rabin’s Beacon uA “beacon” is a trusted party that publicly broadcasts a randomly chosen number between 1 and N every day Michael Rabin. “Transaction protection by beacons”. Journal of Computer and System Sciences, Dec Jan 27Jan 28Jan 29Jan 30Jan 31Feb 1 …

Contract CONTRACT(A, B, future date D, contract terms) Exchange of commitments must be concluded by this date

CONTRACT(A, B, future date D, contract terms) Rabin’s Contract Signing Protocol sig B ”I am committed if 1 is broadcast on day D” sig A ”I am committed if 1 is broadcast on day D” sig A ”I am committed if i is broadcast on day D” sig B ”I am committed if i is broadcast on day D” … sig A ”I am committed if N is broadcast on day D” sig B ”I am committed if N is broadcast on day D” 2N messages are exchanged if both parties are honest

Probabilistic Fairness uSuppose B stops after receiving A’s i th message B has sig A ”committed if 1 is broadcast”, sig A ”committed if 2 is broadcast”, … sig A ”committed if i is broadcast” A has sig B ”committed if 1 is broadcast”,... sig B ”committed if i-1 is broadcast” u… and beacon broadcasts number b on day D If b <i, then both A and B are committed If b >i, then neither A, nor B is committed If b =i, then only A is committed This happens only with probability 1/N

Properties of Rabin’s Protocol Fair The difference between A’s probability to obtain B’s commitment and B’s probability to obtain A’s commitment is at most 1/N –But communication overhead is 2N messages Not optimistic Need input from third party in every transaction –Same input for all transactions on a given day sent out as a one-way broadcast. Maybe this is not so bad! Not timely If one of the parties stops communicating, the other does not learn the outcome until day D

BGMR Probabilistic Contract Signing [Ben-Or, Goldreich, Micali, Rivest ’85-90] uDoesn’t need beacon input in every transaction uUses sig A ”I am committed with probability p A ” instead of sig A ”I am committed if i is broadcast on day D” uEach party decides how much to increase the probability at each step A receives sig B ”I am committed with probability p B ” from B Sets p A =min(1,p B   ) Sends sig A ”I am committed with probability p A ” to B … the algorithm for B is symmetric  is a parameter chosen by A

CONTRACT(A, B, future date D, contract terms) BGMR Message Flow sig B ”I am committed with probability “ 0.12 sig A ”I am committed with probability “ sig A ”I am committed with probability “ sig B ”I am committed with probability “ 0.23 … sig A ”I am committed with probability “ 1.00 sig B ”I am committed with probability “ 1.00

Conflict Resolution sig A ”I am committed with probability “ pA 1 pA 2 sig A ”I am committed with probability “ ??? sig B ”I am committed with probability “ pB 1 sig B ”I am committed with probability “ pB 1 judge “Binding” or “Canceled” (same verdict for both parties) “Binding” or “Canceled” (same verdict for both parties) 01 Waits until date D If   pB 1, contract is binding, else contract is canceled

Judge uWaits until date D to decide uAnnounces verdict to both parties uTosses coin once for each contract uRemembers previous coin tosses Constant memory: use pseudo-random functions with a secret input to produce repeatable coin tosses for each contract uDoes not remember previous verdicts Same coin toss combined with different evidence (signed message with a different probability value) may result in a different verdict

Privilege and Fairness At any step where Prob(B is privileged) > v, Prob(A is not privileged | B is privileged) <  Intuition: at each step, the parties should have comparable probabilities of causing the judge to declare contract binding (privilege must be symmetric) Fairness A party is privileged if it has the evidence to cause the judge to declare contract binding Privilege Intuition: the contract binds either both parties, or neither; what matters is the ability to make the contract binding

Properties of BGMR Protocol Fair Privilege is almost symmetric at each step: if Prob(B is privileged) > p A 0, then Prob(A is not privileged | B is privileged) < 1-1/  Optimistic Two honest parties don’t need to invoke a judge Not timely Judge waits until day D to toss the coin What if the judge tosses the coin and announces the verdict as soon as he is invoked?

Formal Model uProtocol should ensure fairness given any possible behavior by a dishonest participant Contact judge although communication hasn’t stopped Contact judge more than once Delay messages from judge to honest participant uNeed nondeterminism To model dishonest participant’s choice of actions uNeed probability To model judge’s coin tosses uThe model is a Markov decision process

Constructing the Model uDiscretize probability space of coin tosses The coin takes any of N values with equal probability uFix each party’s “probability step” Rate of increases in the probability value contained in the party’s messages determines how many messages are exchanged uA state is unfair if privilege is asymmetric Difference in evidence, not difference in commitments uCompute probability of reaching an unfair state for different values of the parties’ probability steps Use PRISM Defines state space

Attack Strategy uDishonest B’s probability of driving the protocol to an unfair state is maximized by this strategy: 1.Contact judge as soon as first message from A arrives 2.Judge tries to send verdict to A (the verdict is probably negative, since A’s message contains a low probability value) 3.B delays judge’s verdicts sent to A 4.B contacts judge again with each new message from A until a positive verdict is obtained uThis strategy only works in the timely protocol In the original protocol, coin is not tossed and verdict is not announced until day D uConflict between optimism and timeliness

Analysis Results For a higher probability of winning, dishonest B must exchange more messages with honest A Probability of reaching a state where B is privileged and A is not Increase in B’s probability value at each step (lower increase means more messages must be exchanged)

Attacker’s Tradeoff uLinear tradeoff for dishonest B between probability of winning and ability to delay judge’s messages to A uWithout complete control of the communication network, B may settle for a lower probability of winning Expected number of messages before unfair state is reached Probability of reaching a state where B is privileged and A is not

Summary uProbabilistic contract signing is a good testbed for probabilistic model checking techniques Standard formal analysis techniques not applicable Combination of nondeterminism and probability Good for quantifying tradeoffs uProbabilistic contract signing is subtle Unfairness as asymmetric privilege Optimism cannot be combined with timeliness, at least not in the obvious way