Download presentation
Presentation is loading. Please wait.
Published byMarlene Welch Modified over 9 years ago
1
Introduction to Practical Cryptography Forward Key Security Zero Knowledge Oblivious Transfer Multi-Party Computation
2
2 Overview Intended as overview of specific areas Solutions/instances of some require background not covered in this class
3
3 Agenda Forward Key Security Zero Knowledge Oblivious Transfer Multi-party Computation
4
4 Forward Secure Encryption Schemes Attacker learns key at time t Should not be able to decrypt anything from prior to time t Encryption algorithm – add an input, time –E sk (m) becomes E’ sk (m,t) Current key is a function of prior key –kt = F(sk,t) = G(kt-1,t) –Given sk, can compute entire sequence of kt’s c = E kt (m)
5
5 Digital Signatures Alice has a secret key Everyone else has the corresponding public key Alice can sign message with her secret key: Given a signature and a message, everyone can verify correctness using Alice’s public key: Desirable property: non-repudiation. If Alice signed a contract, she can’t deny it later.
6
6 Digital Signature Schemes Three algorithms Key-Gen, Sign, Verify: Key-Gen inputs:security parameter (“key size”) k output:keys (PK, SK) Sign inputs:message M, secret key SK output:signature Verify inputs: M, signature , public key PK output:valid/invalid Strong security notion: even given signatures on messages of its choice, adversary cannot forge signatures on new messages
7
7 Problem with Signatures Can’t tell when a signature was really generated –Not even if you include current time into the document – doesn’t say when it was signed Therefore, signing key disclosed all signatures worthless (even if produced before disclosure) Inconvenience: your notarized document is no longer valid Repudiation: Alice gets out of past contracts by anonymously leaking her SK Key revocation doesn’t help: it merely informs of the leak
8
8 Fixing the Problem Attempt 1: re-sign all the past messages with a new key –Expensive –What if the signer doesn’t cooperate? Attempt 2: change keys frequently, erasing past SK –Disclosure of current SK doesn’t affect past SK –However, changing PK is expensive, requires certification Attempt 3: Employ time stamping authority (third party)
9
9 Forward Security Idea: change SK but not PK [And97] Divide the lifetime of PK into T time periods Each SK j is for a particular time period, erased thereafter If current SK j is compromised, signatures with previous SK j-t remain secure Note: can’t be done without assuming secure erasure
10
10 Definitions: Key-Evolving Scheme [BM99] The usual three algorithms Key-Gen, Sign, Verify: Key-Gen inputs:security parameter (“key size”) k total number T of time periods output:(PK, SK 1 ) Sign inputs:message M, current secret key SK j output:signature S (time period j included) Verify inputs:M, time period j, signature S, public key PK output:valid/invalid
11
11 Definitions: Key-Evolving Scheme [BM99] The usual three algorithms Key-Gen, Sign, Verify: Key-Gen inputs:security parameter (“key size”) k total number T of time periods output:(PK, SK 1 ) Sign inputs:message M, current secret key SK j output:signature S (time period j included) Verify inputs:M, time period j, signature S, public key PK output:valid/invalid A new algorithm Update: Update input:current secret key SK j output:new secret key SK j+1
12
12 Definitions: Forward-Security [BM99] Given: Signatures on messages and time periods of its choice The ability to “break-in” in any time period b and get SK b adversary can’t forge a new signature for a time period j<b
13
13 Simple Schemes: Efficiency? Long Public and Long Private Keys –T pairs (p1, s1), (p2,s2),……. (pt, st) –PK= (p1,p2,….,pt) –SK= (s1,s2,…..,st) –Update= erase si for period i –Drawback: public and private key linear in t = number of periods
14
14 Anderson: Long Secret Key Only T pairs as before and an additional pair (p,s) Sig(j)=SIG( j || pj) with key s, j =1,…,t [“certificate”] Public key now = p (only) Secret key = (sj, Sig(j)) j=1,..,t [Still linear] The public key p is like a CA key and a signature will include the period, the certificate, the message, signature on the message with period’s secret key
15
15 Long Signatures Only Have (p,s) In period j get (pj,sj) and Let Cert(j)= sig_s[j-1] (j || pj) A signature is the entire certificate chain + signature, it is of the form: (j, sig_sj(m), p1, Cert(1),…pj, Cert(j)) Think about it as a tree of degree one and height t for t periods (signer memory still linear in t)
16
16 Binary Certification tree [BM] Now (s0,p0) Each element certifies two children left and right We have a tree of height log t (for t periods) At each point only a log branch of certificate is used in the signature Only leaves are used to sign Keys whose children are not going to be used in future periods are erased. We get O(log t) key sizes, signature size
17
17 Concrete Scheme Use a scheme based on Fiat-Shamir variant Have the public key be a point x raised to 2 t Have the initial private key at period zero be x Sign based on FS paradigm Update by squaring the current key The verifier is aware of the period so it knows that currently the private key is raised to the power 2 t-1 (and the identification proofs are adjusted accordingly).
18
18 Replace keys Pseudorandomness Have future keys derived from a forward secure pseudorandom generator Pseudorandom generators which are forward secure are easy: replace the seed with an iteration of the function. Possibilities in practice: –Block cipher – use previous output as next input –Stream cipher – previous output as next state
19
19 Agenda Forward Key Security Zero Knowledge Oblivious Transfer Multi-party Computation
20
20 Zero Knowledge X Prove it I know X Some exchange, but which does not provide X Interactive method for Alice to prove to Bob that she has/knows x without revealing x to Bob. Motivation: authentication Alice wants to prove her identity to Bob via some secret but doesn't want Bob to learn anything about this secret –Login methods where password is not stored on server (maybe hash of password is stored) –Alice (user/client) proves to Bob (server) that she knows the password without giving Bob the password
21
21 Zero Knowledge Interactive method for Alice to prove to Bob that she has/knows x without revealing x to Bob. Zero-knowledge proof must satisfy three properties: Statement is “alice knows x” –Completeness: if the statement is true, the honest verifier (that is, one following the protocol properly) will be convinced of this fact by an honest prover. –Soundness: if the statement is false, no cheating prover can convince the honest verifier that it is true, except with some small probability. –Zero-knowledge: if the statement is true, no cheating verifier learns anything other than this fact. Completeness and soundness needed for any interactive proof
22
22 Zero Knowledge Example use: enforce honest behavior while maintaining privacy Force a user to prove that its behavior is correct according to the protocol. Used in secure multiparty computation
23
23 Zero Knowledge Jean-Jacques Quisquater, et. al "How to Explain Zero-Knowledge Protocols to Your Children“ Peggy (prover) has uncovered the secret password to open a magic door in a cave. The cave is shaped like a circle, with the entrance in one side and the magic door blocking the opposite side. Victor (verifier) will pay Peggy for the secret, but not until he's sure that she really knows it. Peggy says she'll tell him the secret, but not until she receives the money. Zero knowledge – need a method by which Peggy proves to Victor that she knows the word without telling it to him Solution –Victor waits outside the cave, Peggy goes in. Label the left and right paths from the entrance A and B. She takes either A or B at random (Victor does not know which) –Victor enters the cave and shouts the path (A or B at random) on which she must return –Peggy returns along the path chosen by Victor, opening the door if it was not the path on which she had entered the cave –If Peggy did not know the word, there is a 50% chance she can return on the correct path; repeat the above many times, Peggy’s chance of successfully returning becomes negligible (assume chance of Peggy guessing the word is negligible) –If Peggy returns correctly each time, this proves she knows the word enter A B door
24
24 Agenda Forward Key Security Zero Knowledge Oblivious Transfer Multi-party Computation
25
25 Oblivious Transfer m = 1 0 or 1 Alice transfers a secret bit m to Bob with probability ½ such that –Bob knows whether or not he receives m –Alice doesn’t know if m transferred to Bob
26
26 Oblivious Transfer – Example Method Alice oblivious transfer to Bob Alice –picks 2 random primes p,q, set n = pq –encrypt message m using n (such as with RSA) –c = resulting ciphertext –sends n and c to Bob Bob –picks a Z* n at random –sends w = a 2 mod n to Alice Alice –computes square roots of w: S = {x,-x,y,-y} –picks one of the four at random, s S and sends to Bob (probability s = a or –a is ½) Bob –if s ≠ a, -a: Bob can factor n and obtain m (and will know he has m) –Won’t walk through why – see Rabin cryptosystem Bob obtains m with probability ½ and Alice doesn’t know result
27
27 Oblivious Transfer – Example Method Works if Bob selects a at random (doesn’t cheat) Not known if Bob gains any advantage (can cheat) if intentionally selects a specific a
28
28 Oblivious Transfer – Application Alice and Bob will each sign a contract only if the other also signs it Idea –If names are of equal length, could sign a letter at a time, alternating But someone must go last and could abort – not complete last letter –Sign small fragment at a time – bit, pixel Don’t know who will be last If one stops, both are approximately at same point –But one could send garbage in a fragment Oblivious transfer solves the problem
29
29 Oblivious Transfer – Application Alice and Bob will each sign a contract only if the other also signs it Idea –If names are of equal length, could sign a letter at a time, alternating But someone must go last and could abort – not complete last letter –Sign small fragment at a time – bit, pixel Don’t know who will be last If one stops, both are approximately at same point –But one could send garbage in a fragment Oblivious transfer solves the problem
30
30 Oblivious Transfer – Application Alice and Bob each –create 2 signatures, pick 2 random keys and encrypt each signature (such as with a block cipher) Alice –LA = “Alice, this is my signature of the left half of the contract” –RA = “Alice, this is my signature of the right half of the contract” –Keys KLA, KRA –CLA = E KLA (LA), CRA = E KRA (RA) Bob –LB = “Bob, this is my signature of the left half of the contract” –RB = “Bob, this is my signature of the right half of the contract” –Keys KLB, KRB –CLB = E KLB (LB), CRB = E KRB (RB) Contract is considered signed only if Alice and Bob each have both halves of other’s signature
31
31 Oblivious Transfer – Application Alice sends one of KLA, KRA to Bob using oblivious transfer Bob sends one of KLB, KRB to Alice using oblivious transfer Alice and Bob exchange bits of both keys, one bit at a time from each key, in order, until all bits are sent. If Alice sees a mistake in key bits received, Alice aborts –Likewise if Bob sees a mistake Bob does not know if Alice has KBL or KBR, so he cannot risk sending an incorrect value for either key –Likewise for Alice Must exchange bits simultaneously; otherwise, last one could flip last bit
32
32 Agenda Forward Key Security Zero Knowledge Oblivious Transfer Multi-party Computation Some slides are modified from a presentation by Juan Garay, Bell Labs
33
33 Secure MPC A set of parties with private inputs wish to compute some joint function of their inputs. Parties wish to preserve some security properties. E.g., privacy and correctness. –Example: secure election protocol Security must be preserved in the face of adversarial behavior by some of the participants
34
34 Secure MPC Multi-party computation (MPC) [Goldreich-Micali-Wigderson 87] : –n parties {P 1, P 2, …, P n }: each P i holds a private input x i –One public function f (x 1,x 2,…,x n ) –All want to learn y = f (x 1,x 2,…,x n ) (Correctness) –Nobody wants to disclose his private input (Privacy) 2-party computation (2PC) [Yao 82, Yao 86] : n=2 Studied for a long time. Focus has been security.
35
35 Secure MPC s s1 s2 s4 s5 s3 sn Alice has a secret key, s, (such as a key to a system) Afraid she may lose s Want to give is to someone, but don’t trust anyone with entire secret Share secret among n people: s = s1 s2 s3 s4 s5 … sn
36
36 Secure MPC s s1 s2 s4 s5 s3 sn X Suppose need more flexibility –i th person loses si or is malicious n people –Any subset of size t can recover s –Any subset of size < t cannot recover any information about s
37
37 Secure MPC Use polynomials Finite field F (such as some Z* p for prime p) F(x) = a 0 x 0 + a 1 x 1 + a 2 x 2 + … a t x t Coefficients a 0, a 1, a 2 … a t F t+1 terms (degree is t)
38
38 Secure MPC Polynomials properties: Interpolation: given t+1 distinct points, (x 1,y 1 ), (x 2,y 2 ) … (x t+1,y t+1 ), can find a 0, a 1, a 2,… a t Secrecy: if have only t (or fewer) distinct points, can’t determine anything about a 0
39
39 Secure MPC Coefficients –Pick a 1, a 2.. a t at random –set a 0 = s Each person is associated with a distinct point –Person i assigned distinct x i –Set s i = f(x i ) : (x i,s i ) is point for person i
40
40 Secure MPC Polynomial construction requires honesty –Entity choosing a i ’s, x i ’s is dishonest –Collusion – person 2 may give (x 2,s 2 ) to person 10 Verifiable secret sharing –Each person can verify piece received is a proper piece –Allows for O(log n) colluders –Based on factoring [Chor, Goldwasser, Micali, Awerbuch]
41
41 Instances of 2PC Authentication –Parties: 1 server, 1 client. –Function : if (server.passwd == client.passwd), then return “succeed,” else return “fail.” On-line Bidding –Parties: 1 seller, 1 buyer. –Function: if (seller.price <= buyer.price), then return (seller.price + buyer.price)/2, else return “no transaction.” –Intuition: In NYSE, the trading price is between the ask (selling) price and bid (buying) price.
42
42 Instances of MPC Auctions –Parties: 1 auctioneer, (n-1) bidders. –Function: Many possibilities (e.g., Vickrey*). Consider a secure auction (with secret bids): –An adversary may wish to learn the bids of all parties – to prevent this, requires privacy –An adversary may wish to win with a lower bid than the highest – to prevent this, requires correctness *sealed-bid, bidders submit written bids without knowing the bid of the others. Highest bidder wins, but pays second- highest bid. Intent: bidders bid true value.
43
43 MPC Protocols Consider 2PC (MPC is similar). Two parties: P0 and P1. Encode function as a Boolean circuit of ANDs and XORs Bits on each wire are shared using XOR Per-gate evaluation: (Shared) Inputs: x = x 0 x 1, y = y 0 y 1 XOR: No interaction needed ( x y ) = ( x 0 y 0 ) ( x 1 y 1 ) AND: More complex (needs interaction) Each party reveals his shares
44
44 MPC – Elections m voters: v 1, v 2, … v m i th voter inputs x i Result: function, f, of x i ’s Required properties –Only authorized voters can vote –Each can only vote once –Each vote is secret –No vote can be duplicated by another voter –Vote tally correctly computed –Anyone can check the tally is correct –Fault tolerate protocol: works if some number of “bad” parties –Cannot coerce a voter into revealing how he/she voted (no vote-buying) Meeting all requirements tricky, especially last In real life – system, logistical problems main issues as opposed to protocols
45
45 MPC –Digital Cash Required properties –Prevent forgery –Prevent or detect duplication –Preserve customer’s anonymity Practical: –operationally feasible (ex. no large single database of all issued digital cash)
46
46 Digital Cash 3 protocols –Withdrawal – user can obtain a digital coin –Payment – user buys goods from vendor using digital coin –Deposit – vendor gives digital coin to bank to be credited to the vendors account Notation: –U = user –B = Bank –V = vendor –D = digital coin of $100 –S KB {x} = signature of B on x
47
47 Digital Cash - Withdrawal U notifies B wants to withdraw D B gives D to U D = S KB {I am a $100 bill, #4527} U checks signature –accepts D if it is valid –else rejects D Bank deducts D from U’s account if U accepts D
48
48 Digital Cash - Payment U pays V with D V checks signature –accepts D if it is valid –else rejects D
49
49 Digital Cash - Deposit V gives D to B B checks signature –accepts D if it is valid and credits V’s account –else rejects D
50
50 Digital Cash – Do Properties Hold? Prevent forgery –ok, infeasible under basic assumptions of signature schemes Prevent or detect duplication –Very easy to duplicate coins, double spend Preserve customer’s anonymity –No anonymity – know U and where D was spent
51
51 Digital Cash – Fix Properties Blind signature –U presents D to B –B signs D without seeing its contents (i.e. can’t associate with U) Analogy: U covers check with carbon paper and seals both inside an envelope. Bank signs outside of envelop –But how does B know D is not fake?
52
52 Digital Cash – Fix Properties RSA Blind signature –Have key pair (e,n) (d,n) –U picks random r mod n –U computes D’ = D – r e mod n –presents D’ to B –B signs D’: s’ = (D’) d mod n (= M d (r e ) d = M d r mod n ) –B deducts D from U’s account –U computes signature on M: s = M d mod n by dividing s’ by r Now anonymity: no link between U and D But –B can be tricked into signing fake D –D can be duplicated and double spent
53
53 Digital Cash – Fix Properties One denomination or one public key per denomination, feasible – not many denominations Probability method –U makes up 100 D’s –Blind signature on all 100 and gives all to B –B signs one, requires U to unblind the rest (reveal r’s) –U has 1/100 chance of successfully cheating –U spends 1 remaining D, anonymous
54
54 Digital Cash – Fix Properties Now have anonymity and no (or small) chance of cheating How to prevent double spending? B has database of all D, records D’s as they are returned spent from V –not too practical – large database, V must wait for B to ok each D at time of purchase
55
55 Digital Cash – Fix Properties What if just detect double spending? Random identity string (RIS) –Different for every payment of D –Only U can create a valid RIS –Two different RIS’s on same D allows B to retrieve U’s name If B receives two identical D’s with different RIS values, U cheated If B receives two different D’s with same RIS values, V cheated
56
56 Digital Cash – Fix Properties H = hash U creates 100 D’s –D i = (I’m a $100 bill, #4527i,y i,1 y’ i,1,y i,2, y’ i,2, …y i,k,y’ i,k ) –where y i,j = H(x i,j ); y’ i,j = H(x’ i,j ) x i,j and x’ i,j are randomly chosen but such that x i,j x’ i,j = U’s name i,j
57
57 Digital Cash – Fix Properties Withdrawal –U blinds each D i, get D’ i –B has U unblind all but one D’ i and reveals appropriate x i,j and x’ i,j –B checks for each y i,j = H(x i,j ); y’ i,j = H(x’ i,j ) x i,j x’ i,j = U’s name –B signs remaining blind D’ i and gives to U
58
58 Digital Cash – Fix Properties Payment –U gives D’ i to V –V checks B’s signature on D’ i –V creates a challenge: random bit string b 1,b 2, … b k –If b j = 0, U reveals x i,j, else reveals x’ i,j –V checks y i,j = H(x i,j ) or y’ i,j = H(x’ i,j ) and accepts if yes, else rejects Probability in a different payment that same RIS is produced is 2 -k because V creates challenge at random Only U can produce valid RIS – H is computationally infeasible to invert Two different RIS values on same D’ I leaks U’s name: have x i,j and x’ i,j for some j (due two different challenges)
59
59 Digital Cash – Fix Properties Deposit –V gives D’ i, s’, RIS to B –B verifies signature, checks if already returned –If already in database, B compares RIS values –If different, U double spent –If equal, V trying to deposit twice
60
60 Other MPC Applications Database Query: –Alice has a string q; Bob has a database of strings –Alice wants to know whether there exists a string ti in Bob's database that matches q (exact or close) –Privacy: –Bob cannot know Alice's q or the response –Alice cannot know the database contents except for what can be known from the query result
61
61 Other MPC Applications Profile Matching –Alice has a database of known hacker's behaviors –Bob has a hacker's behavior from a recent break-in –How can Bob check if his hacker is in Alice’s database while Not disclosing the hacker's actual behavior to Alice – doing so that might disclose the vulnerability in his system. Not obtaining contents of Alice’s database – it contains confidential information.
62
62 Other MPC Applications Companies want to cooperate in preventing intrusions into their networks. –Need to share data patterns, but this is sensitive information Real data for IDS/security research –None exists –Outdated/now irrelevant MIT Lincoln Laboratory IDS Evaluation Data Set 1998- 2000
63
63 Other MPC Applications In general, many database applications –Simple queries –Determining intersection –Sharing of patterns without revealing actual content
64
64 Backup
65
65 On-line Bidding: Definition of Security Correctness: seller.output = buyer.output = f (seller.price, buyer.price) Privacy: The transcript carries no additional information about seller.price and buyer.price. seller buyer (seller.price) (buyer.price) (seller.output) (buyer.output) } transcript
66
66 “Privacy” is a little tricky… On-line Bidding Function if (seller.price <= buyer.price), then return (seller.price + buyer.price)/2, else return “no transaction.” If seller.price ≤ buyer.price, then both parties can learn each other’s private input. If seller.price > buyer.price, then both parties should learn nothing more than this fact. Privacy: Each party should only learn whatever can be inferred from the output (which can be a lot sometimes).
67
67 Fair Secure Multi-Party Computation (FMPC) Security is about absolute information gain. “ At the end of the protocol, each party learns y (and anything inferable from y). ” Parties P 1, P 2, …, P n (some corrupted), each holding private input x i, wish to compute y = f(x 1, x 2,…, x n ) privately and correctly. Fairness is about relative information gain. “ At the end of the protocol, either all parties learn y, or no party learns anything.” Important in MPC; crucial in some applications (e.g., two-party contract signing).
68
68 Security vs. Fairness The problem of secure MPC/2PC is well-studied and well-understood. The problem of fair MPC/2PC less developed Security and fairness – different concepts –Fair without being secure
69
69 Security Fairness On-line Bidding Function if (seller.price <= buyer.price), then return (seller.price + buyer.price)/2 else return “no transaction.” E.g., in an unfair on-line bidding protocol, the seller may learn the output (and thus buyer.price) before the buyer learns anything.
70
70 Cheating with Unfair Protocols A cheating seller: 1.Initiate protocol with price x (originally $999,999). 2.Run until getting the output (buyer hasn ’ t got the output yet). 3.if (output == “ no transaction ” ), then abort (e.g., announce “ network failure ” ), set x x-1, and repeat. A cheating seller can: –find out the buyer’s price (destroys privacy) and –achieve maximum profit (destroys correctness) (the actual function computed is { return buyer.price}) The lack of fairness completely voids the security!
71
71 Fairness: Positive Results n parties, t corrupted: t n/3 — possible with p2p channels – computational [GMW87] – information-theoretic [BGW88, CCD88] n/3 t n/2 — possible with broadcast channel – computational [GMW87] – information-theoretic [RB89]
72
72 Unfortunately… Fairness is impossible with corrupted majority (t n/2): Intuition (2 parties) : Party sending the last message may abort early. Consequently, many security definitions do not consider fairness, or only consider partial fairness [BG90, BL91, FGHHS02, GL02].
73
73 Fairness After the Impossibility Result We still need (some form of) fairness, so “tweak” model/definition: “Gradual Release” approach (tweak the definition) [Blum83, D95, BN00,…] No trusted party needed. Parties take turns releasing info’ “little-by-little.” Still somewhat unfair, but we can quantify and control the amount of “unfairness.” “Optimistic” approach (tweak the model) [M97, ASW98, CC00,…] Adds a trusted party as an arbiter in case of dispute. Needs to be (constantly) available.
74
74 The Gradual Release Approach Reasonably studied –Initial idea by [Blum 83] –Subsequent work: […,Damgard 95, Boneh-Naor 00, Garay-Pomerance 03, Pinkas 03,…] Not quite well-understood – Ad hoc security notions – Limited general constructions (only 2PC) – Few practical constructions
75
75 Security and Fairness A typical gradual release protocol (e.g., [BN00, GP03, P03]) consists of two phases: 1.Computation phase: “Normal” computation. 2.Revealing phase: Each P i gradually reveals a “secret” s i ; then each P i computes the result y from s 1, s 2,…, s n.
76
76 Observation on Existing MPC Protocols Many (unfair) MPC protocols (e.g., [GMW87, CDN01, CLOS02]) share the same structure: Sharing phase: Parties share data among themselves (simple sharing, or (n, t) threshold sharing) Evaluation phase: “Gate-by-gate” evaluation (all intermediate data are shared or “blinded”) Revealing phase: Each party reveal its secret share (all parties learn the result from the shares) Unfair! Honest parties reveal their secrets, and corrupted parties abort (and learn the result).
77
77 F CPFO : Commit-Prove-Fair-Open Commit phase: Every party P i commits to a value x i. Prove phase: Every party P i proves a relation about x i. Open phase: Open x 1, x 2,…, x n simultaneously. Using F CPFO, the revealing phase becomes fair, and so does the MPC protocol. Simultaneous opening guarantees fairness — either all parties learn all the committed values, or nobody learns anything.
78
78 Time-lines: Towards realizing F CPFO A time-line: An array of numbers (head, …, tail). Time-line commitments: –TL-Commit(x) = (head, tail· x) –Perfect binding. –Hiding (2 k steps to compute tail from head). –Gradual opening: Each accelerator cuts the number of steps by half. … headtail accelerator 1 accelerator 2 accelerator k
79
79 A time-line, mathematically [BN00,GJ02,GP03] N: “safe Blum modulus,” N = p · q, where p, q, (p-1)/2, (q-1)/2 are all primes. g a random element in Z N *. head = g, tail = g 2 2 k g22kg22k g g 2 2 k-1 g 2 (2 k-1+ 2 k-2 ) … accelerator 1accelerator 2 …
80
80 A time-line, mathematically (cont’d) g22kg22k g g 2 2 k-1 g 2 (2 k-1+ 2 k-2 ) … accelerator 1accelerator 2 … Can move forward m positions by doing m squarings Knowing (N), one can compute G[i] efficiently, for any i. Hard to move backward, not knowing factorization of N inefficient to move forward (step-by-step) point “far away” is “unknown”… G[i] = g22ig22i
81
81 Fair exchange using time-lines START: Alice has a, Bob has b. COMMIT: –Alice sends TL-Commit(a) to Bob, –Bob sends TL-Commit(b) to Alice. OPEN: Take turns to gradually open the commitments. Bob Alice
82
82 Fair exchange using time-lines (cont’d) ABORT: If Bob aborts and force-opens in t steps, Alice can do it as well in 2t steps. Bob Alice t 2t
83
83 Realizing F CPFO using time-lines Setup: A “ master ” time-line T = N; g; G[j], j=1,…,k in CRS. Commit: Each party P i : Derives a time-line T i = N; g i ; G i [j] ; TL-commits to x i : (g i ; G i [k] · x i ), Prove: Standard ZK proof. Open: In round m, each party P i reveals G i [m] with ZK proof; if any party aborts, enter panic mode. Panic mode: Depends on current round m… If (k-m) is “large,” then abort. ( A does not have enough time to force-open.) If (k-m) is “small,” then force-open. ( A has enough time to force-open as well.)
84
84 Putting things together… Plug F CPFO into existing MPC protocols Fair MPC protocols
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.