Download presentation
Presentation is loading. Please wait.
1
FOC-2 Cryptography with Low Complexity: 3
Benny Applebaum and Iftach Haitner Tel-Aviv University
2
Reminder: Local Functions
Function fG,Q defined by: (m,n,d) graph G single predicate Q:{0,1}d{0,1} Fm,n,Q collection {fG,Q} where G is random (m,n,d) graph y ym yi= Q(x1,x2,x5) OUTPUT INPUT X Xn
3
One-Wayness Pseudorandomness [A11]
Conjecture: f is one-way for predicate Q, random graph with m outputs Implication 1: f is 0.99-unpredictable for predicate Q & random graph with m outputs Implication 2: f is -pseudorandom for sensitive Q & random graph with m1/3/2 outputs inapproximability Extension to expanders [AR16]: One-wayness over all expanders Pseudorandomness over all expanders Supports the PRG conjecture
4
One-Wayness Pseudorandomness [A11]
Conjecture: f is one-way for predicate Q, random graph with m outputs m=1.1n outputs Implication 1: f is 0.99-unpredictable for predicate Q & random graph with m outputs Thm: Linear-stretch local PRG Implication 2: f is -pseudorandom for sensitive Q & random graph with m1/3/2 outputs inapproximability
5
One-Wayness Pseudorandomness [A11]
Conjecture: f is one-way for predicate Q, random graph with m outputs m=n1.1 outputs Implication 1: f is 0.99-unpredictable for predicate Q & random graph with m outputs Thm: Linear-stretch local PRG Thm: poly-stretch local PRG 1/poly distinguishing advntage or negl(n) with locality lg*(n) Implication 2: f is -pseudorandom for sensitive Q & random graph with m1/3/2 outputs inapproximability Drawbacks: Polynomial security loss Yields collections of local primitives Relies on hardness for “most functions” Open: Poly-stretch, constant locality, negligible security Requires Highly unbalanced constant-degree expanders!
6
(½+ )-predicting FQ,m Inverting FQ,mt
ym= Q(xi,xj,xk) graph G i j k
7
(½+ )-predicting FQ,m Inverting FQ,mt
It seems that P tells us something that we already know – Isn’t it useless? Observation: P works for many (random) graphs – let’s apply it on a modified graph ! P b = Q(xi,xj,xk) w.p ½+ ym= Q(xi,xj,xk) graph G i j k
8
(½+ )-predicting FQ,m Inverting FQ,mt
Step 1: From prediction to a single 2-LIN equation Idea: Run the predictor on a modified (m,n,d) graph G’ Assuming P is right: b=ym iff xi=xr We learned a noisy 2-LIN equation xrxi= Random r P b = Q(xr,xj,xk) w.p ½+ ym= Q(xi,xj,xk) graph G r i j k
9
(½+ )-predicting FQ,m Inverting FQ,mt
Step 1: From prediction to a single 2-LIN equation Step 2: Collect tn2/2 random noisy equations Given (G,y=fG,Q(x)) of length tm Parse as (G1,y1=fG1,Q(x)),…, (Gt,yt=fGt,Q(x)) Apply basic procedure X1X10=0 X5X11=1 X1X7=0
10
(½+ )-predicting FQ,m Inverting FQ,mt
Step 1: From prediction to a single 2-LIN equation Step 2: Collect tn2/2 random noisy equations Step 3: Clean noise via majority vote and solve linear equations Problem: Noise is adversarial No amplification? E.g., Predictor works only when output depends on x1 Also same x is used all times, analysis? For random input: (i,r) uniform =xi+xr w/p ½+ But correlation! graph G P predict w.p ½+ G,fG,Q(x) (i,r,) 2-LIN constructor
11
Basic Step Input: (m,n,d) graph G=(Si)i[m-1] and m-bit string y
Change the last hyperedge Sm=(i,a2,…,ad) to S’m=(r,a2,…,ad) where r[n] is random G’= (S1,…,Sm-1,S’) Output: (i,r, = Predictor(G’ ,y[1:m-1]) ym) Success(G,x) be the event for which predictor succeed on (G,y=fG,Q(x)) x is good if PrG[Success(G,x)]>1/2+/2. By Markov, /2 fraction of x are good. For random G and good x (G’,i,r) are random is correct w/p at least 1/2+/2 The event “ is correct” may depend on r but ind. of i
12
Basic Step Input: (m,n,d) graph G=(Si)i[m-1] and m-bit string y
Change the last hyperedge Sm=(i,a2,…,ad) to S’m=(r,a2,…,ad) where r[n] is random G’= (S1,…,Sm-1,S’) Output: (i,r, = Predictor(G’ ,y[1:m-1]) ym) Success(G,x) be the event for which predictor succeed on (G,y=fG,Q(x)) x is good if PrG[Success(G,x)]>1/2+/2. By Markov, /2 fraction of x are good. r is good for x if PrG[Success(G,x)|Sm[1]=r]>1/2+/2. By averaging every x has good r
13
Basic Step Input: (m,n,d) graph G=(Si)i[m-1] and m-bit string y
Change the last hyperedge Sm=(i,a2,…,ad) to S’m=(r,a2,…,ad) where r[n] is random G’= (S1,…,Sm-1,S’) Output: (i,r, = Predictor(G’ ,y[1:m-1]) ym) For random G and good x,r (G’,i) is random is correct w/p at least 1/2+/2 The event “ is correct” is ind. of I
14
Inversion Input: (tm,n,d) graph G and tm-bit string y Choose random r
Parse as (G1,y1=fG1,Q(x)),…, (Gt,yt=fGt,Q(x)) Apply basic procedure for each block with the same r Get t equations x[i1]+x[r]= 1,…, x[it]+x[r]= t Guess x[r] For each equation x[i]+x[r]= record vote x[r]+ 1 for x[i] Output majority vote for each x[i] Conditioned on good (x,r) The i‘s are uniform and ind. Each vote is correct w/p at least 1/2+/2 ind. of the i‘s
15
Inversion Input: (tm,n,d) graph G and tm-bit string y Choose random r
Parse as (G1,y1=fG1,Q(x)),…, (Gt,yt=fGt,Q(x)) Apply basic procedure for each block with the same r Get t equations x[i1]+x[r]= 1,…, x[it]+x[r]= t Guess x[r] For each equation x[i]+x[r]= record vote x[r]+ 1 for x[i] Output majority vote for each x[i] Conditioned on good (x,r) Fix i. Set t=O(n log2 n/2). By Chernoff, exept w/p 1/n2, get log n/2 votes for i. Conditioned on the above, Maj is correct for x[i] except w/p 1/n2 x[i] is recovered w/p 1-1/n2 By Union-Bound x is fully recovered.
16
Optimizing via Randomization
New version Randomly permute input nodes :[n][n] Using more optimizations get Thm. (1/2+)-predicting FQ,m inverting FQ,m/2 P graph G ( ) t r i j k For random input: (i,r) uniform =xi+xr w/p ½+ But correlation! G,fG,Q(x) (i,r,) 2-LIN constructor No correlation
17
One-Wayness Pseudorandomness [A11]
Conjecture: f is one-way for predicate Q, random graph with m outputs Implication 2: f is -pseudorandom for sensitive Q & random graph with m1/3/2 outputs inapproximability Thm: poly-stretch local PRG 1/poly distinguishing advntage or negl(n) with locality lg*(n)
18
Amplifying the parameters
Thm. If FQ,m is one-way for m>n1+ and Q sensitive. Then, a>0 Local PRG with stretch na security 1/na Lemma t-XOR Unpredictable Gen m=n1+/3 , d Security: (1/2+n-/3), UnpredGen m=n1+/3 , td Sec:1/2+n-t’ OWF m=n1+ , d t-Composition Yao PRG m=n1+/3 , td Sec: mn-t’ =n-t’’ PRG m=n(1+/3)^t , (td)t Sec: n-t’’
19
One-Wayness Pseudorandomness [A11]
Conjecture: f is one-way for predicate Q, random graph with m outputs Implication 1: f is 0.99-unpredictable for predicate Q & random graph with m outputs Thm: Linear-stretch local PRG inapproximability
20
Handling Insensitive Predicates
b=ym does not mean than xi=xr Another types of noise: due to insenitivity Predictor’s noise can be correlated Need Predictor with good (constant) advantage (e.g. 2/3) P b = Maj(xr,xj,xk) w.p ½+ 00 ym= Maj(xi,xj,xk) 00 graph G r i j k
21
Handling Insensitive Predicates
Assuming Predictor with advantage better than sensitivity* Condition on: last entry is sensitive, P predicts well on G,x Can recover only constant fraction of indices Use existing algorithms to fully recover x [BQ] P b1 = Maj(x1,10), …, bn = Maj(xn,10) ym= Maj(xi,xj,xk) 10 graph G x1xi, …, xnxi r i j k
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.