Download presentation
Presentation is loading. Please wait.
Published byGladys Walton Modified over 9 years ago
1
Amplification and Derandomization Without Slowdown Dana Moshkovitz MIT Joint work with Ofer Grossman (MIT)
2
Randomized Algorithms Amazingly successful endeavor: Parallel Matching, Minimum Spanning Tree, Approximate Max-Cut, Polynomial Identity Testing,… Trade certainty for efficiency: Sometimes won’t output right answer. efficiency certainty
3
Randomized Algorithms As A Pathway To Efficiency With Certainty always correct always efficient constant error probability exp(-n) error probability
4
This Work: Positive Answers In Certain Cases To Two Questions: Amplification: Can we decrease the error probability from 1/3 to exp(-k) without slowing down the algorithm by a factor k? Derandomization: Can we convert a randomized algorithm into a deterministic non-uniform algorithm without a factor n slowdown? (Adleman: a randomized algorithm with error < exp(-n) gives a deterministic non-uniform algorithm.) new connection to sketching!
5
Non-Uniform Algorithms (aka Circuits) Sequence of algorithms, one for every input size. Rationale: input size known in advance. Popular computational model (sorting networks, circuit lower bounds, …). Often non-uniform algorithms imply uniform algorithms! (e.g., matrix mult, minimum spanning tree [PeRa],…) Can amortize the cost of finding good randomness over many inputs. input 1input 2input 3...
6
error probability (decreasing) 2 -n deterministic non- uniform [Adleman] 1/3 via sketching, deterministic non-uniform amplification multiplies time by n Common Wisdom vs. Our Ideas via efficient testing, amplification without slowdown
7
Clique Example Given a graph G on n vertices that has a clique on n vertices, find a set of n vertices with edge density 1- . Based on Goldreich-Goldwasser-Ron: Las Vegas algorithm with time O(n 2 2 1/ 3 2 ) and constant error probability. Thm 1: Las Vegas algorithm with time O(n 2 2 1/ 3 2 ) and error probability exp small in n. Thm 2: Non-uniform deterministic algorithm with similar run-time. In the paper: similar results for Free Games (constraint problems on dense bipartite graphs), Max-Cut in dense graphs, Reed-Muller list decoding. Previous work: For clique, constructive Frieze-Kannan regularity lemma det algorithm with worse dependence on , [doesn’t apply to free games!]
8
Derandomization From Sketching inputs randomness inputs w/ same sketch Is the algorithm correct?
9
It’s the verifier that doesn’t distinguish inputs w/ same sketch – not the algorithm! Example: There is a clique C on |V| vertices. Algorithm samples S V. Does S contain ( - )|S| vertices in C? – Most samples do. – Verifier can tell if knows clique (|V| bits rather than|V| 2 ). S
10
Derandomization From Sketching inputs randomness inputs w/ same sketch Pseudorandom generators vs. this method: shrinking randomness vs. shrinking inputs. Is the algorithm correct? Def: oblivious verifier gets sketch and randomness. 1) input x, randomness r, if verifier accepts on sketch(x) and r, then algorithm correct on x, r. 2) For every sketch, P r (verifier rejects) < . < 2 -sketch-size r x algorithm correct
11
Previous Work Method of conditional probabilities derandomizes without slowdown, but only in special cases. Many existing connections between derandomization, circuit lower bounds, learning, compression. – All based on pseudorandom generators. Countless instances of saving in union bound by considering representatives. E.g., – [Gopalan, Meka, Reingold CCC’12]: improved pseudorandom generator against DNFs by noticing that can sparsify DNFs. Work on derandomizing sub-linear time algorithms. – Deterministic Frieze-Kannan [Dellamonica et al’15] doesn’t apply to problems like free games. – [Zimand CCC’07]: deterministic average case algorithms with slowdown. Our contribution is in defining and designing oblivious verifiers.
12
Randomized Clique Algorithm [GGR] Let C V, |C|= n, be the unknown clique. Sample S V, |S|=100/ 2, and go over each of its subsets U, one of them is U=S C. Which are the other vertices in C? – Connected to all of U. – Most of them are connected to most other vertices that are connected to all of U. Basic Algorithm (constant error probability): Take n vertices connected to U with largest fraction of neighbors among vertices connected to U. Lem: With prob 2/3 average is > 1- . U S
13
The Sketch: Pick random R V, |R|=polylogn. Sketch is all edges touching R. Intuition: the vertices in R connected to U “represent” the vertices connected to U, – i.e., let one estimate for every U and v, what fraction of vertices connected to U are v’s neighbors? Size of sketch is ~n. For derandomization need an algorithm with error probability exp small in ~n. Will show how to amplify the basic randomized algorithm without a slowdown. We’ll argue: sketch suffices to verify this more complicated algorithm. R connected to U
14
Amplification: The Biased Coin Problem The bias of a coin is the probability it falls on head. You know that 2/3 of the coins have bias 0.9. How many coin tosses do you need to find a coin with bias 0.9 with probability 1-exp(- (N))?
15
Simple Case: The bias of every coin is either at least 0.9 or at most 0.7 Pick a random coin, and toss it k=2 i times for i=100,…,logN. For each k, if less than 0.8 fraction are heads, restart with a new coin. Analysis: Probability of restarting at phase k is exp(- (k)), and means we wasted O(k) tosses. So, have 1-exp(- (N)) certainty after O(N) tosses.
16
General Case Pick a random coin, and toss it k=2 i times for i=loglogN,…,logN. For each k, if less than 0.9-i/10logN fraction are heads, restart with a new coin. Analysis: Probability of restarting at phase k is exp(- (k)/log 2 N), and means we wasted O(k) tosses. So, have 1-exp(-N) certainty after O(Nlog 2 N) tosses.
17
Amplification From Biased Coins Biased CoinsRandomized Algorithms coinrandom choice of algorithm bias of coinquality of random choice most coins are biasedmost random choices are good find a biased coin with prob 1-exp(-N) find a good random choice with prob 1-exp(-N) coin tosstest random choice Testing Algorithm Gives Amplification in Nearly Linear Time
18
Finding Approximate Clique With Error Probability 2 - (n) Sample S V, |S|=100/ 2. k 1000 log 2 n. While k < n do For each U S connected to at least n vertices, Sample V’ V, |V’|=k. Find k’ vertices in V’ connected to U with largest fraction of neighbors connected to U. If for all U, average fraction < 0.9-i/10logn, restart. Double k. Search for an approximate clique among the sets defined by U S ( n vertices connected to U with largest fraction of neighbors among vertices connected to U).
19
Oblivious Verifier For Amplified Algorithm Main challenge: algorithm branches depending on input (if… then..). Verifier can’t follow. Solution: Verifier tries all possible branches. – Oblivious verifier is inefficient! – Maintains set of inputs consistent with each branch. Branch is feasible if set is non-empty. – Final verifier checks: do all branches succeed? Use low error probability per branch.
20
Summary: Derandomization From Sketching Requires: – Randomized algorithm with low error (biased coin protocol can help with that). – Sketch of input and oblivious verifier that given sketch & randomness decides if algorithm succeeds. Yields non-uniform deterministic algorithms; can sometimes convert to uniform. Applications in paper: approximate clique, max-cut on dense graphs, free games (constraint satisfaction problems on dense graphs), Reed- Muller list decoding to unique decoding. More??
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.