RANDOMNESS AND PSEUDORANDOMNESS Omer Reingold, Microsoft Research and Weizmann.

Slides:



Advertisements
Similar presentations
Randomness Conductors Expander Graphs Randomness Extractors Condensers Universal Hash Functions
Advertisements

The Weizmann Institute
Walk the Walk: On Pseudorandomness, Expansion, and Connectivity Omer Reingold Weizmann Institute Based on join works with Michael Capalbo, Kai-Min Chung,
Randomness Conductors (II) Expander Graphs Randomness Extractors Condensers Universal Hash Functions
Quantum t-designs: t-wise independence in the quantum world Andris Ambainis, Joseph Emerson IQC, University of Waterloo.
Routing Complexity of Faulty Networks Omer Angel Itai Benjamini Eran Ofek Udi Wieder The Weizmann Institute of Science.
Efficient Private Approximation Protocols Piotr Indyk David Woodruff Work in progress.
An Introduction to Randomness Extractors Ronen Shaltiel University of Haifa Daddy, how do computers get random bits?
Why Simple Hash Functions Work : Exploiting the Entropy in a Data Stream Michael Mitzenmacher Salil Vadhan And improvements with Kai-Min Chung.
Randomness Extractors: Motivation, Applications and Constructions Ronen Shaltiel University of Haifa.
The Contest between Simplicity and Efficiency in Asynchronous Byzantine Agreement Allison Lewko The University of Texas at Austin TexPoint fonts used in.
Extracting Randomness David Zuckerman University of Texas at Austin.
Foundations of Cryptography Lecture 10 Lecturer: Moni Naor.
Expander Graphs, Randomness Extractors and List-Decodable Codes Salil Vadhan Harvard University Joint work with Venkat Guruswami (UW) & Chris Umans (Caltech)
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
Randomized Algorithms Kyomin Jung KAIST Applied Algorithm Lab Jan 12, WSAC
Artur Czumaj Dept of Computer Science & DIMAP University of Warwick Testing Expansion in Bounded Degree Graphs Joint work with Christian Sohler.
Yi Wu (CMU) Joint work with Parikshit Gopalan (MSR SVC) Ryan O’Donnell (CMU) David Zuckerman (UT Austin) Pseudorandom Generators for Halfspaces TexPoint.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 8 May 4, 2005
CPSC 689: Discrete Algorithms for Mobile and Wireless Systems Spring 2009 Prof. Jennifer Welch.
Linear-time encodable and decodable error-correcting codes Daniel A. Spielman Presented by Tian Sang Jed Liu 2003 March 3rd.
Constant Degree, Lossless Expanders Omer Reingold AT&T joint work with Michael Capalbo (IAS), Salil Vadhan (Harvard), and Avi Wigderson (Hebrew U., IAS)
Dynamic Hypercube Topology Stefan Schmid URAW 2005 Upper Rhine Algorithms Workshop University of Tübingen, Germany.
Randomized and Quantum Protocols in Distributed Computation Michael Ben-Or The Hebrew University Michael Rabin’s Birthday Celebration.
1 Analysis of the Linux Random Number Generator Zvi Gutterman, Benny Pinkas, and Tzachy Reinman.
EXPANDER GRAPHS Properties & Applications. Things to cover ! Definitions Properties Combinatorial, Spectral properties Constructions “Explicit” constructions.
Derandomizing LOGSPACE Based on a paper by Russell Impagliazo, Noam Nissan and Avi Wigderson Presented by Amir Rosenfeld.
1 Streaming Computation of Combinatorial Objects Ziv Bar-Yossef U.C. Berkeley Omer Reingold AT&T Labs – Research Ronen.
1 Constructing Pseudo-Random Permutations with a Prescribed Structure Moni Naor Weizmann Institute Omer Reingold AT&T Research.
The Power of Randomness in Computation 呂及人中研院資訊所.
Lecture 20: April 12 Introduction to Randomized Algorithms and the Probabilistic Method.
Collecting Correlated Information from a Sensor Network Micah Adler University of Massachusetts, Amherst.
Correctness of Gossip-Based Membership under Message Loss Maxim Gurevich, Idit Keidar Technion.
Computer Security CS 426 Lecture 3
CMSC 414 Computer and Network Security Lecture 3 Jonathan Katz.
Randomness – A computational complexity view Avi Wigderson Institute for Advanced Study.
Códigos y Criptografía Francisco Rodríguez Henríquez A Short Introduction to Stream Ciphers.
Stochastic Algorithms Some of the fastest known algorithms for certain tasks rely on chance Stochastic/Randomized Algorithms Two common variations – Monte.
The Power and Weakness of Randomness (when you are short on time) Avi Wigderson School of Mathematics Institute for Advanced Study.
Why Extractors? … Extractors, and the closely related “Dispersers”, exhibit some of the most “random-like” properties of explicitly constructed combinatorial.
RANDOMNESS AND PSEUDORANDOMNESS Omer Reingold, Microsoft Research and Weizmann.
Endre Szemerédi & TCS Avi Wigderson IAS, Princeton.
Expanders via Random Spanning Trees R 許榮財 R 黃佳婷 R 黃怡嘉.
CSE 486/586 CSE 486/586 Distributed Systems Graph Processing Steve Ko Computer Sciences and Engineering University at Buffalo.
An Efficient Algorithm for Enumerating Pseudo Cliques Dec/18/2007 ISAAC, Sendai Takeaki Uno National Institute of Informatics & The Graduate University.
Many random walks are faster than one Noga AlonTel Aviv University Chen AvinBen Gurion University Michal KouckyCzech Academy of Sciences Gady KozmaWeizmann.
15-853:Algorithms in the Real World
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
Random walks on undirected graphs and a little bit about Markov Chains Guy.
Data Structures and Algorithms in Parallel Computing Lecture 2.
Amplification and Derandomization Without Slowdown Dana Moshkovitz MIT Joint work with Ofer Grossman (MIT)
My Favorite Ten Complexity Theorems of the Past Decade II Lance Fortnow University of Chicago.
Part 1: Overview of Low Density Parity Check(LDPC) codes.
Artur Czumaj DIMAP DIMAP (Centre for Discrete Maths and it Applications) Computer Science & Department of Computer Science University of Warwick Testing.
RANDOMNESS VS. MEMORY: Prospects and Barriers Omer Reingold, Microsoft Research and Weizmann With insights courtesy of Moni Naor, Ran Raz, Luca Trevisan,
Complexity and Efficient Algorithms Group / Department of Computer Science Testing the Cluster Structure of Graphs Christian Sohler joint work with Artur.
1 Cryptography Troy Latchman Byungchil Kim. 2 Fundamentals We know that the medium we use to transmit data is insecure, e.g. can be sniffed. We know that.
Does Privacy Require True Randomness? Yevgeniy Dodis New York University Joint work with Carl Bosley.
Random Sampling Algorithms with Applications Kyomin Jung KAIST Aug ERC Workshop.
Stochastic Streams: Sample Complexity vs. Space Complexity
Introduction to Randomized Algorithms and the Probabilistic Method
New Characterizations in Turnstile Streams with Applications
Random walks on undirected graphs and a little bit about Markov Chains
CS154, Lecture 18:.
Privacy and Fault-Tolerance in Distributed Optimization Nitin Vaidya University of Illinois at Urbana-Champaign.
Linear sketching with parities
Linear sketching over
Linear sketching with parities
The Zig-Zag Product and Expansion Close to the Degree
The Weizmann Institute
Presentation transcript:

RANDOMNESS AND PSEUDORANDOMNESS Omer Reingold, Microsoft Research and Weizmann

Randomness and Pseudorandomness  When Randomness is Useful  When Randomness can be reduced or eliminated – derandomization  Basic Tool: Pseudorandomness  An object is pseudorandom if it “looks random” (indistinguishable from uniform), though it is not.  Expander Graphs

Randomness In Computation (1)  Distributed computing (breaking symmetry)  Cryptography: Secrets, Semantic Security, …  Sampling, Simulations, …

Randomness In Computation (2)  Communication Complexity (e.g., equality)  Routing (on the cube [Valiant]) - drastically reduces congestion

Randomness In Computation (3)  In algorithms – useful design tool, but many times can derandomize (e.g., PRIMES in P). Is it always the case?  BPP=P means that every randomized algorithm can be derandomized with only polynomial increase in time  RL=L means that every randomized algorithm can be derandomized with only a constant factor increase in memory

In Distributed Computing Byzantine Agreement DeterministicRandomized Synchronous t failures t+1 roundsO(1) Asynchronous impossiblepossible Dining Philosophers: breaking symmetry Attack Now Don’t Attack

Randomness Saves Communication Original File Copy = ?  Deterministic: need to send the entire file!  Randomness in the Sky: O(1) bits (or log in 1/error)  Private Randomness: Logarithmic number of bits (derandomization).

In Cryptography Private Keys: no randomness - no secrets and no identities Encryption: two encryptions of same message with same key need to be different Randomized (interactive) Proofs: Give rise to wonderful new notions: Zero-Knowledge, PCPs, …

Random Walks and Markov Chains  When in doubt, flip a coin:  Explore graph: minimal memory  Page Rank: stationary distribution of Markov Chains  Sampling vs. Approx counting. Estimating size of Web  Simulations of Physical Systems ……

Shake Your Input  Communication network (n-dimensional cube) Every deterministic routing scheme will incur exponentially busy links (in worse case) Valiant: To send a message from x  y, select node z at random, send x  z  y. Now: O(1) expected load for every edge  Another example – randomized quicksort  Smoothed Analysis: small perturbations, big impact

In Private Data Analysis Hide Presence/Absence of Any Individual How many people in the database have the BC1 gene? Add random noise to true answer distributed as Lap(  /  ) More questions? More privacy? Need more noise. 0  22 33 44 -- -2  -3  -4  ratio bounded

Randomness and Pseudorandomness  When Randomness is Useful  When Randomness can be reduced or eliminated – derandomization  Basic Tool: Pseudorandomness  An object is pseudorandom if it “looks random” (indistinguishable from uniform), though it is not.  Expander Graphs

Cryptography: Good Pseudorandom Generators are Crucial  With them, we have one-time pad (and more): ciphertext = plaintext  K = E D plaintext data: plaintext data: (ciphertext  K) = short key K 0 : 110  Without, keys are bad, algorithms are worthless (theoretical & practical) derived key K:

Data Structures & Hash Functions Linear Probing: Alice Mike Bob Mary Bob F(Bob)  If F is random then insertion time and query time are O(1) (in expectation).  But where do you store a random function ?!? Derandomize!  Heuristic: use SHA1, MD4, …  Recently (2007): 5-wise independent functions are sufficient*  Similar considerations all over: bloom filters, cuckoo hashing, bit-vectors, …

Weak Sources & Randomness Extractors  Available random bits are biased and correlated  Von Neumann sources:  Randomness Extractors produce randomness from general weak sources, many other applications b 1 b 2 … b i … are i.i.d. 0/1 variables and b i =1 with some probability p < 1 then translate

Algorithms: Can Randomness Save Time or Memory?  Conjecture - No* (*moderate overheads may still apply)  Examples of derandomization:  Holdouts: Identity testing, approximation algorithms, … Primality Testing in Polynomial Time Graph Connectivity logarithmic Memory

(Bipartite) Expander Graphs |  (S)|  A |S| (A > 1)  S, |S|  K Important: every (not too large) set expands. D NN

(Bipartite) Expander Graphs |  (S)|  A |S| (A > 1)  S, |S|  K  Main goal: minimize D (i.e. constant D) Degree 3 random graphs are expanders! [Pin73] D NN

(Bipartite) Expander Graphs |  (S)|  A |S| (A > 1)  S, |S|  K Also: maximize A.  Trivial upper bound: A  D  even A ≲ D-1  Random graphs: A  D-1 D NN

Applications of Expanders These “innocent” looking objects are intimately related to various fundamental problems:  Network design (fault tolerance),  Sorting networks,  Complexity and proof theory,  Derandomization,  Error correcting codes,  Cryptography,  Ramsey theory  And more...

Non-blocking Network with On-line Path Selection [ALM] N (Inputs)N (Outputs) Depth O(log N), size O(N log N), bounded degree. Allows connection between input nodes and output nodes using vertex disjoint paths.

Non-blocking Network with On-line Path Selection [ALM] N (Inputs)N (Outputs) Every request for connection (or disconnection) is satisfied in O(log N) bit steps: On line. Handles many requests in parallel.

The Network “Lossless” Expander N (Inputs)N (outputs)

Slightly Unbalanced, “Lossless” Expanders |  (S)|  0.9 D |S|  S, |S|  K D N M=  N 0<  1 is an arbitrary constant  D is constant & K=  (M/D) =  (  N/D). [CRVW 02]: such expanders (with D = polylog(1/  ))

Property 1: A Very Strong Unique Neighbor Property  S, |S|  K, |  (S)|  0.9 D |S| S Non Unique neighbor S has  0.8 D |S| unique neighbors ! Unique neighbor of S

Using Unique Neighbors for Distributed Routing Task: match S to its neighbors (|S|  K) S Step I: match S to its unique neighbors. S` Continue recursively with unmatched vertices S’.

Reminder: The Network Adding new paths: think of vertices used by previous paths as faulty.

Property 2: Incredibly Fault Tolerant  S, |S|  K, |  (S)|  0.9 D |S| Remains a lossless expander even if adversary removes (0.7 D) edges from each vertex.

Simple Expander Codes [G63,Z71,ZP76,T81,SS96] M=  N (Parity Checks) Linear code; Rate 1 – M/N = (1 -  ). Minimum distance  K. Relative distance  K/N=  (  / D) =  / polylog (1/  ). For small  beats the Zyablov bound and is quite close to the Gilbert-Varshamov bound of  / log (1/  ). N (Variables)

Error set B, |B|  K/2  Algorithm: At each phase, flip every variable that “sees” a majority of 1’s (i.e, unsatisfied constraints). Simple Decoding Algorithm in Linear Time (& log n parallel phases) [ SS 96 ] M=  N (Constraints) N (Variables) |Flip\B|  |B|/4 |B\Flip|  |B|/4  |B new |  |B|/2 |  (B)| >.9 D |B| |  (B)  Sat|<.2 D|B|

Random Walk on Expanders [AKS 87]... x0x0 x1x1 x2x2 xixi x i converges to uniform fast (for arbitrary x 0 ). For a random x 0 : the sequence x 0, x 1, x 2... has interesting “random-like” properties.

Thanks

Expander Graphs |  (S)|  ¾ D |S|  S, |S|  K = N/10 D N M  N S  (S) Sparse Graphs that are highly connected:  Some useful properties: Random walk rapidly mixing Most outgoing edges are “unique” (great for routing) Very fault tolerant

Applications of Expanders These “innocent” objects are intimately related to various fundamental problems:  Network design (fault tolerance),  Sorting networks,  Complexity and proof theory,  Derandomization,  Error correcting codes,  Cryptography,  Ramsey theory  And more...

Simple Expander Codes Simple Expander Codes [G63,Z71,ZP76,T81,SS96] M=  N (Parity Checks) Linear code; Rate 1 – M/N = (1 -  ). Minimum distance  K. Relative distance  K/N=  (  / D) =  / polylog (1/  ). For small  beats the Zyablov bound and is quite close to the Gilbert-Varshamov bound of  / log (1/  ). N (Variables)

Error set B, |B|  K/2  Algorithm: At each phase, flip every variable that “sees” a majority of 1’s (i.e, unsatisfied constraints). Simple Decoding Algorithm in Linear Time Simple Decoding Algorithm in Linear Time (& log n parallel phases) [ SS 96 ] M=  N (Constraints) N (Variables) |Flip\B|  |B|/4 |B\Flip|  |B|/4  |B new |  |B|/2 |  (B)| >.9 D |B| |  (B)  Sat|<.2 D|B|

More Expander Applications: Non- Blocking Networks and more Expanders  Requests for connection (or disconnection) are satisfied, on line, in very few steps, handles many requests in parallel.