Randomness Conductors Expander Graphs Randomness Extractors Condensers Universal Hash Functions............

Slides:



Advertisements
Similar presentations
Optimal Lower Bounds for 2-Query Locally Decodable Linear Codes Kenji Obata.
Advertisements

The Weizmann Institute
Walk the Walk: On Pseudorandomness, Expansion, and Connectivity Omer Reingold Weizmann Institute Based on join works with Michael Capalbo, Kai-Min Chung,
Randomness Conductors (II) Expander Graphs Randomness Extractors Condensers Universal Hash Functions
An Introduction to Randomness Extractors Ronen Shaltiel University of Haifa Daddy, how do computers get random bits?
Linear-Degree Extractors and the Inapproximability of Max Clique and Chromatic Number David Zuckerman University of Texas at Austin.
A Combinatorial Construction of Almost-Ramanujan Graphs Using the Zig-Zag product Avraham Ben-Aroya Avraham Ben-Aroya Amnon Ta-Shma Amnon Ta-Shma Tel-Aviv.
Randomness Extractors: Motivation, Applications and Constructions Ronen Shaltiel University of Haifa.
Connectivity - Menger’s Theorem Graphs & Algorithms Lecture 3.
Optimization Problems in Optical Networks. Wavelength Division Multiplexing (WDM) Directed: Symmetric: Undirected: Optic Fiber.
Gillat Kol joint work with Ran Raz Locally Testable Codes Analogues to the Unique Games Conjecture Do Not Exist.
Approximation Algorithms Chapter 14: Rounding Applied to Set Cover.
Expander Graphs, Randomness Extractors and List-Decodable Codes Salil Vadhan Harvard University Joint work with Venkat Guruswami (UW) & Chris Umans (Caltech)
List decoding and pseudorandom constructions: lossless expanders and extractors from Parvaresh-Vardy codes Venkatesan Guruswami Carnegie Mellon University.
1 Discrete Structures & Algorithms Graphs and Trees: III EECE 320.
The zigzag product, Expander graphs & Combinatorics vs. Algebra Avi Wigderson IAS & Hebrew University ’00 Reingold, Vadhan, W. ’01 Alon, Lubotzky, W. ’01.
Approximation Algorithms for Unique Games Luca Trevisan Slides by Avi Eyal.
Artur Czumaj Dept of Computer Science & DIMAP University of Warwick Testing Expansion in Bounded Degree Graphs Joint work with Christian Sohler.
Randomness Extractors: Motivation, Applications and Constructions Ronen Shaltiel University of Haifa.
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
(Omer Reingold, 2005) Speaker: Roii Werner TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A AA A A A A AA A.
CPSC 689: Discrete Algorithms for Mobile and Wireless Systems Spring 2009 Prof. Jennifer Welch.
Linear-time encodable and decodable error-correcting codes Daniel A. Spielman Presented by Tian Sang Jed Liu 2003 March 3rd.
Constant Degree, Lossless Expanders Omer Reingold AT&T joint work with Michael Capalbo (IAS), Salil Vadhan (Harvard), and Avi Wigderson (Hebrew U., IAS)
1 COMPOSITION PCP proof by Irit Dinur Presentation by Guy Solomon.
Linear-time Encodable/Decodable Error- Correcting Codes Some results from two papers of M. Sipser and D. Spielman 1 1 Much of the technical wording within.
EXPANDER GRAPHS Properties & Applications. Things to cover ! Definitions Properties Combinatorial, Spectral properties Constructions “Explicit” constructions.
Derandomizing LOGSPACE Based on a paper by Russell Impagliazo, Noam Nissan and Avi Wigderson Presented by Amir Rosenfeld.
CSE 421 Algorithms Richard Anderson Lecture 4. What does it mean for an algorithm to be efficient?
Lower Bounds for Property Testing Luca Trevisan U C Berkeley.
Lecture 20: April 12 Introduction to Randomized Algorithms and the Probabilistic Method.
Linear-Time Encodable and Decodable Error-Correcting Codes Jed Liu 3 March 2003.
CS774. Markov Random Field : Theory and Application Lecture 10 Kyomin Jung KAIST Oct
Expander graphs – Constructions, Connections and Applications Avi Wigderson IAS & Hebrew University ’00 Reingold, Vadhan, W. ’01 Alon, Lubotzky, W. ’01.
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 7 Low Density Parity Check Codes.
Why Extractors? … Extractors, and the closely related “Dispersers”, exhibit some of the most “random-like” properties of explicitly constructed combinatorial.
RANDOMNESS AND PSEUDORANDOMNESS Omer Reingold, Microsoft Research and Weizmann.
Endre Szemerédi & TCS Avi Wigderson IAS, Princeton.
296.3Page :Algorithms in the Real World Error Correcting Codes III (expander based codes) – Expander graphs – Low density parity check (LDPC) codes.
15-853:Algorithms in the Real World
Lower Bounds for Property Testing Luca Trevisan U.C. Berkeley.
15-853Page :Algorithms in the Real World Error Correcting Codes III (expander based codes) – Expander graphs – Low density parity check (LDPC) codes.
Random walks on undirected graphs and a little bit about Markov Chains Guy.
Amplification and Derandomization Without Slowdown Dana Moshkovitz MIT Joint work with Ofer Grossman (MIT)
Speeding Up Enumeration Algorithms with Amortized Analysis Takeaki Uno (National Institute of Informatics, JAPAN)
Part 1: Overview of Low Density Parity Check(LDPC) codes.
Artur Czumaj DIMAP DIMAP (Centre for Discrete Maths and it Applications) Computer Science & Department of Computer Science University of Warwick Testing.
Complexity and Efficient Algorithms Group / Department of Computer Science Testing the Cluster Structure of Graphs Christian Sohler joint work with Artur.
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 5.
The zigzag product, Expander graphs & Combinatorics vs. Algebra Avi Wigderson IAS, Princeton ’00 Reingold, Vadhan, W. ’01 Alon, Lubotzky, W. ’01 Capalbo,
Presented by Alon Levin
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. Fast.
CSE 421 Algorithms Richard Anderson Autumn 2015 Lecture 5.
RANDOMNESS AND PSEUDORANDOMNESS Omer Reingold, Microsoft Research and Weizmann.
Theory of Computational Complexity Probability and Computing Chapter Hikaru Inada Iwama and Ito lab M1.
Prüfer code algorithm Algorithm (Prüfer code)
Great Theoretical Ideas in Computer Science
The Taxi Scheduling Problem
MST in Log-Star Rounds of Congested Clique
Complexity of Expander-Based Reasoning and the Power of Monotone Proofs Sam Buss (UCSD), Valentine Kabanets (SFU), Antonina Kolokolova.
Planarity Testing.
On the effect of randomness on planted 3-coloring models
Introduction Wireless Ad-Hoc Network
Richard Anderson Autumn 2016 Lecture 5
The Zig-Zag Product and Expansion Close to the Degree
The Weizmann Institute
Graph Theory: Euler Graphs and Digraphs
Richard Anderson Winter 2019 Lecture 6
Richard Anderson Winter 2019 Lecture 5
Constructing a m-connected k-Dominating Set in Unit Disc Graphs
Presentation transcript:

Randomness Conductors Expander Graphs Randomness Extractors Condensers Universal Hash Functions

Randomness Conductors Meta-Definition Prob. dist. X An R-conductor if for every (k,k) R, X has k bits of entropy X has k bits of entropy. D N M x x Prob. dist. X

Plan Definitions & Applications: The balanced case (M = N). –Vertex Expansion. –2 nd Eigenvalue Expansion. The unbalanced case (M N). –Extractors, Dispersers, Condensers. Conductors Universal Hash Functions. Constructions: Zigzag Product & Loosless Expanders.

(Bipartite) Expander Graphs | (S)| A |S| (A > 1) S, |S| K Important: every (not too large) set expands. D NN

(Bipartite) Expander Graphs | (S)| A |S| (A > 1) S, |S| K Main goal: minimize D (i.e. constant D) Degree 3 random graphs are expanders! [Pin73] D NN

(Bipartite) Expander Graphs | (S)| A |S| (A > 1) S, |S| K Also: maximize A. Trivial upper bound: A D –even A D-1 Random graphs: A D-1 D NN

Applications of Expanders These innocent objects are intimately related to various fundamental problems: Network design (fault tolerance), Sorting networks, Complexity and proof theory, Derandomization, Error correcting codes, Cryptography, Ramsey theory And more...

Non-blocking Network with On-line Path Selection [ALM] N (Inputs)N (Outputs) Depth O(log N), size O(N log N), bounded degree. Allows connection between input nodes and output nodes using vertex disjoint paths.

Non-blocking Network with On-line Path Selection [ALM] N (Inputs)N (Outputs) Every request for connection (or disconnection) is satisfied in O(log N) bit steps: On line. Handles many requests in parallel.

The Network Lossless Expander N (Inputs)

Slightly Unbalanced, Lossless Expanders | (S)| 0.9 D |S| S, |S| K D N M= N 0< 1 is an arbitrary constant D is constant & K= (M/D) = ( N/D). [CRVW 02]: such expanders (with D = polylog(1/ ))

Property 1: A Very Strong Unique Neighbor Property S, |S| K, | (S)| 0.9 D |S| S Non Unique neighbor S has 0.8 D |S| unique neighbors ! Unique neighbor of S

Using Unique Neighbors for Distributed Routing Task: match S to its neighbors (|S| K) S Step I: match S to its unique neighbors. S` Continue recursively with unmatched vertices S.

Reminder: The Network Adding new paths: think of vertices used by previous paths as faulty.

Incredibly Fault Tolerant Property 2: Incredibly Fault Tolerant S, |S| K, | (S)| 0.9 D |S| Remains a lossless expander even if adversary removes (0.7 D) edges from each vertex.

Simple Expander Codes Simple Expander Codes [ G63,Z71,ZP76,T81,SS96 ] M= N (Parity Checks) Linear code; Rate 1 – M/N = (1 - ). Minimum distance K. Relative distance K/N= ( / D) = / polylog (1/ ). For small beats the Zyablov bound and is quite close to the Gilbert-Varshamov bound of / log (1/ ). N (Variables)

Error set B, |B| K/2 Algorithm: At each phase, flip every variable that sees a majority of 1s (i.e, unsatisfied constraints). Simple Decoding Algorithm in Linear Time Simple Decoding Algorithm in Linear Time (& log n parallel phases) [ SS 96 ] M= N (Constraints) N (Variables) |Flip\B| |B|/4 |B\Flip| |B|/4 |B new | |B|/2 | (B)| >.9 D |B| | (B) Sat|<.2 D|B|

Random Walk on Expanders [AKS 87]... x0x0 x1x1 x2x2 xixi x i converges to uniform fast (for arbitrary x 0 ). For a random x 0 : the sequence x 0, x 1, x 2... has interesting random-like properties.

Expanders Add Entropy Prob. dist. X Definition we gave: |Support(X)| A |Support(X)| Applications of the random walk rely on less naïve measures of entropy. Almost all explicit constructions directly give 2 nd eigenvalue expansion. Can be interpreted in terms of Renyi entropy. D N M x x Induced dist. X

2 nd Eigenvalue Expansion P=(P i,j ) - transition probabilities matrix: P i,j = (# edges between i and j in G) / D Goal: If [0,1] n is a (non-uniform) distribution on vertices of G, then P is closer to uniform. D G - Undirected D N Symmetric N N

2 nd Eigenvalue Expansion 0 1 … N-1, eigenvalues of P. – 0 =1, Corresponding eigenvector v 0 =1 N : P(Uniform)=Uniform –Second eigenvalue (in absolute value): = (G)=max{| 1 |,| N-1 |} –G connected and non-bipartite <1 – is a good measure of the expansion of G [Tan84, AM84, Alo86]. Qualitatively: G is an expander (G) < β < 1

Randomness Conductors Expanders, extractors, condensers & hash functions are all functions, f : [N] [D] [M], that transform: X of entropy k X = f (X,U niform ) of entropy k Many flavors: –Measure of entropy. –Balanced vs. unbalanced. –Lossless vs. lossy. –Lower vs. upper bound on k. –Is X close to uniform? –…–… Randomness conductors: As in extractors. Allows the entire spectrum.