Download presentation
Presentation is loading. Please wait.
Published byMatthew Gutierrez Modified over 11 years ago
1
Randomness Conductors Expander Graphs Randomness Extractors Condensers Universal Hash Functions............
2
Randomness Conductors Meta-Definition Prob. dist. X An R-conductor if for every (k,k) R, X has k bits of entropy X has k bits of entropy. D N M x x Prob. dist. X
3
Plan Definitions & Applications: The balanced case (M = N). –Vertex Expansion. –2 nd Eigenvalue Expansion. The unbalanced case (M N). –Extractors, Dispersers, Condensers. Conductors Universal Hash Functions. Constructions: Zigzag Product & Loosless Expanders.
4
(Bipartite) Expander Graphs | (S)| A |S| (A > 1) S, |S| K Important: every (not too large) set expands. D NN
5
(Bipartite) Expander Graphs | (S)| A |S| (A > 1) S, |S| K Main goal: minimize D (i.e. constant D) Degree 3 random graphs are expanders! [Pin73] D NN
6
(Bipartite) Expander Graphs | (S)| A |S| (A > 1) S, |S| K Also: maximize A. Trivial upper bound: A D –even A D-1 Random graphs: A D-1 D NN
7
Applications of Expanders These innocent objects are intimately related to various fundamental problems: Network design (fault tolerance), Sorting networks, Complexity and proof theory, Derandomization, Error correcting codes, Cryptography, Ramsey theory And more...
8
Non-blocking Network with On-line Path Selection [ALM] N (Inputs)N (Outputs) Depth O(log N), size O(N log N), bounded degree. Allows connection between input nodes and output nodes using vertex disjoint paths.
9
Non-blocking Network with On-line Path Selection [ALM] N (Inputs)N (Outputs) Every request for connection (or disconnection) is satisfied in O(log N) bit steps: On line. Handles many requests in parallel.
10
The Network Lossless Expander N (Inputs)
11
Slightly Unbalanced, Lossless Expanders | (S)| 0.9 D |S| S, |S| K D N M= N 0< 1 is an arbitrary constant D is constant & K= (M/D) = ( N/D). [CRVW 02]: such expanders (with D = polylog(1/ ))
12
Property 1: A Very Strong Unique Neighbor Property S, |S| K, | (S)| 0.9 D |S| S Non Unique neighbor S has 0.8 D |S| unique neighbors ! Unique neighbor of S
13
Using Unique Neighbors for Distributed Routing Task: match S to its neighbors (|S| K) S Step I: match S to its unique neighbors. S` Continue recursively with unmatched vertices S.
14
Reminder: The Network Adding new paths: think of vertices used by previous paths as faulty.
15
Incredibly Fault Tolerant Property 2: Incredibly Fault Tolerant S, |S| K, | (S)| 0.9 D |S| Remains a lossless expander even if adversary removes (0.7 D) edges from each vertex.
16
Simple Expander Codes Simple Expander Codes [ G63,Z71,ZP76,T81,SS96 ] M= N (Parity Checks) Linear code; Rate 1 – M/N = (1 - ). Minimum distance K. Relative distance K/N= ( / D) = / polylog (1/ ). For small beats the Zyablov bound and is quite close to the Gilbert-Varshamov bound of / log (1/ ). N (Variables) 1 1 0 0 1 + + + + 0
17
Error set B, |B| K/2 Algorithm: At each phase, flip every variable that sees a majority of 1s (i.e, unsatisfied constraints). Simple Decoding Algorithm in Linear Time Simple Decoding Algorithm in Linear Time (& log n parallel phases) [ SS 96 ] M= N (Constraints) N (Variables) + + + + 1 1 0 0 1 |Flip\B| |B|/4 |B\Flip| |B|/4 |B new | |B|/2 | (B)| >.9 D |B| | (B) Sat|<.2 D|B| 0 1 0 0 1 1 0
18
Random Walk on Expanders [AKS 87]... x0x0 x1x1 x2x2 xixi x i converges to uniform fast (for arbitrary x 0 ). For a random x 0 : the sequence x 0, x 1, x 2... has interesting random-like properties.
19
Expanders Add Entropy Prob. dist. X Definition we gave: |Support(X)| A |Support(X)| Applications of the random walk rely on less naïve measures of entropy. Almost all explicit constructions directly give 2 nd eigenvalue expansion. Can be interpreted in terms of Renyi entropy. D N M x x Induced dist. X
20
2 nd Eigenvalue Expansion P=(P i,j ) - transition probabilities matrix: P i,j = (# edges between i and j in G) / D Goal: If [0,1] n is a (non-uniform) distribution on vertices of G, then P is closer to uniform. D G - Undirected D N Symmetric N N
21
2 nd Eigenvalue Expansion 0 1 … N-1, eigenvalues of P. – 0 =1, Corresponding eigenvector v 0 =1 N : P(Uniform)=Uniform –Second eigenvalue (in absolute value): = (G)=max{| 1 |,| N-1 |} –G connected and non-bipartite <1 – is a good measure of the expansion of G [Tan84, AM84, Alo86]. Qualitatively: G is an expander (G) < β < 1
22
Randomness Conductors Expanders, extractors, condensers & hash functions are all functions, f : [N] [D] [M], that transform: X of entropy k X = f (X,U niform ) of entropy k Many flavors: –Measure of entropy. –Balanced vs. unbalanced. –Lossless vs. lossy. –Lower vs. upper bound on k. –Is X close to uniform? –…–… Randomness conductors: As in extractors. Allows the entire spectrum.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.