Download presentation
Presentation is loading. Please wait.
1
The Weizmann Institute
Expander Graphs: The Unbalanced Case Omer Reingold The Weizmann Institute
2
What's in This Talk? Expander Graphs – an array of definitions.
Focus on most established notions, and open problems on explicit constructions. Mainly in the unbalanced case since this is What applications often require Where constructions are very far from optimal Will flash one construction (no details) - Unbalanced expanders based on Parvaresh-Vardy Codes [Guruswami,Umans,Vadhan 06]
3
Bipartite Graphs As a preparation for the unbalanced case we will talk of bipartite expanders. Can also capture undirected expanders: D G - Undirected N D N Symmetric
4
Vertex Expansion Every (not too large) set expands. N S, |S| K D
|(S)| A |S| (A > 1) S, |S| K Every (not too large) set expands.
5
Vertex Expansion Goal: minimize D (i.e. constant D)
|(S)| A |S| (A > 1) S, |S| K Goal: minimize D (i.e. constant D) Degree 3 random graphs are expanders [Pin73]
6
Vertex Expansion Also: maximize A. Trivial upper bound: A D
|(S)| A |S| (A > 1) S, |S| K Also: maximize A. Trivial upper bound: A D even A ≲ D-1 Random graphs: AD-1
7
2nd Eigenvalue Expansion
2nd eigenvalue (in absolute value) of (normalized) adjacency matrix is bounded away from 1 Can be interpreted in terms of Renyi (l2) entropy
8
Expanders Add Entropy Vertex expansion: |Support(X’)| A |Support(X)|
Prob. dist. X Induced dist. X’ D x x’ Vertex expansion: |Support(X’)| A |Support(X)| Some applications rely on “less naïve” measures of entropy. Col(X) = Pr[X(1)=X(2)] = ||X||2
9
2nd Eigenvalue Expansion
Col(X’) –1/N 2 (Col(X) –1/N) Renyi entropy (log 1/Col(X)) increases as long as: < 1 and Col(X) is not too small
10
2nd Eigenvalue Expansion
Interestingly, vertex expansion and 2nd-eigenvalue expansion are essentially equivalent for constant degree graphs [Tan84, AM84, Alo86]
11
Explicit Constructions
Applications need explicit constructions: Weakly explicit: easy to build the entire graph (in time poly N). Strongly explicit: Given vertex name x and edge label i easy to find the ith neighbor of x (in time poly log N).
12
Explicit constructions – 2nd Eigenvalue
Celebrated sequence of algebraic constructions [Mar73, GG80,JM85,LPS86,AGM87,Mar88,Mor94,...]. Optimal 2nd eigenvalue (Ramanujan graphs) “Combinatorial” constructions: [Ajt87, RVW00, BL04]. Open: Combinatorial constructions of strongly explicit Ramanujan (or almost Ramanujan) graphs. Getting “close”: [Ben-Aroya,Ta-Shma 08]
13
Explicit constructions – Vertex Expansion
Optimal 2nd eigenvalue expansion does not imply optimal vertex expansion Exist Ramanujan graphs with vertex expansion D/2 [Kah95]. Lossless Expander – Expansion > (1-) D Why should we care? Limitation of previous techniques Many applications
14
Property 1: A Very Strong Unique Neighbor Property
S, |S| K, |(S)| 0.9 D |S| Unique neighbor of S S Non Unique neighbor S has 0.8 D |S| unique neighbors ! We call graphs where every such S has even a single unique neighbor – unique neighbor expanders
15
Property 2: Incredibly Fault Tolerant
S, |S| K, |(S)| 0.9 D |S| Remains a lossless expander even if adversary removes (0.7 D) edges from each vertex.
16
Explicit constructions – Vertex Expansion
Open: lossless expanders for the undirected case. Unique neighbor expanders are known [AC02] For the directed case (expansion only from left side), lossless expanders are known [CRVW02]. Expansion D-O(D). Open: expansion D-O(1) (even with non-constant degree).
17
Unbalanced Expanders Many applications need D N
18
Unbalanced Expanders Many applications need unbalanced expanders: N M
19
Array of Definitions Many flavors: How unbalanced. Measure of entropy.
Lossless vs. lossy. Is X’ close to full entropy? Lower vs. upper bound on entropy of X. … X X’ D
20
Vertex Expansion Revisited
M S, |S|= N 0.9 |(S)| 10 D D Even previously trivial tasks require D = (log N/log M) M << N Farewell constant degree
21
Slightly-Unbalanced Constant-Degree Lossless Expanders
M= N S, |S| K |(S)| (1-) D |S| D CRVW02: 0<, 1 constants D constant & K= (N) In case someone asks: K= ( M/D) & D= poly(1/ , log (1/ )) (fully explicit: D= quasipoly(1/ , log (1/ )))
22
Open: More Unbalanced N M D E.g. M=N0.5 and sets of size at most K=N0.2 expand. While being greedy: Unique neighbor expanders Lossless expanders Minimal Degree
23
Super-Constant Degree
M S, |S| K D |(S)| (1-)D |S| State of the art [GUV06]: D=Poly(LogN), M=Poly(KD) (w. some tradeoff). Open: M=O(KD) (known w. D=QuasiPoly(LogN)) Open: D= O(LogN)
24
Dispersers [Sipser 88] Bounds: D ≥ 1/ log(N/K)
M |(S)| > (1-) M S, |S|≥ K D Bounds: D ≥ 1/ log(N/K) DK/M ≥ log 1/ -- must be lossy Explicit constructions are (comparably) good but still not optimal …
25
Increasing Entropy? Can Renyi entropy increase ?
M Prob. dist. X Induced dist. X’ D x x’ Can Renyi entropy increase ? |Col(X’)| < |Col(X)| (essentially) D> min{M0.5, N/M}
26
Extractors [NZ 93] N M ≪ N X’ X D (k,)-extractor if Min-entropy(X) k X’ -close to uniform Min-entropy(X) k if x, Pr[x] 2-k X and Y are -close if maxT | Pr[XT] - Pr[YT] | = ½ ||X-Y||1
27
Equivalently Extractors = Mixing
| e(S,T)/DK - |T|/N | < S, |S|= K D Vertex Expansion – Sets on the left have many neighbors. Mixing Lemma – the neighborhood of S hits any T with roughly the right proportion.
28
2-Source Extractors EXT Recently – lots of attention and results
source of biased correlated bits EXT almost uniform output random bits another independent weak source Recently – lots of attention and results Randomness Extractors are a special case, where the 2nd source is truly random.
29
Explicit Constructs. of Extractors
Extractors are highly motivated in applications. As a general rule of thumb: “Anything expanders can do, extractors can do better” … Lots of progress. Still very far from optimal. Best in one direction [LRVW03, GUV06]: D=Poly(LogN / ), M=2k(1-) Selected open problem: M=2k with D=Poly(LogN / ) Interpretation: extracting an arbitrary constant fraction of entropy Interpretation: extracting all the entropy
30
A Word About Techniques
Research on randomness extractors was invigorated with the discovery of a beautiful and surprising connection to pseudorandom generators [Tre99]. This further led to discoveries of connections between extractors and error correcting codes [Tre99, RRV99, TZ01, TZS01, SU01]. In particular, [GUV06] relies on Parvaresh-Vardy list-decodable codes
31
[GUV06] - Basic Construction
Left vertex f Fqn (poly. of degree· n-1 over Fq) Edge Label y F Right vertices = Fqm+1 y’th neighbor of f = (y, f(y), (f h mod E)(y), (f h2 mod E)(y), …, (f hm-1 mod E)(y)) where E(Y) = irreducible poly of degree n h = a parameter Thm: This is a (K,A) expander with K=hm, A = q-hnm.
32
Conclusions Many interesting variants of expander graphs
Constructions in general – very far from optimal Any clean and useful algebraic characterization?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.