Download presentation
Presentation is loading. Please wait.
1
Linear sketching with parities
Grigory Yaroslavtsev with Sampath Kannan (Penn) and Elchanan Mossel (MIT)
2
Linear sketching with parities
Input πβ 0,1 π Parity = Linear function over πΎ πΉ 2 : β πβπ π₯ π E.g. π₯ 4 β π₯ 2 β π₯ 42 Deterministic linear sketch: set of π parities: β π = β π 1 β π 1 π₯ π 1 ; β π 2 β π 2 π₯ π 2 ;β¦; β π π β π π π₯ π π Randomized linear sketch: distribution over π parities (random π 1 , π 2 , β¦, π π ):
3
Linear sketching over πΎ πΉ 2
Given π π : 0,1 π β{0,1} Question: Can one recover π(π) from a small (πβͺπ) linear sketch over πΎ πΉ 2 ? Allow randomized computation (99% success) Probability over choice of random sets Sets are known at recovery time Recovery is deterministic (also consider randomized)
4
Motivation: Distributed Computing
Distributed computation among π΄ machines: π=( π π , π π , β¦, π π΄ ) (more generally π= β π=1 π΄ π π ) π΄ machines can compute sketches locally: β π π , β¦, β( π π΄ ) Send them to the coordinator who computes: β π π = β π π π ββ―β β π ( π π΄ ) (coordinate-wise XORs) Coordinator computes π π with ππ΄ communication 1 π π π π π
5
Motivation: Streaming
π generated through a sequence of updates Updates π 1 ,β¦, π π : update π π‘ flips bit at position π π‘ π π Updates: (1, 3, 8, 3) π π 1 π π 1 π π 1 π 1 β π allows to recover π(π) with π bits of space
6
Deterministic vs. Randomized
Fact: π has a deterministic sketch if and only if π=π( β π 1 β π 1 π₯ π 1 ; β π 2 β π 2 π₯ π 2 ;β¦; β π π β π π π₯ π π ) Equivalent to βπ has Fourier dimension π" Randomization can help: OR: π π = π₯ 1 β¨β―β¨ π₯ π Has βFourier dimensionβ =π Pick π= log 1/πΉ random sets π 1 ,β¦, π π If there is π such that β πβ π π π₯ π =1 output 1, otherwise output 0 Error probability πΉ
7
Fourier Analysis π π₯ 1 , β¦, π₯ π : 0,1 π β{0,1} Notation switch:
π π₯ 1 , β¦, π₯ π : 0,1 π β{0,1} Notation switch: 0β1 1ββ1 π β² : β1,1 π β{β1,1} Functions as vectors form a vector space: π: β1,1 π β{β1,1}βπβ {β1,1} 2 π Inner product on functions = βcorrelationβ: π,π = 2 βπ π₯β β1,1 π π π₯ π(π₯) = πΌ π₯βΌ β1,1 π π π₯ π π₯ π 2 = π,π = πΌ π₯βΌ β1,1 π π 2 π₯ =1 (for Boolean only)
8
βMain Charactersβ are Parities
For πΊβ[π] let character π πΊ (π₯)= πβπΊ π₯ π Fact: Every function π: β1,1 π β{β1,1} uniquely represented as multilinear polynomial π π₯ 1 , β¦, π₯ π = πΊβ[π] π πΊ π πΊ (π₯) π πΊ a.k.a. Fourier coefficient of π on πΊ π πΊ β‘ π, π πΊ = πΌ π₯βΌ β1,1 π π π₯ π πΊ π₯ πΊ π πΊ 2 =1 (Parseval)
9
Fourier Dimension Fourier sets π β‘ vectors in πΎ πΉ 2 π
βπ has Fourier dimension πβ = a π-dimensional subspace in Fourier domain has all weight πΊβ π΄ π π πΊ 2 =1 π π₯ 1 , β¦, π₯ π = πΊβ[π] π πΊ π πΊ (π) = πΊβ π΄ π π πΊ π πΊ (π) Pick a basis πΊ π , β¦, πΊ π in π΄ π : Sketch: π πΊ π (π), β¦, π πΊ π (π) For every πΊβ π΄ π there exists πβ π : πΊ= β πβπ πΊ π π πΊ π = β πβπ π πΊ π (π)
10
Deterministic Sketching and Noise
Suppose βnoiseβ has a bounded norm π = π-dim. + noise πΏ 0 -noise in the Fourier domain (via [Sanyalβ15]) π = π-dim. + βFourier πΏ 0 -noiseβ ππππ π = # non-zero Fourier coefficients of noise (aka βFourier sparsityβ) Linear sketch size: π+π( ππππ π /2 ) Our work: canβt be improved even with randomness and even for uniform π₯, e.g for ``addressing functionββ.
11
How Randomization Handles Noise
πΏ 0 -noise in the original domain (hashing a la OR) π= π-dim. + β πΏ 0 -noiseβ Linear sketch size: π + O(log ππππ π 0 ) Optimal (but only existentially, i.e. βπ:β¦) πΏ 1 -noise in the Fourier domain (via [Grolmuszβ97]) π = π-dim. + βFourier πΏ 1 -noiseβ Linear sketch size: π+π( ππππ π ) Example = π-dim. + small decision tree / DNF / etc.
12
Randomized Sketching: Hardness
π -dimensional affine extractors require π: π is an affine-extractor for dim. π if any restriction on a π-dim. affine subspace has values 0/1 w/prob. β₯0.1 each Example (inner product): π π = β π=1 π/2 π₯ 2πβ1 π₯ 2π Not πΈ-concentrated on π -dim. Fourier subspaces For β π -dim. Fourier subspace π΄ : πβπ΄ π πΊ 2 β₯ 1βπΈ Any π -dim. linear sketch makes error 1β πΈ 2 Converse doesnβt hold, i.e. concentration is not enough
13
Randomized Sketching: Hardness
Not πΈ-concentrated on π(π)-dim. Fourier subspaces: Almost all symmetric functions, i.e. π π =π( π π₯ π ) If not Fourier-close to constant or π=1 π π₯ π E.g. Majority (not an extractor even for O( π )) Tribes (balanced DNF) Recursive majority: ππ π βπ =ππ π 3 βππ π 3 β¦βππ π 3
14
Approximate Fourier Dimension
Not πΈ-concentrated on π -dim. Fourier subspaces β π -dim. Fourier subspace π΄: πβπ΄ π πΊ 2 β₯ 1βπΈ Any π -dim. linear sketch makes error Β½(1β πΈ ) Definition (Approximate Fourier Dimension) dim πΈ π = smallest π
such that π is πΈ-concentrated on some Fourier subspace of dimension π
π ( π 1 ) π ( π 2 ) π ( π 3 ) π ( π 2 + π 3 ) π ( π 1 + π 3 ) π ( π 1 +π 2 + π 3 ) πβπ΄ π πΊ 2 β₯ πΈ
15
Sketching over Uniform Distribution + Approximate Fourier Dimension
Sketching error over uniform distribution of π. dim π π -dimensional sketch gives error πβπ: Fix dim π π -dimensional π΄: πβπ΄ π πΊ 2 β₯ π Output: π π₯ =sign πΊβπ΄ π πΊ π πΊ π₯ : Pr π₯βΌπ β1,1 π π π₯ =π π₯ β₯πβ error πβπ We show a basic refinement β error πβπ π Pick π½ from a carefully chosen distribution Output: π π½ π₯ =sign πΊβπ΄ π πΊ π πΊ π₯ βπ½ -1 +1 1
16
Sketching over Uniform Distribution
π― πΉ 1,π (π) = bit-complexity of best compression scheme allowing to compute π with err. πΉ over uniform π Thm: If π π > π π >0, dim π π (π) = dim π π (π) = π
β1 then: π― πΉ 1,π (π)β₯π
, where πΉ=( π π β π π )/4. Corollary: If π (β
)<πΆ for πΆ<1 then there exists π
: π― Ξ 1 π 1,π π β₯π
. Optimal up to error as π
-dim. linear sketch has error 1β π π π Tight for the Majority function
17
Application: Random Streams
πβ 0,1 π generated via a stream of updates Each update randomly flips a random coordinate Goal: maintain π π during the stream (error π) Question: how much space necessary? Answer: π― π 1,π and best algorithm is linear sketch After first O(π logβ‘π) updates input π is uniform Big open question: Is the same true if π is not uniform? True for VERY LONG ( Ξ© π ) streams (via [LNWβ14]) How about short ones?
18
1-way Communication Complexity of XOR-functions
Shared randomness Alice: π β 0,1 π Bob: π β 0,1 π π(π) π(πβπ) Examples: π(z) = π π
π=1 π ( π§ π ) β (not) Equality π(z) = ( π§ 0 > d) β Hamming Distance > d π
π 1 π = min.|M| so that Bobβs error prob. π
19
Communication Complexity of XOR-functions
Well-studied (often for 2-way communication): [Montanaro,Osborne], ArXivβ09 [Shi, Zhang], QICβ09, [Tsang, Wong, Xie, Zhang], FOCSβ13 [OβDonnell, Wright, Zhao,Sun,Tan], CCCβ14 [Hatami, Hosseini, Lovett], FOCSβ16 Connections to log-rank conjecture [Lovettβ14]: Even special case for XOR-functions still open
20
Deterministic 1-way Communication Complexity of XOR-functions
Alice: π β 0,1 π Bob: π β 0,1 π π(π) π(πβπ) π· 1 π = min.|M| so that Bob is always correct [Montanaro-Osborneβ09]: π· 1 π = π· πππ π π· πππ π = deterministic lin. sketch complexity of π π· 1 π = π· πππ π = βFourier dimension of πβ
21
1-way Communication Complexity of XOR-functions
Shared randomness Alice: π β 0,1 π Bob: π β 0,1 π π(π) π(πβπ) π
π 1 π = min.|M| so that Bobβs error prob. π π
π πππ π = rand. lin. sketch complexity (error π ) π
π 1 π β€ π
π πππ π Question: π
π 1 π β π
π πππ π ? (true for symmetric)
22
Distributional 1-way Communication under Uniform Distribution
Alice: π βΌπ( 0,1 π ) Bob: π βΌπ( 0,1 π ) π(π) π(πβπ) π― π 1,π π = min.|M| so that Bobβs error prob. π is over the uniform distribution over (π,π) Enough to consider deterministic messages only Motivation: streaming/distributed with random input π
π 1 π = sup π· π― π 1,π· π
23
π― π 1,π and Approximate Fourier Dimension
Thm: If π π > π π >0, dim π π (π) = dim π π (π) = π
β1 then: π― πΉ 1,π (π)β₯π
, where πΉ=( π π β π π )/4. πβ π,π π π(πβπ) = π π (π) π π π π π π 00 01 10 11 π π π π(π)= πβ π,π π
24
π― π 1,π and Approximate Fourier Dimension
If π΄ π =π
β1 average βrectangleβ size = 2 πβπ
+π A subspace π΄ distinguishes π π and π π if: βπΊβπ΄ : π πΊ π π β π πΊ π π Fix a π
-dim. subspace π΄ π
: typical π π and π π in a typical βrectangleβ are distinguished by π΄ π
Lem: If a π
-dim. subspace π΄ π
distinguishes π π and π π + 1) π is π π -concentrated on π΄ π
2) π not π π -concentrated on any (π
β1)-dim. subspace Pr π§βΌπ( β1,1 π ) π π π π§ β π π π π§ β₯ π π β π π
25
π― π 1,π and Approximate Fourier Dimension
Thm: If π π > π π >0, dim π π (π) = dim π π (π) = π
β1 then: π― πΉ 1,π (π)β₯π
, Where πΉ=( π π β π π )/4. Pr π§βΌπ( β1,1 π ) π π π π§ β π π π π§ β₯ π π β π π Error for fixed π = min( Pr π₯βπ
[ π π π =0], Pr π₯βπ
[ π π π =1]) Average error for (π,π)βπ
= Ξ©( π π β π π ) π π π π 1 π
=βtypical rectangleβ π π π
26
Thanks! Questions? Other stuff:
Sketching Linear Threshold Functions: π π π log π π Resolves a commuication conjecture of [MOβ09] Blog post:
27
Example: Majority π― π(1/ π ) 1,π (π΄π π π )β₯π Majority function:
π΄π π π π§ 1 ,β¦, π§ π β‘ π=1 π π§ π β₯π/2 π΄π π π πΊ only depends on πΊ π΄π π π πΊ =0 if |πΊ| is odd π π π΄π π π = πΊ: πΊ =π π΄π π π πΊ =πΌ π β Β±π 1 π (πβ1)-dimensional subspace with most weight: π¨ πβπ =π πππ( 1 , 2 ,β¦,{πβ1}) πΊβ π¨ πβπ π΄π π π πΊ =1β πΎ π Β±π π β3/2 Set π π =1βπ( π β3/2 ), π π =1β πΎ π +π π β3/2 π― π(1/ π ) 1,π (π΄π π π )β₯π
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.