Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sensitivity of voting schemes and coin tossing protocols

Similar presentations


Presentation on theme: "Sensitivity of voting schemes and coin tossing protocols"— Presentation transcript:

1 Sensitivity of voting schemes and coin tossing protocols
Elchanan Mossel, U.C. Berkeley Based on joint works with Ryan O’Donnell X 2 Gil Kalai and Olle Haggstrom Ryan O’Donell, Oded Regev, Jeff Steif and Benny Sudakov 9/18/2018

2 Definition of voting schemes
A population of size n is to choose between two options / candidates. A voting scheme is a function that associates to each configuration of votes which option to choose. Formally, a voting scheme is a function f : {0,1}n ! {0,1}. Two prime examples: Majority vote, Electoral college. 9/18/2018

3 Properties of voting schemes
Some properties of voting schemes: We will always assume that candidates are treated equally: The function f is anti-symmetric: f(~x) = ~f(x) where ~(z1,…,zn) = (1 – z1,…,1-zn). We always assume that stronger support in a candidate shouldn’t heart her: The function f is monotone: x ¸ y ) f(x) ¸ f(y), where x ¸ y if xi ¸ yi for all i. Note that both majority and the electoral college are anti-symmetric and monotone. 9/18/2018

4 Democracy and voting schemes
Two interpretations of democracy: “Weak democracy” – each voter has the same power: There exists a transitive group  ½ Sn such that for all  2  and all x it holds that f((x(i))) = f((xi)) (***) “Strong democracy” – each set of voters has the same power – (***) holds for all  2 Sn. Easy: Monotonicity + antisymmetry + strong democracy ) f = majority. But: Electoral college is weak democracy (mathematically) 9/18/2018

5 LLN and voting schemes LLN: If  is an i.i.d. measure on {0,1}n, [Xi] = p > ½ and f = maj then P[f(X1,…Xn) = 1] ! 1. Q1: What if f  maj? (e.g. electoral college). Q2: What if  is not a product measure? Example: – if (1,…,1) = p and (0,…,0) = 1-p, then [f] = p for all f! Conclusion: We need the outcome not to depend on few coordinates. 9/18/2018

6 The effect and weighted majorities
Define the effect of voter i as ei = [f | xi = 1] - [f | xi = 0]. Suppose that [xi] = p > ½ and ei <  for all i. Q: For which functions small effects ) [f] ~ 1? Def: We call f weighted majority if f(x1,…xn) = sign(i=1n wi (xi- ½)), wi 2 R, and f is anti-symmetric. 9/18/2018

7 The effect and weighted majorities
Thm[Haagstrom-Kalai-M]: If f is a weighted majority [Xi] ¸ p> ½ and ei ·  for all i, then [f] ¸ max{1 -  p (1-p)/(p – ½), p} If f is not a weighted majority, then there exits a measure  with [Xi] ¸ p > ½ and [f] = 0 and ei = 0 for all i. Cor: f is “good” and weak democracy ) f = maj. 9/18/2018

8 Weighted majorities are good
Two proofs: Probabilistic: Let Yi = p - Xi and g = 1-f then E[g*Yi] = p(1-p) ei. p(1-p)  i wi ¸ p(1-p)( wi ei) = [( wi Yi)g] ¸ (p – ½)( wi)(1 - [f]). Get [f] ¸ 1 -  p (1-p) /(p – ½). Linear Program: Minimize [f] under the constraints: [Xi] ¸ p and ei · . Reductions to f = maj,  = symmetric measure. Gives a tight bound and minimizers. 9/18/2018

9 Non majorities are no good
Proof for week democracies: Need to show: f  maj ) f no good. f  maj ) 9 x s.t. f(x0) = 0 and maj(x0) = 1. Take  to be a uniform measure on the orbit of x0. [Xi] > ½ for all i but [f] = 0 and ei = 0 for all i. + = 0, - = 1 9/18/2018

10 Non majorities are no good
General proof via Linear programing duality Let G = ([n],E) be a hyper graph where S 2 E iff f(xE) = 0, where xE(i) = 1 iff i 2 S. Let *(H), be the fractional covering number: min {S 2 H {S} : 8 S, [S] ¸ 0 and 8 i, i 2 S [S] ¸ 1 } By duality ¤(H) is also max{i wi : 8 i, wi ¸ 0, 8 S in H, i 2 S wi · 1}. ¤(H) ¸ 2 ) f = weighted majority. If ¤(H) = s < 2, then take  to be the minimizer in the first definition and /s is the desired measure. Open problems: Which f are good when  is a general FKG measure (or variants of FKG property)? 9/18/2018

11 To the uniform measure Partial answer: For the uniform measure all monotone f’s are good (Friedgut, Kalai). But which monotone function is better? Assume x is chosen uniformly in {0,1}n. Let y = N(x) is obtained from x by flipping each of x coordinates with probability . Question: What is the probability that the population voted for who they meant to vote for? What is Sf() = P[f(x) = f(y)]? Which is the most sensitive / stable f? Is there an f which is both stable and sensible? Maj? Elctoral college? 9/18/2018

12 Stability without democracy
We now work with {-1,1}n instead of {0,1}^n. N(x, y) = P[N(x) = y],  = 1 – 2  and Z(f,) = <f,Nf>. N has the eigenvectors uS(x) = i 2 S xi, corresponding to the eigenvalues |S|. P[f(x) = f(N(x))] = ½ + E[f(x)f(N(x))]/2 = = ½ + <f,Nf>/2. Write f(x) = S fS uS(x). Since <f,1> = 0, <f,Nf> = S  ; fS2 |S| ·  and therefore Sf() = 1 - . Dictatorship, f(x) = xi is the only optimal function. 9/18/2018

13 Stability with democracy
How stable can we get with democracy? limn ! 1 Sf() = ½ - arcsin(1 – 2 )/ for f = majn. When  is small Sf() » 2 1/2/. An n1/2 £ n1/2 electoral college gives Sf() = (1/4). Conjecture (???): If f = fn satisfies max {ei : 1 · i · n} = o(1), then limn ! 1 Sf() ¸ ½ - arcsin(1 – 2 )/. ) most stable “weak democracy” = maj. Much stronger than recent results of Bourgain. Would give interesting results elsewhere … 9/18/2018

14 Getting sensitive How sensitive can a monotone function be?
Interesting in learning, neural networks, hardness amplification … majn maximizes the isoperimetric edge bounds among all monotone functions and I(majn)2 ~ 2n/. By Russo’s formula: I(f) = Z’(f,1). But Z’(f,1) = S |S| fS2. Consider the following relaxation of the problem: minimize S aS |S| under the constraints: S aS = 1, aS ¸ 0, S |S| aS ~· (2n/)1/2 :=  We get that Z(f,) ~¸ a 9/18/2018

15 Getting sensitive Kalai: Are there any functions that are so sensitive? Kalai: Is it enough to flip n1/2 of the votes in order to flip outcome with probability (1)? Thm[M-O’Donnell]: rec-maj-k satisfies <f,Nf> ~· (n,k) where (n,k) ~ n(k) and (k) ! ½ as k ! 1 (enough to flip n1-(k)) rec-maj with increasing arities gives that it is enough to flip logt(n)*n1/2 where t = ½ log2(/2). Talgrand’s random function gives that it is enough to flip c n1/2. 9/18/2018

16 Tossing coins from cosmic source
x (n bits) y y y ° ° ° yk first bit Alice Bob Cindy o o o Kate 9/18/2018

17 Broadcast with ε errors
x (n bits) y y y ° ° ° yk first bit Alice 1 Bob Cindy o o o Kate 9/18/2018

18 Broadcast with ε errors
x (n bits) y y y ° ° ° yk majority Alice 1 Bob Cindy o o o Kate 1 9/18/2018

19 The parameters Our goal n bit uniform random “source” string x
k parties who cannot communicate, but wish to agree on a uniformly random bit ε each party gets an independently corrupted version yi, each bit flipped independently with probability ε f (or f1… fk): balanced “protocol” functions Our goal For each n, k, ε, find the best protocol function f (or functions f1…fk) which maximize the probability that all parties agree on the same bit. 9/18/2018

20 Coins and voting schemes
Our goal For each n, k, ε, find the best protocol function f (or functions f1…fk) which maximize the probability that all parties agree on the same bit. Coins and voting schemes For k=2 we want to maximize P[f1(y1) = f2(y2)], where y1 and y2 are related by applying  noise twice. Optimal protocol: f1 = f2 = dictatorship. Same is true for k=3 (M-O’Donnell). 9/18/2018

21 Some variants Conjecture(???): For all k, if n = 1 and maxi ei = o(1) “optimal protocol is given by” majn. Gaussian version: x = N(0,1) yi = x + Ni(0,). Conjecture(???): For all k, optimal protocol given by f(x) = sign(x). k=2 recently proved by Wenbo Li (but not robust) using isoperimetric shifting of Gaussian measure. Variants motivated by recent results of Bourgain and needed in computational complexity. x 9/18/2018

22 Notation Further motivation We write:
S(f1, …, fk; ε) = Pr[f1(y1) = ··· = fk(yk)], Sk(f; ε) in the case f = f1 = ··· = fk. Further motivation Noise in “Ever-lasting security” crypto protocols (Ding and Rabin). Variant of a decoding problem. Study of noise sensitivity: |T(f)|kk where T is the Bonami-Beckner operator. 9/18/2018

23 protocols Recall that we want the parties’ bits, when agreed upon, to be uniformly random. To get this, we restricted to balanced functions. However this is neither necessary nor sufficient! In particular, for n = 5 and k = 3, there is a balanced function f such that, if all players use f, they are more likely to agree on 1 than on 0!. To get agreed-upon bits to be uniform, it suffices for functions be antisymmetric: Thm[M-O’Donnell]: In optimal f1 = … = fk = f and f is monotone (Pf uses convexity and symmetrization). We are thus in the same setting as in the voting case. 9/18/2018

24 More results [M-O’Donnell]
When k = 2 or 3, the first-bit function is best. For fixed n, when k→∞ majority is best (exercise!). For fixed n and k when ε→0 and ε→½, the first-bit is best. Proof for ε→0 uses isoperimetric inq for edge boundary. Proof for ε→ ½ uses Fourier. For unbounded n, things get harder… in general we don’t know the best function, but we can give bounds for Sk(f; ε). Main open problem for finite n (odd): Is optimal protocol always a majority of a subset? Conjecture M: No Conjecture O: Yes. 9/18/2018

25 Unbounded n Fixing  and n = 1, how does h(k,) := P[f1 = … = fk] decay as a function of k? First guess: h(k,) decays exponentially with k. But! Prop[M-O’Donnell]: h(k,) ¸ k-c() where c() > 0. Conj[M-O’Donnell]: h(k,) ! 0 as k ! 1. Thm[M-O’Donnell-Regev-Steif-Sudakov]: h(k,) · k-c’() 9/18/2018

26 Harmonic analysis of Boolean functions
To prove “hard” results need to do harmonic analysis of Boolean functions. Consists of many combinatorial and probabilistic tricks + “Hyper-contractivity”. If p-1=2(q-1) then |T f|q · |f|p if p > 1 (Bonami-Beckner) |T f|q ¸ |f|p if p < 1 and f > 0 (Borell). Our application uses 2nd – in particular implies that for all A and B: P[x 2 A, N(x) 2 B] ¸ P(A)1/p P(B)q. Similar inequalities hold for Ornstein-Uhlenbeck processes and “whenever” there is a log-sob inequality. 9/18/2018

27 Coins on other trees We can define the coin problem on trees.
So far we have only discusses the star. Y1 Y4 x x x=y1 Y2 Y3 Y4 Y5 Y4 Y2 Y1 Y2 Y3 Y3 Y5 Some highlights from MORSS: On line dictator is always optimal (new result in MCs). For some trees, different fi’s needed. 9/18/2018

28 Wrap-up We have seen a variety of “stability” problems for voting and coins tossing. Sometimes it is “easy” to show that dictator is optimal. Sometimes majority is (almost) optimal, but typically hard to prove (why?). Recursive majority is really (the most) unstable. 9/18/2018

29 Open problems Does f monotone anti-symmetric,  FKG and [Xi] = p > ½, ei <  ) [f] ¸ 1 - ? For  the i.i.d. measure the (almost) most stable f with ei = o(1) is maj (for k=2? All k?). The most stable f for Gaussian coin problem is f(x) = sign(x) and result is robust. For the coin problem, the optimal f is always a majority of a subset. 9/18/2018


Download ppt "Sensitivity of voting schemes and coin tossing protocols"

Similar presentations


Ads by Google