Avoiding manipulation in elections through computational complexity Vincent Conitzer Computer Science Department Carnegie Mellon University Guest lecture.

Slides:



Advertisements
Similar presentations
Common Voting Rules as Maximum Likelihood Estimators Vincent Conitzer (Joint work with Tuomas Sandholm) Early version of this work appeared in UAI-05.
Advertisements

Voting and social choice Vincent Conitzer
How Hard Is It To Manipulate Voting? Edith Elkind, U. of Warwick Helger Lipmaa, Tartu U.
Algorithmic Game Theory Uri Feige Robi Krauthgamer Moni Naor Lecture 9: Social Choice Lecturer: Moni Naor.
Voting and social choice Looking at a problem from the designers point of view.
The Voting Problem: A Lesson in Multiagent System Based on Jose Vidal’s book Fundamentals of Multiagent Systems Henry Hexmoor SIUC.
How “impossible” is it to design a Voting rule? Angelina Vidali University of Athens.
IMPOSSIBILITY AND MANIPULABILITY Section 9.3 and Chapter 10.
CS 886: Electronic Market Design Social Choice (Preference Aggregation) September 20.
Complexity of manipulating elections with few candidates Vincent Conitzer and Tuomas Sandholm Carnegie Mellon University Computer Science Department.
Computing Kemeny and Slater Rankings Vincent Conitzer (Joint work with Andrew Davenport and Jayant Kalagnanam at IBM Research.)
Manipulation Toby Walsh NICTA and UNSW. Manipulation Constructive  Can we change result so a given candidate wins Destructive  Can we change result.
+ Random Tie Breaking Toby Walsh NICTA and UNSW. + Random Tie Breaking Haris Aziz, Serge Gaspers, Nick Mattei, Nina Narodytska, Toby Walsh NICTA and UNSW.
Using computational hardness as a barrier against manipulation Vincent Conitzer
Using computational hardness as a barrier against manipulation Vincent Conitzer
The Distortion of Cardinal Preferences in Voting Ariel D. Procaccia and Jeffrey S. Rosenschein.
Complexity Results about Nash Equilibria
Computational problems, algorithms, runtime, hardness
Speaker: Ariel Procaccia Joint work with: Michael Zuckerman, Jeff Rosenschein Hebrew University of Jerusalem.
Ties Matter: Complexity of Voting Manipulation Revisited based on joint work with Svetlana Obraztsova (NTU/PDMI) and Noam Hazon (CMU) Edith Elkind (Nanyang.
Preference elicitation Vincent Conitzer
CPS Voting and social choice
Social choice theory = preference aggregation = voting assuming agents tell the truth about their preferences Tuomas Sandholm Professor Computer Science.
Agent Technology for e-Commerce Chapter 10: Mechanism Design Maria Fasli
An Algorithm for Automatically Designing Deterministic Mechanisms without Payments Vincent Conitzer and Tuomas Sandholm Computer Science Department Carnegie.
How Hard Is It To Manipulate Voting? Edith Elkind, Princeton Helger Lipmaa, HUT.
Computational Criticisms of the Revelation Principle Vincent Conitzer, Tuomas Sandholm AMEC V.
AWESOME: A General Multiagent Learning Algorithm that Converges in Self- Play and Learns a Best Response Against Stationary Opponents Vincent Conitzer.
Junta Distributions and the Average Case Complexity of Manipulating Elections A. D. Procaccia & J. S. Rosenschein.
Common Voting Rules as Maximum Likelihood Estimators Vincent Conitzer and Tuomas Sandholm Carnegie Mellon University, Computer Science Department.
Complexity of Mechanism Design Vincent Conitzer and Tuomas Sandholm Carnegie Mellon University Computer Science Department.
Social choice theory = preference aggregation = truthful voting Tuomas Sandholm Professor Computer Science Department Carnegie Mellon University.
Strategic Behavior in Multi-Winner Elections A follow-up on previous work by Ariel Procaccia, Aviv Zohar and Jeffrey S. Rosenschein Reshef Meir The School.
Introduction complexity has been suggested as a means of precluding strategic behavior. Previous studies have shown that some voting protocols are hard.
Social choice (voting) Vincent Conitzer > > > >
An efficient distributed protocol for collective decision- making in combinatorial domains CMSS Feb , 2012 Minyi Li Intelligent Agent Technology.
CPS 173 Mechanism design Vincent Conitzer
Sequences of Take-It-or-Leave-it Offers: Near-Optimal Auctions Without Full Valuation Revelation Tuomas Sandholm and Andrew Gilpin Carnegie Mellon University.
CPS Voting and social choice Vincent Conitzer
The Cost and Windfall of Manipulability Abraham Othman and Tuomas Sandholm Carnegie Mellon University Computer Science Department.
Mechanism Design CS 886 Electronic Market Design University of Waterloo.
Automated Design of Multistage Mechanisms Tuomas Sandholm (Carnegie Mellon) Vincent Conitzer (Carnegie Mellon) Craig Boutilier (Toronto)
Mechanism design for computationally limited agents (previous slide deck discussed the case where valuation determination was complex) Tuomas Sandholm.
Mechanism design. Goal of mechanism design Implementing a social choice function f(u 1, …, u |A| ) using a game Center = “auctioneer” does not know the.
Complexity of Determining Nonemptiness of the Core Vincent Conitzer, Tuomas Sandholm Computer Science Department Carnegie Mellon University.
Junta Distributions and the Average-Case Complexity of Manipulating Elections A presentation by Jeremy Clark Ariel D. Procaccia Jeffrey S. Rosenschein.
Automated Mechanism Design Tuomas Sandholm Presented by Dimitri Mostinski November 17, 2004.
Manipulating the Quota in Weighted Voting Games (M. Zuckerman, P. Faliszewski, Y. Bachrach, and E. Elkind) ‏ Presented by: Sen Li Software Technologies.
Great Theoretical Ideas in Computer Science.
Mechanism design (strategic “voting”) Tuomas Sandholm Professor Computer Science Department Carnegie Mellon University.
6.853: Topics in Algorithmic Game Theory Fall 2011 Constantinos Daskalakis Lecture 22.
Social choice theory = preference aggregation = voting assuming agents tell the truth about their preferences Tuomas Sandholm Professor Computer Science.
CPS Computational problems, algorithms, runtime, hardness (a ridiculously brief introduction to theoretical computer science) Vincent Conitzer.
Definition and Complexity of Some Basic Metareasoning Problems Vincent Conitzer and Tuomas Sandholm Computer Science Department Carnegie Mellon University.
Negotiating Socially Optimal Allocations of Resources U. Endriss, N. Maudet, F. Sadri, and F. Toni Presented by: Marcus Shea.
11/24/2008CS Common Voting Rules as Maximum Likelihood Estimators - Matthew Kay 1 Common Voting Rules as Maximum Likelihood Estimators Vincent Conitzer,
Computing Shapley values, manipulating value division schemes, and checking core membership in multi-issue domains Vincent Conitzer, Tuomas Sandholm Computer.
When Are Elections with Few Candidates Hard to Manipulate V. Conitzer, T. Sandholm, and J. Lang Subhash Arja CS 286r October 29, 2008.
Mechanism design for computationally limited agents (previous slide deck discussed the case where valuation determination was complex) Tuomas Sandholm.
Mechanism design for computationally limited agents (last lecture discussed the case where valuation determination was complex) Tuomas Sandholm Computer.
CPS Mechanism design Michael Albert and Vincent Conitzer
Social choice theory = preference aggregation = voting assuming agents tell the truth about their preferences Tuomas Sandholm Professor Computer Science.
Applied Mechanism Design For Social Good
Voting systems Chi-Kwong Li.
Voting and social choice
Vincent Conitzer Mechanism design Vincent Conitzer
Vincent Conitzer CPS 173 Mechanism design Vincent Conitzer
CPS 173 Voting and social choice
Vincent Conitzer Computer Science Department
CPS Voting and social choice
Presentation transcript:

Avoiding manipulation in elections through computational complexity Vincent Conitzer Computer Science Department Carnegie Mellon University Guest lecture

Introduction. [Bartholdi, Tovey, Trick 1989] [Bartholdi, Orlin 1991]

Voting In multiagent systems, agents may have conflicting preferences Based on reported preferences, a preference aggregator often must choose one candidate from the possible outcomes –Deciding on a leader/coordinator/representative –Joint plans –Allocations of tasks/resources –…–… Voting is the most general preference aggregation method –Applicable to any preference aggregation setting –No side payments

Voting “A > B > C” “A > C > B” “B > A > C” Winner (probably A here) Voter 1 Voter 2 Voter 3 VOTING PROTOCOL

Manipulation in voting A voter is said to manipulate when it does not rank the candidates according to its true preferences Example: not ranking your most preferred candidate first because that candidate has no chance anyway Why is manipulation bad? –Protocol is designed to maximize (some measure of) social welfare with respect to the reported preferences Manipulation will cause a suboptimal outcome to be chosen –Also: if the protocol actually relies on manipulation to choose the right outcome… –… then there exists another nonmanipulable protocol that will lead to the same outcome (Revelation Principle)

Manipulation in voting “Gore” “Bush”“Gore” “Bush” Nader > Gore > Bush Voting truthfully (for Nader) might let Bush win, certainly will not get Nader to win So, better rank Gore first “Gore” MANIPULATION!!!

Some well-known protocols Plurality: candidate with the most votes wins Borda: candidate gets m-1 points for each vote, but also m-2 points for each second place in a vote, m-3 for each third place, … Maximin: –From the complete rankings, for each pair of candidates, we can deduce how each voter would have voted with only these two candidates –This defines (m choose 2) “pairwise elections” –In Maximin, winner is the candidate with the best score in her worst pairwise Single Transferable Vote (STV) –Each round, candidate with fewest votes drops out –When your candidate drops out, your vote transfers to your next most preferred (remaining) candidate –Continue until one candidate remains Note: now our voter can safely vote for Nader, then let the vote transfer to Gore Still manipulable in other cases Seminal result (Gibbard-Satterthwaite): all nondictatorial voting protocols with >2 candidates are manipulable!

Software agents may manipulate more Human voters may not manipulate because: –Do not consider the option of manipulation –Insufficient understanding of the manipulability of the protocol –Manipulation algorithms may be too tedious to run by hand For software agents, voting algorithms must be coded explicitly –Rational strategic algorithms are preferred –The (strategic) voting algorithm needs to be coded only once –Software agents are good at running algorithms Key idea: use computational complexity as a barrier to manipulation!

Computational manipulation problem The simplest version of the manipulation problem (defined relative to a protocol): CONSTRUCTIVE-MANIPULATION. We are given the (unweighted) votes of the other candidates, and a candidate c. We are asked if we can cast our (single) vote to make c win. E.g. for the Borda protocol: Voter 1 votes A > B > C; Voter 2, B > A > C; Voter 3, C > A > B Borda scores are now: A: 4, B: 3, C: 2 Can we make B win? Answer: YES. Vote B > C > A (Borda scores: A: 4, B: 5, C: 3) Utility-theoretically, the special case where the manipulator has utility 1 for c and 0 for everyone else

Prior research Theorem. CONSTRUCTIVE-MANIPULATION is NP-complete for the second-order Copeland protocol. [Bartholdi, Tovey, Trick 1989] Theorem. CONSTRUCTIVE-MANIPULATION is NP-complete for the STV protocol. [Bartholdi, Orlin 1991] All the other protocols are easy to manipulate (in P)

Universal voting protocol tweaks to make manipulation hard [Conitzer, Sandholm IJCAI-2003]

“Tweaks” for protocols Hardness of manipulation is one factor in choosing a voting protocol, but… Many existing protocols have other nice properties –Each attempts to maximize a certain notion of welfare It would be nice to be able to tweak protocols: –Change the protocol slightly so that Hardness of manipulation is increased (significantly) (Most of) the original protocol’s properties still hold It would also be nice to have a single, universal tweak for all (or many) protocols A preround turns out to be such a tweak! And it introduces hardness far beyond previous results

Adding a preround to the protocol A preround proceeds as follows: –Pair the candidates –Each candidate faces its opponent in a pairwise election –The winners proceed to the original protocol Original protocol

Preround example (with Borda) Voter 1: A>B>C>D>E>F Voter 2: D>E>F>A>B>C Voter 3: F>D>B>E>C>A A gets 2 points F gets 3 points D gets 4 points and wins! Voter 1: A>D>F Voter 2: D>F>A Voter 3: F>D>A A vs B: A ranked higher by 1,2 C vs F: F ranked higher by 2,3 D vs E: D ranked higher by all Match A with B Match C with F Match D with E STEP 1: A. Collect votes and B. Match candidates (no order required) STEP 2: Determine winners of preround STEP 3: Infer votes on remaining candidates STEP 4: Execute original protocol (Borda)

Matching first, or vote collection first? Match, then collect “A vs C, B vs D.” “D > C > B > A” “A vs C, B vs D.” “A vs C, B vs D.” “A > C > D > B” Collect, then match (randomly)

Could also interleave… Elicitor alternates between: –(Randomly) announcing part of the matching –Eliciting part of each voter’s vote “A vs F” “C > D” “B vs E” “A > E” … …

Main result: how hard is manipulation? Manipulation hardness differs depending on the order/interleaving of preround matching and vote collection: Theorem. NP-hard if preround matching is done first Theorem. #P-hard if vote collection is done first Theorem. PSPACE-hard if the two are interleaved (for a complicated interleaving protocol) In each case, the tweak introduces the hardness for any protocol satisfying certain sufficient conditions –All of Plurality, Borda, Maximin, STV satisfy the conditions in all cases  they are hard to manipulate with the preround

NP-hard with preround matching first Theorem. (Sufficient condition) Suppose that in a protocol we can construct a set of votes for the other voters s.t. –There is a set of candidates K that could possibly defeat our preferred candidate p in the final round (original protocol) –However, there is another set of candidates L of “nemeses” of K –For each candidate c_k in K, there are some candidates in L such that if even one of these candidates continues, c_k will not defeat p in the final round –The candidates in L are matched in pairs (c_{+l} and c_{-l}), and each pair is tied in their pairwise election Then manipulation is NP-hard! Proof idea: Suppose each pair of candidates in L faces each other in the preround. We have to choose between nemeses such that each candidate in K gets at least one nemesis  SATISFIABILITY.

Still not done… Now we still have to show that protocols meet this complicated condition Simple example: Plurality –Assume that for each c_k in K, each of its nemeses would steal some of its votes in Plurality That is, c_k is ranked below (only) its nemeses in these votes –Then, any of the nemeses going on to the final round would push c_k below p

Conclusions on tweaks We have shown that it is possible to tweak protocols… –A tweak preserves many of the protocol’s properties …in order to drastically increase computational hardness of manipulation –If manipulation is computationally hard, it is less likely to occur The tweak we introduced is a preround –One round of pairwise elimination This makes manipulation NP-hard, #P-hard, or even PSPACE-hard depending on whether scheduling the preround is done before, after, or during vote collection –First results where manipulation is more than NP-hard

Hardness of manipulation with few candidates [Conitzer, Sandholm AAAI-2002] [Conitzer, Lang, Sandholm TARK-2003]

What if there are few candidates? The previous results rely on the number of candidates (m) being unbounded We designed a recursive algorithm for individually manipulating STV with O(1.62^m) calls (and usually much fewer) E.g. 20 candidates: 1.62^20 = Sometimes the candidate space is much larger –Voting over allocations of goods/tasks; California But what if it is not? –A typical election for a representative will only have a few

Manipulation complexity with few candidates Ideally, would like complexity results for constant number of candidates But then manipulator can simply evaluate each possible vote –assuming the others’ votes are known Even for coalitions, only polynomially many effectively different votes However, if we place weights on votes, complexity may return… Unweighted voters Weighted voters Individual manipulation Coalitional manipulation Can be hard easy Constant #candidates Unbounded #candidates Can be hard Can be hard Can be hard Potentially hard Unweighted voters Weighted voters

Why study weighted coalitional manipulation? In large elections, usually an effective individual manipulation does not exist Many real world elections have weights –E.g. electoral college –Weights more likely with heterogeneous software agents Weighted coalitional manipulation may be more realistic than assuming unbounded number of candidates Theorem Whenever weighted coalitional manipulation is hard under certainty, individual weighted manipulation is hard under uncertainty Theorem Whenever evaluating an election is hard with independent weighted voters, it is hard with correlated unweighted voters

Constructive manipulation now becomes: We are given the weighted votes of the others (with the weights) And we are given the weights of members of our coalition Can we make our preferred candidate c win? E.g. another Borda example: Voter 1 (weight 4): A>B>C, voter 2 (weight 7): B>A>C Manipulators: one with weight 4, one with weight 9 Can we make C win? Yes! Solution: weight 4 voter votes C>B>A, other C>A>B –Borda scores: A: 24, B: 22, C: 26

Destructive manipulation Exactly the same, except: Instead of a preferred candidate We now have a hated candidate Our goal is to make sure that the hated candidate does not win (whoever else wins) Utility-theoretically: hated candidate gives utility 0, everyone else utility 1

Some other well-known protocols Veto: candidate with the fewest vetoes wins Copeland: the candidate with the most pairwise election victories wins Cup: candidates are arranged in a tennis-style tournament and defeat each other based on pairwise elections Plurality with runoff : the two candidates with the highest plurality score advance to the “runoff”; winner is winner of pairwise election

A simple example of hardness We want: given the other voters’ votes… … it is NP-complete to find votes for the manipulators to achieve their objective Simple example: veto protocol, constructive manipulation, 3 candidates Suppose, from the given votes, p has received 2K-1 more vetoes than a, and 2K-1 more than b The manipulators’ combined weight is 4K (every individual has weight a multiple of 2) The only way for p to win is if the manipulators veto a with 2K weight, and b with 2K weight But this is doing PARTITION => NP-complete!

What does it mean for a protocol to be easy to manipulate? Given the other voters’ votes… …there is a polynomial-time algorithm to find votes for the manipulators to achieve their objective If the protocol is easy to run, it is easy to check whether a vector of votes for the manipulators is successful Lemma: Suppose the protocol satisfies (for some number of candidates): –If there is a successful (constructive, destructive) manipulation –Then there is a successful (constructive, destructive) manipulation where all manipulators vote identically. Then the protocol is easy to manipulate –Simply check all possible orderings of the candidates (constant)

Example: Maximin with 3 candidates is easy to manipulate constructively Recall: candidate’s Maximin score = worst score in any pairwise election 3 candidates: p, a, b. Manipulators want p to win Suppose there exists a vote vector for the manipulators that makes p win WLOG can assume that all manipulators rank p first –So, they either vote p > a > b or p > b > a Case I: a’s worst pairwise is against b, b’s worst against a –One of them would have a maximin score of at least half the vote weight, and win (or be tied for first) => cannot happen Case II: one of a and b‘s worst pairwise is against p –Say it is a; then can have all the manipulators vote p > a > b Will not affect p or a’s score, can only decrease b’s score

Why do we care about the exact number of candidates required for hardness? If your election has 3 candidates, and the protocol becomes hard to manipulate only at 4 candidates, that hardness is of little use to you! If you do not know beforehand how many candidates the election will have, the lower the hardness occurs, the better the chances you will get the hardness The minimal number of candidates for hardness can be used to compare and quantify the relative hardness of manipulation across protocols –Hardness of manipulation is only one factor in deciding on a protocol, so it is good to know how much harder one protocol is to manipulate than another

Results for constructive manipulation

Results for destructive manipulation

Worst-case hardness… All of these are worst-case measures It may not prevent all (or even most) instances from being manipulable It would be nice if we had some sort of average case hardness… In the works: impossibility result –Impossible to make a constant fraction of instances hard to manipulate in class of protocols

Thank you for your attention!

Uncertainty about others’ votes So far we assumed that manipulator(s) know the others’ votes –Unrealistic -> drop this assumption Theorem. Whenever constructive weighted coalitional manipulation is hard under certainty, individual weighted manipulation is hard under uncertainty –Holds even when manipulator’s vote is worthless i.e. we just wish to evaluate an election –Even with very limited kinds of uncertainty Independence All votes either completely known or not at all –Proof sketch. When manipulator’s vote is worthless, it is difficult to figure out if a certain candidate has a chance of winning, because this requires a constructive vote by the unknown voters

Uncertainty about others’ votes … Let’s drop the assumption of independence between voters –Usually votes are highly correlated –Also, identical software agents will vote identically Theorem. Whenever evaluating an election is hard with independent weighted voters, it is hard with correlated unweighted voters –Even with very limited kinds of correlation Perfect correlation or independence –Proof sketch. Just replace a vote of weight k by k unweighted, perfectly correlated voters So, –because evaluation with independent weighted voters is hard for Borda, Veto, STV, Plurality with runoff, Copeland, Maximin and Randomized Cup, –evaluation is hard for those protocols even for (correlated) unweighted voters

Randomization can be used to make manipulation hard Consider the Cup protocol: –Candidates play an elimination tournament based on pairwise elections Given the schedule (leaf labels), any type of manipulation is easy even with unbounded #candidates –For each node in tree, can build the set of candidates that would reach this node for some vote by the coalition (from bottom up) Manipulating a subtree only requires commitment on the order of candidates in that subtree Idea: randomize (uniformly) over schedules after votes received Theorem. Manipulating Randomized Cup is NP-complete –Proof is complex & uses 7 candidates (manipulation is easy at 6 !) a b b c b d c

Hardness of manipulation and the revelation principle [Conitzer, Sandholm AMEC-2003]

Computational criticisms of the revelation principle The revelation principle says nothing about the computational implications of using direct, truthful mechanisms Does restricting oneself to such mechanisms lead to computational hassles? YES If the participating agents have computational limits, does restricting oneself to such mechanisms lead to a loss in the objective (e.g. social welfare)? YES –Even if the center is computationally unbounded!

Criticizing one-step mechanisms Theorem. There are settings where: –Executing the optimal single-step mechanism requires an exponential amount of communication and computation –There exists an entirely equivalent two-step mechanism that only requires a linear amount of communication and computation Holds both for dominant strategies and Bayes-Nash implementation

Criticizing truthful mechanisms Theorem. There are settings where: –Executing the optimal truthful (in terms of social welfare) mechanism is NP-complete –There exists an insincere mechanism, where The center only carries out polynomial computation Finding a beneficial insincere revelation is NP-complete for the agents If the agents manage to find the beneficial insincere revelation, the insincere mechanism is just as good as the optimal truthful one Otherwise, the insincere mechanism is strictly better (in terms of s.w.) Holds both for dominant strategies and Bayes-Nash implementation Theorem. The same holds under an oracle model (replace NP-complete with “requiring an exponential number of queries”) Holds both for dominant strategies and Bayes-Nash implementation

Is there a systematic approach? The previous result is for a very specific setting How do we take such computational issues into account in general in mechanism design? What is the correct tradeoff? –Cautious: make sure that computationally unbounded agents would not make the mechanism worse than the best truthful mechanism (like previous result) –Aggressive: take a risk and assume agents are probably somewhat bounded What kind of mechanism design approaches can help? –Classical: attempt to theoretically characterize mechanisms that take maximal advantage of computational issues –Automated [Conitzer & Sandholm 02, 03; Jameson, Hackl & Kleinbauer 03; Hsu 03] : compute the mechanism on the fly for the setting at hand