Presentation is loading. Please wait.

Presentation is loading. Please wait.

Privacy as a tool for Robust Mechanism Design in Large Markets

Similar presentations


Presentation on theme: "Privacy as a tool for Robust Mechanism Design in Large Markets"β€” Presentation transcript:

1 Privacy as a tool for Robust Mechanism Design in Large Markets
(A Case Study) Based on joint works with: Rachel Cummings, Justin Hsu, Zhiyi Huang, Sampath Kannan, Michael Kearns, Mallesh Pai, Jamie Morgenstern, Ryan Rogers, Tim Roughgarden Jon Ullman, and Steven Wu

2 Sampath Kannan, Jamie Morgenstern, and Steven Wu
Approximately Stable, School Optimal, and Student-Truthful Many-to-One Matchings (viaΒ Differential Privacy) Aaron Roth Joint work with: Sampath Kannan, Jamie Morgenstern, and Steven Wu

3 Many-to-one Stable Matchings

4 Many-to-one Stable Matchings
In a stable matchings problem there are 𝑛 students and π‘š schools. Students 𝑖 each have a total order ≻ 𝑖 over the schools Schools 𝑐 have a total order ≻ 𝑐 over the students Students can be matched to at most 1 school; schools to at most 𝑠 students. Definition: A matching πœ‡: 𝑛 β†’[π‘š] is stable if it satisfies: Feasibility: For each school 𝑐: πœ‡ βˆ’1 (𝑐) ≀𝑠 (No Blocking Pairs with Filled Seats): For each π‘–βˆˆ[𝑛] and π‘βˆˆ[π‘š] such that πœ‡ 𝑖 ≠𝑐, either πœ‡ 𝑖 ≻ 𝑖 𝑐 or for every π‘—βˆˆ πœ‡ βˆ’1 (𝑐), 𝑗 ≻ 𝑐 𝑖. (No Blocking Pairs with Empty Seats): For every 𝑐 such that |πœ‡ βˆ’1 𝑐 |<𝑠, and every π‘–βˆˆ[𝑛] such that 𝑖 ≻ 𝑐 βˆ…, πœ‡ 𝑖 ≻ 𝑖 𝑐.

5 Many-to-one Stable Matchings
Simple mechanisms compute the student-optimal/school optimal matchings (student/school proposing deferred acceptance). But… Even in the 1-to-1 case, no mechanism is dominant-strategy-truthful for both sides of the market [Dubins and Freedman 1981, Roth 1982] In the many-to-one case, no school-optimal mechanism is dominant-strategy truthful for either side of the market. [Roth 1984] Can we circumvent them with approximation and large-market assumptions? Worst Case Results

6 β€œTraditional” Economic Approach e. g
β€œTraditional” Economic Approach e.g. [Immorlica and Mahdian 05], [Kojima and Pathak 09], [Lee 11], [Azevedo and Budish 12], … Make a strong distributional assumption about how preferences are generated e.g. ([IM 05, KP09]) students have preference lists of constant length π‘˜, drawn i.i.d. from a product distribution Show that as the β€œmarket grows large”, when exact school-optimal matching is computed, the fraction of people who have incentive to deviate diminishes e.g. as π‘›β†’βˆž (and π‘˜ fixed), with high probability, a 1βˆ’π‘œ(1) fraction of students have incentive to mis-report.

7 Here: A more robust β€œdual” approach.
Make no assumptions about student or school preferences. Ask for truthful reporting to be an asymptotic dominant strategy for every student. Make no β€œlarge market” assumptions except that schools have sufficiently many slots. Instead: Perturb the process by which matchings are computed, and find β€œapproximately stable”, β€œapproximately school optimal” matchings. Also: Ask for small finite-market bounds (not just limit results)

8 Approximately Stable Matchings
Definition: A matching πœ‡: 𝑛 β†’[π‘š] is stable if it satisfies: Feasibility: For each school 𝑐: πœ‡ βˆ’1 (𝑐) ≀𝑠 (No Blocking Pairs with Filled Seats): For each π‘–βˆˆ[𝑛] and π‘βˆˆ[π‘š] such that πœ‡ 𝑖 ≠𝑐, either πœ‡ 𝑖 ≻ 𝑖 𝑐 or for every π‘—βˆˆ πœ‡ βˆ’1 (𝑐), 𝑗 ≻ 𝑐 𝑖. (No Blocking Pairs with Empty Seats): For every 𝑐 such that |πœ‡ βˆ’1 𝑐 |<𝑠, and every π‘–βˆˆ[𝑛] such that 𝑖 ≻ 𝑐 βˆ…, πœ‡ 𝑖 ≻ 𝑖 𝑐. Definition: A matching πœ‡: 𝑛 β†’[π‘š] is 𝛼-approximately stable (envy free) if it satisfies: (No Blocking Pairs with Empty Seats at under-enrolled schools): For every c such that |ΞΌ βˆ’1 c |<(1βˆ’Ξ±)s, and every i∈[n] such that i ≻ c βˆ…, ΞΌ i ≻ i c. Schools tolerate a small degree of under-enrollment

9 Approximately School Optimal Matchings
Definition: Let πœ‡ βˆ— be the school-optimal stable matching. A matching πœ‡ is school dominant if for every school 𝑐, and every pair of students 𝑖,𝑗 such that π‘–βˆˆ πœ‡ βˆ’1 (𝑐)\ πœ‡ βˆ— βˆ’1 (𝑐)and π‘—βˆˆ πœ‡ βˆ— βˆ’1 (𝑐)\ πœ‡ βˆ’1 (𝑐): 𝑖 ≻ 𝑐 𝑗 i.e. every student matched to 𝑐 in a school dominant matching must be at least as preferred as every student matched to 𝑐 in the school optimal matching. But there may be fewer of them.

10 Approximate Dominant Strategy Truthfulness
A utility function 𝑒 𝑖 : π‘š β†’[0,1] is consistent with an ordering ≻ 𝑖 if for every 𝑐, 𝑐′: 𝑐 ≻ 𝑖 𝑐 β€² if and only if 𝑒 𝑖 𝑐 > 𝑒 𝑖 ( 𝑐 β€² ). Definition: A matching mechanism 𝑀 is πœ‚-approximately dominant strategy truthful if for every ≻=( ≻ 1 ,…, ≻ 𝑛 ), π‘–βˆˆ[𝑛] and deviation ≻ 𝑖 β€² , and for every utility function 𝑒 𝑖 consistent with ≻ 𝑖 : 𝔼 π‘βˆΌπ‘€ ≻ 𝑖 𝑒 𝑖 𝑐 β‰₯ 𝔼 π‘βˆΌπ‘€ ≻ 𝑖 β€² , ≻ βˆ’π‘– 𝑖 𝑒 𝑖 𝑐 βˆ’πœ‚

11 When 𝒔=𝝎( π’Ž β‹…π’π’π’ˆ 𝒏,π’π’π’ˆ 𝒏 ), we can take
Our Result Theorem: There is a computationally efficient algorithm for computing 𝛼-approximately stable, school dominant matchings, that makes it an πœ‚-approximately dominant strategy for every student to report truthfully whenever school capacity is sufficiently large: 𝑠β‰₯Ξ© π‘š πœ‚π›Ό log 𝑛 When students have constant length preference lists, we only require: 𝑠β‰₯Ξ© log 𝑛 πœ‚π›Ό When 𝒔=𝝎( π’Ž β‹…π’π’π’ˆ 𝒏,π’π’π’ˆ 𝒏 ), we can take 𝜢,πœΌβ†’πŸŽ.

12 Differential Privacy [DMNS06] A measure of Algorithmic Stability
Let π‘‘βˆˆ 𝒯 𝑛 denote an arbitrary type profile, and let 𝑑 𝑖 β€² βˆˆπ’― be any possible report for agent 𝑖. Then a mechanism 𝑀: 𝒯 𝑛 β†’π’ͺ is πœ–-differentially private if for all π‘†βŠ†π’ͺ: Pr 𝑀 𝑑 βˆˆπ‘† ≀ 𝑒 πœ– Pr⁑[𝑀 𝑑 𝑖 β€² , 𝑑 βˆ’π‘– βˆˆπ‘†] In particular, for any 𝑒:π’ͺβ†’ ℝ β‰₯0 : 𝔼 π‘₯βˆΌπ‘€(𝑑) 𝑒 π‘₯ ≀ 𝑒 πœ– 𝔼 π‘₯βˆΌπ‘€ 𝑑 𝑖 β€² , 𝑑 βˆ’π‘– [𝑒 π‘₯ ] Algorithmically enforced informational smallness.

13 A Helpful Change in Perspective Admissions Thresholds
Think of school preferences ≻ 𝑐 as being represented by assigning a rating π‘Ÿ 𝑖 𝑐 ∈{1,…,π‘ˆ} to each student 𝑖. 𝑖 ≻ 𝑐 𝑗⇔ π‘Ÿ 𝑖 𝑐 > π‘Ÿ 𝑗 𝑐 . A set of admissions thresholds 𝑇=( 𝑑 1 ,… 𝑑 π‘š ) induces a matching: πœ‡ ≻ 𝑇 𝑖 = arg max ≻ 𝑖 𝑐 π‘Ÿ 𝑖 𝑐 β‰₯ 𝑑 𝑐 } (i.e. students go to their favorite school that will have them) Say thresholds 𝑇 are 𝛼-approximately stable if πœ‡ ≻ 𝑇 is. Idea: Try and find 𝛼-approximately stable, school dominant thresholds, subject to differential privacy.

14 Differential Privacy Yields Approximate DSIC.
Theorem: Let 𝑀: ≻ 𝑛 β†’ 0,π‘ˆ π‘š be an πœ–-differentially private algorithm for computing admissions thresholds. The algorithm 𝐴 which takes as input preferences ≻ 1 ,…, ≻ 𝑛 and: computes 𝑇=𝑀(≻), and outputs πœ‡ ≻ 𝑇 is πœ–-approximately dominant strategy truthful for all students. Matching is computed subject to β€œjoint differential privacy”.

15 Differential Privacy Yields Approximate DSIC.
Proof: Fix a set of preferences ≻, a student 𝑖, a deviation ≻ 𝑖 β€² , and a utility function 𝑒 𝑖 consistent with ≻ 𝑖 . 𝔼 π‘βˆΌπ΄ ≻ [ 𝑒 𝑖 𝑐 ] = 𝔼 π‘‡βˆΌπ‘€(≻) 𝑒 𝑖 arg max ≻ 𝑖 𝑐 π‘Ÿ 𝑖 𝑐 β‰₯ 𝑑 𝑐 } β‰₯ 𝑒 βˆ’πœ– 𝔼 π‘‡βˆΌπ‘€( ≻ 𝑖 β€² , ≻ βˆ’π‘– ) 𝑒 𝑖 arg max ≻ 𝑖 𝑐 π‘Ÿ 𝑖 𝑐 β‰₯ 𝑑 𝑐 } (Differential Privacy) β‰₯ 𝑒 βˆ’πœ– 𝔼 π‘‡βˆΌπ‘€( ≻ 𝑖 β€² , ≻ βˆ’π‘– ) 𝑒 𝑖 arg max ≻ 𝑖 β€² 𝑐 π‘Ÿ 𝑖 𝑐 β‰₯ 𝑑 𝑐 } (argmax and consistency) β‰₯ 𝑒 βˆ’πœ– 𝔼 π‘βˆΌπ΄ ≻ 𝑖 β€² , ≻ βˆ’π‘– [ 𝑒 𝑖 𝑐 ] β‰₯ 𝔼 π‘βˆΌπ΄ ≻ 𝑖 β€² , ≻ βˆ’π‘– 𝑒 𝑖 𝑐 βˆ’πœ– ( 𝑒 βˆ’πœ– β‰₯1βˆ’πœ– and 𝑒 𝑖 ∈[0,1]) Goal: Design private algorithm to compute approximately stable, school dominant thresholds

16 School Proposing Deferred Acceptance
Set all school thresholds 𝑑 𝑐 =𝑛+1, an initial empty matching πœ‡, and initial counts 𝐸 𝑐 =0 of enrollment for each school. While there exists an under-enrolled school 𝑐 : 𝐸 𝑐 <𝑠 and 𝑑 𝑐 >0: Lower the threshold for school 𝑐: 𝑑 𝑐 ← 𝑑 𝑐 βˆ’1 For each student 𝑖, if πœ‡ 𝑖 β‰  arg max ≻ 𝑖 𝑐 π‘Ÿ 𝑖 𝑐 β‰₯ 𝑑 𝑐 } then: 𝐸 πœ‡(𝑖) ← 𝐸 πœ‡(𝑖) βˆ’1, πœ‡ 𝑖 ← arg max ≻ 𝑖 𝑐 π‘Ÿ 𝑖 𝑐 β‰₯ 𝑑 𝑐 } , 𝐸 πœ‡(𝑖) ← 𝐸 πœ‡(𝑖) +1 Output 𝑇=( 𝑑 1 ,…, 𝑑 π‘š ) How can we make this differentially private?

17 Some Useful Privacy Properties
Theorem (Postprocessing): If 𝑀(≻) is πœ–-differentially private, and 𝑓 is any (randomized) function, then 𝑓(𝑀 ≻ ) is πœ–-differentially private.

18 Some Useful Privacy Properties
Theorem (Composition): If 𝑀 1 ,…, 𝑀 π‘˜ are πœ–- differentially private, then: 𝑀 ≻ ≑( 𝑀 1 ≻ ,…, 𝑀 π‘˜ (≻)) is β‰ˆ π‘˜ πœ–-differentially private.

19 So… We can go about designing algorithms as we normally would. Just access the data using differentially private β€œsubroutines”, and keep track of your β€œprivacy budget” as a resource. Private algorithm design, like regular algorithm design, can be modular.

20 School Proposing Deferred Acceptance
Set all school thresholds 𝑑 𝑐 =𝑛+1, an initial empty matching πœ‡, and initial counts 𝐸 𝑐 =0 of enrollment for each school. While there exists an under-enrolled school 𝑐 : 𝐸 𝑐 <𝑠 and 𝑑 𝑐 >0: Lower the threshold for school 𝑐: 𝑑 𝑐 ← 𝑑 𝑐 βˆ’1 For each student 𝑖, if πœ‡ 𝑖 β‰  arg max ≻ 𝑖 𝑐 π‘Ÿ 𝑖 𝑐 β‰₯ 𝑑 𝑐 } then: 𝐸 πœ‡(𝑖) ← 𝐸 πœ‡(𝑖) βˆ’1, πœ‡ 𝑖 ← arg max ≻ 𝑖 𝑐 π‘Ÿ 𝑖 𝑐 β‰₯ 𝑑 𝑐 } , 𝐸 πœ‡(𝑖) ← 𝐸 πœ‡(𝑖) +1 Output 𝑇=( 𝑑 1 ,…, 𝑑 π‘š ) Only data access: Keeping track of enrollment counts.

21 Privately Maintaining Counts
[DworkNaorPitassiRothblum10,ChanShiSong10] give exactly the tool we need. Private algorithm to maintain a running count. Given a stream of n bits, maintain an estimate of the running count to accuracy Β±Ξ” polylog 𝑛 πœ– , where each person can affect at most Ξ” entries in the stream. For us: Ξ”=2. (No student changes enrollment status at any school more than twice.) 32 1 1 1 1 1

22 Privately Maintaining Counts
+𝑡 𝟎, π₯𝐨𝐠 𝒏 𝝐 5 +𝑡 𝟎, π₯𝐨𝐠 𝒏 𝝐 +𝑡 𝟎, π₯𝐨𝐠 𝒏 𝝐 2 3 +𝑡 𝟎, π₯𝐨𝐠 𝒏 𝝐 +𝑁 0, log 𝑛 πœ– +𝑁 0, log 𝑛 πœ– +𝑡 𝟎, π₯𝐨𝐠 𝒏 𝝐 1 1 2 1 1 1 1 1 1

23 Private School Proposing Deferred Acceptance
Idea: Run school proposing deferred acceptance, but maintain enrollment counts privately. Privacy of the counters, + postprocessing + composition implies privacy of the whole algorithm. πœ‚-DP implies πœ‚-approximate dominant strategy truthfulness. π‘š schools to keep track of, so total error is 𝐸=𝑂 π‘š β‹… log 𝑛 πœ‚ So as to never over-enroll, run as if capacity is shaded down by 𝐸. So long as capacity 𝑠β‰₯ 𝐸 𝛼 =𝑂 π‘š β‹… log 𝑛 πœ‚π›Ό , the under-enrollment due to capacity shading and error is ≀𝛼⋅𝑠.

24 Private School Proposing Deferred Acceptance
Privacy β‡’ approximate dominant strategy truthfulness. Utility guarantees? Enrollments are always underestimated, and so… The sequence of proposals is always a subsequence of the proposals made by some trajectory of the (exact) school-proposing deferred acceptance algorithm. No blocking pairs with filled seats School dominance Excess under-enrollment of at most 𝐸 Only blocking pairs with empty seats are at almost fully enrolled schools.

25 Stepping back… Differential Privacy is a tool that can be used to design robust mechanisms in large markets. Ex-post guarantees for all players even in settings of incomplete information No distributional assumptions Shifts perspective to mechanism design Explicitly perturb mechanisms to yield distributional robustness… Rather than proving structural properties about exact solutions on random instances.

26 Stepping back… Other applications:
Privately computing Walrasian equilibrium prices: Asymptotically truthful combinatorial auctions with item pricings. Privately computing correlated/Nash equilibria: Mediators for equilibrium selection that make truth-telling an ex-post Nash equilibrium. Privately selecting alternatives: General recipe for mechanism design without money. [McSherry Talwar 07, Nissim Smorodinsky Tennenholtz 11] There should be more! Lets involve mechanism/market designers!

27 Stepping back more… β€œMarkets for Privacy” β€œMarkets for Data”
Can we find a β€œmarket price” for πœ–? Depends on individual costs of privacy risk, as well as value of resulting data analysis. Disclosures viewed as public goods? (Talk to John) β€œMarkets for Data” Information is very interesting as a commodity Lots of complicated complementarities, because of inferences. Differential privacy removes some kinds of complementarities (by making reconstruction impossible) Leaves others Privacy trades off in non-trivial ways with β€œprice of data”. Lets involve economists!

28 Thanks!


Download ppt "Privacy as a tool for Robust Mechanism Design in Large Markets"

Similar presentations


Ads by Google