Presentation is loading. Please wait.

Presentation is loading. Please wait.

Preference elicitation Communicational Burden by Nisan, Segal, Lahaie and Parkes October 27th, 2004 Jella Pfeiffer.

Similar presentations


Presentation on theme: "Preference elicitation Communicational Burden by Nisan, Segal, Lahaie and Parkes October 27th, 2004 Jella Pfeiffer."— Presentation transcript:

1 Preference elicitation Communicational Burden by Nisan, Segal, Lahaie and Parkes October 27th, 2004 Jella Pfeiffer

2 2 Outline Motivation Communication Lindahl prices Communication complexity Preference Classes Applying Learning Algorithms to Preference elicitation Applications Conclusion Future Work

3 3 Outline Motivation Communication Lindahl prices Communication complexity Preference Classes Applying Learning Algorithms to Preference elicitation Applications Conclusion Future Work

4 4 Exponential number of bundles in the number of goods  Communication of values  Determination of valuations Reluctance to reveal valuation entirely minimze communication and information revelation* * Incentives are not considered Motivation

5 5 Agenda Motivation Communication Burden Protocols Lindahl prices Communication complexity Preference Classes Applying Learning Algorithms to Preference elicitation Applications Conclusion Future Work

6 6 Communication burden: Minimum Number of messages Transmitted in a protocol (nondeterministic) Realizing the communication Here: „worst-case“ burden = max. number Communication burden

7 7 Communication protocols Sequential message sending 1.Deterministic protocol: Message send, determined by type and preceding messages 2.Nondeterministic protocol: Omniscient oracle Knows state of the world ≽ and Desirable alternative x ∈ F(≽)

8 8 Definition Nondeterministic protocol A nondeterministic protocol is a triple Г = (M, μ, h) where M is the message set, μ: R  M is the message correspondance, and h: M  X‘ is the outcome function, and the message correspondance μ has the following two properties: 1.Existence: μ( ≽) ≠ ∅ for all ≽ ∈ ℜ, 2.Privacy preservation: μ( ≽) = ∩ i μ i (≽ i ) for all ≽ ∈ ℜ, where μ i : R i  M for all i ∈ N.

9 9 Agenda Motivation Communication Lindahl prices Equilibria Importance of Lindahl prices Communication complexity Preference Classes Applying Learning Algorithms to Preference elicitation Applications Conclusion Future Work

10 10 Lindahl Equilbria Lindahl prices: nonlinear and non-anonymous Definition: is a Lindahl equilibrium in state ≽ ∈ ℜ if 1. ≽ i ) for all i ∈ N, (L1) 2. (L2) Lindahl equilibrium correspondance: ↠

11 11 Importance of Lindahl prices Protocol realizes the weakly Pareto efficient correspondence F* if and only if there exists an assignment of budget sets to messages such that protocol realizes the Lindahl equilibrium correspondance E. Communication burden of efficiency = burden of finding Lindahl prices

12 12 Agenda Motivation Communication Lindahl prices Communication complexity Alice and Bob Proof for Lower Bound Communication complexity Preference Classes Applying Learning Algorithms to Preference elicitation Applications Conclusion Future Work

13 13 Alice and Bob

14 14 Communication Complexity (1) Finding a lower bound from „Alice and Bob“:  Including auctioneer  Larger number of bidders  Queries to the bidders  Communicating real numbers  Deterministic protocols

15 15 The proof Lemma: Let v ≠u be arbitrary 0/1 valuations. Then, the sequence of bits transmitted on inputs (v,v*), is not identical to the sequence of bits transmitted on inputs (u,u*). (v*(S) = 1-v(S c )) Theorem: Every protocol that finds the optimal allocation for every pair of 0/1 valuations v1, v2 must use at least bits of total communication in the worst case.

16 16 Comments on the proof In the main paper: Better allocation than auctioning off all objects as a bundle in a two-bidder auction needs at least Holds for valuations with: No externalities Normalization With L = 50 items, the number of bits is (about 500 Gigabytes of data)

17 17 Communication Complexity (2) Theorem*: Exact efficiency requires communicating at least one price for each of the possible bundles. ( is the dimension of the message space) *Holds for general valuations.

18 18 Agenda Motivation Communication Lindahl prices Communication complexity Preference Classes Applying Learning Algorithms to Preference elicitation Applications Conclusion Future Work

19 19 Preference Classes Submodular valuations:  Dimension of message space in any efficient protocol is at least -1 Homogenous valuations: Agents care only about number of items recieved Dimension L Additive Valuations Dimension L

20 20 Agenda Motivation Communication Lindahl prices Communication complexity Preference Classes Applying Learning Algorithms to Preference elicitation Learning algorithms Preference elicitation Parallels (polynomial query learnable/elicitation) Converting learning algorithms Applications Conclusion Future Work

21 21 Applying Learning Algorithms Learning theory Preference elicitation Membership QueryEquivalence Query Value QueryDemand Query

22 22 What is a Learning Algorithm? Learning an unknown function f: X  Y via questions to an oracle Known function class C Typically:, Y either {0,1} or ⊆ ℜ Manifest hypotheses: Size(f) with respect to presentation Example: f: ;f(x) = 2 if x consists of m 1‘s, and f(x) = 0 otherwise. 1) a list of values 2)

23 23 Learning Algorithm - Queries ?Learner?!Oracle! x ∈ X f(x) YES, if NO; counterexample x such that Membership Query Equivalence Query

24 24 Preference elicitation Assumptions: Normalized No externalities Quasi-linear utility function Polynomial time for representation  values of bundles Goal: Sufficient set of manifest valuations to compute an optimal allocation.

25 25 Preference eliciation - Queries ?auctioneer?!agent! S ⊆ M v(S) YES, if S most preferred at p NO; presents more preferred S‘ Value Query Demand Query

26 26 1)Membership query Value query 2)Equivalence query ? Demand query Lindahl prices are only a constant away from manifest valuations Out of a preferred bundle S‘, counterexamples can be computed Parallels: learning & eliciation pref.

27 27 Polynomial-query learnable Defintion: The representation class C is polymonial-query exactly learnable from membership and equivalence queries if there is a fixed polynomial and an algorithm L with access to membership and equivalence queries of an oracle such that for any target function f ∈ C, L outputs after at most p(size(f),m) queries a function such that for all instances x.

28 28 Polynomial-query elicited Similar to definition for polynomial-query learnable but: Value and demand queries Agents‘ valuations are target functions Outputs in p(size(v 1,...,v n ),m) an optimal allocation Valuation functions need not to be determined exactly!

29 29 Converting learning algorithms Idea proved in paper: If each representation class V 1,…,V 2 can be polynomial-query exactly learned from membership and equivalence queries  V 1,…,V 2 can be polynomial-query elicited from value and demand queries.

30 30 Converted Algorithm 1) Run learning algorithms on valuation classes until each requires response to equivalence query

31 31 Converted Algorithm 2)Compute optimal allocation S* and Lindahl prices L* with respect to manifest valuations 3)Represent demand query with S* and L*

32 32 Converted Algorithm 4) Quit if all agents answer YES, otherwise give counterexample from agent i to learning algorithm i. goto 1

33 33 Agenda Motivation Communication Lindahl prices Communication complexity Preference Classes Applying Learning Algorithms to Preference elicitation Applications Polynomial representation XOR/DNF Linear-Threshold Conclusion Future Work

34 34 Polynomials T-spares, multivariate polynomials: T-terms Term is product of variables (e.g. x 1 x 3 x 5 ) „Every valuation function can be uniquely written as polynomial“ [Schapire and Selli] Example: additive valuations Polynomials of size m (m = number of items) x 1 +…+x m Learning algorithm: At most Equivalence queries At most Membership queries

35 35 XOR/DNF Representations (1) XOR bids represent valuations wich have free- disposal Analog in learning theory: DNF formulae Disjunction of conjunctions with unnegated bits E.g. Atomic bids in XOR have value 1

36 36 XOR/DNF Representations (2) An XOR bid containing t atomic bids can be exactly learned with t+1 equivalence queries and at most tm membership queries Each Equivalence query leads to one new atomic bid By m membership queries (exluding bids out of the counteraxample which do not belong to the atomic bid)

37 37 Linear-Threshold Representations r-of-S valuation Let, r-of-k threshold functions: If r known: equivalence queries or demand queries

38 38 Important Results by Nisan, Segal Important role of prices (efficient allocation must reveal suppporting Lindahl prices) Efficient communication must name at least one Lindahl price for each of the bundles Lower bound: no generell good communication design focus on specific classes of preferences

39 39 Important Results by Lahaie, Parkes Learning algorithm with membership and equivalence queries as basis for preference elicitation algorithm If polynomial-query learnable algorithm exists for valuations, preferences can be efficiently elicited whith queries polynomial in m and size(v 1,…,v n ) solution exists for polynomials, XOR, linear- threshold

40 40 Future Work Finding more specific classes of preferences which can be elicited efficiently Address issue of incentives Which Lindahl prices may be used for the questions

41 Thank you for your attenttion Any Questions?


Download ppt "Preference elicitation Communicational Burden by Nisan, Segal, Lahaie and Parkes October 27th, 2004 Jella Pfeiffer."

Similar presentations


Ads by Google