Presentation is loading. Please wait.

Presentation is loading. Please wait.

Mining Preferences from Superior and Inferior Examples KDD’08 1.

Similar presentations


Presentation on theme: "Mining Preferences from Superior and Inferior Examples KDD’08 1."— Presentation transcript:

1 Mining Preferences from Superior and Inferior Examples KDD’08 1

2 Outline Introduction Problem Definition Greedy Method Term-Based Algorithm Condition-Based Algorithm Experimental Result Conclusion 2

3 Introduction In a multidimensional space where the user preferences on some categorical attribute are unknown. Example IDPriceAgeDeveloper a16002X b24001Y c30005Z 3

4 Cont. Superior Example ( S ) For each superior example o, according to the customer preference there does not exist another realty o’ which is as good as o in every aspect, and is better than o in at least one aspect. Inferior Example ( Q ) For each inferior example o, according to the customer preference there exist at least one realty o’ which is as good as o in every aspect, and is better than o in at least one aspect. 4

5 Problem Definition D = such that : Determined attribute. : Undetermined attribute. SPS ( Satisfying Preference Set) d : The dimensionality of D d’: The number of determined attribute 5

6 Example A: Determined attribute B,C: Undetermined attribute S = { } Q = { } ObjectABC o1o1 a1a1 b1b1 c1c1 o2o2 a1a1 b1b1 c2c2 o3o3 a2a2 b1b1 c2c2 o4o4 a2a2 b2b2 c2c2 o5o5 a2a2 b3b3 c2c2 6

7 Cont. Problem 1 (SPS Existence) Given a set of superior examples S and a set of inferior examples Q, determine whether there exists at least SPS R with respect to S and Q. Problem (Minimal SPS) For a set of superior example S and a set of inferior examples Q, find a SPS with respect to S and Q such that is minimized. R is called a minimal SPS. 7

8 Cont. Theorem (Multidimensional Preference) In space, let be a preference on attribute D i and. Then is the number of distinct values in attribute D i 8

9 Term-Based Algorithm Object-id D1D1 D2D2 D3D3 D4D4 Label o1o1 15 a3a3 b3b3 o2o2 16 a2a2 b1b1 Inferior o3o3 16 a2a2 b3b3 o4o4 22 a1a1 b1b1 o5o5 25 a2a2 b2b2 o6o6 31 a4a4 b3b3 o7o7 34 a2a2 b2b2 o8o8 61 a5a5 b1b1 o9o9 61 a5a5 b3b3 o 10 62 a1a1 b1b1 Inferior 9

10 Cont. Inferior P(q)Condition Cq (p) o2o2 o1o1 o3o3 o5o5 o1o1 o4o4 o7o7 o4o4 o6o6 o8o8 o6o6 o9o9 10

11 Cont. Complexity increment CI If is selected on D 3 11

12 Cont. Inferior Example Coverage Cov(t) is the number of interior examples newly satisfied if t is selected. Inferior P(q)Condition Cq (p) o2o2 o1o1 o3o3 o5o5 o1o1 o4o4 o7o7 o4o4 o6o6 o8o8 o6o6 o9o9 12

13 Cont. Term\Iteration 123 D3D3 1/31/4*\ 1/31/8 1/61/8 1/6\\ D4D4 1/51/102/12* 2/5*\\ 1/5 1/6 13

14 Condition-Based Algorithm Inferior Condition123 o2o2 o1o1 3/9\\ o3o3 2/5*\\ o5o5 o1o1 2/91.5/10*\ o4o4 2/92/16 o7o7 o4o4 2/92/161/12 o6o6 1.5/91.5/101/5* o8o8 o6o6 2.5/9\\ o9o9 2/5\\ 14

15 Experimental Results 15

16 Conclusion Mining Preference Greedy Method: Score (t) ? 16


Download ppt "Mining Preferences from Superior and Inferior Examples KDD’08 1."

Similar presentations


Ads by Google