Download presentation
Presentation is loading. Please wait.
Published byJuniper Carter Modified over 6 years ago
1
Title: Suggestion Strategies for Constraint- Based Matchmaker Agents
Author: Eugene C. Freuder and Richard J. Wallace Journal: CP 1998
2
Motivation Internet searches based on client query, not user-friendly.
Example: Interior Decorator working with a client.
3
Matchmaker Is this what you want? client matchmaker Sort of….
How about this? YES! Model a user that doesn’t know exactly what he wants, but knows when it is something they do or don’t want.
4
Matchmakers Two entities Client: has a set of internal criteria
Matchmaker: tries to figure out and satisfy customer criteria.
5
How? CSP CSP CSP client matchmaker YES!
Solution to Matchmaker CSP matchmaker client Constraints in client CSP that were violated by that solution Solution to modified Matchmaker CSP CSP CSP CSP YES! Draw domain on board, what is universe? Both Matchmaker and Client contain an internal CSP. Matchmaker CSP is a subset of Clent CSP.
6
Terminology CASP – Constraint Acquisition and Satisfaction Problem – A CSP in which not all the constraints are known at the outset. Domain – All variables and values. Universe – The set of all possible constraints. Suggestion – A solution to the Matchmaker’s CSP.
7
Knowledge Universe P = Customer Universe – all possible constraints
Matchmaker Universe – all possible constraints P – constraints held by customer K- constraints known to both customer and solver
8
Assumptions Universe P = Customer
K = Matchmaker Matchmaker and Client are in same universe Domain knowledge = constraints When a customer receives a suggestion, they give feedback of all constraints violated by that suggestion.
9
Goals of Matchmaker Find a suitable solution as quickly as possible. (Fewest suggestions, least work Client who may not like answering all those questions) Find out as much about the Customer as possible. (Discover as many constraints held by Client as possible. Good for returning business.)
10
Criteria Number of steps to solution Customer knowledge
Want to minimize Customer knowledge Number of customer constraints acquired Want to maximize May be tradeoff between these two
11
Implementation I UNIVERSE client matchmaker Constraint 1 Constraint 2
… Constraint K Constraint K+1 Constraint P Constraint 1 Constraint 2 … Constraint K
12
Implementation II Two agents Both computer One Client, One Matchmaker
Customer has P constraints Solver has K constraints K subset of P Both K, P subset of Universe
13
Implementation III For client For Matchmaker Decision problem
Iterative decision / optimization problem Maximize info gathered Minimize steps to get there
14
Test Problems I 10 problems 50 variables each, every domain of size 5
343 Constraints (tightness .18 on average). This made the Universe. Problems are near critical region for computational complexity Took first 10 generated with complete solutions Remind of Computational Complexity – what it is, where the phase transition is.
15
Test Problems II Universe
K = .2 or .4 Matchmaker and Client K generated by choosing from all constraints with a probability of .2, .4. These constraints were added to both the Customer and the Solver’s constraints.
16
Test Problems III Universe
P = .4 or .8 Client only P- generated in the same way, but added only to the Client’s constraints. Used probabilities of .4 and .8
17
Methods for Solving I Systematic:
FC-CBJ (Prosser) Variable ordering: dynamic, least-domain Value ordering: lexical max-promise (Geelen) min-promise random Note: only the Matchmaker solves a CSP. What is max-promise going to give us? What is min-promise going to give us?
18
Methods for Solving II Hill-climbing: Random-walk (Walker 1996)
Retain past suggestions, local walk Restart from new random location with each new iteration (with preprocessing)
19
Results I Constraint Problem Value Ordering Iterations to Solution Violations / iteration Undiscov. Constraints Solution Similarity Runtime .2, .4 lexical 8 59 .73 .03 Max-promise 6 3 85 .83 .02 Min-promise 11 37 .61 shuffled 14 13 .22 Explain what the .2, .4 mean, and the difference between .2, .4 and .2, .8. What do we expect from max-promise, min-promise? Do they hold true?
20
Results II Constraint Problem Value Ordering Iterations to Solution Violations / iteration Undiscov. Constraints Solution Similarity Runtime .2, .4 lexical 8 59 .73 .03 Max-promise 6 3 85 .83 .02 Min-promise 11 37 .61 shuffled 14 13 .22 .2, .8 12 69 .54 .55 9 105 .65 .43 17 47 .53 19 21 .24 .41
21
Results III Max-Promise has fastest results when difference between constraints Solver knows and constraints Customer has is small. Use shuffled ordering to discover the most constraints overall. Min-promise has best trade-off between discovering constraints and getting to a solution quickly.
22
Results IV Similarity of Solutions:
Proportaion of common values of successive suggestions during a run. Similar results for hill-climbing Matchmaker.
23
Contributions Matchmaking paradigm not new. (Gomez 1996)
CSP implementation of Matchmaking paradigm. Strategies for solving Metrics for evaluation Empirical evaluation reported.
24
What’s missing Presentation of multiple solutions
Customer preference beyond just decision
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.