Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 A User-Guided Cognitive Agent for Wireless Service Selection in Pervasive Computing George Lee May 5, 2004 G. Lee, P. Faratin, S. Bauer, and J. Wroclawski.

Similar presentations


Presentation on theme: "1 A User-Guided Cognitive Agent for Wireless Service Selection in Pervasive Computing George Lee May 5, 2004 G. Lee, P. Faratin, S. Bauer, and J. Wroclawski."— Presentation transcript:

1 1 A User-Guided Cognitive Agent for Wireless Service Selection in Pervasive Computing George Lee May 5, 2004 G. Lee, P. Faratin, S. Bauer, and J. Wroclawski. A User-Guided Cognitive Agent for Network Service Selection in Pervasive Computing Environments. In Proceedings of Second IEEE International Conference on Pervasive Computing and Communications (PerCom ’04), 2004.

2 2 Publications G. Lee, S. Bauer, P. Faratin, and J. Wroclawski. An Agent for Interactively Learning User Preferences On-Line for Wireless Services Provisioning. To appear in Proceedings of AAMAS, 2004. G. Lee, S. Bauer, P. Faratin, and J. Wroclawski. An Agent for Interactively Learning User Preferences On-Line for Wireless Services Provisioning. To appear in Proceedings of AAMAS, 2004. G. Lee, P. Faratin, S. Bauer, and J. Wroclawski. A User-Guided Cognitive Agent for Network Service Selection in Pervasive Computing Environments. In Proceedings of IEEE Conference on Pervasive Computing and Communication (PerCom), 2004. G. Lee, P. Faratin, S. Bauer, and J. Wroclawski. A User-Guided Cognitive Agent for Network Service Selection in Pervasive Computing Environments. In Proceedings of IEEE Conference on Pervasive Computing and Communication (PerCom), 2004. P. Faratin, G. Lee, J. Wroclawski, and S. Parsons. Social User Agents for Dynamic Access to Wireless Networks. In Proceedings of AAAI Spring Symposium, 2003. P. Faratin, G. Lee, J. Wroclawski, and S. Parsons. Social User Agents for Dynamic Access to Wireless Networks. In Proceedings of AAAI Spring Symposium, 2003. P. Faratin, J. Wroclawski, G. Lee, and S. Parsons. The Personal Router: An Agent for Wireless Access. In Proceedings of AAAI Fall Symposium, 2002. P. Faratin, J. Wroclawski, G. Lee, and S. Parsons. The Personal Router: An Agent for Wireless Access. In Proceedings of AAAI Fall Symposium, 2002.

3 3 Overview The Personal Router connectivity model The Personal Router connectivity model –An open, competitive market for network access The service selection problem The service selection problem –Network and user challenges A learning agent approach A learning agent approach –Design and implementation Experimental results Experimental results –Agent is accurate and unobtrusive Solving other systems and networking problems Solving other systems and networking problems –Helping users deal with complexity using learning

4 4 The Problems with Current Connectivity Models Limited competition Limited competition Long-term contracts Long-term contracts Difficult to switch Difficult to switch Regional Providers Limited coverage Limited coverage No market No market Local Providers

5 5 A New Connectivity Model Competition Competition No long-term contracts No long-term contracts Bottom-up Bottom-up Advantages: Verizon User MIT Pizza Shop

6 6 The Personal Router Personal Router Service Negotiation Application Composition Your personal digital accessories Your home National provider Local pizza shop

7 7 Economic and Technical Challenges Service discovery Service discovery Network level mobility/handoff Network level mobility/handoff Payment mechanisms Payment mechanisms Service provisioning Service provisioning Security Security Service selection Service selection

8 8 Services Service Selection User Input Service Info User Activity InputsOutput Services Selection Personal Router

9 9 Incomplete service information Incomplete service information –Service providers may be unwilling or unable to provide detailed service information Dynamic environment Dynamic environment –Existing services may become unavailable and new services may become available Network Service Challenges

10 10 Dealing with User Preferences Service features include Service features include –Bandwidth –Latency –Complex pricing plans User context includes User context includes –Location –Applications –Urgency of their task 11 Mbps $0.25/min 128 kbps $10/mon 1 Mbps $0.05/MB Utility depends on:

11 11 Service Selection Must Be Unobtrusive Manual selection does not work Manual selection does not work –Cognitively demanding Static rules do not work Static rules do not work –Does not accommodate individual user preferences –Difficult for application developers Offline preference elicitation (e.g. configuration files) does not work Offline preference elicitation (e.g. configuration files) does not work –Time consuming –Users may not be able to specify their preferences

12 12 Services A Three Step Learning Service Selection Approach Perceived Quality Perceived Cost Q/C Tradeoff Utility User Input Service Info User Activity InputsOutput Services Selection Select service Compute utility Learn from user User ModelUser Utility

13 13 Services Agent Architecture Perceived Quality Perceived Cost Q/C Tradeoff Utility User Input Service Info User Activity InputsOutput Services Selection User ModelUser Utility Service Value Predictor Service Value Estimator Utility Calculator Service Selector Q/C Tradeoff Modeler Input Translator

14 14 Services Interpreting User Input Perceived Quality Perceived Cost Q/C Tradeoff Utility User Input Service Info User Activity Services Selection Service Value Predictor Service Value Estimator Utility Calculator Service Selector Q/C Tradeoff Modeler Input Translator InputsOutputUser ModelUser Utility

15 15 Interpreting User Input User Input Service Value Predictor Service Value Estimator Q/C Tradeoff Modeler ΔwΔw Δq, Δc Input Translator Preference Updates Q/C Tradeoff Updates User Interface Translates high level user input into specific updates Provides an intuitive and unobtrusive interface

16 16 Unobtrusiveness is Essential Goal is opposite of traditional user interfaces Goal is opposite of traditional user interfaces User interface must not distract or interrupt user User interface must not distract or interrupt user Explicit rating is too obtrusive and cognitively demanding Explicit rating is too obtrusive and cognitively demanding Implicit rating based on simple input Implicit rating based on simple input

17 17 An Intuitive Interface High Level High Level Simple & Intuitive Simple & Intuitive “better” and “cheaper” “better” and “cheaper” “No input” implies satisfaction “No input” implies satisfaction “Taxi Meter” “Taxi Meter” –Shows total paid since last reset. “Money Speedometer” “Money Speedometer” –Shows derivative of the taxi meter. “Better” button “Better” button –Noticeably improves quality. “Cheaper” button. “Cheaper” button. –Noticeably improves cost. –Note that “better” and “cheaper” actions are not symmetric. $ 01.14 CheaperBetter TotalSpending Rate

18 18 Services Learning Quality and Cost Perceived Quality Perceived Cost Q/C Tradeoff Utility User Input Service Info User Activity Output Services Selection Service Value Predictor Service Value Estimator Utility Calculator Service Selector Q/C Tradeoff Modeler Input Translator InputsUser ModelUser Utility

19 19 Estimating Service Value Service Info Cost Prediction Quality Prediction Perceived Quality Perceived Cost User Activity Service Value Predictor Service Value Estimator Δq, Δc Preference Updates Q(s,a) C(s,a) a Learns user perceived quality and cost based on user feedback: Q(s,a)  (1 – α)Q(s,a) + α Q(s,a)  (1 – α)Q(s,a) + αΔq C(s,a)  (1 – α)C(s,a) + α C(s,a)  (1 – α)C(s,a) + αΔc Predictor provides initial estimates

20 20 Predicting Service Value Service Info Cost Prediction Prediction by nonlinear regression: maps from activity and service features to perceived quality and cost using a neural network Quality Prediction Perceived Quality Perceived Cost User Activity Service Value Predictor Service Value Estimator Preference Updates a Δq, Δc I CP(I,a)QP(I,a) Predictor trained on user feedback Δq, Δc Q(s,a) C(s,a) a

21 21 Services Calculating Utility and Selecting a Service Perceived Quality Perceived Cost Q/C Tradeoff Utility User Input Service Info User Activity Inputs Output Services Selection User ModelUser Utility Service Value Predictor Service Value Estimator Utility Calculator Service Selector Q/C Tradeoff Modeler Input Translator

22 22 Calculating Utility Services Perceived Quality Perceived Cost Q/C Tradeoff Utility Services Selection Utility Calculator Service Selector U(w,q,c) = wq + (1 – w)c Q(s,a) C(s,a) w U(w,q,c) Calculates utility as a linear weighted average: U(w,q,c) = wq + (1 – w)c

23 23 P(s) = P(s) = (T: “temperature” controls exploration level) Selecting a Service Services Perceived Quality Perceived Cost Q/C Tradeoff Utility Services Selection Utility Calculator Service Selector Q(s,a) C(s,a) w U(w,q,c)U(s) Balances exploration and exploitation by selecting services stochastically according to a Gibbs softmax distribution Only switches on user input, activity change, or change in set of available services

24 24 Services Learning Addresses Network and User Complexity Perceived Quality Perceived Cost Q/C Tradeoff Utility User Input Service Info User Activity InputsOutput Services Selection User ModelUser Utility Service Value Predictor Service Value Estimator Utility Calculator Service Selector Q/C Tradeoff Modeler Input Translator Handles incomplete service information using estimator Handles incomplete service information using estimator Handles dynamic environments using predictor Handles dynamic environments using predictor User model accommodates complex quality and cost functions User model accommodates complex quality and cost functions

25 25 Evaluating the Agent with User Experiments To answer these questions: To answer these questions: –Can the agent learn? –Is it better than manual selection? –How can we improve the agent model? Dynamic network Dynamic network –8 services available –Features: Bandwidth, Price/min, Price/kb –Available services change during experiment 17 subjects 17 subjects –9 learning agent users –8 manual selection users Only 30 minutes long Only 30 minutes long

26 26 Experimental Procedure View 20 pages in 5 minutes View 20 pages in 5 minutes Spend as few credits as possible Spend as few credits as possible –Subjects were paid according to how well they did Select services using only the given interface Select services using only the given interface

27 27 The Agent vs. Manual Selection

28 28 The Agent Learns Utility

29 29 Summary Our fundamental goals: Our fundamental goals: –Catalyze a ubiquitous, competitive wireless access market –Make it simple for users and diverse digital devices to use it Our approach: Our approach: –A Personal Router that selects services using a learning agent Accurate Accurate Unobtrusive Unobtrusive

30 30 Future Work Group learning Group learning –Collaborative filtering –Clustering based on user preferences More experiments More experiments –Modeling seller behavior –More sophisticated learning algorithms Other market models Other market models –Auctions –Negotiation Solving other systems and networking problems Solving other systems and networking problems –Helping users deal with complexity using learning

31 31 The Agent vs. Manual Selection

32 32 The Agent Learns Utility


Download ppt "1 A User-Guided Cognitive Agent for Wireless Service Selection in Pervasive Computing George Lee May 5, 2004 G. Lee, P. Faratin, S. Bauer, and J. Wroclawski."

Similar presentations


Ads by Google