CS246: Leveraging User Feedback

Slides:



Advertisements
Similar presentations
Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.
Advertisements

Evaluating the Robustness of Learning from Implicit Feedback Filip Radlinski Thorsten Joachims Presentation by Dinesh Bhirud
Chapter 5: Introduction to Information Retrieval
Optimizing search engines using clickthrough data
Query Chains: Learning to Rank from Implicit Feedback Paper Authors: Filip Radlinski Thorsten Joachims Presented By: Steven Carr.
Eye Tracking Analysis of User Behavior in WWW Search Laura Granka Thorsten Joachims Geri Gay.
Personalization and Search Jaime Teevan Microsoft Research.
1 Learning User Interaction Models for Predicting Web Search Result Preferences Eugene Agichtein Eric Brill Susan Dumais Robert Ragno Microsoft Research.
Information Retrieval in Practice
Click Evidence Signals and Tasks Vishwa Vinay Microsoft Research, Cambridge.
Personalizing Search via Automated Analysis of Interests and Activities Jaime Teevan Susan T.Dumains Eric Horvitz MIT,CSAILMicrosoft Researcher Microsoft.
1 Discussion Class 11 Click through Data as Implicit Feedback.
Modern Information Retrieval Chapter 5 Query Operations.
Recall: Query Reformulation Approaches 1. Relevance feedback based vector model (Rocchio …) probabilistic model (Robertson & Sparck Jones, Croft…) 2. Cluster.
1 Automatic Identification of User Goals in Web Search Uichin Lee, Zhenyu Liu, Junghoo Cho Computer Science Department, UCLA {uclee, vicliu,
Problem Addressed The Navigation –Aided Retrieval tries to provide navigational aided query processing. It claims that the conventional Information Retrieval.
University of Kansas Department of Electrical Engineering and Computer Science Dr. Susan Gauch April 2005 I T T C Dr. Susan Gauch Personalized Search Based.
An investigation of query expansion terms Gheorghe Muresan Rutgers University, School of Communication, Information and Library Science 4 Huntington St.,
An Overview of Relevance Feedback, by Priyesh Sudra 1 An Overview of Relevance Feedback PRIYESH SUDRA.
Overview of Search Engines
Online Search Evaluation with Interleaving Filip Radlinski Microsoft.
Finding and Re-Finding Through Personalization Jaime Teevan MIT, CSAIL David Karger (advisor), Mark Ackerman, Sue Dumais, Rob Miller (committee), Eytan.
COMP423.  Query expansion  Two approaches ◦ Relevance feedback ◦ Thesaurus-based  Most Slides copied from ◦
Evaluation Experiments and Experience from the Perspective of Interactive Information Retrieval Ross Wilkinson Mingfang Wu ICT Centre CSIRO, Australia.
Improving Web Search Ranking by Incorporating User Behavior Information Eugene Agichtein Eric Brill Susan Dumais Microsoft Research.
PERSONALIZED SEARCH Ram Nithin Baalay. Personalized Search? Search Engine: A Vital Need Next level of Intelligent Information Retrieval. Retrieval of.
Tag Data and Personalized Information Retrieval 1.
UOS 1 Ontology Based Personalized Search Zhang Tao The University of Seoul.
Thanks to Bill Arms, Marti Hearst Documents. Last time Size of information –Continues to grow IR an old field, goes back to the ‘40s IR iterative process.
Internet Information Retrieval Sun Wu. Course Goal To learn the basic concepts and techniques of internet search engines –How to use and evaluate search.
Hao Wu Nov Outline Introduction Related Work Experiment Methods Results Conclusions & Next Steps.
1 Query Operations Relevance Feedback & Query Expansion.
Implicit Acquisition of Context for Personalization of Information Retrieval Systems Chang Liu, Nicholas J. Belkin School of Communication and Information.
Lecture 2 Jan 13, 2010 Social Search. What is Social Search? Social Information Access –a stream of research that explores methods for organizing users’
Implicit User Feedback Hongning Wang Explicit relevance feedback 2 Updated query Feedback Judgments: d 1 + d 2 - d 3 + … d k -... Query User judgment.
Search Result Interface Hongning Wang Abstraction of search engine architecture User Ranker Indexer Doc Analyzer Index results Crawler Doc Representation.
Personalized Search Xiao Liu
Context-Sensitive Information Retrieval Using Implicit Feedback Xuehua Shen : department of Computer Science University of Illinois at Urbana-Champaign.
Search Engines that Learn from Implicit Feedback Jiawen, Liu Speech Lab, CSIE National Taiwan Normal University Reference: Search Engines that Learn from.
Lecture 2 Jan 15, 2008 Social Search. What is Social Search? Social Information Access –a stream of research that explores methods for organizing users’
IR Theory: Relevance Feedback. Relevance Feedback: Example  Initial Results Search Engine2.
 Examine two basic sources for implicit relevance feedback on the segment level for search personalization. Eye tracking Display time.
Personalizing Search Jaime Teevan, MIT Susan T. Dumais, MSR and Eric Horvitz, MSR.
 A Case For Interaction: A Study of Interactive Information Retrieval Behavior and Effectiveness By Jürgen Koenemann and Nicholas J. Belkin (1996) John.
Implicit User Modeling for Personalized Search Xuehua Shen, Bin Tan, ChengXiang Zhai Department of Computer Science University of Illinois, Urbana-Champaign.
Implicit User Feedback Hongning Wang Explicit relevance feedback 2 Updated query Feedback Judgments: d 1 + d 2 - d 3 + … d k -... Query User judgment.
Search Result Interface Hongning Wang Abstraction of search engine architecture User Ranker Indexer Doc Analyzer Index results Crawler Doc Representation.
Supporting Knowledge Discovery: Next Generation of Search Engines Qiaozhu Mei 04/21/2005.
Social Searching and Information Recommendation Systems Hassan Zamir.
Identifying “Best Bet” Web Search Results by Mining Past User Behavior Author: Eugene Agichtein, Zijian Zheng (Microsoft Research) Source: KDD2006 Reporter:
Unsearch Leveraging Google Instant results. Google Instant Google Instant displays the result of your most probable search You receive the result before.
1 Personalizing Search via Automated Analysis of Interests and Activities Jaime Teevan, MIT Susan T. Dumais, Microsoft Eric Horvitz, Microsoft SIGIR 2005.
Potential for Personalization Transactions on Computer-Human Interaction, 17(1), March 2010 Data Mining for Understanding User Needs Jaime Teevan, Susan.
Accurately Interpreting Clickthrough Data as Implicit Feedback
Evaluation Anisio Lacerda.
User Characterization in Search Personalization
Search User Behavior: Expanding The Web Search Frontier
Improvements to Search
موضوع پروژه : بازیابی اطلاعات Information Retrieval
Evidence from Behavior
Web Information retrieval (Web IR)
Image Search Engine on Internet
Anatomy of a Search Search The Index:
This presentation document has been prepared by Vault Intelligence Limited (“Vault") and is intended for off line demonstration, presentation and educational.
Junghoo “John” Cho UCLA
CS246: Information Retrieval
Junghoo “John” Cho UCLA
Click Summary Value Button to Show Source of Integral or Time
Information Retrieval and Web Design
ADVANCED SEARCH ON WESTLAWNEXT
Presentation transcript:

CS246: Leveraging User Feedback Junghoo “John” Cho UCLA

Improving Search Engine Via User Feedback Q: Once we deploy a search engine, can we improve it through user interaction? Q: How can users’ “feedback” be used? A: Possibilities Rerank documents Evaluate ranking function Expand benchmark query set

2. Evaluating Ranking Q: How can we use user’s interaction to evaluate whether our new ranking is better than the old one? A/B testing: common technique to compare the performance difference of two versions Serve two different versions, A and B, to users (without telling them) Measure the performance difference between the two

A/B Testing Example Q: Which colored button do users notice and click more? Click rate: 35% Click rate: 17%

A/B Testing for Ranking Evaluation UCLA CS Q: What should we measure to compare?

Difficulty of Ranking Evaluation Using users’ click behavior is a very noisy signal Ambiguity in interpreting users’ behavior Does more clicks mean more relevant results? Do users’ clicks influenced by ranking in the result? Did users stop clicking because they found what they looked for or because they gave up?

A/B Testing via Merged Ranking [Joachims 03] Instead of showing two separate ranked results, show one merged result The ranking that gets more clicks in the merged result is better Removes many ambiguities arising from two separate lists Q: Is it true? A B A B A B A

Expanding Benchmark Query Set Expand benchmark query set through users’ click data Q: How can we identify “relevant” pages? Again, very tricky due to the ambiguity of users’ behavior, but Are users clicks heavily focused on a particular page even after changes in their position?

Summary Improve search engine through user feedback Implicit vs explicit vs “peudo” Users are reluctant to provide any explicit feedback User behavior is ambiguous and difficult to interpret Three possible ways to use users’ feedback Rerank documents Evaluate ranking Expand benchmark query set

References [Joachims 03] Thorsten Joachims: Evaluating Retrieval Performance using Clickthrough Data in Text Mining, 2003