1 Discussion Class 11 Click through Data as Implicit Feedback.

Slides:



Advertisements
Similar presentations
Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.
Advertisements

B – Design B(i) How the solution solves the problem In this section you specify exactly how your chosen software will meet the requirements. For example.
Module 12 WSP quality assurance tool 1. Module 12 WSP quality assurance tool Session structure Introduction About the tool Using the tool Supporting materials.
1 Evaluation Rong Jin. 2 Evaluation  Evaluation is key to building effective and efficient search engines usually carried out in controlled experiments.
Optimizing search engines using clickthrough data
Query Chains: Learning to Rank from Implicit Feedback Paper Authors: Filip Radlinski Thorsten Joachims Presented By: Steven Carr.
Anatomy Laboratory Write up Emulate standard Scientific Paper (few exceptions)
1 Learning User Interaction Models for Predicting Web Search Result Preferences Eugene Agichtein Eric Brill Susan Dumais Robert Ragno Microsoft Research.
1 Discussion Class 3 The Porter Stemmer. 2 Course Administration No class on Thursday.
Evaluating Search Engine
1 Discussion Class 2 A Vector Space Model for Automated Indexing.
1 Discussion Class 12 Medical Subject Headings (MeSH) and Unified Medical Language System (UML)
1 CS 430 / INFO 430 Information Retrieval Lecture 2 Searching Full Text 2.
1 Discussion Class 4 Latent Semantic Indexing. 2 Discussion Classes Format: Question Ask a member of the class to answer. Provide opportunity for others.
1 Discussion Class 10 Informedia. 2 Discussion Classes Format: Question Ask a member of the class to answer. Provide opportunity for others to comment.
Cooperation, Learning and Project Planning (SLP): MM8
1 Discussion Class 12 User Interfaces and Visualization.
1 Discussion Class 3 Inverse Document Frequency. 2 Discussion Classes Format: Questions. Ask a member of the class to answer. Provide opportunity for.
1 Discussion Class 2 A Vector Space Model for Automated Indexing.
1 Discussion Class 6 Crawling the Web. 2 Discussion Classes Format: Questions. Ask a member of the class to answer. Provide opportunity for others to.
An investigation of query expansion terms Gheorghe Muresan Rutgers University, School of Communication, Information and Library Science 4 Huntington St.,
1 Discussion Class 8 The Google File System. 2 Discussion Classes Format: Question Ask a member of the class to answer. Provide opportunity for others.
1 Discussion Class 5 TREC. 2 Discussion Classes Format: Questions. Ask a member of the class to answer. Provide opportunity for others to comment. When.
An Overview of Relevance Feedback, by Priyesh Sudra 1 An Overview of Relevance Feedback PRIYESH SUDRA.
1 Final Discussion Class User Interfaces. 2 Discussion Classes Format: Question Ask a member of the class to answer Provide opportunity for others to.
1 Discussion Class 1 Three Information Retrieval Systems.
Writing Letters to the Editor. What is a letter to the editor? A written way of talking to readers of a regularly printed publication Generally found.
Research Paper-Expanding Executive Power. Overview Essential questions: Do president's have too much power today? Final Task: You will write a critical.
1 CS 430 / INFO 430 Information Retrieval Lecture 2 Text Based Information Retrieval.
1 Can People Collaborate to Improve the relevance of Search Results? Florian Eiteljörge June 11, 2013Florian Eiteljörge.
Hao Wu Nov Outline Introduction Related Work Experiment Methods Results Conclusions & Next Steps.
Implicit Acquisition of Context for Personalization of Information Retrieval Systems Chang Liu, Nicholas J. Belkin School of Communication and Information.
Clustering Personalized Web Search Results Xuehua Shen and Hong Cheng.
Implicit User Feedback Hongning Wang Explicit relevance feedback 2 Updated query Feedback Judgments: d 1 + d 2 - d 3 + … d k -... Query User judgment.
Key Skills: Communications Presented by Bill Haining.
Designing Your Webpage Team EDIT: Summer 2010 EDUC
1 Discussion Class 8 MARC. 2 Discussion Classes Format: Question Ask a member of the class to answer. Provide opportunity for others to comment. When.
Implicit User Feedback Hongning Wang Explicit relevance feedback 2 Updated query Feedback Judgments: d 1 + d 2 - d 3 + … d k -... Query User judgment.
Your job will be to examine who or what the document is about, when and where it takes place and how the information that is being presented can be.
Principals of Research Writing. What is Research Writing? Process of communicating your research  Before the fact  Research proposal  After the fact.
1 CS 501 Spring 2002 CS 501: Software Engineering Lecture 27 Software Engineering as Engineering.
Technical Writing: An Editor’s Perspective Michael K. Lindell Hazard Reduction & Recovery Center Texas A&M University.
1 Discussion Class 1 Three Information Retrieval Systems.
Identifying “Best Bet” Web Search Results by Mining Past User Behavior Author: Eugene Agichtein, Zijian Zheng (Microsoft Research) Source: KDD2006 Reporter:
Why Decision Engine Bing Demos Search Interaction model Data-driven Research Problems Q & A.
Unified Relevance Feedback for Multi-Application User Interest Modeling Sampath Jayarathna PhD Candidate Computer Science & Engineering.
1 Discussion Class 3 Stemming Algorithms. 2 Discussion Classes Format: Question Ask a member of the class to answer Provide opportunity for others to.
1 Discussion Class 1 Inverted Files. 2 Discussion Classes Format: Question Ask a member of the class to answer Provide opportunity for others to comment.
1 Discussion Class 10 Thesaurus Construction. 2 Discussion Classes Format: Question Ask a member of the class to answer Provide opportunity for others.
Information Literacy How to evaluate information found on the World Wide Web.
Let’s All Learn How to Write a DBQ What is a DBQ? Your job will be to examine who or what the document is about, when and where it takes place and how.
1 Discussion Class 2 A Vector Space Model for Automated Indexing.
Can't Type? press F11 Can’t Hear? Check: Speakers, Volume or Re-Enter Seminar Put ? in front of Questions so it is easier to see them. 1 Welcome to Unit.
CONCLUSIONS & CONTRIBUTIONS Ground-truth dataset, simulated search tasks environment Implicit feedback, semi-explicit feedback (annotations), explicit.
Accurately Interpreting Clickthrough Data as Implicit Feedback
Evaluation Anisio Lacerda.
WSP quality assurance tool
Active Training Awards 2017: Training Provider of the Year
That teaching philosophy!
Active Training Awards 2017: Small Employer of the Year (up to 100 employees) This category is aimed at employers that provide outstanding support and.
Let’s All Learn How to Write a DBQ
Discussion Class 7 Lucene.
Assessment Needs Analysis
CS246: Leveraging User Feedback
1) Save the world OR 2) Make a million £
Relevance Feedback and Query Modification
Introduction to information retrieval
Discussion Class 9 Google.
Discussion Class 7 User Requirements.
Discussion Class 8 User Interfaces.
Presentation transcript:

1 Discussion Class 11 Click through Data as Implicit Feedback

2 Discussion Classes Format: Question Ask a member of the class to answer. Provide opportunity for others to comment. When answering: Stand up. Give your name. Make sure that the TA hears it. Speak clearly so that all the class can hear. Suggestions: Do not be shy at presenting partial answers. Differing viewpoints are welcome.

3 Question 1: Training data (a) Why is training data important in tuning a search engine? (b) Explain the difference between the use of "explicit relevance data" and "implicit feedback." (c) What is "click through data" and why might it be useful?

4 Question 2: Eye tracking (a) What is eye tracking? (b)Define "fixation" and "saccade"? (c)How are they measured?

5 Question 3: Explicit relevance judgments The method for obtaining explicit relevance judgments was very different from the TREC methodology. Explain the difference in: (a) Selection of assessors (b) Documents reviewed by assessors (c) Task required of assessors for each document (d) Technique for ensuring consistency between assessors Are these differences suitable for this experiment?

6 Question 4: Regular Google ranks How do the fixations differ from the click?

7 Question 5: Relevance, ranking, and click through data (a)When the rankings were reversed, how did the user behavior change? i) Average rank of pages clicked ii) Number of pages clicked (b)The evidence that user's click through behavior is influenced by trust in Google was studied by a statistical experiment. i) What was the hypothesis? ii) What was the methodology? iii) What was the conclusions?

8 Question 6: Clicks as relative relevance judgments Section 5.3 looks at a number of relative measures: click > skip above last click > click above click > click earlier last click > click previous click > no-click next (a)What are these tests attempting to measure? (b)Explain the rationale behind: click > skip above.

9 Question 7:

10 Question 8:

11 Question 9: