 A Case For Interaction: A Study of Interactive Information Retrieval Behavior and Effectiveness By Jürgen Koenemann and Nicholas J. Belkin (1996) John.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Evaluating the Robustness of Learning from Implicit Feedback Filip Radlinski Thorsten Joachims Presentation by Dinesh Bhirud
How to Pick Up Women…. Using IR Strategies By Mike Wooldridge May 9, 2006.
Chapter 5: Introduction to Information Retrieval
Query Chains: Learning to Rank from Implicit Feedback Paper Authors: Filip Radlinski Thorsten Joachims Presented By: Steven Carr.
Exercising these ideas  You have a description of each item in a small collection. (30 web sites)  Assume we are looking for information about boxers,
“ The Anatomy of a Large-Scale Hypertextual Web Search Engine ” Presented by Ahmed Khaled Al-Shantout ICS
Search Engines and Information Retrieval
© Tefko Saracevic, Rutgers University1 Search strategy & tactics Governed by effectiveness & feedback.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
SLIDE 1IS 240 – Spring 2009 Prof. Ray Larson University of California, Berkeley School of Information Principles of Information Retrieval.
SLIDE 1IS 202 – FALL 2004 Lecture 13: Midterm Review Prof. Ray Larson & Prof. Marc Davis UC Berkeley SIMS Tuesday and Thursday 10:30 am -
A machine learning approach to improve precision for navigational queries in a Web information retrieval system Reiner Kraft
© Tefko Saracevic, Rutgers University 1 EVALUATION in searching IR systems Digital libraries Reference sources Web sources.
Modern Information Retrieval Chapter 5 Query Operations.
INFM 700: Session 12 Summative Evaluations Jimmy Lin The iSchool University of Maryland Monday, April 21, 2008 This work is licensed under a Creative Commons.
© Tefko Saracevic1 Search strategy & tactics Governed by effectiveness&feedback.
Information Retrieval Ch Information retrieval Goal: Finding documents Search engines on the world wide web IR system characters Document collection.
SIMS 202 Information Organization and Retrieval Prof. Marti Hearst and Prof. Ray Larson UC Berkeley SIMS Tues/Thurs 9:30-11:00am Fall 2000.
An investigation of query expansion terms Gheorghe Muresan Rutgers University, School of Communication, Information and Library Science 4 Huntington St.,
An Overview of Relevance Feedback, by Priyesh Sudra 1 An Overview of Relevance Feedback PRIYESH SUDRA.
CS580: Building Web Based Information Systems Roger Alexander & Adele Howe The purpose of the course is to teach theory and practice underlying the construction.
Information Retrieval
Modern Information Retrieval Relevance Feedback
Search Engines and Information Retrieval Chapter 1.
INF 141 COURSE SUMMARY Crista Lopes. Lecture Objective Know what you know.
PERSONALIZED SEARCH Ram Nithin Baalay. Personalized Search? Search Engine: A Vital Need Next level of Intelligent Information Retrieval. Retrieval of.
Nielsen’s Ten Usability Heuristics
Thanks to Bill Arms, Marti Hearst Documents. Last time Size of information –Continues to grow IR an old field, goes back to the ‘40s IR iterative process.
CS276A Text Information Retrieval, Mining, and Exploitation Lecture 10 7 Nov 2002.
1 Information Retrieval Acknowledgements: Dr Mounia Lalmas (QMW) Dr Joemon Jose (Glasgow)
Implicit Acquisition of Context for Personalization of Information Retrieval Systems Chang Liu, Nicholas J. Belkin School of Communication and Information.
Personalized Search Xiao Liu
Search Engine Architecture
IR Theory: Relevance Feedback. Relevance Feedback: Example  Initial Results Search Engine2.
LANGUAGE MODELS FOR RELEVANCE FEEDBACK Lee Won Hee.
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
Company LOGO Digital Infrastructure of RPI Personal Library Qi Pan Digital Infrastructure of RPI Personal Library Qi Pan.
Information Retrieval in Context of Digital Libraries - or DL in Context of IR Peter Ingwersen Royal School of LIS Denmark –
Performance Measurement. 2 Testing Environment.
Information Retrieval
L&I SCI 110: Information science and information theory Instructor: Xiangming(Simon) Mu Sept. 9, 2004.
Supporting Knowledge Discovery: Next Generation of Search Engines Qiaozhu Mei 04/21/2005.
By R. O. Nanthini and R. Jayakumar.  tools used on the web to find the required information  Akeredolu officially described the Web as “a wide- area.
Why IR test collections are so bad Mark Sanderson University of Sheffield.
User-Friendly Systems Instead of User-Friendly Front-Ends Present user interfaces are not accepted because the underlying systems are too difficult to.
WHAT IS IT & HOW DOES IT WORK?. SEO = search engine optimization optimizing content for search engines, right? Therefor if a search engine's jobs is to.
Relevance Feedback Prof. Marti Hearst SIMS 202, Lecture 24.
Document Clustering for Natural Language Dialogue-based IR (Google for the Blind) Antoine Raux IR Seminar and Lab Fall 2003 Initial Presentation.
WHIM- Spring ‘10 By:-Enza Desai. What is HCIR? Study of IR techniques that brings human intelligence into search process. Coined by Gary Marchionini.
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Free SEO for Blogs & YouTube Channels.
Presented by Archana Kumari ( ) | Supervised By Mr Vikram Singh
Search Engine Architecture
Relevance Feedback Hongning Wang
Search Pages and Results
Thanks to Bill Arms, Marti Hearst
IR Theory: Evaluation Methods
موضوع پروژه : بازیابی اطلاعات Information Retrieval
Evaluation of IR Performance
Web Information retrieval (Web IR)
Learning Literature Search Models from Citation Behavior
10 Design Principles.
Search Engine Architecture
CS246: Leveraging User Feedback
A Suite to Compile and Analyze an LSP Corpus
Nilesen 10 hueristics.
Relevance and Reinforcement in Interactive Browsing
Retrieval Utilities Relevance feedback Clustering
Topic 6- Basic Computer Literacy
Presentation transcript:

 A Case For Interaction: A Study of Interactive Information Retrieval Behavior and Effectiveness By Jürgen Koenemann and Nicholas J. Belkin (1996) John Clougherty1 Presented by John Clougherty

Experiment to Measure Effectiveness of Relevance Feedback and User Interaction John Clougherty2

Relevance Feedback and User Interaction Improves Retrieval Effectiveness  Relevance feedback increased retrieval effectiveness  Increased user interaction with relevance feedback made interactions more efficient John Clougherty3

Exponential Number of New Users are Gaining Access to IR Tools  Information is everywhere and it’s availability is growing rapidly  Anyone with a Computer/Smartphone/Tablet can use Bing to Google something  New users have none or minimal training John Clougherty4

Increase in Computerized Tools to Support User’s Information Need  User’s have an information need  Search engines are designed to provide that information need  Improvements in search engines are being developed every day  Speed, Best-match, Relevance, Flexibility, Push and Pull John Clougherty5

Novice IR System Users Perform Query Reformulation  INQUERY System  Novice Users  2000 Documents  Two Topics  and Recall John Clougherty6

Four Levels of Information Retrieval System 2. Opaque  Hidden Effect of Relevance Feedback  No User Interaction John Clougherty7 4. Penetrable  Visible Effect of Relevance Feedback  User Interaction with Feedback 1. Baseline  No Relevance Feedback 3. Transparent  Visible Effect of Relevance Feedback  No User Interaction

Penetrable System Performed Best for and Number of Iterations John Clougherty8

Relevance Feedback is Heavily Utilized but User Interaction is Minimal  Directly or Indirectly obtained by the user  Most relevance feedback algorithms are done implicitly  User interaction has yet to be implemented in a big way John Clougherty9

Sources  33f1aec03e68.jpg  Prologue-Hologram-2.jpg  chart-300x300.jpg    John Clougherty10

John Clougherty11 Thank You