Retroactive Answering of Search Queries Beverly Yang Glen Jeh Google, Inc. Presented.

Slides:



Advertisements
Similar presentations
Retroactive Answering of Search Queries Beverly Yang Glen Jeh Google.
Advertisements

Dong Liu Xian-Sheng Hua Linjun Yang Meng Weng Hong-Jian Zhang.
Explorations in Tag Suggestion and Query Expansion Jian Wang and Brian D. Davison Lehigh University, USA SSM 2008 (Workshop on Search in Social Media)
 How many pages does it search?  How does it access all those pages?  How does it give us an answer so quickly?  How does it give us such accurate.
Topic-Sensitive PageRank Taher H. Haveliwala. PageRank Importance is propagated A global ranking vector is pre-computed.
J. Chen, O. R. Zaiane and R. Goebel An Unsupervised Approach to Cluster Web Search Results based on Word Sense Communities.
Web Usage Mining with Semantic Analysis Date: 2013/12/18 Author: Laura Hollink, Peter Mika, Roi Blanco Source: WWW’13 Advisor: Jia-Ling Koh Speaker: Pei-Hao.
Search Engines and Information Retrieval Chapter 1.
Creating Tutorials for the Web: a Designer’s Challenge Module 4: Checking for Effectiveness.
1 Applying Collaborative Filtering Techniques to Movie Search for Better Ranking and Browsing Seung-Taek Park and David M. Pennock (ACM SIGKDD 2007)
Improving Web Search Ranking by Incorporating User Behavior Information Eugene Agichtein Eric Brill Susan Dumais Microsoft Research.
Tag Data and Personalized Information Retrieval 1.
UOS 1 Ontology Based Personalized Search Zhang Tao The University of Seoul.
When Experts Agree: Using Non-Affiliated Experts To Rank Popular Topics Meital Aizen.
Cloak and Dagger: Dynamics of Web Search Cloaking David Y. Wang, Stefan Savage, and Geoffrey M. Voelker University of California, San Diego 左昌國 Seminar.
CIKM’09 Date:2010/8/24 Advisor: Dr. Koh, Jia-Ling Speaker: Lin, Yi-Jhen 1.
Understanding and Predicting Personal Navigation Date : 2012/4/16 Source : WSDM 11 Speaker : Chiu, I- Chih Advisor : Dr. Koh Jia-ling 1.
1 McGraw-Hill Professional Learn More. Do More. Search & Browse..…………………………….………..…………..3-9 Using My AccessScience Profiles …………
Rev.04/2015© 2015 PLEASE NOTE: The Application Review Module (ARM) is a system that is designed as a shared service and is maintained by the Grants Centers.
Wikipedia as Sense Inventory to Improve Diversity in Web Search Results Celina SantamariaJulio GonzaloJavier Artiles nlp.uned.es UNED,c/Juan del Rosal,
1 FollowMyLink Individual APT Presentation Third Talk February 2006.
Retroactive Answering of Search Queries Beverly Yang Glen Jeh.
“In the beginning -- before Google -- a darkness was upon the land.” Joel Achenbach Washington Post.
A System for Automatic Personalized Tracking of Scientific Literature on the Web Tzachi Perlstein Yael Nir.
CS791 - Technologies of Google Spring A Web­based Kernel Function for Measuring the Similarity of Short Text Snippets By Mehran Sahami, Timothy.
INFORMATION RETRIEVAL MEASUREMENT OF RELEVANCE EFFECTIVENESS 1Adrienn Skrop.
WHIM- Spring ‘10 By:-Enza Desai. What is HCIR? Study of IR techniques that brings human intelligence into search process. Coined by Gary Marchionini.
Importance of SEO in business ESOLPK. SEO What is SEO?  Site design improvement or SEO to put it plainly, is an arrangement of standards that can be.
How to use Search Engines and Discovery Tools? Salama Khamis Al Mehairi U
AdisInsight User Guide July 2015
Information Retrieval in Practice
Queensland University of Technology
Search Engine Optimization
Information Storage and Retrieval Fall Lecture 1: Introduction and History.
Recruiter 2.0 Overview May 1, 2012.
Differentiating Instruction Using Nettrekker
Simultaneous Support for Finding and Re-Finding
SEARCH ENGINES & WEB CRAWLER Akshay Ghadge Roll No: 107.
Proposal for Term Project
Enhancing Internet Search Engines to Achieve Concept-based Retrieval
Preface to the special issue on context-aware recommender systems
IST 516 Fall 2011 Dongwon Lee, Ph.D.
P7: Annotated Wireframes
E S Search Engine Optimization Guide to follow in 2018! T Tips 5.
Personalized, Interactive Question Answering on the Web
Guidelines for NSPRC Presentations
Search Pages and Results
10 Top SEO Trends For 2018
Monitoring Test Progress and Data Cleanup
Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall
50 ABOUT KOBIT HOW TO USE SERVICE COSTS Member Registration
ZIMS With Medical Release 2.0 R2
Online Tool Screen shots
NORTHEASTERNERS, INC. IRS 990 filing instructions presentation
Data Integration for Relational Web
Beliefs and Biases in Web Search
ISWC 2013 Entity Recommendations in Web Search
Finding Trends with Visualizations
50 ABOUT KOBIT HOW TO USE SERVICE COSTS Member Registration
Struggling and Success in Web Search
Anatomy of a Search Search The Index:
Reporting Site Manager User Guide February 2019.
Monitoring Test Progress and Data Cleanup
Web Searching Everything, now..
Retrieval Performance Evaluation - Measures
Information Retrieval and Web Design
ADVANCED SEARCH ON WESTLAWNEXT
Connecting the Dots Between News Article
FAFSA on the Web Simplification
Searching for Federal Funding
Presentation transcript:

Retroactive Answering of Search Queries Beverly Yang Glen Jeh Google, Inc. Presented by: Raghavendra Aekka Kranthi Kumar Kontham

 Introduction  QSR Architecture  Identifying the Standing Interests Problem Definition Signals  Determining Recommendations  User Studies First person study design Third person study design  Results of User Studies  Related work  Conclusion Overview

Recommendations

Personalization

QSR QSR = Query-Specific (Web) Recommendations A personalization service that alerts the user when interesting new results to selected previous queries have appeared. e.g.. ``Britney Spears concert San Francisco.''

Visualization of Recommended Web Pages

Challenges  To automatically identify queries that represent standing interests.  To identify new results that the user would be interested in.

QSR Architecture Limit: M queries Limit: N recommendations

Functioning of QSR Read User’s Action Identify Top M Queries Compare 1 st Ten New Results with Old Results Display Top N Recommendations To User Submit Each Query to Search Engine Scoring Recommendations

Identifying Standing Interests Definition If user is interested in seeing new interesting results, it is said to be Standing Interest in Query Users’ standing interests determine how useful recommendations are.

Problem Definition Factors indicating Standing Interest  Prior Fulfillment Whether the user found a satisfactory result.  Query Interest Level Whether still Interested even after satisfactory result.  Need/Interest Duration How timely is the information need.

Sample Query Session html encode java (8 s) * RESULTCLICK (91.00 s) * RESULTCLICK ( s) * RESULTCLICK (12.00 s) * NEXTPAGE (5.00 s) - start = 10 o RESULTCLICK ( s) o REFINEMENT (21.00 s) - html encode java utility + RESULTCLICK (32.00 s) o NEXTPAGE (8.00 s) - start = 10 * NEXTPAGE (30.00 s) - start = 20 (Total time: s)

Query Sessions All actions associated with a given initial query. Query Refinement New query contains at least one common term of previous one. How do we determine the Actual Query from a Query Session?

Signals Factors for identifying Query Interest  Number of terms E.g.: valentino rossi theaters screening movie 300 in norfolk  Number of clicks and number of refinements More actions by the user indicate more interest.  History match Query matches previous queries or clicks.

 Navigational E.g.: google videos  Repeated non-navigational E.g. : Britney Spears Additional Signals Session duration Topic of the query Number of long clicks

Interest Score Boolean signals such as repeated non-navigational signals can be used as filters.

Determining Interesting Results Already addressed by current web alert services. Quality of recommendation determined by user’s interest. Factors for web alerts Result moved into the top 10 results of a query. Spurious change in the ranking of a page.

Characteristics of Good Recommendations  New to the user  Good page  Recently “promoted”

Signals  History presence Do not recommend pages which are in users search engine.  Rank High Rank implies good recommendation.  Popularity and relevance score (PR) High PR score implies good recommendation.  Above drop-off Result above threshold PR score is a good recommendation.

Additional Signals Days elapsed since query submission signal Sole changed signal Only new result in Top 6. All poor signal All results having low PR score imply no good pages.

Quality Score It is a suboptimal indicator of quality. Thus alternate score with superior performance : Boolean signal “Above Dropoff” is used as filter.

User Studies Two human subject studies conducted on users of Google's Search History service. First person study Evaluate interest level on their own past queries and quality of recommendations Third person study Evaluate anonymous query sessions and assess the quality of recommendations

First Person Study design The survey displayed a maximum of 30 query sessions from the user’s own history For each query session, the user was shown a visual representation of the actions. Generate query recommendations Selecting Query Sessions Eliminate query sessions with No events No clicks and one/two refinements Non repeated navigational queries

 From the remaining half of the sessions selected for the survey consisted of highest ranked sessions with respect to high score.  The second half of the sessions for the survey were randomly selected. Selecting Recommendations: Generate recommendations for queries with history presence signal. Consider the current top ten results for a query. Apply Boolean signals to the result that the user has not yet seen as filters. From the remaining select top recommendations with respect to the qscore.

Third Person study Design Five human subjects viewed other users’ anonymized query sessions and associated recommendations. Displayed visual representation of the entire query session. Half of the recommendations selected as in first person study. Second half consisted of highest ranked new result in the top ten.

Identifying Standing Interest

Determining Quality of Recommendations

Conclusion Presented QSR, a new system that retroactively answers search queries representing standing interests. QSR addresses two sub problems: Automatic identification of queries that represent standing interest. Identification of new interesting results. Presented algorithms and conducted user studies to evaluate them.

Discussions Do the security concerns outweigh the advantages of the QSR system? What do you think was wrong in the User Studies? Why do you think Google is spending so much in research of QSR System?