Download presentation
Presentation is loading. Please wait.
1
September 7, 2000Information Organization and Retrieval Introduction to Information Retrieval Ray Larson & Marti Hearst University of California, Berkeley School of Information Management and Systems SIMS 202: Information Organization and Retrieval
2
September 7, 2000Information Organization and Retrieval Search and Retrieval Outline of Part I of SIMS 202 The Search Process Information Retrieval Models Content Analysis/Zipf Distributions Evaluation of IR Systems –Precision/Recall –Relevance –User Studies System and Implementation Issues Web-Specific Issues User Interface Issues Special Kinds of Search
3
September 7, 2000Information Organization and Retrieval Today Finding Out About Introduction to Information Retrieval Introduction to the Boolean Model and Boolean Queries
4
September 7, 2000Information Organization and Retrieval Finding Out About (This discussion is drawn from R. Belew’s manuscript) Three phases: –Asking of a question (the Information Need) –Construction of an answer (IR proper) –Assessment of the answer (Evaluation) Part of an iterative process
5
September 7, 2000Information Organization and Retrieval Question Asking Person asking = “user” –In a frame of mind, a cognitive state –Aware of a gap in their knowledge –May not be able to fully define this gap Paradox of FOA: –If user knew the question to ask, there would often be no work to do. “The need to describe that which you do not know in order to find it” Roland Hjerppe Query –External expression of this ill-defined state
6
September 7, 2000Information Organization and Retrieval Question Answering Say question answerer is human. –Can they translate the user’s ill-defined question into a better one? –Do they know the answer themselves? –Are they able to verbalize this answer? –Will the user understand this verbalization? –Can they provide the needed background? Say answerer is a computer system.
7
September 7, 2000Information Organization and Retrieval Assessing the Answer How well does it answer the question? –Complete answer? Partial? –Background Information? –Hints for further exploration? How relevant is it to the user?
8
September 7, 2000Information Organization and Retrieval IR is an Iterative Process Repositories Workspace Goals
9
September 7, 2000Information Organization and Retrieval IR is a Dialog –The exchange doesn’t end with first answer –User can recognize elements of a useful answer –Questions and understanding changes as the process continues.
10
September 7, 2000Information Organization and Retrieval A sketch of a searcher… “moving through many actions towards a general goal of satisfactory completion of research related to an information need.” (after Bates 89) Q0 Q1 Q2 Q3 Q4 Q5
11
September 7, 2000Information Organization and Retrieval Berry-picking model (cont.) The query is continually shifting New information may yield new ideas and new directions The information need –is not satisfied by a single, final retrieved set – is satisfied by a series of selections and bits of information found along the way.
12
September 7, 2000Information Organization and Retrieval Information Seeking Behavior Two parts of the process: search and retrieval analysis and synthesis of search results
13
September 7, 2000Information Organization and Retrieval Search Tactics and Strategies Search Tactics –Bates 79 Search Strategies –Bates 89 –O’Day and Jeffries 93
14
September 7, 2000Information Organization and Retrieval Tactics vs. Strategies Tactic: short term goals and maneuvers –operators, actions Strategy: overall planning –link a sequence of operators together to achieve some end
15
September 7, 2000Information Organization and Retrieval Later in the course: –Search Process and Strategies –User interfaces to improve IR process –Incorporation of Content Analysis into better systems
16
September 7, 2000Information Organization and Retrieval Restricted Form of the IR Problem The system has available only pre-existing, “canned” text passages. Its response is limited to selecting from these passages and presenting them to the user. It must select, say, 10 or 20 passages out of millions or billions!
17
September 7, 2000Information Organization and Retrieval Information Retrieval Revised Task Statement: Build a system that retrieves documents that users are likely to find relevant to their queries. This set of assumptions underlies the field of Information Retrieval.
18
September 7, 2000Information Organization and Retrieval Some IR History –Roots in the scientific “Information Explosion” following WWII –Interest in computer-based IR from mid 1950’s H.P. Luhn at IBM (1958) Probabilistic models at Rand (Maron & Kuhns) (1960) Boolean system development at Lockheed (‘60s) Vector Space Model (Salton at Cornell 1965) Statistical Weighting methods and theoretical advances (‘70s) Refinements and Advances in application (‘80s) User Interfaces, Large-scale testing and application (‘90s)
19
September 7, 2000Information Organization and Retrieval Structure of an IR System Search Line Interest profiles & Queries Documents & data Rules of the game = Rules for subject indexing + Thesaurus (which consists of Lead-In Vocabulary and Indexing Language Storage Line Potentially Relevant Documents Comparison/ Matching Store1: Profiles/ Search requests Store2: Document representations Indexing (Descriptive and Subject) Formulating query in terms of descriptors Storage of profiles Storage of Documents Information Storage and Retrieval System Adapted from Soergel, p. 19
20
September 7, 2000Information Organization and Retrieval Structure of an IR System Search Line Interest profiles & Queries Documents & data Rules of the game = Rules for subject indexing + Thesaurus (which consists of Lead-In Vocabulary and Indexing Language Storage Line Potentially Relevant Documents Comparison/ Matching Store1: Profiles/ Search requests Store2: Document representations Indexing (Descriptive and Subject) Formulating query in terms of descriptors Storage of profiles Storage of Documents Information Storage and Retrieval System Adapted from Soergel, p. 19
21
September 7, 2000Information Organization and Retrieval Structure of an IR System Search Line Interest profiles & Queries Documents & data Rules of the game = Rules for subject indexing + Thesaurus (which consists of Lead-In Vocabulary and Indexing Language Storage Line Potentially Relevant Documents Comparison/ Matching Store1: Profiles/ Search requests Store2: Document representations Indexing (Descriptive and Subject) Formulating query in terms of descriptors Storage of profiles Storage of Documents Information Storage and Retrieval System Adapted from Soergel, p. 19
22
September 7, 2000Information Organization and Retrieval Structure of an IR System Search Line Interest profiles & Queries Documents & data Rules of the game = Rules for subject indexing + Thesaurus (which consists of Lead-In Vocabulary and Indexing Language Storage Line Potentially Relevant Documents Comparison/ Matching Store1: Profiles/ Search requests Store2: Document representations Indexing (Descriptive and Subject) Formulating query in terms of descriptors Storage of profiles Storage of Documents Information Storage and Retrieval System Adapted from Soergel, p. 19
23
September 7, 2000Information Organization and Retrieval Relevance (introduction) In what ways can a document be relevant to a query? –Answer precise question precisely. –Who is buried in grant’s tomb? Grant. –Partially answer question. –Where is Danville? Near Walnut Creek. –Suggest a source for more information. –What is lymphodema? Look in this Medical Dictionary. –Give background information. –Remind the user of other knowledge. –Others...
24
September 7, 2000Information Organization and Retrieval Query Languages A way to express the question (information need) Types: –Boolean –Natural Language –Stylized Natural Language –Form-Based (GUI)
25
September 7, 2000Information Organization and Retrieval Simple query language: Boolean –Terms + Connectors (or operators) –terms words normalized (stemmed) words phrases thesaurus terms –connectors AND OR NOT
26
September 7, 2000Information Organization and Retrieval Boolean Queries Cat Cat OR Dog Cat AND Dog (Cat AND Dog) (Cat AND Dog) OR Collar (Cat AND Dog) OR (Collar AND Leash) (Cat OR Dog) AND (Collar OR Leash)
27
September 7, 2000Information Organization and Retrieval Boolean Queries (Cat OR Dog) AND (Collar OR Leash) –Each of the following combinations works: Catxxxx Dogxxxxx Collarxxxx Leashxxxx
28
September 7, 2000Information Organization and Retrieval Boolean Queries (Cat OR Dog) AND (Collar OR Leash) –None of the following combinations work: Catxx Dogxx Collarxx Leashxx
29
September 7, 2000Information Organization and Retrieval Boolean Logic A B
30
September 7, 2000Information Organization and Retrieval Boolean Queries –Usually expressed as INFIX operators in IR ((a AND b) OR (c AND b)) –NOT is UNARY PREFIX operator ((a AND b) OR (c AND (NOT b))) –AND and OR can be n-ary operators (a AND b AND c AND d) –Some rules - (De Morgan revisited) NOT(a) AND NOT(b) = NOT(a OR b) NOT(a) OR NOT(b)= NOT(a AND b) NOT(NOT(a)) = a
31
September 7, 2000Information Organization and Retrieval Boolean Logic 3t33t3 1t11t1 2t22t2 1D11D1 2D22D2 3D33D3 4D44D4 5D55D5 6D66D6 8D88D8 7D77D7 9D99D9 10 D 10 11 D 11 m1m1 m2m2 m3m3 m5m5 m4m4 m7m7 m8m8 m6m6 m 2 = t 1 t 2 t 3 m 1 = t 1 t 2 t 3 m 4 = t 1 t 2 t 3 m 3 = t 1 t 2 t 3 m 6 = t 1 t 2 t 3 m 5 = t 1 t 2 t 3 m 8 = t 1 t 2 t 3 m 7 = t 1 t 2 t 3
32
September 7, 2000Information Organization and Retrieval Boolean Searching “Measurement of the width of cracks in prestressed concrete beams” Formal Query: cracks AND beams AND Width_measurement AND Prestressed_concrete Cracks Beams Width measurement Prestressed concrete Relaxed Query: (C AND B AND P) OR (C AND B AND W) OR (C AND W AND P) OR (B AND W AND P)
33
September 7, 2000Information Organization and Retrieval Psuedo-Boolean Queries A new notation, from web search –+cat dog +collar leash Does not mean the same thing! Need a way to group combinations. Phrases: –“stray cat” AND “frayed collar” –+“stray cat” + “frayed collar”
34
Information need Index Pre-process Parse Collections Rank Query text input
35
September 7, 2000Information Organization and Retrieval Result Sets Run a query, get a result set Two choices –Reformulate query, run on entire collection –Reformulate query, run on result set Example: Dialog query (Redford AND Newman) -> S1 1450 documents (S1 AND Sundance) ->S2 898 documents
36
Information need Index Pre-process Parse Collections Rank Query text input Reformulated Query Re-Rank
37
September 7, 2000Information Organization and Retrieval Ordering of Retrieved Documents Pure Boolean has no ordering In practice: –order chronologically –order by total number of “hits” on query terms What if one term has more hits than others? Is it better to one of each term or many of one term? Fancier methods have been investigated –p-norm is most famous usually impractical to implement usually hard for user to understand
38
September 7, 2000Information Organization and Retrieval Boolean Advantages –simple queries are easy to understand –relatively easy to implement Disadvantages –difficult to specify what is wanted –too much returned, or too little –ordering not well determined Dominant language in commercial systems until the WWW
39
September 7, 2000Information Organization and Retrieval Faceted Boolean Query Strategy: break query into facets (polysemous with earlier meaning of facets) –conjunction of disjunctions a1 OR a2 OR a3 b1 OR b2 c1 OR c2 OR c3 OR c4 –each facet expresses a topic “rain forest” OR jungle OR amazon medicine OR remedy OR cure Smith OR Zhou AND
40
September 7, 2000Information Organization and Retrieval Faceted Boolean Query Query still fails if one facet missing Alternative: Coordination level ranking –Order results in terms of how many facets (disjuncts) are satisfied –Also called Quorum ranking, Overlap ranking, and Best Match Problem: Facets still undifferentiated Alternative: assign weights to facets
41
September 7, 2000Information Organization and Retrieval Proximity Searches Proximity: terms occur within K positions of one another –pen w/5 paper A “Near” function can be more vague –near(pen, paper) Sometimes order can be specified Also, Phrases and Collocations –“United Nations” “Bill Clinton” Phrase Variants –“retrieval of information” “information retrieval”
42
September 7, 2000Information Organization and Retrieval Filters Filters: Reduce set of candidate docs Often specified simultaneous with query Usually restrictions on metadata –restrict by: date range internet domain (.edu.com.berkeley.edu) author size limit number of documents returned
43
September 7, 2000Information Organization and Retrieval Next Week Statistical Properties of Text Preparing information for search: Lexical analysis Introduction to the Vector Space model of IR.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.