Download presentation
Presentation is loading. Please wait.
1
What is missing? Reasons that ideal effectiveness hard to achieve: 1. Users’ inability to describe queries precisely. 2. Document representation loses information. 3. Same term may have multiple meanings and different terms may have similar meanings. 4. Similarity function used not be good enough. 5. Importance/weight of a term in representing a document and query may be inaccurate.
2
Some improvements n Query expansion techniques (for 1) u relevance feedback F Vector model F Probabilistic model u co-occurrence analysis (local and global thesauri) n Improving the quality of terms [(2), (3) and (5).] u Latent Semantic Indexing u Phrase-detection
3
Insight through Principal Components Analysis KL Transform Neural Networks Dimensionality Reduction
4
Latent Semantic Indexing n Classic IR might lead to poor retrieval due to: u unrelated documents might be included in the answer set u relevant documents that do not contain at least one index term are not retrieved u Reasoning: retrieval based on index terms is vague and noisy n The user information need is more related to concepts and ideas than to index terms n A document that shares concepts with another document known to be relevant might be of interest
5
Latent Semantic Indexing n Creates modified vector space n Captures transitive co-occurrence information u If docs A & B don’t share any words, with each other, but both share lots of words with doc C, then A & B will be considered similar u Handles polysemy (adam’s apple) & synonymy n Simulates query expansion and document clustering (sort of)
6
A motivating example n Suppose we have keywords u Car, automobile, driver, elephant n We want queries on car to also get docs about drivers, but not about elephants u Need to realize that driver and car are related while elephant is not n When you scrunch down the dimensions, small differences get glossed over, and you get the desired behavior
7
Latent Semantic Indexing n Definitions u Let t be the total number of index terms u Let N be the number of documents u Let (Mij) be a term-document matrix with t rows and N columns u To each element of this matrix is assigned a weight wij associated with the pair [ki,dj] u The weight wij can be based on a tf-idf weighting scheme
8
Everything You Always Wanted to Know About LSI, and More Singular Value Decomposition (SVD): Convert term-document matrix into 3matrices U, D and V Reduce Dimensionality: Throw out low-order rows and columns Recreate Matrix: Multiply to produce approximate term- document matrix. Use new matrix to process queries
9
Latent Semantic Indexing n The matrix (Mij) can be decomposed into 3 matrices (singular value decomposition) as follows: u (Mij) = (U) (S) (V) t F (U) is the matrix of eigenvectors derived from (M)(M) t F (V) t is the matrix of eigenvectors derived from (M) t (M) F (S) is an r x r diagonal matrix of singular values r = min(t,N) that is, the rank of (Mij) Singular values are the positive square roots of the eigen values of (M)(M) t (also (M) t (M)) For the special case where M is a square matrix, S is the diagonal eigen value matrix, and K and D are eigen vector matrices K and S are orthogonal matrices
10
Latent Semantic Indexing n The key idea is to map documents and queries into a lower dimensional space (i.e., composed of higher level concepts which are in fewer number than the index terms) n Retrieval in this reduced concept space might be superior to retrieval in the space of index terms
11
Latent Semantic Indexing n In the matrix (S), select only the k largest singular values n Keep the corresponding columns in (U) and (V) t n The resultant matrix is called (M) k and is given by u (M) k = (U) k (S) k (D) t k u where k, k < r, is the dimensionality of the concept space n The parameter k should be u large enough to allow fitting the characteristics of the data u small enough to filter out the non-relevant representational details The classic over-fitting issue
13
Computing an Example n Let (Mij) be given by the matrix u Compute the matrices (K), (S), and (D) t
14
Example term ch2ch3 ch4 ch5 ch6 ch7 ch8 ch9 controllability 1 10 0 1001 observability 1 0 0 01 101 realization 1 01 0 1 010 feedback 0 10 0 0 100 controller 0 1 0 0 1 100 observer 0 1 1 0 1 100 transfer function 0 0 0 0 1 10 0 polynomial 0 0 0 0 1 010 matrices 0 0 0 0 1 011 U (9x7) = 0.3996 -0.1037 0.5606 -0.3717 -0.3919 -0.3482 0.1029 0.4180 -0.0641 0.4878 0.1566 0.5771 0.1981 -0.1094 0.3464 -0.4422 -0.3997 -0.5142 0.2787 0.0102 -0.2857 0.1888 0.4615 0.0049 -0.0279 -0.2087 0.4193 -0.6629 0.3602 0.3776 -0.0914 0.1596 -0.2045 -0.3701 -0.1023 0.4075 0.3622 -0.3657 -0.2684 -0.0174 0.2711 0.5676 0.2750 0.1667 -0.1303 0.4376 0.3844 -0.3066 0.1230 0.2259 -0.3096 -0.3579 0.3127 -0.2406 -0.3122 -0.2611 0.2958 -0.4232 0.0277 0.4305 -0.3800 0.5114 0.2010 S (7x7) = 3.9901 0 0 0 0 0 0 0 2.2813 0 0 0 0 0 0 0 1.6705 0 0 0 0 0 0 0 1.3522 0 0 0 0 0 0 0 1.1818 0 0 0 0 0 0 0 0.6623 0 0 0 0 0 0 0 0.6487 V (7x8) = 0.2917 -0.2674 0.3883 -0.5393 0.3926 -0.2112 -0.4505 0.3399 0.4811 0.0649 -0.3760 -0.6959 -0.0421 -0.1462 0.1889 -0.0351 -0.4582 -0.5788 0.2211 0.4247 0.4346 -0.0000 -0.0000 -0.0000 -0.0000 0.0000 -0.0000 0.0000 0.6838 -0.1913 -0.1609 0.2535 0.0050 -0.5229 0.3636 0.4134 0.5716 -0.0566 0.3383 0.4493 0.3198 -0.2839 0.2176 -0.5151 -0.4369 0.1694 -0.2893 0.3161 -0.5330 0.2791 -0.2591 0.6442 0.1593 -0.1648 0.5455 0.2998 This happens to be a rank-7 matrix -so only 7 dimensions required Singular values = Sqrt of Eigen values of AA T T
15
U2 (9x2) = 0.3996 -0.1037 0.4180 -0.0641 0.3464 -0.4422 0.1888 0.4615 0.3602 0.3776 0.4075 0.3622 0.2750 0.1667 0.2259 -0.3096 0.2958 -0.4232 S2 (2x2) = 3.9901 0 0 2.2813 V2 (8x2) = 0.2917 -0.2674 0.3399 0.4811 0.1889 -0.0351 -0.0000 -0.0000 0.6838 -0.1913 0.4134 0.5716 0.2176 -0.5151 0.2791 -0.2591 U (9x7) = 0.3996 -0.1037 0.5606 -0.3717 -0.3919 -0.3482 0.1029 0.4180 -0.0641 0.4878 0.1566 0.5771 0.1981 -0.1094 0.3464 -0.4422 -0.3997 -0.5142 0.2787 0.0102 -0.2857 0.1888 0.4615 0.0049 -0.0279 -0.2087 0.4193 -0.6629 0.3602 0.3776 -0.0914 0.1596 -0.2045 -0.3701 -0.1023 0.4075 0.3622 -0.3657 -0.2684 -0.0174 0.2711 0.5676 0.2750 0.1667 -0.1303 0.4376 0.3844 -0.3066 0.1230 0.2259 -0.3096 -0.3579 0.3127 -0.2406 -0.3122 -0.2611 0.2958 -0.4232 0.0277 0.4305 -0.3800 0.5114 0.2010 S (7x7) = 3.9901 0 0 0 0 0 0 0 2.2813 0 0 0 0 0 0 0 1.6705 0 0 0 0 0 0 0 1.3522 0 0 0 0 0 0 0 1.1818 0 0 0 0 0 0 0 0.6623 0 0 0 0 0 0 0 0.6487 V (7x8) = 0.2917 -0.2674 0.3883 -0.5393 0.3926 -0.2112 -0.4505 0.3399 0.4811 0.0649 -0.3760 -0.6959 -0.0421 -0.1462 0.1889 -0.0351 -0.4582 -0.5788 0.2211 0.4247 0.4346 -0.0000 -0.0000 -0.0000 -0.0000 0.0000 -0.0000 0.0000 0.6838 -0.1913 -0.1609 0.2535 0.0050 -0.5229 0.3636 0.4134 0.5716 -0.0566 0.3383 0.4493 0.3198 -0.2839 0.2176 -0.5151 -0.4369 0.1694 -0.2893 0.3161 -0.5330 0.2791 -0.2591 0.6442 0.1593 -0.1648 0.5455 0.2998 U2*S2*V2 will be a 9x8 matrix That approximates original matrix T
16
K=2 K=6 One component ignored 5 components ignored U6S6V6TU6S6V6T U2S2V2TU2S2V2T USV T U4S4V4TU4S4V4T K=4 =U 7 S 7 V 7 T 3 components ignored What should be the value of k?
17
Coordinate transformation inherent in LSI M = U S V T Mapping of keywords into LSI space is given by US For k=2, the mapping is: 1.5944439 -0.2365708 1.6678618 -0.14623132 1.3821706 -1.0087909 0.7533309 1.05282 1.4372339 0.86141896 1.6259657 0.82628685 1.0972775 0.38029274 0.90136355 -0.7062905 1.1802715 -0.96544623 controllability observability realization feedback controller observer Transfer function polynomial matrices LSx LSy controllability controller LSIx LSIy LSIx Mapping of a doc d=[w1….wk] into LSI space is given by dUS -1 The base-keywords of The doc are first mapped To LSI keywords and Then differentially weighted By S -1 ch3
18
Medline data from Berry’s paper
19
Querying To query for feedback controller, the query vector would be q = [0 0 0 1 1 0 0 0 0]' (' indicates transpose), since feedback and controller are the 4-th and 5-th terms in the index, and no other terms are selected. Let q be the query vector. Then the document-space vector corresponding to q is given by: q'*U2*inv(S2) = Dq For the feedback controller query vector, the result is: Dq = 0.1376 0.3678 To find the best document match, we compare the Dq vector against all the document vectors in the 2- dimensional V2 space. The document vector that is nearest in direction to Dq is the best match. The cosine values for the eight document vectors and the query vector are: -0.3747 0.9671 0.1735 -0.9413 0.0851 0.9642 -0.7265 -0.3805 -0.37 0.967 0.173 -0.94 0.08 0.96 -0.72 -0.38 Centroid of the terms In the query (with scaling)
20
Within.40 threshold K is the number of singular values used
21
Latent Ranking (a la text) n The user query can be modelled as a pseudo- document in the original (M) matrix n Assume the query is modelled as the document numbered 0 in the (M) matrix n The matrix (M) t (M) s quantifies the relantionship between any two documents in the reduced concept space n The first row of this matrix provides the rank of all the documents with regard to the user query (represented as the document numbered 0) s Inefficient way
23
Folding docs -Convert new documents into LSI space using the dUS -1 method Folding terms - find the vectors for new terms as weighted sum of the docs in which they occur Practical Issues: How often do you re-compute SVD when terms or documents are added to the collection? --Folding is a cheaper solution but will worsen quality over time
24
Summary of LSI n Latent semantic indexing provides an interesting conceptualization of the IR problem n No stemming needed, spelling errors tolerated n Can do true conceptual retrieval u Retrieval of documents that do not share any keywords with the query!
25
The best fit for the feedback controller query vector is with the second document, which is Chapter 3. The sixth document, or Chapter 7, is also a good match. A query for feedback realization yields the query vector Dq = 0.1341 0.0084 and cosine values 0.6933 0.6270 0.9698 - 0.0762 0.9443 0.6357 0.3306 0.6888 The best matches for feedback realization are Chapters 4 and 6.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.