Download presentation
Presentation is loading. Please wait.
1
Item-based Collaborative Filtering Idea: a user is likely to have the same opinion for similar items [if I like Canon cameras, I might also like Canon video cameras] Similarity between items: how other users have rated these items [item features are used] Advantage (compared to user-based CF): –Reduces cold-start problem on new users –Improves scalability (similarity between items is more stable than between users, because a user might change his/her interests over time) Item-based Collaborative Filtering Recommendation Algorithms. Badrul Sarwar, George Karypis, Joseph Konstan, and John Riedl. WWW'01.
2
Item-based CF Example: infer (user 1, item 3) Item 1Item 2Item 3Item 4Item 5 User 181 ? 27 User 22 ? 575 User 354747 User 471738 User 51746 ? User 683837
3
Item-based Prediction Algorithms Similar to user-based: use nearest item neighbours Item 3 2 1 8 7 Aggregation function: often weighted sum Weight depends on similarity Item 5 Item 4 Item 2 Item 1
4
How to Calculate Similarity (Item 3 and Item 5)? Item 1Item 2Item 3Item 4Item 5 User 181 ? 27 User 22 ? 575 User 354747 User 471738 User 51746 ? User 683837
5
Similarity between Items Item 3Item 4Item 5 ? 27 575 747 738 46 ? 837 How similar are items 3 and 5? How to calculate their similarity?
6
Similarity between items Item 3Item 5 ? 7 55 77 78 4 ? 87 Only consider users who have rated both items For each user: Calculate difference in ratings for the two items Take the average of this difference over the users Can also use Pearson Correlation Coefficients as in user-based approaches sim(item 3, item 5) = cosine( (5, 7, 7), (5, 7, 8) ) = (5*5 + 7*7 + 7*8)/(sqrt(5 2 +7 2 +7 2 )* sqrt(5 2 +7 2 +8 2 ))
7
Prediction: Calculating ranking r(user1,item3) Item 3 2 1 8 7 Item 5 Item 4 Item 2 Item 1 Where is a normalization factor, which is 1/[the sum of all sim(item i,item 3 )].
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.