Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lessons from the Netflix Prize Robert Bell AT&T Labs-Research In collaboration with Chris Volinsky, AT&T Labs-Research & Yehuda Koren, Yahoo! Research.

Similar presentations


Presentation on theme: "Lessons from the Netflix Prize Robert Bell AT&T Labs-Research In collaboration with Chris Volinsky, AT&T Labs-Research & Yehuda Koren, Yahoo! Research."— Presentation transcript:

1 Lessons from the Netflix Prize Robert Bell AT&T Labs-Research In collaboration with Chris Volinsky, AT&T Labs-Research & Yehuda Koren, Yahoo! Research

2 2 “We’re quite curious, really. To the tune of one million dollars.” – Netflix Prize rules Goal to improve on Netflix’s existing movie recommendation technology Contest began October 2, 2006 Prize –Based on reduction in root mean squared error (RMSE) on test data –$1,000,000 grand prize for 10% drop (19% for MSE) –Or, $50,000 progress for best result each year

3 3 Data Details Training data –100 million ratings (from 1 to 5 stars) –6 years (2000-2005) –480,000 users –17,770 “movies” Test data –Last few ratings of each user –Split as shown on next slide

4 4 Test Data Split into Three Pieces Probe –Ratings released –Allows participants to assess methods directly Daily submissions allowed for combined Quiz/Test data –Identity of Quiz cases withheld –RMSE released for Quiz –Test RMSE withheld –Prizes based on Test RMSE

5 5 Higher Mean Rating in Probe Data

6 6 2004 Something Happened in Early 2004

7 Data about the Movies CountAvg rating Most Loved Movies 1378124.593The Shawshank Redemption 1335974.545Lord of the Rings: The Return of the King 1808834.306The Green Mile 1506764.460Lord of the Rings: The Two Towers 1390504.415Finding Nemo 1174564.504Raiders of the Lost Ark Most Rated Movies Miss Congeniality Independence Day The Patriot The Day After Tomorrow Pretty Woman Pirates of the Caribbean Highest Variance The Royal Tenenbaums Lost In Translation Pearl Harbor Miss Congeniality Napolean Dynamite Fahrenheit 9/11

8 8 Most Active Users User ID# RatingsMean Rating 30534417,6511.90 38741817,4321.81 243949316,5601.22 166401015,8114.26 211846114,8294.08 1461435 9,8201.37 1639792 9,7641.33 1314869 9,7392.95

9 9 Major Challenges 1.Size of data –Places premium on efficient algorithms –Stretched memory limits of standard PCs 2.99% of data are missing –Eliminates many standard prediction methods –Certainly not missing at random 3.Training and test data differ systematically –Test ratings are later –Test cases are spread uniformly across users

10 10 Major Challenges (cont.) 4.Countless factors may affect ratings –Genre, movie/TV series/other –Style of action, dialogue, plot, music et al. –Director, actors –Rater’s mood 5.Large imbalance in training data –Number of ratings per user or movie varies by several orders of magnitude –Information to estimate individual parameters varies widely

11 11 Ratings per Movie in Training Data Avg #ratings/movie: 5627

12 12 Ratings per User in Training Data Avg #ratings/user: 208

13 13 The Fundamental Challenge How can we estimate as much signal as possible where there are sufficient data, without over fitting where data are scarce?

14 14 Recommender Systems Personalized recommendations of items (e.g., movies) to users Increasingly common –To deal with explosive number of choices on the internet –Netflix –Amazon –Many others

15 15 Content Based Systems A pre-specified list of attributes Score each item on all attributes User interest obtained for the same attributes –Direct solicitation, or –Estimated based on user rating, purchases, or other behavior

16 16 Pandora Music recommendation system Songs rated on 400+ attributes –Music genome project –Roots, instrumentation, lyrics, vocals Two types of user feedback –Seed songs –Thumbs up/down for recommended songs

17 17 Collaborative Filtering (CF) Avoids need for: –Determining “proper” content –Collecting information about items or users Infers user-item relationships from purchases or ratings Used by Amazon and Netflix Two main CF tools –Nearest neighbors –Latent factor models

18 18 Nearest Neighbor Methods Most common CF tool at the beginning of the contest Predict rating for a specific user-item pair based on ratings of –Similar items –By the same user –Or vice versa Pearson correlation or cosine similarity

19 19 Merits of Nearest Neighbors Few modeling assumptions Few tuning parameters to learn Easy to explain to users –Dear Amazon.com Customer, We've noticed that customers who have purchased or rated How Does the Show Go On: An Introduction to the Theater by Thomas Schumacher have also purchased Princess Protection Program #1: A Royal Makeover (Disney Early Readers).How Does the Show Go On: An Introduction to the Theater

20 20 Latent Factor Models Models with latent classes of items and users –Individual items and users are assigned to either a single class or a mixture of classes Neural networks –Restricted Boltzmann machines Singular Value Decomposition (SVD) –AKA matrix factorization –Items and users described by unobserved factors –Main method used by leaders of competition

21 21 SVD Dimension reduction technique for matrices Each item summarized by a d-dimensional vector q i Similarly, each user summarized by p u Choose d much smaller than number of items or users –e.g., d = 50 << 18,000 or 480,000 Predicted rating for Item i by User u –Inner product of q i and p u –

22 22 Geared towards females Geared towards males serious escapist The Princess Diaries The Lion King Braveheart Lethal Weapon Independence Day Amadeus The Color Purple Dumb and Dumber Ocean’s 11 Sense and Sensibility

23 23 Geared towards females Geared towards males serious escapist The Princess Diaries The Lion King Braveheart Lethal Weapon Independence Day Amadeus The Color Purple Dumb and Dumber Ocean’s 11 Sense and Sensibility Gus Dave

24 24 Regularization for SVD Want to minimize SSE for Test data One idea: Minimize SSE for Training data –Want large d to capture all the signals –But, Test RMSE begins to rise for d > 2 Regularization is needed –Allow rich model where there are sufficient data –Shrink aggressively where data are scarce Minimize

25 25 Geared towards females Geared towards males serious escapist The Princess Diaries The Lion King Braveheart Lethal Weapon Independence Day Amadeus The Color Purple Dumb and Dumber Ocean’s 11 Sense and Sensibility Gus

26 26 Geared towards females Geared towards males serious escapist The Princess Diaries The Lion King Braveheart Lethal Weapon Independence Day Amadeus The Color Purple Dumb and Dumber Ocean’s 11 Sense and Sensibility Gus

27 27 Geared towards females Geared towards males serious escapist The Princess Diaries The Lion King Braveheart Lethal Weapon Independence Day Amadeus The Color Purple Dumb and Dumber Ocean’s 11 Sense and Sensibility Gus

28 28 Geared towards females Geared towards males serious escapist The Princess Diaries The Lion King Braveheart Lethal Weapon Independence Day Amadeus The Color Purple Dumb and Dumber Ocean’s 11 Sense and Sensibility Gus

29 29 Estimation for SVD Fit by gradient descent –Loop over observed ratings –Update each relevant parameter –Small step in each parameter, proportional to gradient –Repeat until convergence Alternatively, fit by sequence of ridge regressions –Fix item factors –Loop over users, estimating user factors –Do same to estimate item factors –Repeat until convergence

30 Improvements to Collaborative Filtering Fine tune existing methods Incorporate alternative “effects” Incorporate a variety of modeling methods Careful regularization to avoid over fitting

31 Localized SVD SVD uses all of a user’s ratings to train the user’s factors But what if the user is multiple people? –Different factor values may apply to movies rated by Mom vs. Dad vs. the Kids This approach computes user factors, p u, specific to the movie being predicted –Given all the {q i }, p u is the solution of a ridge regression –Weighted ridge regressions with higher weights for movies similar to the target movie

32 Improvement from Localized SVD

33 33 Lesson #1: Data >> Models Very limited feature set –User, movie, date –Places focus on models/algorithms Major steps forward associated with incorporating new data features –What movies a user rated –Temporal effects

34 34 You are What You Rate What you rate (and don’t) provides information about your preferences Paterek’s NSVD explicitly characterizes users by which movies they like Incorporate what a user rated into the user factor – Substantially reduces RMSE

35 35 Temporal Effects User behavior may change over time –Ratings go up or down –Interests change –For example, with addition of a new rater Allow user biases and/or factors to change over time – –Model a u (t) and p u (t) as linear, unrestricted, or a sum of both types

36 36 Geared towards females Geared towards males serious escapist The Princess Diaries The Lion King Braveheart Lethal Weapon Independence Day Amadeus The Color Purple Dumb and Dumber Ocean’s 11 Sense and Sensibility Gus

37 37 Geared towards females Geared towards males serious escapist The Princess Diaries The Lion King Braveheart Lethal Weapon Independence Day Amadeus The Color Purple Dumb and Dumber Ocean’s 11 Sense and Sensibility Gus

38 38 Geared towards females Geared towards males serious escapist The Princess Diaries The Lion King Braveheart Lethal Weapon Independence Day Amadeus The Color Purple Dumb and Dumber Ocean’s 11 Sense and Sensibility Gus

39 39 Geared towards females Geared towards males serious escapist The Princess Diaries The Lion King Braveheart Lethal Weapon Independence Day AmadeusThe Color Purple Dumb and Dumber Ocean’s 11 Sense and Sensibility Gus +

40 40 #2: The Power of Regularized SVD Fit by Gradient Descent Allowed anyone to approach early leaders –Powerful predictor –Efficient –Easy to program Flexibility to incorporate additional features –Implicit feedback –Temporal effects –Neighborhood effects Accurate regularization is essential

41 41

42 #3: The Wisdom of Crowds (of Models) All models are wrong; some are useful – G. Box Used linear blends of many prediction sets –107 in Year 1 –Over 800 at the end Difficult, or impossible, to build the grand unified model Mega blends are not needed in practice –A handful of simple models achieves 80 percent of the improvement of the full blend

43 43 #4: Find Good Teammates Yehuda Koren –The engine of progress for the Netflix Prize –Implicit feedback –Temporal effects –Nearest neighbor modeling Big Chaos: Michael Jahrer, Andreas Toscher (Year 2) –Optimization of tuning parameters –Blending methods Pragmatic Theory: Martin Chabbert, Martin Piotte (Year 3) –Some movies age better than others –Link functions

44 44 The Final Leaderboard

45 45 Test Set Results The Ensemble: 0.856714

46 46 Test Set Results The Ensemble: 0.856714 BellKor’s Pragmatic Theory: 0.856704

47 47 Test Set Results The Ensemble: 0.856714 BellKor’s Pragmatic Theory: 0.856704 Both scores round to 0.8567

48 48 Test Set Results The Ensemble: 0.856714 BellKor’s Pragmatic Theory: 0.856704 Both scores round to 0.8567 Tie breaker is submission date/time

49 49 Final Test Set Leaderboard

50 Who Got the Money? AT&T’s donated its full share to organizations supporting science education Young Science Achievers Program New Jersey Institute of Technology pre-college and educational opportunity programs North Jersey Regional Science Fair Neighborhoods Focused on African American Youth

51 51 #5: Is This the Way to Do Science? Big Success for Netflix –Lots of cheap labor, good publicity –Already incorporated 6 percent improvement –Potential for much more using other data they have Big advances to the science of recommender systems –Regularized SVD –Identification of new features –Understanding nearest neighbors –Contributions to literature

52 52 Why Did this Work so Well? Industrial strength data Very good design Accessibility to anyone with a PC Free flow of ideas –Leaderboard –Forum –Workshop and papers Money?

53 53 But There are Limitations Need a conceptually simple task Winner-take-all has drawbacks Intellectual property and liability issues How many prizes can overlap?

54 54 Thank You! rbell@research.att.com www.netflixprize.com –…/leaderboard –…/community Click BellKor’s Pragmatic Chaos or The Ensemble on Leaderboard for details


Download ppt "Lessons from the Netflix Prize Robert Bell AT&T Labs-Research In collaboration with Chris Volinsky, AT&T Labs-Research & Yehuda Koren, Yahoo! Research."

Similar presentations


Ads by Google