GAUSSIAN PROCESS FACTORIZATION MACHINES FOR CONTEXT-AWARE RECOMMENDATIONS Trung V. Nguyen, Alexandros Karatzoglou, Linas Baltrunas SIGIR 2014 Presentation:

Slides:



Advertisements
Similar presentations
Pseudo-Relevance Feedback For Multimedia Retrieval By Rong Yan, Alexander G. and Rong Jin Mwangi S. Kariuki
Advertisements

Neural networks Introduction Fitting neural networks
Biointelligence Laboratory, Seoul National University
1 Learning User Interaction Models for Predicting Web Search Result Preferences Eugene Agichtein Eric Brill Susan Dumais Robert Ragno Microsoft Research.
The Wisdom of the Few A Collaborative Filtering Approach Based on Expert Opinions from the Web Xavier Amatriain Telefonica Research Nuria Oliver Telefonica.
Lecture 14: Collaborative Filtering Based on Breese, J., Heckerman, D., and Kadie, C. (1998). Empirical analysis of predictive algorithms for collaborative.
Collaborative Ordinal Regression Shipeng Yu Joint work with Kai Yu, Volker Tresp and Hans-Peter Kriegel University of Munich, Germany Siemens Corporate.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
1 Collaborative Filtering: Latent Variable Model LIU Tengfei Computer Science and Engineering Department April 13, 2011.
Ensemble Learning (2), Tree and Forest
Jeff Howbert Introduction to Machine Learning Winter Machine Learning Feature Creation and Selection.
Collaborative Filtering Matrix Factorization Approach
Item-based Collaborative Filtering Recommendation Algorithms
Cao et al. ICML 2010 Presented by Danushka Bollegala.
A NON-IID FRAMEWORK FOR COLLABORATIVE FILTERING WITH RESTRICTED BOLTZMANN MACHINES Kostadin Georgiev, VMware Bulgaria Preslav Nakov, Qatar Computing Research.
Performance of Recommender Algorithms on Top-N Recommendation Tasks RecSys 2010 Intelligent Database Systems Lab. School of Computer Science & Engineering.
Introduction to variable selection I Qi Yu. 2 Problems due to poor variable selection: Input dimension is too large; the curse of dimensionality problem.
WEMAREC: Accurate and Scalable Recommendation through Weighted and Ensemble Matrix Approximation Chao Chen ⨳ , Dongsheng Li
Yan Yan, Mingkui Tan, Ivor W. Tsang, Yi Yang,
Focused Matrix Factorization for Audience Selection in Display Advertising BHARGAV KANAGAL, AMR AHMED, SANDEEP PANDEY, VANJA JOSIFOVSKI, LLUIS GARCIA-PUEYO,
Wancai Zhang, Hailong Sun, Xudong Liu, Xiaohui Guo.
Improving Web Search Ranking by Incorporating User Behavior Information Eugene Agichtein Eric Brill Susan Dumais Microsoft Research.
RecSys 2011 Review Qi Zhao Outline Overview Sessions – Algorithms – Recommenders and the Social Web – Multi-dimensional Recommendation, Context-
Training and Testing of Recommender Systems on Data Missing Not at Random Harald Steck at KDD, July 2010 Bell Labs, Murray Hill.
EMIS 8381 – Spring Netflix and Your Next Movie Night Nonlinear Programming Ron Andrews EMIS 8381.
Outline 1-D regression Least-squares Regression Non-iterative Least-squares Regression Basis Functions Overfitting Validation 2.
CIKM’09 Date:2010/8/24 Advisor: Dr. Koh, Jia-Ling Speaker: Lin, Yi-Jhen 1.
GAP FM : O PTIMAL T OP -N R ECOMMENDATIONS FOR G RADED R ELEVANCE D OMAINS Yue Shi, Alexandros Karatzoglou, Linas Baltrunas, Martha Larson, Alan Hanjalic.
Chengjie Sun,Lei Lin, Yuan Chen, Bingquan Liu Harbin Institute of Technology School of Computer Science and Technology 1 19/11/ :09 PM.
User Interests Imbalance Exploration in Social Recommendation: A Fitness Adaptation Authors : Tianchun Wang, Xiaoming Jin, Xuetao Ding, and Xiaojun Ye.
Shanda Innovations Context-aware Ensemble of Multifaceted Factorization Models for Recommendation Kevin Y. W. Chen.
Online Learning for Collaborative Filtering
CISC Machine Learning for Solving Systems Problems Presented by: Alparslan SARI Dept of Computer & Information Sciences University of Delaware
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
EigenRank: A Ranking-Oriented Approach to Collaborative Filtering IDS Lab. Seminar Spring 2009 강 민 석강 민 석 May 21 st, 2009 Nathan.
Multifactor GPs Suppose now we wish to model different mappings for different styles. We will add a latent style vector s along with x, and define the.
COT: Contextual Operating Tensor for Context-aware Recommender Systems Center for Research on Intelligent Perception And Computing (CRIPAC) National Lab.
CiteSight: Contextual Citation Recommendation with Differential Search Avishay Livne 1, Vivek Gokuladas 2, Jaime Teevan 3, Susan Dumais 3, Eytan Adar 1.
EigenRank: A ranking oriented approach to collaborative filtering By Nathan N. Liu and Qiang Yang Presented by Zachary 1.
Effective Automatic Image Annotation Via A Coherent Language Model and Active Learning Rong Jin, Joyce Y. Chai Michigan State University Luo Si Carnegie.
Xutao Li1, Gao Cong1, Xiao-Li Li2
Pairwise Preference Regression for Cold-start Recommendation Speaker: Yuanshuai Sun
Automating Readers’ Advisory to Make Book Recommendations for K-12 Readers by Alicia Wood.
Recommender Systems with Social Regularization Hao Ma, Dengyong Zhou, Chao Liu Microsoft Research Michael R. Lyu The Chinese University of Hong Kong Irwin.
FISM: Factored Item Similarity Models for Top-N Recommender Systems
GeoMF: Joint Geographical Modeling and Matrix Factorization for Point-of-Interest Recommendation Defu Lian, Cong Zhao, Xing Xie, Guangzhong Sun, EnhongChen,
A Novel Relational Learning-to- Rank Approach for Topic-focused Multi-Document Summarization Yadong Zhu, Yanyan Lan, Jiafeng Guo, Pan Du, Xueqi Cheng Institute.
NTU & MSRA Ming-Feng Tsai
Collaborative Filtering via Euclidean Embedding M. Khoshneshin and W. Street Proc. of ACM RecSys, pp , 2010.
ICONIP 2010, Sydney, Australia 1 An Enhanced Semi-supervised Recommendation Model Based on Green’s Function Dingyan Wang and Irwin King Dept. of Computer.
Tree and Forest Classification and Regression Tree Bagging of trees Boosting trees Random Forest.
A Method to Approximate the Bayesian Posterior Distribution in Singular Learning Machines Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Hao Ma, Dengyong Zhou, Chao Liu Microsoft Research Michael R. Lyu
1 Dongheng Sun 04/26/2011 Learning with Matrix Factorizations By Nathan Srebro.
StressSense: Detecting Stress in Unconstrained Acoustic Environments using Smartphones Hong Lu, Mashfiqui Rabbi, Gokul T. Chittaranjan, Denise Frauendorfer,
Neural Collaborative Filtering
Recommending Forum Posts to Designated Experts
Deep Feedforward Networks
LECTURE 11: Advanced Discriminant Analysis
Learning Recommender Systems with Adaptive Regularization
Asymmetric Correlation Regularized Matrix Factorization for Web Service Recommendation Qi Xie1, Shenglin Zhao2, Zibin Zheng3, Jieming Zhu2 and Michael.
Location Recommendation — for Out-of-Town Users in Location-Based Social Network Yina Meng.
Probabilistic Models with Latent Variables
Movie Recommendation System
Overfitting and Underfitting
Probabilistic Latent Preference Analysis
Ensemble learning Reminder - Bagging of Trees Random Forest
Ch 3. Linear Models for Regression (2/2) Pattern Recognition and Machine Learning, C. M. Bishop, Previously summarized by Yung-Kyun Noh Updated.
Jonathan Elsas LTI Student Research Symposium Sept. 14, 2007
Learning to Rank with Ties
Presentation transcript:

GAUSSIAN PROCESS FACTORIZATION MACHINES FOR CONTEXT-AWARE RECOMMENDATIONS Trung V. Nguyen, Alexandros Karatzoglou, Linas Baltrunas SIGIR 2014 Presentation: Vu Tien Duong

CONTENT Introduction Gaussian processes GPFM GPPW Evaluation Conclusion

Introduction Context: the environment in which a recommendation is provided. Multidimensional latent factors: variables are represented as latent features in a low-dimensional space Context-aware recommendation (CAR): the user- item-context interactions are modeled in factor models. Tensor Factorization Factorization Machine

Introduction (1) Problem: given the many possible types of interactions between user, items and contextual variables, it may seem unrealistic to restrict the interactions among them to linearity  Solution: Gaussian Process Factorization Machines (GPFM) - non-linear context-aware collaborative filtering method, use Gaussian Processes.

Introduction (2) Contributions: Applicable to both the explicit and implicit feedback Using stochastic gradient descent (SGD) optimization to allow scalability of the model The first GP-based attempt for context-aware recommendations

Steps of method 1. Converting observed data to latent representation 2. Having prior and likelihood of model 3. Learning 4. Predicting utility

Gaussian Processes Widely used for modeling relational data Use flexible covariance functions An important tool for modeling non-linear complex patterns

Gaussian Processes (1)

Gaussian Process Factorization Machines (GPFM) First, convert data to latent representation Apply GP to CAR, GPFM is specified by the prior and likelihood Can choose many type of covariance function k

Pairwise Comparison for Implicit Feedback Pair comparison (j1, c1) > i (j2, c2) which says the user has higher utility for item j1 in context c1 than item j2 in context c2

Pairwise Preference Model (GPPW) Similar to GPFM

LEARNING

PREDICTIVE DISTRIBUTION

Evaluation Implicit datasets: FRAPPE, converted FOOD, converted COMODA Explicit datasets: ADOM, COMODA, FOOD, SUSHI Compared methods: fm, multiverse, mf, constant Metrics: Overall quality: MAE, RMSE Top items in a list: NDCG, ERR DatasetDetail ADOMMovies (from students) COMODAMovies FOODFood menus SUSHISushi types (from Japanese) FRAPPEAndroid applications Converted FOOD rating > 3 is treated as positive Converted COMODA rating > 4 is treated as positive

Evaluation Split dataset to 5 folds and iterate 5 times. Each time, one for testing and 4 for training. Empirical tune parameters by one fold as validation. Then fix tuned parameter when running experiments with other 4 folds. The performance is average of 5 folds.

Evaluation of GPFM for Explicit Feedback Context-aware and context-agnostic gpfm and fm significantly outperforms mf => benefit of contextual info Multiverse outperform mf on ADOM and FOOD but poor on COMODA and SUSHI => high dimensional context gpfm and fm (best context-aware methods) Significatly outperforms => nonlinear with GPFM leads to substantial performance in CAR.

GPFM - Explicit Feedback

Evaluation of GPPW for Implicit Feedback gppw significantly outperforms both gpfm and fm on the FOOD and COMODA Comparable and on the FRAPPE  Learning with paired comparisons can lead to substantial improvement for ranking (compared to optimizing item- based scores)  GPPW is more effective than GPFM in the implicit feedback with little overhead in computation

CONCLUSION The utility of an item under a context is modeled as functions in the latent feature space of the item and context Introducing Gaussian processes as priors for these utility functions, GPFM allows complex, nonlinear user-item- context interactions to be captured leading to powerful and flexible modeling capacity