Learning with Green’s Function with Application to Semi-Supervised Learning and Recommender System ----Chris Ding, R. Jin, T. Li and H.D. Simon. A Learning Framework using Green’s Function and Kernel Regularization with Application to Recommender System. KDD’07.
Outline Green’s Function Graph-Based Semi-supervised Learning with Green’s Function Item-Based Recommendation Using Green’s Function Extension
Green’s Function Given a weighted graph G=(V,E), W= D= The Graph Laplacian matrix L= D-W
Green’s Function Defined as the inverse of L = D-W with zero- mode discarded. discard
Semi-Supervised with Green’s Function Green’s Function Interpreted as an electric resistor network Viewed as a similarity metric on a graph
Semi-Supervised with Green’s Function Label Propagation Labeled data &, unlabeled data labeled data unlabeled data For 2-class problems: For k-class problems: Label Propagation
Semi-Supervised with Green’s Function Compared to Harmonic Function Harmonic Function is an iterative procedure Outperforms Harmonic Function 7 datasets, 10% as labeled data
Recommendation with Green’s Function Item-based Recommendation To calculate unknown rating by averaging rating of similar items by test users User-item matrix R, : rates Item Graph G=(V,E) typical similarity: cosine similarity, conditional probability…
Recommendation with Green’s Function
Experiments: Dataset: Movielens : 943 users; 1682 movies; ratings from 1 to 5 Training set: 90,570 records Test set: 9,430 records
Recommendation with Green’s Function Results compared to traditional methods: MAE: Mean Absolute Error M0E: Mean Zero-one Error
Extension Combination between semi-supervised learning and recommendation? Combine with other recommendation algorithms? Improve graph-based semi-supervised learning with other algorithm?
Discussion and Suggestion
Thank You!