Download presentation
Presentation is loading. Please wait.
Published byFrederica Lynch Modified over 9 years ago
1
Matrix Factorization and its applications By Zachary 16 th Nov, 2010
2
Outline Expression power of matrix Various matrix factorization methods Application of matrix factorization
3
What can matrix represent? System of equations User rating matrix Image Matrix structure in graph theory ◦ Adjacent matrix ◦ Distance matrix
4
Different matrix factorization methods LU decomposition Singular Value Decomposition(SVD) Probabilistic Matrix Factorization(PMF) Non-negative Matrix Factorization(NMF)
5
Application of matrix factorization LU decomposition ◦ Solving system of equations SVD decomposition ◦ Low rank matrix approximation ◦ Pseudo-inverse
6
Application of matrix factorization PMF ◦ Recommendation system NMF ◦ Learning the parts of objects
7
PMF Consider a typical recommendation problem ◦ Given a n by m matrix R with some entries unknown n rows represent n users m columns represent m movies Entry represent the ith user’s rating on the jth movie ◦ We are interested in the unknown entries’ possible values i.e. Predict users’ ratings
8
PMF We can model the problem as R=U’V ◦ U (k by n) is the latent feature matrix for users How much the user likes action movie? How much the user likes comedy movie? ◦ V (k by m) is the latent feature matrix for movies To what extent is the movie an action movie? To what extent is the movie a comedy movie?
9
PMF If we can learn U and V from existing ratings, then we can compute unknown entries by multiplying these two matrices. Let’s consider a probabilistic approach.
10
PMF
11
PMF We want to maximize Equivalent to minimizing Can be solved using steepest descent method
12
Extension to PMF We can augment the model as long as we have additional data matrix that share comment latent feature matrix
13
NMF Consider the following problem ◦ M = 2429 facial images ◦ Each image of size n = 19 by 19 = 361 ◦ Matrix V = n by m is the original dataset ◦ We want to approximate V by two lower rank matrix W (n by 49) and H (49 by m) V ~ WH Constraints All entries of W and H are non-negative
14
NMF How well can W and H approximate V How can we interpret the result
15
NMF Assumption ◦ ◦ Maximize logarithm likelihood and we get the objective function
16
Criticize of NMF NMF doesn’t always give parts based result Sparseness constraints For more information, refer to “Non-negative matrix factorization with sparseness constrains”
17
Questions? Thank you
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.