Non-Negative Matrix Factorization ( NMF ) Reportor: MaPeng Paper :D.D.Lee andS.Seung,”Learning the parts of objects by non-negative matrix factorization” Nature,vol.401,pp ,1999
作者的相关信息 Daniel D. Lee, Ph.D. Associate Professor Dept. of Electrical and Systems Engineering Dept. of Bioengineering (Secondary) GRASP (General Robotics, Automation, Sensing, Perception) Lab 203B Moore/6314 University of Pennsylvania 200 S. 33rd Street Philadelphia, PA (FAX)
H. Sebastian Seung Professor of Computational Neuroscience, MIT Investigator, Howard Hughes Medical Institute MIT, Vassar St. Cambridge, MA voice: Administrative assistant: Amy Dunn voice: fax:
Problem Statement Given a set of images: 1. Create a set of basis images that can be linearly combined to create new images 2. Find the set of weights to reproduce every input image from the basis images 3. Dimension reduction
PCA NMF LNMF FNMF WNMF Mainly Discuss
PCA Find a set of orthogonal basis images The reconstructed image is a linear combination of the basis images
What don’t we like about PCA? PCA involves adding up some basis images then subtracting others Basis images aren ’ t physically intuitive Subtracting doesn ’ t make sense in context of some applications How do you subtract a face? What does subtraction mean in the context of document classification? back
Non-negative Matrix Factorization Like PCA, except the coefficients in the linear combination cannot be negative
Non-negative matrix factorization (NMF) (Lee & Seung ) NMF gives Part based representation (Lee & Seung – Nature 1999)
NMF is based on Gradient Descent NMF: V WH s.t. W i,d,H d,j 0 Let C be a given cost function, then update the parameters according to:
The idea behind multiplicative updates Positive term Negative term
The NMF decomposition is not unique NMF only unique when data adequately spans the positive orthant (Donoho & Stodden )
NMF Basis Imagesnmf_basis nmf_basis Only allowing adding of basis images makes intuitive sense – Has physical analogue in neurons Forcing the reconstruction coefficients to be positive leads to nice basis images – To reconstruct images, all you can do is add in more basis images – This leads to basis images that represent parts
Faces Training set: 2429 examples First 25 examples shown at right Set consists of 19x19 centered face images
Faces Basis Images: – Rank: 49 – Iterations: 50
Faces x = Original
Faces x = back
Example
Local non-negative matrix factorization
Letting LNMF is aimed at learning local features by imposing the following three additional constraints on the NMF basis:
back LNMF_basis
Fisher non-negative matrix factorization
back
Weighted NMF
back
结论及未来工作 综上所述,非负矩阵分解是一种的提取 图像局部特征信息的有效的方法,目前 在很多领域得到广泛应用,值得我们关 注。 问题 ( 1 )非平衡样本集识别率低的问题 ( 2 )权重选取问题
参考文献 [1]D.D.Lee and H.S.Seung,“Learning the parts of objects by non-negative matrix factorization”, Nature,vol.401,pp ,1999 [2]D.D.Lee and H.S.Seung“Algorithms for non-negative Matrix factorization ” , in Proceedings of Neural Information Processing Systems,2000. [3]S.Z.Li,X.Hou,H.J.Zhang,andQ.Cheng,“Learning spatially localized,parts- based representation”,Proc.IEEE Int.Conf.Computer Vision and Pattern Recognition,2001,pp [4]J.Lu andY.-P.Tan,“Doubly weighted nonnegative matrix factorization for imbalanced face recognition”,Proc.IEEE Int.Conf.Acoustics,Speech,andSignalProcessing,2009,pp.877¨C880