Informatics and Mathematical Modelling / Intelligent Signal Processing ISCAS 2008 1 Morten Mørup Approximate L0 constrained NMF/NTF Morten Mørup Informatics.

Slides:



Advertisements
Similar presentations
Nonnegative Matrix Factorization with Sparseness Constraints S. Race MA591R.
Advertisements

Zhen Lu CPACT University of Newcastle MDC Technology Reduced Hessian Sequential Quadratic Programming(SQP)
Informatics and Mathematical Modelling / Intelligent Signal Processing 1 Morten Mørup Decomposing event related EEG using Parallel Factor Morten Mørup.
Informatics and Mathematical Modelling / Intelligent Signal Processing 1 Morten Mørup Extensions of Non-negative Matrix Factorization to Higher Order data.
Nonsmooth Nonnegative Matrix Factorization (nsNMF) Alberto Pascual-Montano, Member, IEEE, J.M. Carazo, Senior Member, IEEE, Kieko Kochi, Dietrich Lehmann,
Latent Causal Modelling of Neuroimaging Data Informatics and Mathematical Modeling Morten Mørup 1 1 Cognitive Systems, DTU Informatics, Denmark, 2 Danish.
Informatics and Mathematical Modelling / Cognitive Sysemts Group 1 MLSP 2010 September 1st Archetypal Analysis for Machine Learning Morten Mørup DTU Informatics.
Chapter 2: Lasso for linear models
More MR Fingerprinting
Rayan Alsemmeri Amseena Mansoor. LINEAR SYSTEMS Jacobi method is used to solve linear systems of the form Ax=b, where A is the square and invertible.
1.2 Row Reduction and Echelon Forms
Linear Equations in Linear Algebra
ERPWAVELAB 1st International Summer School in Biomedical Engineering1st International Summer School in Biomedical Engineering August 8, st International.
Coefficient Path Algorithms Karl Sjöstrand Informatics and Mathematical Modelling, DTU.
Non-Negative Tensor Factorization with RESCAL Denis Krompaß 1, Maximilian Nickel 1, Xueyan Jiang 1 and Volker Tresp 1,2 1 Department of Computer Science.
Informatics and Mathematical Modelling / Intelligent Signal Processing 1 EMMDS 2009 July 3rd, 2009 Clustering on the Simplex Morten Mørup DTU Informatics.
Sparse and Overcomplete Data Representation
Non-negative Tensor Decompositions
Curve-Fitting Regression
Linear Regression  Using a linear function to interpolate the training set  The most popular criterion: Least squares approach  Given the training set:
Collaborative Filtering Matrix Factorization Approach
Chapter 4 Systems of Linear Equations; Matrices
Informatics and Mathematical Modelling / Intelligent Signal Processing MLSP Morten Mørup Multiplicative updates for the LASSO Morten Mørup and Line.
Online Dictionary Learning for Sparse Coding International Conference on Machine Learning, 2009 Julien Mairal, Francis Bach, Jean Ponce and Guillermo Sapiro.
Informatics and Mathematical Modelling / Intelligent Signal Processing 1 EUSIPCO’09 27 August 2009 Tuning Pruning in Sparse Non-negative Matrix Factorization.
1 Information Retrieval through Various Approximate Matrix Decompositions Kathryn Linehan Advisor: Dr. Dianne O’Leary.
Non Negative Matrix Factorization
1 A Fast-Nonegativity-Constrained Least Squares Algorithm R. Bro, S. D. Jong J. Chemometrics,11, , 1997 By : Maryam Khoshkam.
Cs: compressed sensing
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 6 Solving Normal Equations and Estimating Estimable Model Parameters.
EE369C Final Project: Accelerated Flip Angle Sequences Jan 9, 2012 Jason Su.
Shifted Independent Component Analysis Morten Mørup, Kristoffer Hougaard Madsen and Lars Kai Hansen The shift problem Informatics and Mathematical Modelling.
Local Non-Negative Matrix Factorization as a Visual Representation Tao Feng, Stan Z. Li, Heung-Yeung Shum, HongJiang Zhang 2002 IEEE Presenter : 張庭豪.
SAND C 1/17 Coupled Matrix Factorizations using Optimization Daniel M. Dunlavy, Tamara G. Kolda, Evrim Acar Sandia National Laboratories SIAM Conference.
Informatics and Mathematical Modelling / Intelligent Signal Processing 1 Sparse’09 8 April 2009 Sparse Coding and Automatic Relevance Determination for.
1 Robust Nonnegative Matrix Factorization Yining Zhang
Direct Robust Matrix Factorization Liang Xiong, Xi Chen, Jeff Schneider Presented by xxx School of Computer Science Carnegie Mellon University.
Solving linear models. x y The two-parameter linear model.
Regularization and Feature Selection in Least-Squares Temporal Difference Learning J. Zico Kolter and Andrew Y. Ng Computer Science Department Stanford.
A Clustering Method Based on Nonnegative Matrix Factorization for Text Mining Farial Shahnaz.
Biointelligence Laboratory, Seoul National University
Simultaneous estimation of monotone trends and seasonal patterns in time series of environmental data By Mohamed Hussian and Anders Grimvall.
Endmember Extraction from Highly Mixed Data Using MVC-NMF Lidan Miao AICIP Group Meeting Apr. 6, 2006 Lidan Miao AICIP Group Meeting Apr. 6, 2006.
Sparse & Redundant Representation Modeling of Images Problem Solving Session 1: Greedy Pursuit Algorithms By: Matan Protter Sparse & Redundant Representation.
Extensions of Non-Negative Matrix Factorization (NMF) to Higher Order Data HONMF (Higher Order Non-negative Matrix Factorization) NTF2D/SNTF2D ((Sparse)
Ariadna Quattoni Xavier Carreras An Efficient Projection for l 1,∞ Regularization Michael Collins Trevor Darrell MIT CSAIL.
NONNEGATIVE MATRIX FACTORIZATION WITH MATRIX EXPONENTIATION Siwei Lyu ICASSP 2010 Presenter : 張庭豪.
Non-negative Matrix Factorization
Adaptive Multi-view Clustering via Cross Trace Lasso
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
Non-negative Matrix Factor Deconvolution; Extraction of Multiple Sound Sources from Monophonic Inputs C.G. Puntonet and A. Prieto (Eds.): ICA 2004 Presenter.
A generic procedure for simultaneous estimation of monotone trends and seasonal patterns in time series of environmental data by Mohamed Hussian and Anders.
StingyCD: Safely Avoiding Wasteful Updates in Coordinate Descent
Jinbo Bi Joint work with Jiangwen Sun, Jin Lu, and Tingyang Xu
Chapter 4 Systems of Linear Equations; Matrices
Multiplicative updates for L1-regularized regression
Zhu Han University of Houston Thanks for Dr. Mingyi Hong’s slides
Georgina Hall Princeton, ORFE Joint work with Amir Ali Ahmadi
Nonnegative polynomials and applications to learning
Collaborative Filtering Matrix Factorization Approach
A Motivating Application: Sensor Array Signal Processing
Sudocodes Fast measurement and reconstruction of sparse signals
Learning latent structure in complex networks 1 2
The European Conference on e-learing ,2017/10
Sparse Principal Component Analysis
Non-Negative Matrix Factorization
An Efficient Projection for L1-∞ Regularization
NON-NEGATIVE COMPONENT PARTS OF SOUND FOR CLASSIFICATION Yong-Choon Cho, Seungjin Choi, Sung-Yang Bang Wen-Yi Chu Department of Computer Science &
ECE 576 POWER SYSTEM DYNAMICS AND STABILITY
Alan Girling University of Birmingham, UK
Presentation transcript:

Informatics and Mathematical Modelling / Intelligent Signal Processing ISCAS Morten Mørup Approximate L0 constrained NMF/NTF Morten Mørup Informatics and Mathematical Modeling Technical University of Denmark Work done in collaboration with Professor Lars Kai Hansen Informatics and Mathematical Modeling Technical University of Denmark PhD Kristoffer Hougaard Madsen Informatics and Mathematical Modeling Technical University of Denmark

Informatics and Mathematical Modelling / Intelligent Signal Processing ISCAS Morten Mørup Non-negative Matrix Factorization (NMF) VWH, V≥0,W≥0, H≥0 NMF gives Part based representation! (Lee & Seung – Nature 1999) 

Informatics and Mathematical Modelling / Intelligent Signal Processing ISCAS Morten Mørup NMF based on Multiplicative updates Step size parameter

Informatics and Mathematical Modelling / Intelligent Signal Processing ISCAS Morten Mørup fast Non-Negative Least Squares, fNNLS Active Set procedure (Lawson and Hanson, 1974)

Informatics and Mathematical Modelling / Intelligent Signal Processing ISCAS Morten Mørup NMF not in general unique!! V=WH=(WP)(P -1 H)=W ’ H ’ (Donoho & Stodden, 2003)

Informatics and Mathematical Modelling / Intelligent Signal Processing ISCAS Morten Mørup FIX: Impose sparseness (Hoyer, 2001,2004 Eggert et al. 2004) Ensures uniqueness Eases interpretability (sparse representation  factor effects pertain to fewer dimensions) Can work as model selection (Sparseness can turn off excess factors by letting them become zero) Resolves over complete representations (when model has many more free variables than data points) L1 used as convex proxy for the L0 norm, i.e. card(H)

Informatics and Mathematical Modelling / Intelligent Signal Processing ISCAS Morten Mørup Least Angle Regression and Selection(LARS)/Homotopy Method  

Informatics and Mathematical Modelling / Intelligent Signal Processing ISCAS Morten Mørup Controlling sparsity degree (Patric Hoyer 2004) Controlling sparsity degree (Mørup et al., 2008) Sparsity can now be controlled by evaulating the full regularization path of the NLARS

Informatics and Mathematical Modelling / Intelligent Signal Processing ISCAS Morten Mørup New Algorithm for sparse NMF: 1: Solve for each column of H using NLARS and obtain solutions for all values of  (i.e. the entire regularization path) 2: Select -solution giving the desired degree of sparsity 3: Update W such that || W d || F =1, according to (Eggert et al. 2004) Repeat from step 1 until convergence

Informatics and Mathematical Modelling / Intelligent Signal Processing ISCAS Morten Mørup CBCL face database USPS handwritten digits

Informatics and Mathematical Modelling / Intelligent Signal Processing ISCAS Morten Mørup

Informatics and Mathematical Modelling / Intelligent Signal Processing ISCAS Morten Mørup Conclusion New efficient algorithm for sparse NMF based on the proposed non-negative version of the LARS algorithm The obtained full regularization path admit to use L1 as a convex proxy for the L0 norm to control the degree of sparsity given by The proposed method is more efficient than previous methods to control degree of sparsity. Furhtermore, NLARS is even comparable in speed to the classic efficient fNNLS method. Proposed method directly generalizes to tensor decompositions through models such as Tucker and PARAFAC when using an alternating least squares approach.