Learning sparse representations to restore, classify, and sense images and videos Guillermo Sapiro University of Minnesota Supported by NSF, NGA, NIH, ONR, DARPA, ARO, McKnight Foundation
Learning Sparsity 2 Martin Duarte Rodriguez Ramirez Lecumberry
Learning Sparsity 3 Overview Introduction –Denoising, Demosaicing, Inpainting –Mairal, Elad, Sapiro, IEEE-TIP, January 2008 Learn multiscale dictionaries –Mairal, Elad, Sapiro, SIAM-MMS, April 2008 Sparsity + Self-similarity –Mairal, Bach, Ponce, Sapiro, Zisserman, pre-print. Incoherent dictionaries and universal coding –Ramirez, Lecumberry, Sapiro, June 2009, pre-print Learning to classify –Mairal, Bach, Ponce, Sapiro, Zisserman, CVPR 2008, NIPS 2008 –Rodriguez and Sapiro, pre-print, Learning to sense sparse signals –Duarte and Sapiro, pre-print, May 2008, IEEE-TIP to appear
Learning Sparsity 4 Introduction I: Sparse and Redundant Representations Webster Dictionary: Of few and scattered elements
Learning Sparsity 5 Relation to measurements Restoration by Energy Minimization Thomas Bayes Prior or regularization y : Given measurements x : Unknown to be recovered Restoration/representation algorithms are often related to the minimization of an energy function of the form Bayesian type of approach What is the prior? What is the image model?
Learning Sparsity 6 The Sparseland Model for Images MM K N A fixed Dictionary Every column in D (dictionary) is a prototype signal (Atom). The vector contains very few (say L) non-zeros. A sparse & random vector N
Learning Sparsity 7 What Should the Dictionary D Be? D should be chosen such that it sparsifies the representations (for a given task!) Learn D : Multiscale Learning Color Image Examples Task / sensing adapted Internal structure One approach to choose D is from a known set of transforms (Steerable wavelet, Curvelet, Contourlets, Bandlets, …)
Learning Sparsity 8 Introduction II: Dictionary Learning
Learning Sparsity 9 Each example is a linear combination of atoms from D Measure of Quality for D D X A Each example has a sparse representation with no more than L atoms Field & Olshausen (‘96) Engan et. al. (‘99) Lewicki & Sejnowski (‘00) Cotter et. al. (‘03) Gribonval et. al. (‘04) Aharon, Elad, & Bruckstein (‘04) Aharon, Elad, & Bruckstein (‘05) Ng et al. (‘07) Mairal, Sapiro, Elad (‘08)
Learning Sparsity 10 The K–SVD Algorithm – General D Initialize D Sparse Coding Orthogonal Matching Pursuit (or L1) Dictionary Update Column-by-Column by SVD computation over the relevant examples Aharon, Elad, & Bruckstein (`04) XTXT
Learning Sparsity 11 Show me the pictures
Learning Sparsity 12 Change the Metric in the OMP
Learning Sparsity 13 Non-uniform noise
Learning Sparsity 14 Example: Non-uniform noise
Learning Sparsity 15 Example: Inpainting
Learning Sparsity 16 Example: Demoisaic
Learning Sparsity 17 Example: Inpainting
Learning Sparsity 18 Not enough fun yet?: Multiscale Dictionaries
Learning Sparsity 19 Learned multiscale dictionary
Learning Sparsity 20
Learning Sparsity 21 Color multiscale dictionaries
Learning Sparsity 22 Example
Learning Sparsity 23 Video inpainting
Extending the Models Learning Sparsity 24
Universal Coding and Incoherent Dictionaries Consistent Improved generalization properties Improved active set computation Improved coding speed Improved reconstruction See poster by Ramirez and Lecumberry… Learning Sparsity 25
Sparsity + Self-similarity=Group Sparsity Combine the two of the most successful models for images Mairal, Bach. Ponce, Sapiro, Zisserman, pre-print, 2009 Learning Sparsity 26
Learning to Classify
Learning Sparsity 28 Global Dictionary
Learning Sparsity 29 Barbara
Learning Sparsity 30 Boat
Learning Sparsity 31 Digits
Which dictionary? How to learn them? Multiple reconstructive dictionary? (Payre) Single reconstructive dictionary ? (Ng et al, LeCunn et al.) Dictionaries for classification! See also Winn et al., Holub et al., Lasserre et al., Hinton et al. for joint discriminative/generative probabilistic approaches Learning Sparsity 32
Learning Sparsity 33 Learning multiple reconstructive and discriminative dictionaries With J. Mairal, F. Bach, J. Ponce, and A. Zisserman, CVPR ’08, NIPS ‘08
Learning Sparsity 34 Texture classification
Semi-supervised detection learning MIT -- Learning Sparsity 35
Learning Sparsity 36 Learning a Single Discriminative and Reconstructive Dictionary Exploit the representation coefficients for classification –Include this in the optimization –Class supervised simultaneous OMP With F. Rodriguez
Learning Sparsity 37 Digits images: Robust to noise and occlusions
Learning Sparsity 38 Supervised Dictionary Learning With J. Mairal, F. Bach, J. Ponce, and A. Zisserman, NIPS ‘08
Learning to Sense Sparse Images
Motivation Compressed sensing (Candes &Tao, Donoho, et al.) –Sparsity –Random sampling Universality Stability Shall the sensing be adapted to the data type? –Yes! (Elad, Peyre, Weiss et al., Applebaum et al, this talk). Shall the sensing and dictionary be learned simultaneously? Learning Sparsity 40
Some formulas…. Learning Sparsity 41 + “RIP (Identity Gramm Matrix)”
Design the dictionary and sensing together Learning Sparsity 42
Just Believe the Pictures Learning Sparsity 43
Just Believe the Pictures Learning Sparsity 44
Just Believe the Pictures Learning Sparsity 45
Learning Sparsity 46 Conclusions State-of-the-art denoising results for still (shared with Dabov et al.) and video General Vectorial and multiscale learned dictionaries Dictionaries with internal structure Dictionary learning for classification –See also Szlam and Sapiro, ICML 2009 –See also Carin et al, ICML 2009 Dictionary learning for sensing A lot of work still to be done!
Please do not use the wrong dictionaries… 12 M pixel image 7 million patches LARS+online learning: ~8 minutes Mairal, Bach, Ponce, Sapiro, ICML 2009 Learning Sparsity 47
Learning Sparsity 48