Sparse and Overcomplete Data Representation

Slides:



Advertisements
Similar presentations
MMSE Estimation for Sparse Representation Modeling
Advertisements

Joint work with Irad Yavneh
Learning Measurement Matrices for Redundant Dictionaries Richard Baraniuk Rice University Chinmay Hegde MIT Aswin Sankaranarayanan CMU.
Fast Bayesian Matching Pursuit Presenter: Changchun Zhang ECE / CMR Tennessee Technological University November 12, 2010 Reading Group (Authors: Philip.
The Analysis (Co-)Sparse Model Origin, Definition, and Pursuit
Sparse Representations and the Basis Pursuit Algorithm* Michael Elad The Computer Science Department – Scientific Computing & Computational mathematics.
K-SVD Dictionary-Learning for Analysis Sparse Models
Wangmeng Zuo, Deyu Meng, Lei Zhang, Xiangchu Feng, David Zhang
Learning sparse representations to restore, classify, and sense images and videos Guillermo Sapiro University of Minnesota Supported by NSF, NGA, NIH,
Extensions of wavelets
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
1 Micha Feigin, Danny Feldman, Nir Sochen
Ilias Theodorakopoulos PhD Candidate
Design of Non-Linear Kernel Dictionaries for Object Recognition
Entropy-constrained overcomplete-based coding of natural images André F. de Araujo, Maryam Daneshi, Ryan Peng Stanford University.
Sparse & Redundant Signal Representation, and its Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Dictionary-Learning for the Analysis Sparse Model Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Image Super-Resolution Using Sparse Representation By: Michael Elad Single Image Super-Resolution Using Sparse Representation Michael Elad The Computer.
SRINKAGE FOR REDUNDANT REPRESENTATIONS ? Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel.
Mathematics and Image Analysis, MIA'06
Image Denoising via Learned Dictionaries and Sparse Representations
Over-Complete & Sparse Representations for Image Decomposition and Inpainting Michael Elad The Computer Science Department The Technion – Israel Institute.
Image Denoising and Beyond via Learned Dictionaries and Sparse Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
An Introduction to Sparse Representation and the K-SVD Algorithm
ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
New Results in Image Processing based on Sparse and Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
Optimized Projection Directions for Compressed Sensing Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
Image Denoising with K-SVD Priyam Chatterjee EE 264 – Image Processing & Reconstruction Instructor : Prof. Peyman Milanfar Spring 2007.
Random Convolution in Compressive Sampling Michael Fleyer.
* Joint work with Michal Aharon Guillermo Sapiro
Introduction to Compressive Sensing
Analysis of the Basis Pustuit Algorithm and Applications*
Recent Trends in Signal Representations and Their Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Image Decomposition and Inpainting By Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute of.
Sparse Representations of Signals: Theory and Applications *
Over-Complete & Sparse Representations for Image Decomposition*
A Weighted Average of Sparse Several Representations is Better than the Sparsest One Alone Michael Elad The Computer Science Department The Technion –
A Sparse Solution of is Necessarily Unique !! Alfred M. Bruckstein, Michael Elad & Michael Zibulevsky The Computer Science Department The Technion – Israel.
Sparse and Redundant Representation Modeling for Image Processing Michael Elad The Computer Science Department The Technion – Israel Institute of technology.
Topics in MMSE Estimation for Sparse Approximation Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Alfredo Nava-Tudela John J. Benedetto, advisor
The Quest for a Dictionary. We Need a Dictionary  The Sparse-land model assumes that our signal x can be described as emerging from the PDF:  Clearly,
Compressed Sensing Compressive Sampling
AMSC 6631 Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Midyear Report Alfredo Nava-Tudela John J. Benedetto,
Cs: compressed sensing
Sparse Representations of Signals: Theory and Applications * Michael Elad The CS Department The Technion – Israel Institute of technology Haifa 32000,
“A fast method for Underdetermined Sparse Component Analysis (SCA) based on Iterative Detection- Estimation (IDE)” Arash Ali-Amini 1 Massoud BABAIE-ZADEH.
 Karthik Gurumoorthy  Ajit Rajwade  Arunava Banerjee  Anand Rangarajan Department of CISE University of Florida 1.
Fast and incoherent dictionary learning algorithms with application to fMRI Authors: Vahid Abolghasemi Saideh Ferdowsi Saeid Sanei. Journal of Signal Processing.
Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro.
Sparse & Redundant Representation Modeling of Images Problem Solving Session 1: Greedy Pursuit Algorithms By: Matan Protter Sparse & Redundant Representation.
Image Decomposition, Inpainting, and Impulse Noise Removal by Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion.
A Weighted Average of Sparse Representations is Better than the Sparsest One Alone Michael Elad and Irad Yavneh SIAM Conference on Imaging Science ’08.
Image Priors and the Sparse-Land Model
The Quest for a Dictionary. We Need a Dictionary  The Sparse-land model assumes that our signal x can be described as emerging from the PDF:  Clearly,
Single Image Interpolation via Adaptive Non-Local Sparsity-Based Modeling The research leading to these results has received funding from the European.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
Jeremy Watt and Aggelos Katsaggelos Northwestern University
A Motivating Application: Sensor Array Signal Processing
Optimal sparse representations in general overcomplete bases
* *Joint work with Ron Rubinstein Tomer Peleg Remi Gribonval and
Sparse and Redundant Representations and Their Applications in
The Analysis (Co-)Sparse Model Origin, Definition, and Pursuit
Sparse and Redundant Representations and Their Applications in
Improving K-SVD Denoising by Post-Processing its Method-Noise
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
Outline Sparse Reconstruction RIP Condition
Presentation transcript:

Sparse and Overcomplete Data Representation Michael Elad The CS Department The Technion – Israel Institute of technology Haifa 32000, Israel Israel Statistical Association 2005 Annual Meeting Tel-Aviv University (Dan David bldg.) May 17th, 2005

Agenda A Visit to Sparseland 2. Answering the 4 Questions Motivating Sparsity & Overcompleteness 2. Answering the 4 Questions How & why should this work? Welcome to Sparseland Sparse and Overcomplete Data Representation

M Data Synthesis in Sparseland K N A fixed Dictionary Every column in D (dictionary) is a prototype data vector (Atom). M The vector  is generated randomly with few non-zeros in random locations and random values. A sparse & random vector N Sparse and Overcomplete Data Representation

M Sparseland Data is Special Simple: Every generated vector is built as a linear combination of few atoms from our dictionary D Rich: A general model: the obtained vectors are a special type mixture-of-Gaussians (or Laplacians). Multiply by D M Sparse and Overcomplete Data Representation

M T Transforms in Sparseland ? M Assume that x is known to emerge from . . We desire simplicity, independence, and expressiveness. M How about “Given x, find the α that generated it in ” ? T Sparse and Overcomplete Data Representation

Difficulties with the Transform Multiply by D A sparse & random vector Are there practical ways to get ? 4 Major Questions Is ? Under which conditions? How effective are those ways? How would we get D? Sparse and Overcomplete Data Representation

Sparseland is HERE Why Is It Interesting? Several recent trends from signal/image processing worth looking at: JPEG to JPEG2000 - From (L2-norm) KLT to wavelet and non-linear approximation From Wiener to robust restoration – From L2-norm (Fourier) to L1. (e.g., TV, Beltrami, wavelet shrinkage …) From unitary to richer representations – Frames, shift-invariance, bilateral, steerable, curvelet Approximation theory – Non-linear approximation ICA and related models Sparsity. Overcompleteness. Sparsity & Overcompleteness. Independence and Sparsity. Sparseland is HERE Sparse and Overcomplete Data Representation

T Agenda 1. A Visit to Sparseland 2. Answering the 4 Questions Motivating Sparsity & Overcompleteness 2. Answering the 4 Questions How & why should this work? T Sparse and Overcomplete Data Representation

Suppose we can solve this exactly Question 1 – Uniqueness? Multiply by D M Suppose we can solve this exactly Why should we necessarily get ? It might happen that eventually . Sparse and Overcomplete Data Representation

Matrix “Spark” Donoho & Elad (‘02) Definition: Given a matrix D, =Spark{D} is the smallest and and number of columns that are linearly dependent. Example: Spark = 3 Rank = 4 Sparse and Overcomplete Data Representation

Uniqueness Rule M Suppose this problem has been solved somehow If we found a representation that satisfy then necessarily it is unique (the sparsest). Uniqueness Donoho & Elad (‘02) This result implies that if generates vectors using “sparse enough” , the solution of the above will find it exactly. M Sparse and Overcomplete Data Representation

M Question 2 – Practical P0 Solver? Are there reasonable ways to find ? Multiply by D Sparse and Overcomplete Data Representation

Matching Pursuit (MP) Mallat & Zhang (1993) The MP is a greedy algorithm that finds one atom at a time. Step 1: find the one atom that best matches the signal. Next steps: given the previously found atoms, find the next one to best fit … The Orthogonal MP (OMP) is an improved version that re-evaluates the coefficients after each round. Sparse and Overcomplete Data Representation

Basis Pursuit (BP) Instead of solving Solve Instead Chen, Donoho, & Saunders (1995) Instead of solving Solve Instead The newly defined problem is convex. It has a Linear Programming structure. Very efficient solvers can be deployed: Interior point methods [Chen, Donoho, & Saunders (`95)] , Sequential shrinkage for union of ortho-bases [Bruce et.al. (`98)], If computing Dx and DT are fast, based on shrinkage [Elad (`05)]. Sparse and Overcomplete Data Representation

M Question 3 – Approx. Quality? How effective are the MP/BP Multiply by D How effective are the MP/BP in finding ? Sparse and Overcomplete Data Representation

Mutual Coherence Compute D DT DTD = D DTD Assume normalized columns The Mutual Coherence µ is the largest entry in absolute value outside the main diagonal of DTD. The Mutual Coherence is a property of the dictionary (just like the “Spark”). The smaller it is, the better the dictionary. Sparse and Overcomplete Data Representation

BP and MP Equivalence Equivalence Given a vector x with a representation , Assuming that , BP and MP are Guaranteed to find the sparsest solution. Donoho & Elad (‘02) Gribonval & Nielsen (‘03) Tropp (‘03) Temlyakov (‘03) MP is typically inferior to BP! The above result corresponds to the worst-case. Average performance results are available too, showing much better bounds [Donoho (`04), Candes et.al. (`04), Elad and Zibulevsky (`04)]. Sparse and Overcomplete Data Representation

M Question 4 – Finding D? Multiply by D Given these P examples and a fixed size [NK] dictionary D: Is D unique? (Yes) How to find D? Train!! The K-SVD algorithm Skip? Sparse and Overcomplete Data Representation

X A D  Training: The Objective Each example is a linear combination of atoms from D Each example has a sparse representation with no more than L atoms Sparse and Overcomplete Data Representation

Column-by-Column by SVD computation The K–SVD Algorithm Aharon, Elad, & Bruckstein (`04) D Initialize D Sparse Coding Use MP or BP X Dictionary Update Column-by-Column by SVD computation Sparse and Overcomplete Data Representation

Today We Discussed A Visit to Sparseland 2. Answering the 4 Questions Motivating Sparsity & Overcompleteness 2. Answering the 4 Questions How & why should this work? Sparse and Overcomplete Data Representation

There are difficulties in using them! Summary Sparsity and Over- completeness are important ideas that can be used in designing better tools in data/signal/image processing There are difficulties in using them! We are working on resolving those difficulties: Performance of pursuit alg. Speedup of those methods, Training the dictionary, Demonstrating applications, … The dream? Future transforms and regularizations will be data-driven, non-linear, overcomplete, and promoting sparsity. Sparse and Overcomplete Data Representation