Sparse and Redundant Representations and Their Applications in

Slides:



Advertisements
Similar presentations
The simplex algorithm The simplex algorithm is the classical method for solving linear programs. Its running time is not polynomial in the worst case.
Advertisements

ECG Signal processing (2)
Sub Exponential Randomize Algorithm for Linear Programming Paper by: Bernd Gärtner and Emo Welzl Presentation by : Oz Lavee.
Approximation Algorithms Chapter 14: Rounding Applied to Set Cover.
MMSE Estimation for Sparse Representation Modeling
Joint work with Irad Yavneh
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Online Performance Guarantees for Sparse Recovery Raja Giryes ICASSP 2011 Volkan Cevher.
Fast Bayesian Matching Pursuit Presenter: Changchun Zhang ECE / CMR Tennessee Technological University November 12, 2010 Reading Group (Authors: Philip.
K-SVD Dictionary-Learning for Analysis Sparse Models
Wangmeng Zuo, Deyu Meng, Lei Zhang, Xiangchu Feng, David Zhang
Extensions of wavelets
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
1 Micha Feigin, Danny Feldman, Nir Sochen
Dictionary-Learning for the Analysis Sparse Model Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Sparse and Overcomplete Data Representation
Edge detection. Edge Detection in Images Finding the contour of objects in a scene.
Image Denoising via Learned Dictionaries and Sparse Representations
Optimized Projection Directions for Compressed Sensing Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
Image Decomposition and Inpainting By Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute of.
A Weighted Average of Sparse Several Representations is Better than the Sparsest One Alone Michael Elad The Computer Science Department The Technion –
A Sparse Solution of is Necessarily Unique !! Alfred M. Bruckstein, Michael Elad & Michael Zibulevsky The Computer Science Department The Technion – Israel.
Topics in MMSE Estimation for Sparse Approximation Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Daniel Kroening and Ofer Strichman Decision Procedures An Algorithmic Point of View Deciding ILPs with Branch & Bound ILP References: ‘Integer Programming’
Compressed Sensing Compressive Sampling
Cs: compressed sensing
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Sparse & Redundant Representation Modeling of Images Problem Solving Session 1: Greedy Pursuit Algorithms By: Matan Protter Sparse & Redundant Representation.
Image Decomposition, Inpainting, and Impulse Noise Removal by Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion.
A Weighted Average of Sparse Representations is Better than the Sparsest One Alone Michael Elad and Irad Yavneh SIAM Conference on Imaging Science ’08.
Unique Games Approximation Amit Weinstein Complexity Seminar, Fall 2006 Based on: “Near Optimal Algorithms for Unique Games" by M. Charikar, K. Makarychev,
A Brief Maximum Entropy Tutorial Presenter: Davidson Date: 2009/02/04 Original Author: Adam Berger, 1996/07/05
Learning Kernel Classifiers 1. Introduction Summarized by In-Hee Lee.
D Nagesh Kumar, IIScOptimization Methods: M8L1 1 Advanced Topics in Optimization Piecewise Linear Approximation of a Nonlinear Function.
Simplex Method Review. Canonical Form A is m x n Theorem 7.5: If an LP has an optimal solution, then at least one such solution exists at a basic feasible.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
Unconstrained Submodular Maximization Moran Feldman The Open University of Israel Based On Maximizing Non-monotone Submodular Functions. Uriel Feige, Vahab.
LB160 (Professional Communication Skills For Business Studies)
Sparsity Based Poisson Denoising and Inpainting
Game Theory Just last week:
Tools for Decision Analysis: Analysis of Risky Decisions
12. Principles of Parameter Estimation
Solver & Optimization Problems
Moran Feldman The Open University of Israel
Management Practices Lecture 8.
Sparse and Redundant Representations and Their Applications in
Presenter: Xudong Zhu Authors: Xudong Zhu, etc.
Sparse and Redundant Representations and Their Applications in
Sparse and Redundant Representations and Their Applications in
Nuclear Norm Heuristic for Rank Minimization
Sparse and Redundant Representations and Their Applications in
Decision Making, Learning, Creativity and Entrepreneurship
Towards Understanding the Invertibility of Convolutional Neural Networks Anna C. Gilbert1, Yi Zhang1, Kibok Lee1, Yuting Zhang1, Honglak Lee1,2 1University.
Sparse and Redundant Representations and Their Applications in
Sparse and Redundant Representations and Their Applications in
The Analysis (Co-)Sparse Model Origin, Definition, and Pursuit
Sparse and Redundant Representations and Their Applications in
Sparse and Redundant Representations and Their Applications in
Sparse and Redundant Representations and Their Applications in
Maths for Signals and Systems Linear Algebra in Engineering Lecture 6, Friday 21st October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Improving K-SVD Denoising by Post-Processing its Method-Noise
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
Sparse and Redundant Representations and Their Applications in
Sparse and Redundant Representations and Their Applications in
Sparse and Redundant Representations and Their Applications in
12. Principles of Parameter Estimation
Outline Sparse Reconstruction RIP Condition
Linear Constrained Optimization
Guess Free Maximization of Submodular and Linear Sums
Presentation transcript:

Sparse and Redundant Representations and Their Applications in Signal and Image Processing (236862) Section 5: Theoretical Study of the Approximate Pursuit Problem Winter Semester, 2018/2019 Michael (Miki) Elad

Meeting Plan Quick review of the material covered Answering questions from the students and getting their feedback Addressing issues raised by other learners Discussing a new material – Dantzig Selector & Near-Oracle performance Administrative issues

Overview of the Material Analyzing the Approximate Pursuit Problem Uniqueness vs. Stability – Gaining Intuition The Restricted Isometry Property (RIP) Key Properties of the Restricted Isometry Property (RIP) Theoretical Study of P0 in the Noisy Case Performance of Pursuit Algorithms – General Basis-Pursuit Stability Guarantee Thresholding Stability Guarantee: Worst-Case OMP Stability Guarantee Rate of Decay of the Residual in Greedy Methods   Course Summary and a Glimpse to the Future Course Summary & A Glimpse to the Future

Issues Raised by Other Learners Bisection strategy in IRLS Solution of the Basis Pursuit In the video "IRLS Solution of the Basis Pursuit", at the time slice 5:49, the bisection strategy is mentioned to improve the effectiveness of the searching. Could you please put some more words to explain how this trick is so helpful?   1 2 4 3

Your Questions and Feedback

New Material? The Dantzig Selector (DS) This is a new pursuit algorithm It was proposed in 2007 by Candes and Tao, and termed Dantzig-Selector (DS) The name chosen pays tribute to George Dantzig, the father of the simplex algorithm that solves Linear Programming (LP) problems The connection to LP will become evident shortly

The DS: Introduction We already met various approximation algorithms for solving the above task One natural option we had is the Basis-Pursuit: We present now an appealing and surprising alternative pursuit algorithm for sparse approximation: The Dantsig-Selector This algorithm is competitive with all the algorithms we have met, and it has few interesting advantages

The DS: Intuition The original L2 constraint speaks in terms of the noise power, but says nothing about the noise shape. Does the residual behave like random Gaussian noise? The DS computes the residual and forces it to be nearly uncorrelated with the atoms in A. This is formulated by the L - the maximal inner product must be below some threshold Intuition: If we look at the residual r=Ax-b and recognize shapes of atoms, it means that we have not peeled all the true content from the noise

The DS as Linear Programming

The DS: The Unitary Case  This is a classic soft-shrinkage, just like the one we get for the Basis Pursuit in the unitary case In the unitary case BP=DS

The DS: Theoretical Guarantee Near-Oracle Performance Random noise (not adversarial) ROP RIP

The DS: Theoretical Guarantee ROP ?

Near Oracle Performance ? Could we get similar “near-oracle” performance guarantees for the OMP or Basis-Pursuit? The answer is positive, and there are many papers offering such an analysis Obviously, the key is to assume random noise instead of adversarial one … and rely on proba- -bilistic inequalities We present one such paper from 2010

Near Oracle Performance ?

A Sample from This Paper’s Results s - cardinality m – number of atoms  - a parameter (>0) to choose Where is this coming from? This is the oracle performance. Why does it look so different from things we have seen? Is this a familiar condition?

Zoom In (1): Probability of Success

Zoom In (1): Probability of Success

Zoom In (1): Probability of Success

Zoom In (1): The Condition Gathering the two parts, our worst-case analysis gives: What is different as we move to the Gaussian noise? The noise term maxk|wTak| cannot be bounded by  anymore Instead, we set an arbitrary threshold, , and bound (from bellow) the probability for the event to be true: We choose for reasons that we become clear later

Zoom In (2): Probability of Success Sidak’s Lemma These are Gaussian RV’s with zero mean and  as their STD

Zoom In (3): Oracle Performance

Zoom In (3): Oracle Performance ?

Zoom In (3): Oracle Performance We will NOT work with expectations: and now invoke these: 1. 2. This is a random event with probability as derived above

Administrative Issues The final project in Course 1 is due on December 16th Moving to the Second Course: We meet again on 20/12 to resume the course with the second half Please arrive that day after doing Section 1 of Course 2 Your Research-Project Assignment ….