Learning With Dynamic Group Sparsity Junzhou Huang Xiaolei Huang Dimitris Metaxas Rutgers University Lehigh University Rutgers University.

Slides:



Advertisements
Similar presentations
Junzhou Huang, Shaoting Zhang, Dimitris Metaxas CBIM, Dept. Computer Science, Rutgers University Efficient MR Image Reconstruction for Compressed MR Imaging.
Advertisements

Object Specific Compressed Sensing by minimizing a weighted L2-norm A. Mahalanobis.
Regularization David Kauchak CS 451 – Fall 2013.
Compressive Sensing IT530, Lecture Notes.
Multi-Label Prediction via Compressed Sensing By Daniel Hsu, Sham M. Kakade, John Langford, Tong Zhang (NIPS 2009) Presented by: Lingbo Li ECE, Duke University.
MMSE Estimation for Sparse Representation Modeling
Joint work with Irad Yavneh
Pixel Recovery via Minimization in the Wavelet Domain Ivan W. Selesnick, Richard Van Slyke, and Onur G. Guleryuz *: Polytechnic University, Brooklyn, NY.
Online Performance Guarantees for Sparse Recovery Raja Giryes ICASSP 2011 Volkan Cevher.
Multi-Task Compressive Sensing with Dirichlet Process Priors Yuting Qi 1, Dehong Liu 1, David Dunson 2, and Lawrence Carin 1 1 Department of Electrical.
UNLocBox: Matlab convex optimization toolbox epfl
Exact or stable image\signal reconstruction from incomplete information Project guide: Dr. Pradeep Sen UNM (Abq) Submitted by: Nitesh Agarwal IIT Roorkee.
Chapter 2: Lasso for linear models
Extensions of wavelets
1 Micha Feigin, Danny Feldman, Nir Sochen
More MR Fingerprinting
Compressed sensing Carlos Becker, Guillaume Lemaître & Peter Rennert
Robust Object Tracking via Sparsity-based Collaborative Model
Optimization & Learning for Registration of Moving Dynamic Textures Junzhou Huang 1, Xiaolei Huang 2, Dimitris Metaxas 1 Rutgers University 1, Lehigh University.
Volkan Cevher, Marco F. Duarte, and Richard G. Baraniuk European Signal Processing Conference 2008.
Video Coding with Linear Compensation (VCLC) Arif Mahmood, Zartash Afzal Uzmi, Sohaib A Khan Department of Computer.
Video summarization by graph optimization Lu Shi Oct. 7, 2003.
Effective Gaussian mixture learning for video background subtraction Dar-Shyang Lee, Member, IEEE.
6.829 Computer Networks1 Compressed Sensing for Loss-Tolerant Audio Transport Clay, Elena, Hui.
Topics in MMSE Estimation for Sparse Approximation Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Model-based Compressive Sensing
Improved results for a memory allocation problem Rob van Stee University of Karlsruhe Germany Leah Epstein University of Haifa Israel WADS 2007 WAOA 2007.
An ALPS’ view of Sparse Recovery Volkan Cevher Laboratory for Information and Inference Systems - LIONS
AMSC 6631 Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Midyear Report Alfredo Nava-Tudela John J. Benedetto,
Surface Simplification Using Quadric Error Metrics Michael Garland Paul S. Heckbert.
Introduction to variable selection I Qi Yu. 2 Problems due to poor variable selection: Input dimension is too large; the curse of dimensionality problem.
BraMBLe: The Bayesian Multiple-BLob Tracker By Michael Isard and John MacCormick Presented by Kristin Branson CSE 252C, Fall 2003.
Cs: compressed sensing
Iterated Denoising for Image Recovery Onur G. Guleryuz To see the animations and movies please use full-screen mode. Clicking on.
Recovering low rank and sparse matrices from compressive measurements Aswin C Sankaranarayanan Rice University Richard G. Baraniuk Andrew E. Waters.
Learning With Structured Sparsity
Image Restoration using Iterative Wiener Filter --- ECE533 Project Report Jing Liu, Yan Wu.
Constructing Optimal Wavelet Synopses Dimitris Sacharidis Timos Sellis
An Introduction to Support Vector Machines (M. Law)
Kevin Cherry Robert Firth Manohar Karki. Accurate detection of moving objects within scenes with dynamic background, in scenarios where the camera is.
Shriram Sarvotham Dror Baron Richard Baraniuk ECE Department Rice University dsp.rice.edu/cs Sudocodes Fast measurement and reconstruction of sparse signals.
CS654: Digital Image Analysis
Sparse Signals Reconstruction Via Adaptive Iterative Greedy Algorithm Ahmed Aziz, Ahmed Salim, Walid Osamy Presenter : 張庭豪 International Journal of Computer.
Sparse & Redundant Representation Modeling of Images Problem Solving Session 1: Greedy Pursuit Algorithms By: Matan Protter Sparse & Redundant Representation.
3.7 Adaptive filtering Joonas Vanninen Antonio Palomino Alarcos.
Zhilin Zhang, Bhaskar D. Rao University of California, San Diego March 28,
Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen
A global approach Finding correspondence between a pair of epipolar lines for all pixels simultaneously Local method: no guarantee we will have one to.
A comparative approach for gene network inference using time-series gene expression data Guillaume Bourque* and David Sankoff *Centre de Recherches Mathématiques,
A Kernel Approach for Learning From Almost Orthogonal Pattern * CIS 525 Class Presentation Professor: Slobodan Vucetic Presenter: Yilian Qin * B. Scholkopf.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Linear Solution to Scale and Rotation Invariant Object Matching Hao Jiang and Stella X. Yu Computer Science Department Boston College.
Dynamic Background Learning through Deep Auto-encoder Networks Pei Xu 1, Mao Ye 1, Xue Li 2, Qihe Liu 1, Yi Yang 2 and Jian Ding 3 1.University of Electronic.
Curve Simplification under the L 2 -Norm Ben Berg Advisor: Pankaj Agarwal Mentor: Swaminathan Sankararaman.
ICCV 2007 National Laboratory of Pattern Recognition Institute of Automation Chinese Academy of Sciences Half Quadratic Analysis for Mean Shift: with Extension.
ICCV 2007 Optimization & Learning for Registration of Moving Dynamic Textures Junzhou Huang 1, Xiaolei Huang 2, Dimitris Metaxas 1 Rutgers University 1,
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
Sparsity Based Poisson Denoising and Inpainting
Robust and Fast Collaborative Tracking with Two Stage Sparse Optimization Authors: Baiyang Liu, Lin Yang, Junzhou Huang, Peter Meer, Leiguang Gong and.
Compressive Coded Aperture Video Reconstruction
Markov Random Fields with Efficient Approximations
Learning With Dynamic Group Sparsity
Basic Algorithms Christina Gallner
Image Processing for Physical Data
NESTA: A Fast and Accurate First-Order Method for Sparse Recovery
Presenter: Xudong Zhu Authors: Xudong Zhu, etc.
Sudocodes Fast measurement and reconstruction of sparse signals
Orthogonal Matching Pursuit (OMP)
Learned Convolutional Sparse Coding
Outline Sparse Reconstruction RIP Condition
Presentation transcript:

Learning With Dynamic Group Sparsity Junzhou Huang Xiaolei Huang Dimitris Metaxas Rutgers University Lehigh University Rutgers University

Outline  Problem: Applications where the useful information is very less compared with the given data sparse recovery  Previous work and related issues  Proposed method: Dynamic Group Sparsity (DGS) DGS definition and one theoretical result One greedy algorithm for DGS Extension to Adaptive DGS (AdaDGS)  Applications Compressive sensing, Video Background subtraction

Previous Work: Standard Sparsity  Without priors for nonzero entries  Complexity O(k log (n/k) ), too high for large n  Existing work L1 norm minimization (Lasso, GPSR, SPGL1 et al.) Greedy algorithms (OMP, ROMP, SP, CoSaMP et al.) Problem: give the linear measurement of a sparse data and, where and m<<n. How to recover the sparse data x from its measurement y ?

Previous Work: Group Sparsity  The indices {1,..., n} are divided into m disjoint groups G 1,G 2,...,G m. Suppose only g groups cover k nonzero entries  Priors for nonzero entries entries in one group are either zeros both or both nonzero  Group complexity: O(k + g log(m)). Too Restrictive for practical applications; the known group setting, inability for dynamic groups  Existing work Yuan&Lin’06, Wipf&Rao’07, Bach’08, Ji et al.’08

Proposed Work: Motivation  More knowledge about nonzero entries leads to the less complexity No information about nonzero positions: O(k log(n/k) ) Group priors for the nonzero positions: O(g log(m) ) Knowing nonzero positions: O(k) complexity  Advantages Reduced complexity as group sparsity Flexible enough as standard sparsity

Dynamic Group Sparse Data  Nonzero entries tend to be clustered in groups  However, we do not know the group size/location group sparsity: can not be directly used stardard sparisty: high complexity

Theoretical Result for DGS  Lemma: Suppose we have dynamic group sparse data, the nonzero number is k and the nonzero entries are clustered into q disjoint groups where q<< k. Then the DGS complexity is O(k+q log(n/q))  Better than the standard sparsity complexity O(k+k log(n/k))  More useful than group sparsity in practice

DGS Recovery  Five main steps Prune the residue estimation using DGS approximation Merge the support sets Estimate the signal using least squares Prune the signal estimation using DGS approximation Update the signal/residue estimation and support set.

Steps 1,4: DGS Approximation Pruning  A nonzero pixel implies adjacent pixels are more likely to be nonzeros  Key point: Pruning the data according to both the value of the current pixel and those of its adjacent pixels  Weights can be added to adjust the balance. If weights corresponding to the adjacent pixels are zeros, it becomes the standard sparsity approximation pruning.  The number of nonzero entries K must be known

AdaDGS Recovery  Suppose knowing the sparsity range [k min, k max ]  Setting one sparsity step size  Iteratively run the DGS recovery algorithm with incremental sparsity number until the halting criterion  In practice, choosing a halting condition is very important. No optimal way.

Two Useful Halting Conditions  The residue norm in the current iteration is not smaller than that in the last iteration. practically fast, used in the inner loop in AdaDGS  The relative change of the recovered data between two consecutive iterations is smaller than a certain threshold. It is not worth taking more iterations if the improvement is small Used in the outer loop in AdaDGS

Application on Compressive Sensing  Experiment setup Quantitative evaluation: relative difference between the estimated sparse data and the ground truth Running on a 3.2 GHz PC in Matlab  Demonstrate the advantage of DGS over standard sparsity on the CS of DGS data

Example: 1D Simulated Signals

Statistics: 1D Simulated Signals

Example: 2D Images Figure. (a) original image, (b) recovered image with MCS [Ji et al.’08 ] (error is and time is seconds), (c) recovered image with SP [Dai’08] (error is and time is seconds) and (d) recovered image with DGS (error is and time is seconds).

Statistics: 2D Images

Video Background Subtraction  Foreground is typical DGS data The nonzero coefficients are clustered into unknown groups, which corresponding to the foreground objects Unknown group size/locations, group number Temporal and spatial sparsity Figure. Example.(a) one frame, (b) the foreground, (c) the foreground mask and (d) Our result

AdaDGS Background Subtraction  Previous Video frames, Let f t is the foreground image, b t is the background image Suppose background subtraction already done in frame 1~ t and let  New Frame Temporal sparisty:, x is sparse, Sparisty Constancy assumption instead of Brightness Constancy assumption Spatial sparsity: f t+1 is dynamic group sparse

Formulation  Problem z is dynamic group sparse data Efficiently solved by the proposed AdaDGS algorithm

Video Results (a) Original video, (b) our result, (c) by [C. Stauffer and W. Grimson 1999]

Video Results (a)Original video, (b) our result, (c) by [C. Stauffer and W. Grimson 1999] and (d) by [Monnet et al 2003]

Video Results (a)Original, (b) our result, (c) by [Elgammal et al 2002] and (d) by [C. Stauffer and W. Grimson 1999] (a) Original (b) proposed (c) by [J. Zhong and S. Sclaroff 2003] and (d) by [C. Stauffer and W. Grimson 1999]

Summary  Proposed work Definition and theoretical result for DGS DGS and AdaDGS recovery algorithm Two applications  Future work Real time implementation of AdaDGS background subtraction (3 sec per frame in current Matlab implementation )  Thanks!