Multi-Label Prediction via Compressed Sensing By Daniel Hsu, Sham M. Kakade, John Langford, Tong Zhang (NIPS 2009) Presented by: Lingbo Li ECE, Duke University.

Slides:



Advertisements
Similar presentations
1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Advertisements

Junzhou Huang, Shaoting Zhang, Dimitris Metaxas CBIM, Dept. Computer Science, Rutgers University Efficient MR Image Reconstruction for Compressed MR Imaging.
A Robust Super Resolution Method for Images of 3D Scenes Pablo L. Sala Department of Computer Science University of Toronto.
Chapter Outline 3.1 Introduction
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Compressive Sensing IT530, Lecture Notes.
Fast Bayesian Matching Pursuit Presenter: Changchun Zhang ECE / CMR Tennessee Technological University November 12, 2010 Reading Group (Authors: Philip.
Extensions of wavelets
Compressed sensing Carlos Becker, Guillaume Lemaître & Peter Rennert
Entropy-constrained overcomplete-based coding of natural images André F. de Araujo, Maryam Daneshi, Ryan Peng Stanford University.
Compressive Oversampling for Robust Data Transmission in Sensor Networks Infocom 2010.
DIMENSIONALITY REDUCTION BY RANDOM PROJECTION AND LATENT SEMANTIC INDEXING Jessica Lin and Dimitrios Gunopulos Ângelo Cardoso IST/UTL December
Vector Space Information Retrieval Using Concept Projection Presented by Zhiguo Li
Efficient and Numerically Stable Sparse Learning Sihong Xie 1, Wei Fan 2, Olivier Verscheure 2, and Jiangtao Ren 3 1 University of Illinois at Chicago,
Computer Science Department Andrés Corrada-Emmanuel and Howard Schultz Presented by Lawrence Carin from Duke University Autonomous precision error in low-
6.829 Computer Networks1 Compressed Sensing for Loss-Tolerant Audio Transport Clay, Elena, Hui.
(r, n)-Threshold Image Secret Sharing Methods with Small Shadow Images Xiaofeng Wang, Zhen Li, Xiaoni Zhang, Shangping Wang Xi'an University of Technology,
Alfredo Nava-Tudela John J. Benedetto, advisor
Compressed Sensing Compressive Sampling
How Robust are Linear Sketches to Adaptive Inputs? Moritz Hardt, David P. Woodruff IBM Research Almaden.
AMSC 6631 Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Midyear Report Alfredo Nava-Tudela John J. Benedetto,
Presented By Wanchen Lu 2/25/2013
Game Theory Meets Compressed Sensing
Mining Discriminative Components With Low-Rank and Sparsity Constraints for Face Recognition Qiang Zhang, Baoxin Li Computer Science and Engineering Arizona.
Non Negative Matrix Factorization
Cs: compressed sensing
Kaihua Zhang Lei Zhang (PolyU, Hong Kong) Ming-Hsuan Yang (UC Merced, California, U.S.A. ) Real-Time Compressive Tracking.
Learning With Structured Sparsity
Classification and Ranking Approaches to Discriminative Language Modeling for ASR Erinç Dikici, Murat Semerci, Murat Saraçlar, Ethem Alpaydın 報告者:郝柏翰 2013/01/28.
SVD: Singular Value Decomposition
Efficient and Numerically Stable Sparse Learning Sihong Xie 1, Wei Fan 2, Olivier Verscheure 2, and Jiangtao Ren 3 1 University of Illinois at Chicago,
Shriram Sarvotham Dror Baron Richard Baraniuk ECE Department Rice University dsp.rice.edu/cs Sudocodes Fast measurement and reconstruction of sparse signals.
Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro.
CSE 185 Introduction to Computer Vision Face Recognition.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Sparse Signals Reconstruction Via Adaptive Iterative Greedy Algorithm Ahmed Aziz, Ahmed Salim, Walid Osamy Presenter : 張庭豪 International Journal of Computer.
Sparse Bayesian Learning for Efficient Visual Tracking O. Williams, A. Blake & R. Cipolloa PAMI, Aug Presented by Yuting Qi Machine Learning Reading.
Tell Me What You See and I will Show You Where It Is Jia Xu 1 Alexander G. Schwing 2 Raquel Urtasun 2,3 1 University of Wisconsin-Madison 2 University.
Social Tag Prediction Paul Heymann, Daniel Ramage, and Hector Garcia- Molina Stanford University SIGIR 2008.
Guest lecture: Feature Selection Alan Qi Dec 2, 2004.
A Weighted Average of Sparse Representations is Better than the Sparsest One Alone Michael Elad and Irad Yavneh SIAM Conference on Imaging Science ’08.
Iterative similarity based adaptation technique for Cross Domain text classification Under: Prof. Amitabha Mukherjee By: Narendra Roy Roll no: Group:
26/01/20161Gianluca Demartini Ranking Categories for Faceted Search Gianluca Demartini L3S Research Seminars Hannover, 09 June 2006.
Adaptive Multi-view Clustering via Cross Trace Lasso
Ridge Regression: Biased Estimation for Nonorthogonal Problems by A.E. Hoerl and R.W. Kennard Regression Shrinkage and Selection via the Lasso by Robert.
By: Jesse Ehlert Dustin Wells Li Zhang Iterative Aggregation/Disaggregation(IAD)
Learning Photographic Global Tonal Adjustment with a Database of Input / Output Image Pairs.
Multi-label Prediction via Sparse Infinite CCA Piyush Rai and Hal Daume III NIPS 2009 Presented by Lingbo Li ECE, Duke University July 16th, 2010 Note:
Chapter 13 Discrete Image Transforms
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
Image Processing Architecture, © Oleh TretiakPage 1Lecture 5 ECEC 453 Image Processing Architecture Lecture 5, 1/22/2004 Rate-Distortion Theory,
Spectral Algorithms for Learning HMMs and Tree HMMs for Epigenetics Data Kevin C. Chen Rutgers University joint work with Jimin Song (Rutgers/Palentir),
Dimensionality Reduction and Principle Components Analysis
Compressive Coded Aperture Video Reconstruction
Author: Vikas Sindhwani and Amol Ghoting Presenter: Jinze Li
Jeremy Watt and Aggelos Katsaggelos Northwestern University
Importance Weighted Active Learning
Basic Algorithms Christina Gallner
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Presenter: Xudong Zhu Authors: Xudong Zhu, etc.
Unfolding Problem: A Machine Learning Approach
Towards Understanding the Invertibility of Convolutional Neural Networks Anna C. Gilbert1, Yi Zhang1, Kibok Lee1, Yuting Zhang1, Honglak Lee1,2 1University.
The Communication Complexity of Distributed Set-Joins
Sudocodes Fast measurement and reconstruction of sparse signals
Sparselet Models for Efficient Multiclass Object Detection
Sparse Learning Based on L2,1-norm
CIS 700: “algorithms for Big Data”
Using Manifold Structure for Partially Labeled Classification
Sudocodes Fast measurement and reconstruction of sparse signals
Outline Sparse Reconstruction RIP Condition
Presentation transcript:

Multi-Label Prediction via Compressed Sensing By Daniel Hsu, Sham M. Kakade, John Langford, Tong Zhang (NIPS 2009) Presented by: Lingbo Li ECE, Duke University * Some notes are directly copied from the original paper.

Outline Introduction Preliminaries Learning Reduction Compression and Reconstruction Empirical Results Conclusion

Introduction Large database of images; Goal: predict who or what is in a given image  Samples: images with corresponding labels is the total number of entities in the whole database. One-against-all algorithm: Learn a binary predictor for each label (class). Computation is expensive when is large. e.g., Assume the output vector is sparse.

Introduction Main idea: “Learn to predict compressed label vectors, and then use sparse reconstruction algorithm to recover uncompressed labels from these predictions” Compressed sensing : For any sparse vector, it is highly possible to compress to logarithmic in dimension with perfect reconstruction of.

Preliminaries : input space; : output (label) space, where Training data: Goal: to learn the predictor with low mean- squared error Assume is very large; Expected value is sparse, with only a few non-zero entries.

Learning reduction Linear compression function where Goal: to learn a predictor Predict the label y with the Predictor F Predict the compressed label Ay with the Predictor H Samples Compressed Samples To minimize

Reduction-training and prediction Reconstruction Algorithm R: If is close to, then should be close to

Compression Functions Examples of valid compression functions:

Reconstruction Algorithms Examples of valid reconstruction algorithms: iterative and greedy algorithms Orthogonal Matching Pursuit (OMP) Forward-Backward Greedy (FoBa) Compressive Sampling Matching Pursuit (CoSaMP)

General Robustness Guarantees What if the reduction create a problem harder to solve than the original problem? Sparsity error is defined as where is the best k-sparse approximation of

Linear Prediction If there is a perfect linear predictor of, then there will be a perfect linear predictor of :

Experimental Results Experiment 1: Image data (collected by the ESP Game) 65k images, 22k unique labels; Keep the 1k most frequent labels; the least frequent occurs 39 times while the most frequent occurs about 12k times, 4 labels on average per image; Half of the data as training and half as testing. Experiment 2: Text data (collected from 16k labeled web page, 983 unique labels; the least frequent occurs 21 times, the most frequent occurs about 6500 times, 19 labels on average per web page; Half of the data as training and half as testing. Compression function A: select m random rows of the Hadamard matrix. Test the greedy and iterative reconstruction algorithm: OMP, FoBa, CoSaMp and Lasso. Use correlation decoding (CD) as a baseline method for comparisons.

Experimental Results Measure Measure the precision Top two: image data; Bottom: text data

Conclusion Application of compressed sensing to multi-label prediction problem with output sparsity; Efficient reduction algorithm with the number of predictions equal to logarithmic in original labels; Robustness Guarantees from compressed case to the original case; and vice versa for the linear prediction setting.