Fast and robust sparse recovery New Algorithms and Applications The Chinese University of Hong Kong The Institute of Network Coding Sheng Cai Eric Chan.

Slides:



Advertisements
Similar presentations
1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Advertisements

Multicast Tree Reconfiguration in Distributed Interactive Applications Pål Halvorsen 1,2, Knut-Helge Vik 1 and Carsten Griwodz 1,2 1 Department of Informatics,
Wavelet and Matrix Mechanism CompSci Instructor: Ashwin Machanavajjhala 1Lecture 11 : Fall 12.
A Hierarchical Multiple Target Tracking Algorithm for Sensor Networks Songhwai Oh and Shankar Sastry EECS, Berkeley Nest Retreat, Jan
On the Power of Adaptivity in Sparse Recovery Piotr Indyk MIT Joint work with Eric Price and David Woodruff, 2011.
Image acquisition using sparse (pseudo)-random matrices Piotr Indyk MIT.
Multi-Label Prediction via Compressed Sensing By Daniel Hsu, Sham M. Kakade, John Langford, Tong Zhang (NIPS 2009) Presented by: Lingbo Li ECE, Duke University.
Qiwen Wang, Sidharth Jaggi, Shuo-Yen Robert Li Institute of Network Coding (INC) The Chinese University of Hong Kong October 19, 2011 IEEE Information.
D.J.C MacKay IEE Proceedings Communications, Vol. 152, No. 6, December 2005.
Exploiting Sparse Markov and Covariance Structure in Multiresolution Models Presenter: Zhe Chen ECE / CMR Tennessee Technological University October 22,
Multi-Task Compressive Sensing with Dirichlet Process Priors Yuting Qi 1, Dehong Liu 1, David Dunson 2, and Lawrence Carin 1 1 Department of Electrical.
S. J. Shyu Chap. 1 Introduction 1 The Design and Analysis of Algorithms Chapter 1 Introduction S. J. Shyu.
Noisy Group Testing (Quick and Efficient) Sheng Cai, Mayank Bakshi, Sidharth Jaggi The Chinese University of Hong Kong Mohammad Jahangoshahi Sharif University.
ECE Department Rice University dsp.rice.edu/cs Measurements and Bits: Compressed Sensing meets Information Theory Shriram Sarvotham Dror Baron Richard.
Robust Network Compressive Sensing Lili Qiu UT Austin NSF Workshop Nov. 12, 2014.
“Random Projections on Smooth Manifolds” -A short summary
Volkan Cevher, Marco F. Duarte, and Richard G. Baraniuk European Signal Processing Conference 2008.
Distributed Regression: an Efficient Framework for Modeling Sensor Network Data Carlos Guestrin Peter Bodik Romain Thibaux Mark Paskin Samuel Madden.
1 -1 Chapter 1 Introduction Why Do We Need to Study Algorithms? To learn strategies to design efficient algorithms. To understand the difficulty.
A Single-letter Characterization of Optimal Noisy Compressed Sensing Dongning Guo Dror Baron Shlomo Shamai.
Introduction to Compressive Sensing
© 2005, it - instituto de telecomunicações. Todos os direitos reservados. Gerhard Maierbacher Scalable Coding Solutions for Wireless Sensor Networks IT.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
Rice University dsp.rice.edu/cs Distributed Compressive Sensing A Framework for Integrated Sensing and Processing for Signal Ensembles Marco Duarte Shriram.
Rethinking Internet Traffic Management: From Multiple Decompositions to a Practical Protocol Jiayue He Princeton University Joint work with Martin Suchara,
Randomness in Computation and Communication Part 1: Randomized algorithms Lap Chi Lau CSE CUHK.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Non-adaptive probabilistic group testing with noisy measurements: Near-optimal bounds with efficient algorithms Chun Lam Chan, Pak Hou Che and Sidharth.
Hybrid Dense/Sparse Matrices in Compressed Sensing Reconstruction
Compressed Sensing Compressive Sampling
Compressive sensing SHO-FA: Robust compressive sensing with order-optimal complexity, measurements, and bits 1 Mayank Bakshi, Sidharth Jaggi, Sheng Cai.
An ALPS’ view of Sparse Recovery Volkan Cevher Laboratory for Information and Inference Systems - LIONS
SUPER: Sparse signals with Unknown Phases Efficiently Recovered Sheng Cai, Mayank Bakshi, Sidharth Jaggi and Minghua Chen The Chinese University of Hong.
Minimizing interference for the highway model in Wireless Ad-hoc and Sensor Networks Haisheng Tan, Tiancheng, Lou, Francis C.M. Lau, YuexuanWang, Shiteng.
Sparse Fourier Transform
Gwangju Institute of Science and Technology Intelligent Design and Graphics Laboratory Multi-scale tensor voting for feature extraction from unstructured.
Game Theory Meets Compressed Sensing
Richard Baraniuk Chinmay Hegde Marco Duarte Mark Davenport Rice University Michael Wakin University of Michigan Compressive Learning and Inference.
Compressive Sensing Based on Local Regional Data in Wireless Sensor Networks Hao Yang, Liusheng Huang, Hongli Xu, Wei Yang 2012 IEEE Wireless Communications.
Cs: compressed sensing
Chi-Cheng Lin, Winona State University CS 313 Introduction to Computer Networking & Telecommunication Chapter 5 Network Layer.
Model-Based Compressive Sensing Presenter: Jason David Bonior ECE / CMR Tennessee Technological University November 5, 2010 Reading Group (Richard G. Baraniuk,
Threshold Phenomena and Fountain Codes Amin Shokrollahi EPFL Joint work with M. Luby, R. Karp, O. Etesami.
Handover and Tracking in a Camera Network Presented by Dima Gershovich.
Growth Codes: Maximizing Sensor Network Data Persistence abhinav Kamra, Vishal Misra, Jon Feldman, Dan Rubenstein Columbia University, Google Inc. (SIGSOMM’06)
Stochastic Threshold Group Testing Chun Lam Chan, Sheng Cai, Mayank Bakshi, Sidharth Jaggi The Chinese University of Hong Kong Venkatesh Saligrama Boston.
Shriram Sarvotham Dror Baron Richard Baraniuk ECE Department Rice University dsp.rice.edu/cs Sudocodes Fast measurement and reconstruction of sparse signals.
An Introduction to Compressive Sensing Speaker: Ying-Jou Chen Advisor: Jian-Jiun Ding.
Intradomain Traffic Engineering By Behzad Akbari These slides are based in part upon slides of J. Rexford (Princeton university)
Compressive sensing meets group testing: LP decoding for non-linear (disjunctive) measurements Chun Lam Chan, Sidharth Jaggi and Samar Agnihotri The Chinese.
Amplification and Derandomization Without Slowdown Dana Moshkovitz MIT Joint work with Ofer Grossman (MIT)
Network RS Codes for Efficient Network Adversary Localization Sidharth Jaggi Minghua Chen Hongyi Yao.
Reliable Deniable Communication: Hiding Messages in Noise The Chinese University of Hong Kong The Institute of Network Coding Pak Hou Che Mayank Bakshi.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
© The McGraw-Hill Companies, Inc., Chapter 1 Introduction.
An Introduction to Compressive Sensing
Reliable Deniable Communication: Hiding Messages from Noise Pak Hou Che Joint Work with Sidharth Jaggi, Mayank Bakshi and Madhi Jafari Siavoshani Institute.
Network Coding Tomography for Network Failures
Optimization-based Cross-Layer Design in Networked Control Systems Jia Bai, Emeka P. Eyisi Yuan Xue and Xenofon D. Koutsoukos.
n-pixel Hubble image (cropped)
SketchVisor: Robust Network Measurement for Software Packet Processing
Group Testing and Its Applications
Provable Learning of Noisy-OR Networks
Compressive Coded Aperture Video Reconstruction
M ? n m
Lecture 15 Sparse Recovery Using Sparse Matrices
StreamApprox Approximate Stream Analytics in Apache Spark
Near-Optimal (Euclidean) Metric Compression
Sudocodes Fast measurement and reconstruction of sparse signals
Sudocodes Fast measurement and reconstruction of sparse signals
Presentation transcript:

Fast and robust sparse recovery New Algorithms and Applications The Chinese University of Hong Kong The Institute of Network Coding Sheng Cai Eric Chan Minghua Chen Sidharth Jaggi Mohammad Jahangoshahi Venkatesh Saligrama Mayank Bakshi INC, CUHK

? n 2 Fast and robust sparse recovery m m<n k Unknown x Measurement Measurement output Reconstruct x

? n m<n m 3 Fast and robust sparse recovery

A. Compressive sensing 4 ? k ≤ m<n ? n m k

A. Robust compressive sensing Approximate sparsity Measurement noise 5 ?

Tomography Computerized Axial (CAT scan)

B. Tomography Estimate x given y and T y = Tx

B. Network Tomography Measurements y: End-to-end packet delays Transform T: Network connectivity matrix (known a priori) Infer x: Link/node congestion Hopefully “k-sparse” Compressive sensing? Challenge: Matrix T “fixed” Can only take “some” types of measurements

n-d d 1 0 q 1 q For Pr(error)< ε, Lower bound: Noisy Combinatorial OMP: What’s known …[CCJS11] 0 9 C. Robust group testing

A. Robust compressive sensing Approximate sparsity Measurement noise 11 ?

Apps: 1. Compression 12 W(x+z) BW(x+z)= A(x+z) M.A. Davenport, M.F. Duarte, Y.C. Eldar, and G. Kutyniok, "Introduction to Compressed Sensing,"in Compressed Sensing: Theory and Applications, 2012"Introduction to Compressed Sensing," x+zx+z

Apps: 2. Fast(er) Fourier Transform 13 H. Hassanieh, P. Indyk, D. Katabi, and E. Price. Nearly optimal sparse fourier transform. In Proceedings of the 44th symposium on Theory of Computing (STOC '12).

Apps: 3. One-pixel camera 14

y=A(x+z)+e 15

y=A(x+z)+e 16

y=A(x+z)+e 17

y=A(x+z)+e 18

y=A(x+z)+e (Information-theoretically) order-optimal 19

(Information-theoretically) order- optimal Support Recovery 20

SHO-FA:SHO(rt)-FA(st)

O(k) measurements, O(k) time

SHO (rt) -FA (st) O(k) meas., O(k) steps 23

1. Graph-Matrix n ck d=3 24 A

1. Graph-Matrix 25 n ck A d=3

26 1. Graph-Matrix

2. (Most) x-expansion ≥2|S| |S| 27

3. “Many” leafs ≥2|S| |S| L+L’≥2|S| 3|S|≥L+2L’ L≥|S| L+L’≤3|S| L/(L+L’) ≥1/3 L/(L+L’) ≥1/2 28

4. Matrix 29

Encoding – Recap

Decoding – Initialization 31

Decoding – Leaf Check(2-Failed-ID) 32

Decoding – Leaf Check (4-Failed-VER) 33

Decoding – Leaf Check(1-Passed) 34

Decoding – Step 4 (4-Passed/STOP) 35

Decoding – Recap ? ? ?

Decoding – Recap

38

Noise/approx. sparsity 39

Meas/phase error 40

Correlated phase meas. 41

Correlated phase meas. 42

Correlated phase meas. 43

Goal: Infer network characteristics (edge or node delay) Difficulties: – Edge-by-edge (or node-by node) monitoring too slow – Inaccessible nodes 44 Network Tomography

Goal: Infer network characteristics (edge or node delay) Difficulties: – Edge-by-edge (or node-by node) monitoring too slow – Inaccessible nodes Network Tomography: – with very few end-to-end measurements – quickly – for arbitrary network topology 45 Network Tomography

B. Network Tomography Measurements y: End-to-end packet delays Transform T: Network connectivity matrix (known a priori) Infer x: Link/node congestion Hopefully “k-sparse” Compressive sensing? Idea: “Mimic” random matrix Challenge: Matrix T “fixed” Can only take “some” types of measurements Our algorithm: FRANTIC Fast Reference-based Algorithm for Network Tomography vIa Compressive sensing

A Better TOMORROW fast TOMOgRaphy oveR netwOrks with feW probes

SHO-FA 49 n ck A d=3

50 T 1. Integer valued CS [BJCC12] “SHO-FA-INT”

2. Better mimicking of desired T

Node delay estimation

Edge delay estimation

Idea 1: Cancellation,,

Idea 2: “Loopy” measurements Fewer measurements Arbitrary packet injection/ reception Not just 0/1 matrices (SHO-FA),

SHO-FA + Cancellations + Loopy measurements Path delay: O(MDn/k) Parameters – n = |V| or |E| – M = “loopiness” – k = sparsity Results – Measurements: O(k log(n)/log(M)) – Decoding time: O(k log(n)/log(M)) – General graphs, node/edge delay estimation 17

SHO-FA + Cancellations + Loopy measurements Path delay: O(MD’n/k) (Steiner/”Average Steiner” trees) Parameters – n = |V| or |E| – M = “loopiness” – k = sparsity Results – Measurements: O(k log(n)/log(M)) – Decoding time: O(k log(n)/log(M)) – General graphs, node/edge delay estimation 17

SHO-FA + Cancellations + Loopy measurements Path delay: ??? (Graph decompositions) Parameters – n = |V| or |E| – M = “loopiness” – k = sparsity Results – Measurements: O(k log(n)/log(M)) – Decoding time: O(k log(n)/log(M)) – General graphs, node/edge delay estimation 17

C. GROTESQUE: Noisy GROup TESting (QUick and Efficient)

n-d d 1 0 q 1 q For Pr(error)< ε, Lower bound: Noisy Combinatorial OMP: What’s known …[CCJS11] 0 63

Decoding complexity # Tests  Lower bound Lower bound   Adaptive Non-Adaptive 2-Stage Adaptive This work   O(poly(D)log(N)),O(D 2 log(N)) O(DN),O(Dlog(N)) [NPR12]

Decoding complexity # Tests    This work

Hammer: GROTESQUE testing

Multiplicity ?

Localization ? Noiseless: Noisy:

Nail: “Good” Partioning GROTESQUE n items d defectives

Adaptive Group Testing O(n/d)

Adaptive Group Testing O(n/d) GROTESQUE O(dlog(n)) time, tests, constant fraction recovered

Adaptive Group Testing Each stage constant fraction recovered # tests, time decaying geometrically

Adaptive Group Testing T=O(logD)

Non-Adaptive Group Testing Constant fraction “good” O(Dlog(D))

Non-Adaptive Group Testing Iterative Decoding

2-Stage Adaptive Group Testing =D 2

2-Stage Adaptive Group Testing =D 2 O(Dlog(D)log(D 2 )) tests, time

2-Stage Adaptive Group Testing No defectives share the same “birthday” when S=poly(D) =D 2 O(Dlog(D)log(D 2 )) tests, time

2-Stage Adaptive Group Testing =D 2 O(Dlog(N)) tests, time

Observation: – only few edges (or nodes) “unknown” => sparse recovery problem 2 Network Tomography

3 ? k ≤ m<n ? n m k Compressive Sensing Random

4 Network Tomography as a Compressive sensing Problem End-to-end delay Edge delay

4 Network Tomography as a Compressive sensing Problem End-to-end delay Node delay

4 Network Tomography as a Compressive sensing Problem End-to-end delay Node delay Fixed network topology Random measurements

FasterHigherStronger 5

Decoding complexity # of measurements ° RS’60 ° TG’07 ° CM’06 ° C’08 ° IR’08 ° SBB’06 ° GSTV’06 ° MV’12,KP’12 ° DJM’11  Our work Lower bound 1. Better CS [BJCC12] “SHO(rt)-FA(st)” 6

7 SHO(rt)-FA(st) Good Bad Good Bad O(k) measurements, O(k) time

High-Level Overview n ck k= n ck k=2 A

High-Level Overview n ck k=2 How to find the leaf nodes and utilize the leaf nodes to do decoding How to guarantee the existence of leaf node

Bipartite Graph → Sensing Matrix n ck d=3 10 A Distinct weights

Bipartite Graph → Sensing Matrix 10 n ck d=3 Distinct weights “sparse & random” matrix A

Sensing Matrix→ Measurement Design 11

2. Better mimicking of desired A 12

Node delay estimation 13

Node delay estimation 13

Node delay estimation Problems – General graph – Inaccessible nodes – Edge delay estimation 13

Edge delay estimation 14

Idea 1: Cancellation,, 15

, Fewer measurements Even if there exists inaccessible node (e.g. v 3 ) Go beyond 0/1 matrices (sho-fa) Idea 2: “Loopy” measurements 16

SHO-FA + Cancellations + Loopy measurements Path delay: O(MDn/k) Path delay: O(MD’n/k) (Steiner trees) Path delay: O(MD’’n/k) (“Average” Steiner trees) Path delay: ??? (Graph decompositions) Parameters – n = |V| or |E| – M = “loopiness” – k = sparsity Results – Measurements: O(k log(n)/log(M)) – Decoding time: O(k log(n)/log(M)) – General graphs, node/edge delay estimation 17

D. Threshold Group Testing # defective items in a group Probability that Output is positive n items d defectives Each test: Goal: find all d defectives Our result: tests suffice; Previous best algorithms:

Summary Fast and Robust Sparse Recovery algorithms Compressive sensing: Order optimal complexity, # of measurements Network Tomography: Nearly optimal complexity, # of measurements Group Testing: Optimal complexity, nearly optimal # of tests - Threshold Group Testing: Nearly optimal # of tests

THANK YOU 謝謝 18