Compressive sensing meets group testing: LP decoding for non-linear (disjunctive) measurements Chun Lam Chan, Sidharth Jaggi and Samar Agnihotri The Chinese.

Slides:



Advertisements
Similar presentations
Integer Optimization Basic Concepts Integer Linear Program(ILP): A linear program except that some or all of the decision variables must have integer.
Advertisements

1 LP Duality Lecture 13: Feb Min-Max Theorems In bipartite graph, Maximum matching = Minimum Vertex Cover In every graph, Maximum Flow = Minimum.
The Stability of a Good Clustering Marina Meila University of Washington
Qiwen Wang, Sidharth Jaggi, Shuo-Yen Robert Li Institute of Network Coding (INC) The Chinese University of Hong Kong October 19, 2011 IEEE Information.
Optimization. f(x) = 0 g i (x) = 0 h i (x)
T HE POWER OF C ONVEX R ELAXATION : N EAR - OPTIMAL MATRIX COMPLETION E MMANUEL J. C ANDES AND T ERENCE T AO M ARCH, 2009 Presenter: Shujie Hou February,
Convex Position Estimation in Wireless Sensor Networks
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
Noisy Group Testing (Quick and Efficient) Sheng Cai, Mayank Bakshi, Sidharth Jaggi The Chinese University of Hong Kong Mohammad Jahangoshahi Sharif University.
CSCI 3160 Design and Analysis of Algorithms Tutorial 6 Fei Chen.
Amplify-and-Forward in Wireless Relay Networks Samar Agnihotri, Sidharth Jaggi, Minghua Chen Institute of Network Coding The Chinese University of Hong.
Diagnosis of Ovarian Cancer Based on Mass Spectra of Blood Samples Hong Tang Yelena Mukomel Eugene Fink.
A Single-letter Characterization of Optimal Noisy Compressed Sensing Dongning Guo Dror Baron Shlomo Shamai.
Vapnik-Chervonenkis Dimension
Answer to set 1 Graphical Linear Programming. (1) 4X1 + 3X2 >96 X1X2 096/3=32 96/4=240.
THIS IS THE TITLE OF MY POSTER. THIS IS THE TITLE OF MY POSTER NAMES, NAMES, & NAMES This is my abstract. This is my abstract. This is my abstract. This.
Integer Programming Difference from linear programming –Variables x i must take on integral values, not real values Lots of interesting problems can be.
Tracking a maneuvering object in a noisy environment using IMMPDAF By: Igor Tolchinsky Alexander Levin Supervisor: Daniel Sigalov Spring 2006.
Review of Reservoir Problem OR753 October 29, 2014 Remote Sensing and GISc, IST.
Non-adaptive probabilistic group testing with noisy measurements: Near-optimal bounds with efficient algorithms Chun Lam Chan, Pak Hou Che and Sidharth.
Chapter 7 Systematic Sampling n Selection of every kth case from a list of possible subjects.
Approximation Algorithms: Bristol Summer School 2008 Seffi Naor Computer Science Dept. Technion Haifa, Israel TexPoint fonts used in EMF. Read the TexPoint.
1 OR II GSLM Outline  separable programming  quadratic programming.
Compressive sensing SHO-FA: Robust compressive sensing with order-optimal complexity, measurements, and bits 1 Mayank Bakshi, Sidharth Jaggi, Sheng Cai.
Linear Programming Models: Graphical Method © 2007 Pearson Education from the companion CD - Chapter 2 of the book: Balakrishnan, Render, and Stair, “Managerial.
SUPER: Sparse signals with Unknown Phases Efficiently Recovered Sheng Cai, Mayank Bakshi, Sidharth Jaggi and Minghua Chen The Chinese University of Hong.
Cs: compressed sensing
Increasing / Decreasing Test
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 653 Course.
Fast and robust sparse recovery New Algorithms and Applications The Chinese University of Hong Kong The Institute of Network Coding Sheng Cai Eric Chan.
Uncorrectable Errors of Weight Half the Minimum Distance for Binary Linear Codes Kenji Yasunaga * Toru Fujiwara + * Kwansei Gakuin University, Japan +
Stabilized column generation  Classical column generation often very slow.  Slow convergence (tailing-off effect)  Poor columns in initial stages (head-in.
* Dong-Hyawn Kim: Graduate Student, KAIST Ju-Won Oh: Professor, Hannam University Ju-Won Oh: Professor, Hannam University In-Won Lee: Professor, KAIST.
Stochastic Threshold Group Testing Chun Lam Chan, Sheng Cai, Mayank Bakshi, Sidharth Jaggi The Chinese University of Hong Kong Venkatesh Saligrama Boston.
Инвестиционный паспорт Муниципального образования «Целинский район»
A Factor-Graph Approach to Joint OFDM Channel Estimation and Decoding in Impulsive Noise Channels Philip Schniter The Ohio State University Marcel Nassar,
Nonlinear Programming Models
(x – 8) (x + 8) = 0 x – 8 = 0 x + 8 = x = 8 x = (x + 5) (x + 2) = 0 x + 5 = 0 x + 2 = x = - 5 x = - 2.
Functions of random variables Sometimes what we can measure is not what we are interested in! Example: mass of binary-star system: We want M but can only.
Column Generation By Soumitra Pal Under the guidance of Prof. A. G. Ranade.
Learning Spectral Clustering, With Application to Speech Separation F. R. Bach and M. I. Jordan, JMLR 2006.
High-dimensional Error Analysis of Regularized M-Estimators Ehsan AbbasiChristos ThrampoulidisBabak Hassibi Allerton Conference Wednesday September 30,
Integer LP In-class Prob
+ Quadratic Programming and Duality Sivaraman Balakrishnan.
Chapter 2. Optimal Trees and Paths Combinatorial Optimization
Reliable Deniable Communication: Hiding Messages in Noise The Chinese University of Hong Kong The Institute of Network Coding Pak Hou Che Mayank Bakshi.
Feature Selction for SVMs J. Weston et al., NIPS 2000 오장민 (2000/01/04) Second reference : Mark A. Holl, Correlation-based Feature Selection for Machine.
An improved unequal error protection technique for the wireless transmission of MPEG-4 Video Bo Yan, Kam Wing NG The Chinese University of Hong Kong ICICS-PCM.
Linear Program MAX C B X B + C NB X NB s.t. BX B + A NB X NB = b X B, X NB ≥ 0.
Computational Sensing = Modeling + Optimization CENS seminar Jan 28, 2005 Miodrag Potkonjak Key Contributors: Bradley Bennet, Alberto.
ON AVCS WITH QUADRATIC CONSTRAINTS Farzin Haddadpour Joint work with Madhi Jafari Siavoshani, Mayank Bakshi and Sidharth Jaggi Sharif University of Technology,
1 Lu LIU and Jie HUANG Department of Mechanics & Automation Engineering The Chinese University of Hong Kong 9 December, Systems Workshop on Autonomous.
TU/e Algorithms (2IL15) – Lecture 12 1 Linear Programming.
Efficient Point Coverage in Wireless Sensor Networks Jie Wang and Ning Zhong Department of Computer Science University of Massachusetts Journal of Combinatorial.
Common Intersection of Half-Planes in R 2 2 PROBLEM (Common Intersection of half- planes in R 2 ) Given n half-planes H 1, H 2,..., H n in R 2 compute.
Secure Error-Correcting (SEC) Network Codes Raymond W. Yeung Institute of Network Coding & Department of Information Engineering The Chinese University.
Approximation Algorithms Duality My T. UF.
Factoring GCF – Greatest Common Factor Difference of 2 Squares Factoring by Grouping Factoring Trinomials.
照片档案整理 一、照片档案的含义 二、照片档案的归档范围 三、 卷内照片的分类、组卷、排序与编号 四、填写照片档案说明 五、照片档案编目及封面、备考填写 六、数码照片整理方法 七、照片档案的保管与保护.
공무원연금관리공단 광주지부 공무원대부등 공적연금 연계제도 공무원연금관리공단 광주지부. 공적연금 연계제도 국민연금과 직역연금 ( 공무원 / 사학 / 군인 / 별정우체국 ) 간의 연계가 이루어지지 않고 있 어 공적연금의 사각지대가 발생해 노후생활안정 달성 미흡 연계제도 시행전.
Жюль Верн ( ). Я мальчиком мечтал, читая Жюля Верна, Что тени вымысла плоть обретут для нас; Что поплывет судно громадней «Грейт Истерна»; Что.
L11 Optimal Design L.Multipliers
(B.P :51) ( B:P52 ).
Sample Presentation. Slide 1 Info Slide 2 Info.
Chapter 10 Inferences on Two Samples
Parallel and Distributed Simulation
DATA.
Chapter 1. Formulations (BW)
Chapter 1. Formulations.
1.2 Guidelines for strong formulations
Presentation transcript:

Compressive sensing meets group testing: LP decoding for non-linear (disjunctive) measurements Chun Lam Chan, Sidharth Jaggi and Samar Agnihotri The Chinese University of Hong Kong Venkatesh Saligrama Boston University

2 n-d d Lower bound: OMP: What’s known BP: Compressive sensing

3 n-d d Group testing: q 1 q Lower bound: Noisy Combinatorial OMP: What’s known This work: Noisy Combinatorial BP: …[CCJS11]

4 Group-testing model p=1/D [CCJS11]

5 CBP-LP relaxation weight positive tests negative tests

6 NCBP-LP “Slack”/noise variables Minimum distance decoding

7 “Perturbation analysis” 1.For all (“Conservation of mass”) 2. LP change under a single ρ i (Case analysis) 3. LP change under all n(n-d) ρ i s (Chernoff/union bounds) 4. LP change under all (∞) perturbations (Convexity) (5.) If d unknown but bounded, try ‘em all (“Info thry”)

8 1. Perturbation vectors NCBLP feasible set x ρiρi ρjρj dn-d

9 2. LP value change with ONE perturbation vector x

10 3. LP value change with EACH (n(n-d)) perturbation vector Union boundChernoff bound Prob error < x

11 4. LP value change under ALL (∞) perturbations x Prob error < Convexity of min LP = x

12 (5.) NCBP-LPs Information-theoretic argument – just a single d “works”.

13 Bonus: NCBP-SLPs ONLY negative tests ONLY positive tests

14

Noiseless CBP 15 n-d d

Noiseless CBP 16 n-d d Discard

Noiseless CBP 17  Sample g times to form a group n-d d

Noiseless CBP 18  Sample g times to form a group n-d d

Noiseless CBP 19  Sample g times to form a group n-d d

Noiseless CBP 20  Sample g times to form a group n-d d

Noiseless CBP 21  Sample g times to form a group  Total non-defective items drawn: n-d d

Noiseless CBP 22  Sample g times to form a group  Total non-defective items drawn:  Coupon collection: n-d d

Noiseless CBP 23  Sample g times to form a group  Total non-defective items drawn:  Coupon collection:  Conclusion: n-d d

Noisy CBP 24 n-d d

Noisy CBP 25 n-d d

Noisy CBP 26 n-d d

Noisy CBP 27 n-d d

Noiseless COMP 28

Noiseless COMP 29

Noiseless COMP 30

Noiseless COMP 31

Noiseless COMP 32

Noisy COMP 33

Noisy COMP 34

Noisy COMP 35

Noisy COMP 36

Noisy COMP 37

Noisy COMP 38

Noisy COMP 39

Simulations 40

Simulations 41

Summary 42  With small error,

Noiseless COMP x My

x My x9x9 01 → Noiseless COMP 44

Noiseless COMP x My x7x7 11 →

Noiseless COMP x My x4x4 01 →

Noiseless COMP x My x4x4 00x7x7 10x9x9 (a)01 → 1(b)11 → 1(c)01 →

Noisy COMP x My ν ŷ →

Noisy COMP x My ν ŷ → x3x3 10 →

Noisy COMP x My ν ŷ → x2x2 10 →

Noisy COMP x My ν ŷ → x7x7 10 →

Noisy COMP x My ν ŷ → x2x2 01x3x3 01x7x7 (a)10 → 1(b)10 → 1(c)10 →

Noisy COMP x My ν ŷ → x2x2 01x3x3 01x7x7 (a)10 → 1(b)10 → 1(c)10 →