Presentation is loading. Please wait.

Presentation is loading. Please wait.

Fast and robust sparse recovery New Algorithms and Applications The Chinese University of Hong Kong The Institute of Network Coding Sheng Cai Eric Chan.

Similar presentations


Presentation on theme: "Fast and robust sparse recovery New Algorithms and Applications The Chinese University of Hong Kong The Institute of Network Coding Sheng Cai Eric Chan."— Presentation transcript:

1 Fast and robust sparse recovery New Algorithms and Applications The Chinese University of Hong Kong The Institute of Network Coding Sheng Cai Eric Chan Minghua Chen Sidharth Jaggi Mohammad Jahangoshahi Venkatesh Saligrama Mayank Bakshi INC, CUHK

2 ? n 2 Fast and robust sparse recovery m m<n k Unknown x Measurement Measurement output Reconstruct x

3 ? n m<n m 3 Fast and robust sparse recovery

4 A. Compressive sensing 4 ? k ≤ m<n ? n m k

5 A. Robust compressive sensing Approximate sparsity Measurement noise 5 ?

6 Tomography Computerized Axial (CAT scan)

7 B. Tomography Estimate x given y and T y = Tx

8 B. Network Tomography Measurements y: End-to-end packet delays Transform T: Network connectivity matrix (known a priori) Infer x: Link/node congestion Hopefully “k-sparse” Compressive sensing? Challenge: Matrix T “fixed” Can only take “some” types of measurements

9 n-d d 1 0 q 1 q For Pr(error)< ε, Lower bound: Noisy Combinatorial OMP: What’s known …[CCJS11] 0 9 C. Robust group testing

10

11 A. Robust compressive sensing Approximate sparsity Measurement noise 11 ?

12 Apps: 1. Compression 12 W(x+z) BW(x+z)= A(x+z) M.A. Davenport, M.F. Duarte, Y.C. Eldar, and G. Kutyniok, "Introduction to Compressed Sensing,"in Compressed Sensing: Theory and Applications, 2012"Introduction to Compressed Sensing," x+zx+z

13 Apps: 2. Fast(er) Fourier Transform 13 H. Hassanieh, P. Indyk, D. Katabi, and E. Price. Nearly optimal sparse fourier transform. In Proceedings of the 44th symposium on Theory of Computing (STOC '12).

14 Apps: 3. One-pixel camera http://dsp.rice.edu/sites/dsp.rice.edu/files/cs/cscam.gif 14

15 y=A(x+z)+e 15

16 y=A(x+z)+e 16

17 y=A(x+z)+e 17

18 y=A(x+z)+e 18

19 y=A(x+z)+e (Information-theoretically) order-optimal 19

20 (Information-theoretically) order- optimal Support Recovery 20

21 SHO-FA:SHO(rt)-FA(st)

22 O(k) measurements, O(k) time

23 SHO (rt) -FA (st) O(k) meas., O(k) steps 23

24 1. Graph-Matrix n ck d=3 24 A

25 1. Graph-Matrix 25 n ck A d=3

26 26 1. Graph-Matrix

27 2. (Most) x-expansion ≥2|S| |S| 27

28 3. “Many” leafs ≥2|S| |S| L+L’≥2|S| 3|S|≥L+2L’ L≥|S| L+L’≤3|S| L/(L+L’) ≥1/3 L/(L+L’) ≥1/2 28

29 4. Matrix 29

30 Encoding – Recap. 30 0101001010

31 Decoding – Initialization 31

32 Decoding – Leaf Check(2-Failed-ID) 32

33 Decoding – Leaf Check (4-Failed-VER) 33

34 Decoding – Leaf Check(1-Passed) 34

35 Decoding – Step 4 (4-Passed/STOP) 35

36 Decoding – Recap. 36 0000000000 ? ? ? 0001000010

37 Decoding – Recap. 28 0101001010

38 38

39 Noise/approx. sparsity 39

40 Meas/phase error 40

41 Correlated phase meas. 41

42 Correlated phase meas. 42

43 Correlated phase meas. 43

44 Goal: Infer network characteristics (edge or node delay) Difficulties: – Edge-by-edge (or node-by node) monitoring too slow – Inaccessible nodes 44 Network Tomography

45 Goal: Infer network characteristics (edge or node delay) Difficulties: – Edge-by-edge (or node-by node) monitoring too slow – Inaccessible nodes Network Tomography: – with very few end-to-end measurements – quickly – for arbitrary network topology 45 Network Tomography

46 B. Network Tomography Measurements y: End-to-end packet delays Transform T: Network connectivity matrix (known a priori) Infer x: Link/node congestion Hopefully “k-sparse” Compressive sensing? Idea: “Mimic” random matrix Challenge: Matrix T “fixed” Can only take “some” types of measurements Our algorithm: FRANTIC Fast Reference-based Algorithm for Network Tomography vIa Compressive sensing

47

48 A Better TOMORROW fast TOMOgRaphy oveR netwOrks with feW probes

49 SHO-FA 49 n ck A d=3

50 50 T 1. Integer valued CS [BJCC12] “SHO-FA-INT”

51 2. Better mimicking of desired T

52 Node delay estimation

53

54

55 Edge delay estimation

56 Idea 1: Cancellation,,

57 Idea 2: “Loopy” measurements Fewer measurements Arbitrary packet injection/ reception Not just 0/1 matrices (SHO-FA),

58

59 SHO-FA + Cancellations + Loopy measurements Path delay: O(MDn/k) Parameters – n = |V| or |E| – M = “loopiness” – k = sparsity Results – Measurements: O(k log(n)/log(M)) – Decoding time: O(k log(n)/log(M)) – General graphs, node/edge delay estimation 17

60 SHO-FA + Cancellations + Loopy measurements Path delay: O(MD’n/k) (Steiner/”Average Steiner” trees) Parameters – n = |V| or |E| – M = “loopiness” – k = sparsity Results – Measurements: O(k log(n)/log(M)) – Decoding time: O(k log(n)/log(M)) – General graphs, node/edge delay estimation 17

61 SHO-FA + Cancellations + Loopy measurements Path delay: ??? (Graph decompositions) Parameters – n = |V| or |E| – M = “loopiness” – k = sparsity Results – Measurements: O(k log(n)/log(M)) – Decoding time: O(k log(n)/log(M)) – General graphs, node/edge delay estimation 17

62 C. GROTESQUE: Noisy GROup TESting (QUick and Efficient)

63 n-d d 1 0 q 1 q For Pr(error)< ε, Lower bound: Noisy Combinatorial OMP: What’s known …[CCJS11] 0 63

64 Decoding complexity # Tests  Lower bound Lower bound   Adaptive Non-Adaptive 2-Stage Adaptive This work   O(poly(D)log(N)),O(D 2 log(N)) O(DN),O(Dlog(N)) [NPR12]

65 Decoding complexity # Tests    This work

66 Hammer: GROTESQUE testing

67 Multiplicity ?

68 Localization ? Noiseless: Noisy:

69 Nail: “Good” Partioning GROTESQUE n items d defectives

70 Adaptive Group Testing O(n/d)

71 Adaptive Group Testing O(n/d) GROTESQUE O(dlog(n)) time, tests, constant fraction recovered

72 Adaptive Group Testing Each stage constant fraction recovered # tests, time decaying geometrically

73 Adaptive Group Testing T=O(logD)

74 Non-Adaptive Group Testing Constant fraction “good” O(Dlog(D))

75 Non-Adaptive Group Testing Iterative Decoding

76 2-Stage Adaptive Group Testing =D 2

77 2-Stage Adaptive Group Testing =D 2 O(Dlog(D)log(D 2 )) tests, time

78 2-Stage Adaptive Group Testing No defectives share the same “birthday” when S=poly(D) =D 2 O(Dlog(D)log(D 2 )) tests, time

79 2-Stage Adaptive Group Testing =D 2 O(Dlog(N)) tests, time

80

81 Observation: – only few edges (or nodes) “unknown” => sparse recovery problem 2 Network Tomography

82 3 ? k ≤ m<n ? n m k Compressive Sensing Random

83 4 Network Tomography as a Compressive sensing Problem End-to-end delay Edge delay

84 4 Network Tomography as a Compressive sensing Problem End-to-end delay Node delay

85 4 Network Tomography as a Compressive sensing Problem End-to-end delay Node delay Fixed network topology Random measurements

86 FasterHigherStronger 5

87 Decoding complexity # of measurements ° RS’60 ° TG’07 ° CM’06 ° C’08 ° IR’08 ° SBB’06 ° GSTV’06 ° MV’12,KP’12 ° DJM’11  Our work Lower bound 1. Better CS [BJCC12] “SHO(rt)-FA(st)” 6

88 7 SHO(rt)-FA(st) Good Bad Good Bad O(k) measurements, O(k) time

89 High-Level Overview 8 4 3 4 n ck k=2 4 3 4 n ck k=2 A

90 High-Level Overview 9 4 3 4 3 4 n ck k=2 How to find the leaf nodes and utilize the leaf nodes to do decoding How to guarantee the existence of leaf node

91 Bipartite Graph → Sensing Matrix n ck d=3 10 A Distinct weights

92 Bipartite Graph → Sensing Matrix 10 n ck d=3 Distinct weights “sparse & random” matrix A

93 Sensing Matrix→ Measurement Design 11

94 2. Better mimicking of desired A 12

95 Node delay estimation 13

96 Node delay estimation 13

97 Node delay estimation Problems – General graph – Inaccessible nodes – Edge delay estimation 13

98 Edge delay estimation 14

99 Idea 1: Cancellation,, 15

100 , Fewer measurements Even if there exists inaccessible node (e.g. v 3 ) Go beyond 0/1 matrices (sho-fa) Idea 2: “Loopy” measurements 16

101 SHO-FA + Cancellations + Loopy measurements Path delay: O(MDn/k) Path delay: O(MD’n/k) (Steiner trees) Path delay: O(MD’’n/k) (“Average” Steiner trees) Path delay: ??? (Graph decompositions) Parameters – n = |V| or |E| – M = “loopiness” – k = sparsity Results – Measurements: O(k log(n)/log(M)) – Decoding time: O(k log(n)/log(M)) – General graphs, node/edge delay estimation 17

102 D. Threshold Group Testing # defective items in a group Probability that Output is positive n items d defectives Each test: Goal: find all d defectives Our result: tests suffice; Previous best algorithms:

103 Summary Fast and Robust Sparse Recovery algorithms Compressive sensing: Order optimal complexity, # of measurements Network Tomography: Nearly optimal complexity, # of measurements Group Testing: Optimal complexity, nearly optimal # of tests - Threshold Group Testing: Nearly optimal # of tests

104 THANK YOU 謝謝 18


Download ppt "Fast and robust sparse recovery New Algorithms and Applications The Chinese University of Hong Kong The Institute of Network Coding Sheng Cai Eric Chan."

Similar presentations


Ads by Google