Presentation is loading. Please wait.

Presentation is loading. Please wait.

Brief Introduction to Measurement Matrix

Similar presentations


Presentation on theme: "Brief Introduction to Measurement Matrix"— Presentation transcript:

1 Brief Introduction to Measurement Matrix
Presenter : Yumin (林祐民) Advisor : Prof. An-Yeu Wu Date : 2014/04/08

2 Outline Midterm Presentation Compressive Sensing
Construct Sensing Matrix Criteria of RIP Matrices Random Sensing Deterministic Sensing Application of Compressive Sensing Medical Imaging Compressive Imagine Midterm Presentation Information Paper Survey

3 Compressive Sensing

4 Compressive Sensing(1/2)
[1][2][3] Traditional digital data acquisition Sample data with Nyquist rate Compress data Compressive sensing Main idea: compression within sampling

5 Compressive Sensing (2/2)
Measure what should be measured

6 Construct Sensing Matrix - Criteria of RIP Matrices - Random Sensing - Deterministic Sensing

7 Measurement Fundamental questions in compressive sensing
How to construction suitable sensing matrices Φ How to recovery signal From orthogonal basis sensing to non-linear sensing X =(x0, x1, x2, x3,∙∙∙∙∙∙∙, xn) V0=(1, 0, 0, 0, ∙∙∙∙∙∙, 0) =δ [k] V1=(0, 1, 0, 0, ∙∙∙∙∙∙, 0) =δ [k-1] Vn=(0, 0, 0, 0, ∙∙∙∙∙∙, 1) =δ [k-n] Y = V∙X Full rank y0=x0 y1=x1 yn = xn X =(x0, x1, x2, x3,∙∙∙∙∙∙∙, xn) V0=(1, 0, 0, 1, ∙∙∙∙∙∙, 0) V1=(0, 1, 0, 0, ∙∙∙∙∙∙, 0) Vm=(0, 0, 1, 0, ∙∙∙∙∙∙, 1) Y = V∙X y0 = x0 + x3 y1 = x1 + x8 ym = x2+xn Non-deterministic Polynomial-time problem CS

8 How Can It Work Projection Φ not full rank
M<N Loses information in general Interested in K-sparse signal Design Φ so that each of it’s MxK submetrices are full rank Pseidoinverse to recover the nonzero coefficient of x yMx1 ΦMxN xNx1 yMx1 ΦMxN xNx1 yMx1 xNx1 Design Φ so that each of it’s MxK submetrices are full rank: Ideally close to orthobasis K columns K-sparse

9 Restricted Isometry Property
Signal Sparsity S-parse Restricted Isometry Property Nearly orthonormal when operation on sparse vector Random constructions exist δ with high probability x α JPEG2000 < 0.1

10 Criteria of Good Matrices
Good matrices satisfied Columns vector of Φ is small linear dependent Columns vector of Φ is low coherence, which means like randomness Random matrices satisfied RIP with high probability Nearly orthonormal when operation on sparse vector Random matrix: Gaussian random matrix Partial random matrices: random Fourier matrix δ ≤ (𝐾−1) 𝑀 , 𝜇 𝛷 = 1 𝑀 , spark(𝛷)>2K → 𝜹≤ 𝟏 𝟐 𝝁 𝜱 ×𝒔𝒑𝒂𝒓𝒌(𝜱) [2007’ Donoho D]

11 Gaussian Random Matrix
Fill out the entries of Φ with i.i.d. samples form Gaussian distribution Project on to a “random subspace” M: measurement S: non-zero number N: signal dimension M=O(Slog(N/S)) << N

12 Random Fourier Matrix M=O(Slogp(N/S)) << N
Partial Random Measurement Matrixes Generate NxN matrix Φ0 and choose M rows to construct MxN measurement matrix Φ NxN matrix Φ0 : Random set : MxN matrix Φ0 : M=O(Slogp(N/S)) << N

13 From Random to Deterministic
Random Sensing Non-mainstream of signal processing: Worst Case Less efficient recovery time Larger storage Less measurements for K-sparse signals Deterministic Sensing Mainstream of signal processing: Average Case More efficient recovery time Efficient/compact storage More measurements for K-sparse signals [4] 在早期,大家所利用的RIP矩陣多為隨機的,淺顯地說這是因為亂度足夠的隨機矩陣更容易滿足RIP 後來,也有大量研究證實了一些特別設計的deterministic矩陣也能滿足統計意義上的RIP,或滿足一些其他形式的比較弱的RIP,譬如用Delsarte–Goethals codes或Reed Muller codes生成的那些 其實這些研究和大量新概念的定義都是很不容易的, 但大家之所以這麼熱衷地要推deterministic matrices,主要是因為祂有如下的這些好處:……

14 Issues for Simplifying Measurement Matrices
From complex to sparse to: Structurally-simplified Numerically-simplified Steady recovery performance Simplifying Existing Sampling Matrices Becomes Prominent 但問題是,我們有沒有辦法不僅從運算上,也同時能夠從形式和硬體實現可能性上,進一步簡化sampling matrices,進而提高CS在系統前端的效率和效費比? 一個很自然的想法就是將sampling matrix在數值上簡化 當然另一條思路是在【結構】上進行簡化 同時,上述的這些簡化不能夠干擾到後端 recovery的準確性

15 Deterministic Simplification(1/2)
Structurally-simplified Numerically-simplified Devore’s binary (0/1) BCH-bipolar (±1) Combinatorial-ternary (±1/0) 1 2 3 4 5 … … Generation Complexity = O(kn) Sampling Complexity = O(kn) Generation Complexity = O(n) Sampling Complexity = O(n*logn) i.i.d = independently & identically distributed 相同獨立分佈 n逐漸增加代表著measurements數m一定時,sparsity在增大,所以成功recover的機率就越來越高 【跑k=1000次算平均】

16 Deterministic Simplification(2/2)
Structural Simplification Numerically-simplified Signal Length n Empirical Probability of Success Number of Non-Zero Entries Successful Recovery Rate (SNRrec≥100dB) Steady Recovery Performance !!

17 Application of Compressive Sensing - Medical Imaging - Compressive Imagine

18 Applications of Compressive Sensing
Compressive sensing leads to data acquisition revolution Object Recognition Compressive MIMO Radar Electronic Gate Random Modulator Analog-to-Information Conversion Modulated Wideband Converter Ultrasound Medical Imaging Electrocardiography Single-pixel Camera Compressive Imaging Lensless Camera High Speed Periodic Video

19 Portable ECG Reduce data rate in bio-signal acquisition system
[12] Reduce data rate in bio-signal acquisition system Sampling rate 256Hz, resolution 12bit Bandwidth = 256*12 = 3072bit/s = 3Mb/s CS can provide up to 16X compression rate Ultra-low-power performance Bio-signal acquisition devices are usually portable

20 Compare Two Approaches
Adaptive sampling Sampling rate is variable Additional computation circuit Compressive sensing Lower effective sampling rate Threshold circuit to make signal sparse

21 Ultrasound System Imagine
Portable ultrasound device Low power Less memory High image quality Use less transmitters for beamforming Trade off !! Use more transmitters for beamforming How to use less transmitters to obtain high performance ultrasound image?

22 Reconstruction of Ultrasound Imaging
[15] Spatial Sampling Frequency sampling

23 Single-Pixel CS Camera
[16] Rice University, 2008 random pattern on DMD array Single photon detector Image reconstruction 1 2 3 A/D conversion 1 1 1 y = x 1 1 1 1 1 1

24 Single-Pixel CS Camera
[16] Image reconstruction

25 Midterm Presentation - Information - 查資料的方法

26 Information Date: 4/29 (Tue.) 6:30~8:30 Location: EE2-225
兩人一組,每組報告12分鐘,提問3分鐘 Number Subject 1. Image via Compressive Sensing 2. Medical Application via Compressive Sensing 3. Reconstruction Algorithm: Orthogonal Matching Pursuit (OMP) 4. Reconstruction Algorithm: Iterative Thresholding 5. Hardware Implementation of Reconstruction Algorithm 6. Sampling Algorithm: Structured Matrices Mentor: 陳奕

27 附錄四:查資料的方法 (1) Google 學術搜尋 (不可以不知道) http://scholar.google.com.tw/
(太重要了,不可以不知道) 只要任何的書籍或論文,在網路上有電子版,都可以用這個功能查得到 再按「搜尋」,就可找到想要的資料 輸入關鍵字,或期刊名,或作者

28 (2) 尋找 IEEE 的論文 註:除非你是 IEEE Member,否則必需要在學校上網,才可以下載到 IEEE 論文的電子檔 (3) Google (4) Wikipedia (5) 數學的百科網站 有多個 tables,以及對數學定理的介紹 (6) 傳統方法:去圖書館找資料 台大圖書館首頁 或者去

29 (7) 查詢其他圖書館有沒有我要找的期刊 台大圖書館首頁 其他聯合目錄 全國期刊聯合目錄資料庫 如果發現其他圖書館有想要找的期刊,可以申請「館際合作」,請台大圖書館幫忙獲取所需要的論文的影印版 台大圖書館首頁 館際合作 (8) 查詢其他圖書館有沒有我要找的書 「台大圖書館首頁」 「其他圖書館」 (9) 找尋電子書 「台大圖書館首頁」 「電子書」 或「免費電子書」

30 (10) 中文電子學位論文服務 可以查到多個碩博士論文 (尤其是 2006年以後的碩博士論文) 的電子版 (11) 想要對一個東西作入門但較深入的了解: 看書會比看 journal papers 或 Wikipedia 適宜 如果實在沒有適合的書籍,可以看 “review”, “survey”, 或 “tutorial” 性質的論文 (12) 有了相當基礎之後,再閱讀 journal papers (以 Paper Title, Abstract, 以及其他 Papers 對這篇文章的描述, 來判斷這篇 journal papers 應該詳讀或大略了解即可)


Download ppt "Brief Introduction to Measurement Matrix"

Similar presentations


Ads by Google