Presentation is loading. Please wait.

Presentation is loading. Please wait.

Coarse-to-Fine Image Reconstruction Rebecca Willett In collaboration with Robert Nowak and Rui Castro.

Similar presentations


Presentation on theme: "Coarse-to-Fine Image Reconstruction Rebecca Willett In collaboration with Robert Nowak and Rui Castro."— Presentation transcript:

1 Coarse-to-Fine Image Reconstruction Rebecca Willett In collaboration with Robert Nowak and Rui Castro

2 Haar Tree Pruning MSE = 0.0033 Poisson Data ~14 photons/pixel MSE = 0.0169 Wedgelet Tree Pruning MSE = 0.0015 O(n) O(n 11/6 )

3 Iterative reconstruction E-Step: Compute conditional expectation of new noisy image estimate given data and current image estimate Traditional Shepp-Vardi M-Step: Maximum Likelihood Estimation Improved M-Step: Complexity Regularized Multiscale Poisson Denoising (Willett & Nowak, IEEE-TMI ‘03)

4 MLE Jeff Fessler’s PWLS Wedgelet-based reconstruction Shepp-Logan Wedgelet-based tomography

5 Tomography

6 piecewise constant 2-d function with “smooth” edges A simple image model

7 Access only to n noisy “pixels” Measurement model Goal: find an estimate of the original image such that is small.

8 Image space

9 Kolmogorov metric entropy

10 Dudley ‘74

11 approx. errestimation err. Minimax lower bound Korostelev & Tsybakov, ‘93

12 Adaptively pruned partitions

13 Tree pruning estimation

14 Partitions and Estimators Sum-of-squared errors empirical risk:

15 Complexity penalized estimator: Complexity Regularization and the Bias-Variance Trade-off set of all possible tree prunings |P| fidelity to data complexity

16 The Li-Barron bound approximation error (bias) estimation error (variance) Li & Barron, ‘00 Nowak & Kolaczyk, ‘01

17 The Kraft inequality 1 1110 111110000000 0000

18 Decorate each partition set with a constant: squared approximation error This class of models is not well-matched to the class of images Estimating smooth contours - Haar

19 Donoho ‘99 Approximating smooth contours - wedgelets

20 Approximating smoother contours Original Image Haar Wavelet Partition Wedgelet Partition Wedgelet > 850 terms < 370 terms (Donoho ‘99)

21 squared approximation error Use wedges and decorate each partition set with a constant: This is the best achievable rate!!! Estimating smoother contours - wedgelets

22 Simple Computation Poor approximation Haar-based estimationWedgelet estimation Complex Computation Good approximation The problem with estimating smooth contours

23 Computational implications

24 space of all signal models is very large from which one is selected A solution: Coarse-to-fine model selection two-step process involves search first over coarse model space coarse model space

25 second step involves search over small subset of models Coarse-to-fine model selection

26 Start with a uniform partition C2F wedgelets: two-stage optimization Stage 1: Adapt partition to the data by pruning Stage 2: Only apply wedges in the small boxes that remain

27 C2F wedgelets: two-stage optimization

28 Error analysis of two-stage approach: (Castro, Willett, Nowak, ICASSP ‘04)

29 Controlling variance in the preview stage Start with a coarse partition in the first stage: lowers the variance of the coarse resolution estimate with high probability, pruned coarse partition close to optimal coarse partition unpruned boxes at this stage indicate edges or boundaries

30 Controlling bias in the preview stage Bias becomes large if a square containing a boundary fragment is pruned in the first stage (this may happen if a boundary is close to the side of the squares) Solution: Compute TWO coarse partitions - one normal, and one shifted Refine any region unpruned in either or both shifts potential problem area: not a problem after shift:

31 Computational implications

32 noisy data MSE = 0.0052 stage 1 result MSE = 0.1214 stage 2 result O(n 7/6 ), MSE = 0.00046 Main result in action Compare with standard wedgelet denoising : Significant computational savings and better result ! O(n 11/6 ), MSE = 0.00073

33 low resolution high resolution C2F limitations: The “ribbon”

34 C2F and other greedy methods: Matching pursuit 20 Questions (Geman & Blanchard, ‘03) Boosting

35 More general image models platelet planar fits (Willett & Nowak, IEEE-TMI ‘03, Willett & Nowak, Wavelets X. Nowak, Mitra, & Willett, JSAC ‘03)

36 Platelet Approximation Theory m-term approximation error decay rate: Fourier: O(m -1/2 ) Wavelets: O(m -1 ) Wedgelets: O(m -1 ) Platelets: O(m -2 ) Curvelets: O(m -2 ) Twice continuously differentiable

37 Confocal microscopy simulation Noisy Image Haar Estimate Platelet Estimate

38 C2F limitations: complex images “Images are edges”: many images consist almost entirely of edges C2F model still appropriate for many applications: –nuclear medicine –feature classification –temperature field estimation

39 C2F in multiple dimensions

40 Final remarks and ongoing work Careful greedy methods can perform as well as exhaustive searches, both in theory and practice Coarse-to-fine estimation dramatically reduces computational complexities Similar ideas can be used in other scenarios –Reduce the amount of data required (e.g., active learning and adaptive sampling) –Reduce number of bits required to encode model locations in compression schemes


Download ppt "Coarse-to-Fine Image Reconstruction Rebecca Willett In collaboration with Robert Nowak and Rui Castro."

Similar presentations


Ads by Google