Presentation is loading. Please wait.

Presentation is loading. Please wait.

Martin Burger Institut für Numerische und Angewandte Mathematik European Institute for Molecular Imaging CeNoS Total Variation and Related Methods.

Similar presentations


Presentation on theme: "Martin Burger Institut für Numerische und Angewandte Mathematik European Institute for Molecular Imaging CeNoS Total Variation and Related Methods."— Presentation transcript:

1 Martin Burger Institut für Numerische und Angewandte Mathematik European Institute for Molecular Imaging CeNoS Total Variation and Related Methods

2 Martin Burger Total Variation 2 Cetraro, September 2008 Mathematical Imaging@WWU Christoph Brune Alex Sawatzky Frank Wübbeling Thomas Kösters Martin Benning Bärbel Schlake Marzena Franek Christina Stöcker Mary Wolfram Thomas Grosser Jahn Müller

3 Martin Burger Total Variation 3 Cetraro, September 2008 Imaging Basics: What and Why ? - Denoising: Given a noisy version of an image, find a smooth approximation (better appealing to the human eye or better suited for some task, e.g. counting cells in a microscope) - Decomposition: Given an image, decompose it into different parts such as smooth structure, texture, edges, noise - Deblurring: Given a blurred version of an image (also noisy) find an approximation of the original image - Inpainting: Given an image with holes, try to fill the holes as a reasonable continuation

4 Martin Burger Total Variation 4 Cetraro, September 2008 Imaging Basics: What and Why ? - Segmentation: Find the edges / different objects in an image - Reconstruction: Given indirect information about an image, e.g. tomography, try to find an approximation of the image Many of these tasks can be categorized as Inverse Problems: reconstruction of the cause of an observed effect (via a mathematical model relating them) Diagnosis in medicine is a prototypical example "The grand thing is to be able to reason backwards." Arthur Conan Doyle (A study in scarlet)

5 Martin Burger Total Variation 5 Cetraro, September 2008 Noisy Images Noise appears from measurement devices or transmission loss

6 Martin Burger Total Variation 6 Cetraro, September 2008 Damaged Images Corrupted Pixels, dusted, scratches

7 Martin Burger Total Variation 7 Cetraro, September 2008 Medical Imaging: CT Classical image reconstruction example: computerized tomography (CT) Mathematical Problem: Reconstruction of a density function from its line integrals Inversion of the Radon transform cf. Natterer 86, Natterer-Wübbeling 02

8 Martin Burger Total Variation 8 Cetraro, September 2008 Medical Imaging: CT Classical image reconstruction example: computerized tomography (CT)

9 Martin Burger Total Variation 9 Cetraro, September 2008 Medical Imaging: CT + Low noise level + High spatial resolution + Exact reconstruction + Reasonable Costs - Restricted to few seconds (radiation exposure, 20 mSiewert) - No functional information - Few mathematical challenges left Soret, Bacharach, Buvat 07 Schäfers et al 07

10 Martin Burger Total Variation 10 Cetraro, September 2008 Medical Imaging: MR + Low noise level + High spatial resolution + Reconstruction by Fourier inversion + No radiation exposure + Good contrast in soft matter - Low tracer sensitivity - Limited functional information - Expensive - Few mathematical challenges left Courtesy Carsten Wolters, University Hospital Münster

11 Martin Burger Total Variation 11 Cetraro, September 2008 Medical Imaging: Ultrasound + Fast and cheap + Varying spatial resolution + Usually no reconstruction + No radiation exposure - High noise level - Bad contrast / bones

12 Martin Burger Total Variation 12 Cetraro, September 2008 Imaging Examples: PET (Human / Small animal) Positron-Emission-Tomography Data: detecting decay events of an radioactive tracer Decay events are random, but their rate is proportional to the tracer uptake (Radon transform with random directions) Imaging of molecular properties

13 Martin Burger Total Variation 13 Cetraro, September 2008 Medical Imaging: PET + High sensitivity + Long time (mins ~ 1 hour, radiation exposure 8-12 mSiewert) + Functional information + Many open mathematical questions - Few anatomical information - High noise level and disturbing effects (damping, scattering, … ) - Low spatial resolution Soret, Bacharach, Buvat 07 Schäfers et al 07

14 Martin Burger Total Variation 14 Cetraro, September 2008 Small Animal PET: Burning down the Mouse

15 Martin Burger Total Variation 15 Cetraro, September 2008 Image reconstruction in PET Stochastic models needed: typically measurements drawn from Poisson model „Image“ u equals density function (uptake) of tracer Linear Operator K equals Radon-transform Possibly additional (Gaussian) measurement noise b

16 Martin Burger Total Variation 16 Cetraro, September 2008 Cameras and Microscopes Same model with different K can be used for imaging with photons (microscopy, CCD cameras,..) Typically the Poisson statistic is good (many photon counts), measurement noise dominates In some cases the opposite is true !

17 Martin Burger Total Variation 17 Cetraro, September 2008 Low SNR Bad statistics arising due to lower radioactive activity or isotopes decaying fast (e.g. H 2 O 15 ) Desireable for patients Desireable for certain quantitative investigations (H 2 O 15 is useful tracer for blood flow) ~10.000 Events ~600 Events

18 Martin Burger Total Variation 18 Cetraro, September 2008 Basic Paradigms Typical imaging tasks are s0lved by a compromise between the following two goals: - Stay close to the data - Among those close to the data choose the one that corresponds best to a-priori ideas / knowledge The measure of how close one wants to stay to data is the SNR, respectively noise level. For zero noise / infinite SNR one would reproduce the data exactly. The higher the noise level / lower the SNR the farther the solution can be from the data.

19 Martin Burger Total Variation 19 Cetraro, September 2008 Imaging models Continuum Discrete Image: Data: Relation by (sometimes nonlinear Operator)

20 Martin Burger Total Variation 20 Cetraro, September 2008 Imaging models We usually use an abstract treatment with an image space X and a data space Y Digital (discrete) model is nowadays the realistic one, however there are several reasons to interpret it as a discretization of an underlying continuum model: - Images come with different resolution, should be compareable - Rich mathematical models in the continuum – PDEs - Robustness of numerical methods

21 Martin Burger Total Variation 21 Cetraro, September 2008 Relation between Image and Data Denoising: Decomposition :

22 Martin Burger Total Variation 22 Cetraro, September 2008 Relation between Image and Data Deblurring: Inpainting: inpainting region

23 Martin Burger Total Variation 23 Cetraro, September 2008 Relation between Image and Data Segmentation: Reconstruction:

24 Martin Burger Total Variation 24 Cetraro, September 2008 Bayes Paradigm The two goals are translated into probabilities: - Conditional data probability - A-priori probability of an image in absence of data

25 Martin Burger Total Variation 25 Cetraro, September 2008 Bayes Paradigm and MAP Together they create the a-posterior probability of an image A-priori probability of data is a scaling factor and can be ignored A natural estimator is the one maximizing probability, the maximum-aposteriori-probability (MAP) estimator

26 Martin Burger Total Variation 26 Cetraro, September 2008 MAP Estimator MAP estimator can be computed by minimizing negative log- likelihood: A-priori probability can be related to a regularization term

27 Martin Burger Total Variation 27 Cetraro, September 2008 The log-likelihood The probability to observe data f if the exact image is u can be related to the distribution of the noise Example: additive Gaussian noise (pointwise)

28 Martin Burger Total Variation 28 Cetraro, September 2008 The log-likelihood The log-likelihood becomes a sum, which converges to an integral in the continuum limit

29 Martin Burger Total Variation 29 Cetraro, September 2008 Variational model The above reasoning yields directly a standard variational model The MAP estimator is determined from minimizing the functional

30 Martin Burger Total Variation 30 Cetraro, September 2008 Variational model II One can show that the above minimization is equivalent to

31 Martin Burger Total Variation 31 Cetraro, September 2008 Discrepancy principle The second formulation is a (generalized) discrepancy principle for Gaussian noise: Minimize the regularization (maximize a-priori probability) among all images that give a data discrepancy of the order of the variance Alternatively this can be interpreted as a rule of choosing Choose such that

32 Martin Burger Total Variation 32 Cetraro, September 2008 Image Space and Regularization The image space and a-priori probability are directly related, X consists of all images with positive probability or, equivalently, finite regularization functional What is the right choice of R ?

33 Martin Burger Total Variation 33 Cetraro, September 2008 Image Space and Regularization How can we get reasonable regularization terms ? Dependent on goals and expectations on the solution Typical expectation: smoothness, in particular few oscillations (high oscillations = noise, to be eliminated) Few oscillations means small gradient variance, i.e.

34 Martin Burger Total Variation 34 Cetraro, September 2008 Denoising Example Consider MAP estimate for Gaussian noise with above regularization Unconstrained optimization, simple optimality condition

35 Martin Burger Total Variation 35 Cetraro, September 2008 Reminder: Gateaux-Derivative Gateaux derivative of a functional is the collection of all directional derivatives

36 Martin Burger Total Variation 36 Cetraro, September 2008 Optimality Condition Compute Gateaux-derivative Optimality: Weak form of:

37 Martin Burger Total Variation 37 Cetraro, September 2008 Elliptic Regularity We were looking for a function in

38 Martin Burger Total Variation 38 Cetraro, September 2008 Elliptic Regularity Regularity theory for the Poisson equation implies Hence u has even second derivatives and may be oversmoothed Note: derivatives go to infinity for

39 Martin Burger Total Variation 39 Cetraro, September 2008 Scale Space and Inverse Scale Space The square root of the Lagrange parameter defines a scale Hence u varies at a scale of order Smaller scales in f are suppressed

40 Martin Burger Total Variation 40 Cetraro, September 2008 Scale Space and Inverse Scale Space Multiple scales by iterating the variational method for small Scale Space Methods (Diffusion Filters): Start with noisy image (finest scales) and gradually coarsen scales until a certain minimal scale is reached Inverse Scale Space Methods (Bregman Iterations): Start with the most rough information about the image (largest scale = whole image, i.e. start with mean value) and gradually refine scales until a certain minimal scale is reached

41 Martin Burger Total Variation 41 Cetraro, September 2008 Variational Methods Variational method can be interpreted as both a - Scale space method: - Inverse scale space method:

42 Martin Burger Total Variation 42 Cetraro, September 2008 Scale Space Methods: Diffusion filters Alternative construction of a scale space method: Reinterpret optimality condition

43 Martin Burger Total Variation 43 Cetraro, September 2008 Scale Space Methods Iterate the variational method using the previous result as data

44 Martin Burger Total Variation 44 Cetraro, September 2008 Scale Space Method Start with Evolve u by Denoised result:

45 Martin Burger Total Variation 45 Cetraro, September 2008 Inverse Scale Space Method Alternative by starting with coarsest scale

46 Martin Burger Total Variation 46 Cetraro, September 2008 Inverse Scale Space Method Opposite limit (oversmoothing) yields flow Denoised result: T depends on noise level

47 Martin Burger Total Variation 47 Cetraro, September 2008 Variational Methods Alternative smoothing via penalizing coefficients in orthonormal bases

48 Martin Burger Total Variation 48 Cetraro, September 2008 Fourier Series Rewrite functional Equivalent minimization

49 Martin Burger Total Variation 49 Cetraro, September 2008 Fourier Series Explicit solution of the minimization problem

50 Martin Burger Total Variation 50 Cetraro, September 2008 Other Orthonormal Bases / Wavelets Analogous approach for other orthonormal bases, minimization of coefficients

51 Martin Burger Total Variation 51 Cetraro, September 2008 Problems with Quadratic Regularization Quadratic regularization yields simple linear equations to solve, but has several disadvantages - Oversmoothing (see above) - Edges are destroyed - Bias from operator A (see later) Alternative: other functions of the gradient

52 Martin Burger Total Variation 52 Cetraro, September 2008 Nonquadratic Regularization Optimality condition Linearization for smooth and strictly convex G

53 Martin Burger Total Variation 53 Cetraro, September 2008 Total Variation Only way to penalize oscillations without full elliptic regularity is to choose G not smooth / not strictly convex Canonical choice

54 Martin Burger Total Variation 54 Cetraro, September 2008 Minimization Problem The minimization problem has no solution in general (more later) Problem needs to be defined on a larger space

55 Martin Burger Total Variation 55 Cetraro, September 2008 Total Variation Rigorous definition

56 Martin Burger Total Variation 56 Cetraro, September 2008 Why TV-Methods ? Cartooning Linear Filter TV-Method

57 Martin Burger Total Variation 57 Cetraro, September 2008 Why TV-Methods ? Cartooning ROF Model with increasing allowed variance

58 Martin Burger Total Variation 58 Cetraro, September 2008 Sparsity Analogous approach in orthormal basis by penalization with weighted 1-norm

59 Martin Burger Total Variation 59 Cetraro, September 2008 Sparsity Most coefficients will be zero (sparse solution), the shrinkage of coefficients is a data-dependent Total variation leads to sparsity in the gradient, hence gradient will be zero in most points (usually in the others there are edges)


Download ppt "Martin Burger Institut für Numerische und Angewandte Mathematik European Institute for Molecular Imaging CeNoS Total Variation and Related Methods."

Similar presentations


Ads by Google