Presentation is loading. Please wait.

Presentation is loading. Please wait.

1.  Probability density function (pdf) estimation using isocontours/isosurfaces  Application to Image Registration  Application to Image Filtering.

Similar presentations


Presentation on theme: "1.  Probability density function (pdf) estimation using isocontours/isosurfaces  Application to Image Registration  Application to Image Filtering."— Presentation transcript:

1 1

2  Probability density function (pdf) estimation using isocontours/isosurfaces  Application to Image Registration  Application to Image Filtering  Circular/spherical density estimation in Euclidean space 2

3 3 Histograms Kernel density estimate Mixture model Parameter selection: bin-width/bandwidth/number of components Bias/variance tradeoff: large bandwidth: high bias, low bandwidth: high variance) Sample-based methods Do not treat a signal as a signal

4 4 Continuous image representation: using some interpolant. Trace out isocontours of the intensity function I(x,y) at several intensity values.

5

6 6 Assume a uniform density on (x,y) Random variable transformation from (x,y) to (I,u) Integrate out u to get the density of intensity I Every point in the image domain contributes to the density. Published in CVPR 2006, PAMI 2009. u = direction along the level set (dummy variable)

7 7

8 8 Relationships between geometric and probabilistic entities.

9  Similar density estimator developed by Kadir and Brady (BMVC 2005) independently of us.  Similar idea: several differences in implementation, motivation, derivation of results and applications. 9

10 10 Densities (derivatives of the cumulative) do not exist where image gradients are zero, or where image gradients run parallel. Compute cumulative interval measures.

11 11

12 12 Standard histogramsIsocontour Method 32 bins64 bins128 bins256 bins512 bins1024 bins

13 13

14  Randomized/digital approximation to area calculation.  Strict lower bound on the accuracy of the isocontour method, for a fixed interpolant.  Computationally more expensive than the isocontour method. 14

15 15 128 x 128 bins

16  Simplest one: linear interpolant to each half- pixel (level curves are segments).  Low-order polynomial interpolants: high bias, low variance.  High-order polynomial interpolants: low bias, high variance. 16

17 17 Polynomial Interpolant Accuracy of estimated density improves as signal is sampled with finer resolution. Assumptions on signal: better interpolant Bandlimited analog signal, Nyquist-sampled digital signal: Accurate reconstruction by sinc interpolant! (Whitaker-Shannon Sampling Theorem)

18  Probability density function (pdf) estimation using isocontours  Application to Image Registration  Application to Image Filtering  Circular/spherical density estimation in Euclidean space 18

19 19 Given two images of an object, to find the geometric transformation that “best” aligns one with the other, w.r.t. some image similarity measure. Mutual Information: Well known image similarity measure Viola and Wells (IJCV 1995) and Maes et al (TMI 1997). Insensitive to illumination changes: useful in multimodality image registration

20 20 Marginal entropy Joint entropy Conditional entropy Joint Probability Marginal Probabilities

21 Functions of Geometric Transformation Hypothesis: If the alignment between images is optimal then Mutual information is maximum. 21

22 22

23 23 32 bins 128 bins PVI=partial volume interpolation (Maes et al, TMI 1997)

24 24 PD slice T2 sliceWarped T2 sliceWarped and Noisy T2 slice Brute force search for the maximum of MI

25 25 MI with standard histograms MI with our method Par. of affine transf.

26 26 MethodError in Theta (avg., var.) Error in s (avg.,var.) Error in t (avg., var.) Histograms (bilinear) 3.7,18.10.7,00.43,0.08 Isocontours0,0.060,0 PVI1.9, 8.50.56,0.080.49,0.1 Histograms (cubic) 0.3,49.40.7,00.2,0 2DPointProb0.3,0.220,0 32 BINS

27  Probability density function (pdf) estimation using isocontours  Application to Image Registration  Application to Image Filtering  Circular/spherical density estimation in Euclidean space 27

28 Anisotropic neighborhood filters (Kernel density based filters): Grayscale images 28 Central Pixel (a,b): Neighborhood N(a,b) around (a,b) K: a decreasing function (typically Gaussian) Parameter σ controls the degree of anisotropicity of the smoothing

29 Anisotropic Neighborhood filters: Problems 29 Sensitivity to the parameter Sensitivity to the SIZE of the Neighborhood Does not account for gradient information

30 Anisotropic Neighborhood filters: Problems 30 Treat pixels as independent samples

31 Continuous Image Representation 31 Interpolate in between the pixel values

32 Areas between isocontours at intensity α and α+Δ (divided by area of neighborhood)= Pr(α < Intensity < α+Δ|N(a,b)) Continuous Image Representation 32

33 Areas between isocontours: contribute to weights for averaging. 33 Published in EMMCVPR 2009

34 Extension to RGB images 34 Joint Probability of R,G,B = Area of overlap of isocontour pairs from R, G, B images

35 Mean-shift framework A clustering method developed by Fukunaga & Hostetler (IEEE Trans. Inf. Theory, 1975). Applied to image filtering by Comaniciu and Meer (PAMI 2003). Involves independent update of each pixel by maximization of local estimate of probability density of joint spatial and intensity parameters. 35

36 Mean-shift framework One step of mean-shift update around (a,b,c) where c=I(a,b). 36

37 Our Method in Mean-shift Setting 37 I(x,y)X(x,y)=x Y(x,y)=y

38 Our Method in Mean-shift Setting 38 Facets of tessellation induced by isocontours and the pixel grid = Centroid of Facet #k. = Intensity (from interpolated image) at. = Area of Facet #k.

39 Experimental Setup: Grayscale Images Piecewise-linear interpolation used for our method in all experiments. For our method, Kernel K = pillbox kernel, i.e. For discrete mean-shift, Kernel K = Gaussian. Parameters used: neighborhood radius ρ=3, σ=3. Noise model: Gaussian noise of variance 0.003 (scale of 0 to 1). 39 If |z| <= σ If |z| > σ

40 Original Image Noisy Image Denoised (Isocontour Mean Shift) Denoised (Gaussian Kernel Mean Shift) MSE Noisy Image181.27 Isocontour (ρ=3, σ =3) 110.95 Std. Mean shift (ρ=3, σ =3) 175.27 Std. Mean shift (ρ=5, σ =5) 151.27 40

41 Original Image Noisy Image Denoised (Isocontour Mean Shift) Denoised (Std.Mean Shift) Noisy image Isocontour Mean shift (ρ=3, σ =3) Std. mean shift (ρ=3, σ =3) Std. mean shift (ρ=5, σ =3) MSE190113.8184.77153.5 41

42 Experiments on color images Use of pillbox kernels for our method. Use of Gaussian kernels for discrete mean shift. Parameters used: neighborhood radius ρ= 6, σ = 6. Noise model: Independent Gaussian noise on each channel with variance 0.003 (on a scale of 0 to 1). 42

43 Experiments on color images Independent piecewise-linear interpolation on R,G,B channels in our method. Smoothing of R, G, B values done by coupled updates using joint probabilities. 43

44 Original Image Noisy Image Denoised (Isocontour Mean Shift) Denoised (Gaussian Kernel Mean Shift) MSE Noisy Image572.24 Isocontour (ρ=3, σ =3) 319.88 Std. Mean shift (ρ=3, σ =3) 547.96 Std. Mean shift (ρ=5, σ =5) 496.7 44

45 Original Image Noisy Image Denoised (Isocontour Mean Shift) Denoised (Gaussian Kernel Mean Shift) Noisy image Isocontour Mean shift (ρ=3, σ =3) Std. mean shift (ρ=3, σ =3) Std. mean shift (ρ=5, σ =5) MSE547.9306.14526.8477.25 45

46 Observations Discrete kernel mean shift performs poorly with small neighborhoods and small values of σ. Why? Small sample-size problem for kernel density estimation. Isocontour based method performs well even in this scenario (number of isocontours/facets >> number of pixels). Large σ or large neighborhood values not always necessary for smoothing. 46

47 Observations Superior behavior observed when comparing isocontour-based neighborhood filters with standard neighborhood filters for the same parameter set and the same number of iterations. 47

48  Probability density function (pdf) estimation using isocontours  Application to Image Registration  Application to Image Filtering  Circular/spherical density estimation in Euclidean space 48

49  Examples of unit vector data: 1. Chromaticity vectors of color values: 2. Hue (from the HSI color scheme) obtained from the RGB values. 49

50 50 Convert RGB values to unit vectors Estimate density of unit vectors voMF mixture models Banerjee et al (JMLR 2005) Other popular kernels: Watson, cosine.

51 51 Estimate density of RGB using KDE/Mixture models Density of (magnitude,chromaticity) using random-variable transformation Density of chromaticity (integrate out magnitude) Projected normal estimator: Watson,”Statistics on spheres”, 1983, Small,”The statistical theory of shape”, 1995 Density of chromaticity: conditioning on m=1. Variable bandwidth voMF KDE: Bishop, “Neural networks for pattern recognition” 2006. What’s new? The notion that all estimation can proceed in Euclidean space.

52 52 Estimate density of RGB using KDE/Mixture models Use random variable transformation to get density of HSI (hue, saturation,intensity) Integrate out S,I to get density of hue

53  Consistency between densities of Euclidean and unit vector data (in terms of random variable transformation/conditioning).  Potential to use the large body of literature available for statistics of Euclidean data (example: Fast Gauss Transform Greengard et al (SIAM Sci. Computing 1991), Duraiswami et al (IJCV 2003).  Model selection can be done in Euclidean space. 53


Download ppt "1.  Probability density function (pdf) estimation using isocontours/isosurfaces  Application to Image Registration  Application to Image Filtering."

Similar presentations


Ads by Google