Presentation is loading. Please wait.

Presentation is loading. Please wait.

M. Wu: ENEE631 Digital Image Processing (Spring'09) Texture Analysis and Synthesis Spring ’09 Instructor: Min Wu Electrical and Computer Engineering Department,

Similar presentations


Presentation on theme: "M. Wu: ENEE631 Digital Image Processing (Spring'09) Texture Analysis and Synthesis Spring ’09 Instructor: Min Wu Electrical and Computer Engineering Department,"— Presentation transcript:

1 M. Wu: ENEE631 Digital Image Processing (Spring'09) Texture Analysis and Synthesis Spring ’09 Instructor: Min Wu Electrical and Computer Engineering Department, University of Maryland, College Park   bb.eng.umd.edu (select ENEE631 S’09)   minwu@eng.umd.edu ENEE631 Spring’09 Lecture 27 (5/11/2009)

2 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [2] Overview and Logistics Last Time: –Multi-dimension Lattice sampling –Sampling rate conversion and applications in video processing Today: –Texture analysis and synthesis –More discussions on image modeling Project presentation: Thursday May 21, 2009, 11am Kim 2211 –15min; Arrange to have all members in the team speak; Practice See course webpage for guides on writing & presentation Course evaluation (online) UMCP ENEE631 Slides (created by M.Wu © 2004)

3 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [3] Recap: Sampling Lattice Conversion From Wang’s book preprint Fig.4.4 Intermediate Original Targeted UMCP ENEE631 Slides (created by M.Wu © 2001)

4 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [4] Video Format Conversion for NTSC  PAL Require both temporal and spatial rate conversion –NTSC 525 lines per picture, 60 fields per second –PAL 625 lines per picture, 50 fields per second Ideal approach (direct conversion) –525 lines 60 field/sec  13125 line 300 field/sec  625 lines 50 field/sec 4-step sequential conversion –Deinterlace => line rate conversion => frame rate conversion => interlace Simplified conversion –50 field/sec  60 field/sec: deinterlace, then simplify to 5  6 frames –625 lines  525 lines: simplify to 25 lines  21 lines –Conversion involves two adjacent lines or frames only UMCP ENEE631 Slides (created by M.Wu © 2001)

5 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [5] From Wang’s book preprint Fig.4.9 UMCP ENEE631 Slides (created by M.Wu © 2001)

6 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [6] Simplified Video Format Conversion 50 field/sec  60 field/sec –After deinterlacing, s implify to 5 frames  6 frames –Conversion involves two adjacent frames only 625 lines  525 lines –Simplify to 25 lines  21 lines –Conversion involves two adjacent lines only UMCP ENEE631 Slides (created by M.Wu © 2001)

7 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [7] UMCP ENEE631 Slides (created by M.Wu © 2001) From Wang’s book preprint

8 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [8] Texture Analysis & Synthesis UMCP ENEE631 Slides (created by M.Wu © 2004)

9 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [9] Types of Image Processing Tasks Image in, Image out –Codec (compression-decompression) –Image enhancement and restoration –Digital watermarking => May require both analysis and synthesis operations u Intermediate output may be non-image like (coded stream), but end output should reconstruct into an image close/relate to the input Image in, Features out –Features may be used for classification, recognition, and other studies

10 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [10] Texture Observed in structural patterns of objects’ surfaces –[Natural] wood, grain, sand, grass, tree leaves, cloth –[Man-made] tiles, printing patterns “Texture” ~ repetition of “texels” (basic texture element) –Texels’ placement may be periodic, quasi-periodic, or random From http://texlib.povray.org/textures.html and Gonzalez 3/e book online resource

11 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [11] Properties and Major Approaches to Study Texture Texture properties: smoothness, coarseness, regularity Structural approach –Describe arrangement of basic image primitives Statistical approach –Examine histogram and other features derived from it –Characterize textures as smooth, coarse, grainy Spectral and random field approach –Exploit Fourier spectrum properties –Detect global periodicity

12 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [12] Statistical Measures on Textures x ~ r.v. of pixel value R = 1 – 1 / ( 1 +  x 2 ) ~ 0 for constant region; 1 for large variance E[ (X –  x ) K ] 3 rd moment: ~ histogram’s skewness 4 th moment: ~ relative flatness Uniformity or Energy ~ squared sum of hist. bins Figures from Gonzalez’s book resource

13 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [13] Characterizing Textures Structural measures –Periodic textures u Features of deterministic texel: gray levels, shape, orientation, etc. u Placement rules: period, repetition grid pattern, etc. –Textures with random nature u Features of texel: edge density, histogram features, etc. Stochastic/spectral measures –Mainly for textures with random nature; Model as a random field u 2-D sequence of random variables –Autocorrelation function: measuring the relations among those r.v. R(m,n; m’,n’) = E[ U(m,n) U(m’,n’) ] “wide-sense stationary”: R(m,n;m’n’) = R U (m-m’,n-n’) and constant mean –Fit into random field models ~ analysis and synthesis u Focus on second order statistics for simplicity u Two textures with same 2nd order statistics often appear similar

14 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [14] Examples: Spectral Approaches to Study Texture Figures from Gonzalez’s 2/e book resource

15 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [15] Texture Synthesis Recall: error concealment of small blocks Exploit surrounding edge info. to interpolate General approach: analysis then synthesis Image In-painting –Filling in missing/occluded regions with synthesized version u Maintain structural consistency (in edge & overall color/brightness) u Maintain texture’s statistical continuity (such as oscillation pattern) for improved visual effect Ref: M. Bertalmio, L. Vese, G. Sapiro, and S. Osher: “Simultaneous Structure and Texture Image Inpainting,” IEEE Trans. on Image Proc., vol.12, no.8, August 2003. edge estimation edge-directed interpolation Image examples from Bertalmio et al. TIP’03 paper

16 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [16] Image Inpainting: Basic Approach and Example Figures from Bertalmio et al. TIP’03 paper

17 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [17] Texture Synthesis Approach by Effros-Leung Model texture as Markov Random Field (MRF) –Probability distribution of pixel brightness given spatial neighborhood is independent of the rest of the image. –Model neighborhood of a pixel with a square window around Window size controls how stochastic the texture will be E.g. Choose window size on the scale of biggest regular feature 2-step estimate of cond’l p.d.f. –Match neighborhood with some allowable distortion –Build histogram of corresp. pixels from matched neighborhoods –Produce estimate based on the histogram Figures from Effros-Leung ICCV’99 paper

18 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [18] Recall: Characterize the Ensemble of 2-D Signals Specify by a joint probability distribution function –Difficult to measure and specify the joint distribution for images of practical size => too many r.v. : e.g. 512 x 512 = 262,144 Specify by the first few moments –Mean (1 st moment) and Covariance (2 nd moment) u may still be non-trivial to measure for the entire image size By various stochastic models –Use a few parameters to describe the relations among all pixels u E.g. 2-D extensions from 1-D Autoregressive (AR) model Important for a variety of image processing tasks –image compression, enhancement, restoration, understanding, … UMCP ENEE631 Slides (created by M.Wu © 2004)

19 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [19] Recall: Discrete Random Field We call a 2-D sequence discrete random field if each of its elements is a random variable –when the random field represents an ensemble of images, we often call it a random image Mean and Covariance of a complex random field E[u(m,n)] =  (m,n) Cov[u(m,n), u(m’,n’)] = E[ (u(m,n) –  (m,n)) (u(m’,n’) –  (m’,n’)) * ] = r u ( m, n; m’, n’) u For zero-mean random field, autocorrelation function = cov. function Wide-sense stationary (or wide-sense homogeneity)  (m,n) =  = constant r u ( m, n; m’, n’) = r u ( m – m’, n – n’; 0, 0) = r ( m – m’, n – n’ ) u also called shift invariant, spatial invariant in some literature UMCP ENEE631 Slides (created by M.Wu © 2004)

20 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [20] Recall: Special Random Fields of Interests White noise field –A stationary random field –Any two elements at different locations x(m,n) and x(m’,n’) are mutually uncorrelated r x ( m – m’, n – n’) =  x 2 ( m, n )  ( m – m’, n – n’ ) Gaussian random field –Every segment defined on an arbitrary finite grid is Gaussian i.e. every finite segment of u(m,n) when mapped into a vector have a joint Gaussian p.d.f. of UMCP ENEE631 Slides (created by M.Wu © 2004)

21 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [21] Recall: Spectral Density Function Spectral density function (SDF) is defined as the Fourier transform of the covariance function r x u Also known as the power spectral density (p.s.d.) Example: SDF of stationary white noise field with r(m,n)=  2  (m,n)  S(  1,  2) =  2 SDF Properties: –Real and nonnegative: S(  1,  2 ) = S*(  1,  2 ); S(  1,  2 )  0 u By conjugate symmetry of covariance function: r (m, n) = r * (-m, -n) u By non-negative definiteness of covariance function –SDF of the output from a LSI system w/ freq response H(  1,  2 ) S y (  1,  2 ) = | H(  1,  2 ) | 2 S x (  1,  2 ) UMCP ENEE631 Slides (created by M.Wu © 2004)

22 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [22] More on Image Modeling Good image model can facilitate many image proc. tasks –Coding/compression; restoration; estimation/interpolation; … Image models we’ve seen/used so far –Consider pixel values as realizations of a r.v. u Color/grayscale histogram –Predictive models u Use linear combination of (causal) neighborhood to estimate –Random field u(m,n) u Characterized by 2-D correlation function or p.s.d. Generally can characterize u(m,n) = u’(m,n) + e(m,n) u’(m,n) is some prediction of u(m,n); e(m,n) is another random field u Minimum Variance representation (MVR): e(m,n) is error of min. var. prediction u White Noise Driven representation: e(m,n) is chosen as a white noise field u ARMA representation: e(m,n) is a 2-D moving average of a white noise field

23 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [23] Recall: Linear Predictor Causality required for coding purpose –Can’t use the samples that decoder hasn’t got as reference Use last sample u q (n-1): equiv. to coding the difference (DPCM) p th –order auto-regressive (AR) model –Linear predictor from past samples Prediction neighborhood –Line-by-line DPCM u predict from the past samples in the same line –2-D DPCM u predict from past samples in the same line and from previous lines –Non-causal neighborhood u Use samples around as prediction/estimation => for filtering, restoration, etc Predictor coefficients in MMSE sense: get from orthogonality condition (from Wiener filtering discussions) UMCP ENEE631 Slides (created by M.Wu © 2001)

24 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [24] Commonly Used Image Model Gaussian model and Gaussian mixture model Every segment defined on an arbitrary finite grid is Gaussian -or- follows a distribution of linear combining several Gaussian u Reduce the modeling to estimate mean(s), variance(s), and weightings (Ref: Prof. R. Gray’s IEEE talk S’07 http://www-ee.stanford.edu/~gray/umcpqcc.pdf) Markov random field –Markovianity: conditional independence u Define past, present, future pixel set for each pix location u Given the present, the future is independent of the past –2-D spatial causal AR model (under Gaussian noise or MVR) Gauss-Markov random field model –Gaussian: conditional independence => conditional uncorrelateness Bring in multi-scale and wavelet/multi-resolution ideas Ref: Section 4.1-4.5 of Bovik’s Handbook

25 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [25] Comparison of Various Predictive Models –Ref. Jain’s pp495 From Jain’s Fig.11.12 UMCP ENEE631 Slides (created by M.Wu © 2001)

26 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [26] Recursive Estimation: Basic Ideas Kalman filter: recursive Linear MMSE estimator 1-D example: provide linear estimation for an AR signal –System model AR(M): x(n)=c 1 x(n-1)+… c M x(n-M) + w(n) u State equation: x(n) = C x(n-1) + w(n) for n = 0, 1, …, x(-1)=0 ~ AR sig u Observation equation: y(n) = h T x(n) + v(n) u M-dimension state vector x(n); Model noise w(n) and observation noise v(n) are white & orthogonal –Signal model is Mth-order Markov under Gaussian noise w –Linear MMSE estimator is globally optimal if model and observation noise w and v are both Gaussian –C can be time-variant and physics motivated by applications General MMSE solution: equiv. to find conditional mean u Filtering estimate: E[ x(n) | y(n), y(n-1), … y(0) ]  x a (n) u One-step predictor estimate: E[ x(n) | y(n-1), … y(0) ]  x b (n)

27 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [27] Recursive Estimation (cont’d) 1-D Kalman filter equations u Prediction: x b (n) = C x a (n-1) u Update: x a (n) = x b (n) + g(n) [ y(n) – h T x b (n) ] –Error variance equations P b (n) = C P a (n-1) C T + Q w initialize: x(0) = [w(0), 0, … 0 ] T, x b (0) = 0 P a (n) = (I – g(n) h T ) P b (n) P b (0) = Q w ~ all zero except 1 st entry  w 2 –Kalman gain vector g(n) u g(n) = P b (n) (h T P b (n) h +  v 2 ) -1 2-D Kalman filtering: define proper state vector u Raster scan observations & map to equiv. 1-D case u Restrict Kalman gain terms to be just surround current observation to reduce computational complexity past present state future

28 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [28] Summary of Today’s Lecture Texture analysis and synthesis More on image modeling Readings –Texture: Gonzalez’s book 11.3.3; see also Jain’s 9.11 –Image modeling: u Wood’s book Chapter 7 & 9.4; Jain’s book Chapter 6; Bovik’s Handbook 4.1-4.5 u Recursive/Kalman estimation: Woods’ book Chapter 7; EE621 (Poor’s book) UMCP ENEE631 Slides (created by M.Wu © 2004)

29 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [29] Beyond ENEE631 ENEE633 Statistical Pattern Recognition ENEE731 Image Understanding Audio/Speech: ENEE632 Adaptive algorithms: ENEE634 Special-topic research-oriented courses (sometimes offered) –On medical image processing –On media security & forensics UMCP ENEE631 Slides (created by M.Wu © 2004)

30 M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [30]


Download ppt "M. Wu: ENEE631 Digital Image Processing (Spring'09) Texture Analysis and Synthesis Spring ’09 Instructor: Min Wu Electrical and Computer Engineering Department,"

Similar presentations


Ads by Google