M. Wu: ENEE631 Digital Image Processing (Spring'09) Texture Analysis and Synthesis Spring ’09 Instructor: Min Wu Electrical and Computer Engineering Department,

Slides:



Advertisements
Similar presentations
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The Linear Prediction Model The Autocorrelation Method Levinson and Durbin.
Advertisements

November 12, 2013Computer Vision Lecture 12: Texture 1Signature Another popular method of representing shape is called the signature. In order to compute.
Chapter 3 Image Enhancement in the Spatial Domain.
Lecture 7 Linear time invariant systems
Introduction To Tracking
Texture. Edge detectors find differences in overall intensity. Average intensity is only simplest difference. many slides from David Jacobs.
Image (and Video) Coding and Processing Lecture 5: Point Operations Wade Trappe.
EE322 Digital Communications
Texture Turk, 91.
Probabilistic video stabilization using Kalman filtering and mosaicking.
Texture Reading: Chapter 9 (skip 9.4) Key issue: How do we represent texture? Topics: –Texture segmentation –Texture-based matching –Texture synthesis.
Image Enhancement.
Image (and Video) Coding and Processing Lecture: Motion Compensation Wade Trappe Most of these slides are borrowed from Min Wu and KJR Liu of UMD.
EE565 Advanced Image Processing Copyright Xin Li Statistical Modeling of Natural Images in the Wavelet Space Parametric models of wavelet coefficients.
Review of Probability.
M. Wu: ENEE631 Digital Image Processing (Spring'09) Edge Detection and Basics on 2-D Random Signal Spring ’09 Instructor: Min Wu Electrical and Computer.
EE513 Audio Signals and Systems Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
ENEE631 Digital Image Processing (Spring'04) Sampling Issues in Image and Video Spring ’04 Instructor: Min Wu ECE Department, Univ. of Maryland, College.
ELE 488 F06 ELE 488 Fall 2006 Image Processing and Transmission ( ) Wiener Filtering Derivation Comments Re-sampling and Re-sizing 1D  2D 10/5/06.
Modulation, Demodulation and Coding Course
Computer vision.
STAT 497 LECTURE NOTES 2.
Week 2ELE Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
INDEPENDENT COMPONENT ANALYSIS OF TEXTURES based on the article R.Manduchi, J. Portilla, ICA of Textures, The Proc. of the 7 th IEEE Int. Conf. On Comp.
Texture. Texture is an innate property of all surfaces (clouds, trees, bricks, hair etc…). It refers to visual patterns of homogeneity and does not result.
Mathematical Preliminaries. 37 Matrix Theory Vectors nth element of vector u : u(n) Matrix mth row and nth column of A : a(m,n) column vector.
M. Wu: ENEE631 Digital Image Processing (Spring'09) Lattice Sampling Spring ’09 Instructor: Min Wu Electrical and Computer Engineering Department, University.
1 Linear Prediction. 2 Linear Prediction (Introduction) : The object of linear prediction is to estimate the output sequence from a linear combination.
1 Linear Prediction. Outline Windowing LPC Introduction to Vocoders Excitation modeling  Pitch Detection.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
1 University of Texas at Austin Machine Learning Group 图像与视频处理 计算机学院 Motion Detection and Estimation.
2. Stationary Processes and Models
ENEE631 Digital Image Processing (Spring'04) Image Restoration Spring ’04 Instructor: Min Wu ECE Department, Univ. of Maryland, College Park  
Adv DSP Spring-2015 Lecture#9 Optimum Filters (Ch:7) Wiener Filters.
Why do we Need Statistical Model in the first place? Any image processing algorithm has to work on a collection (class) of images instead of a single one.
Elements of Stochastic Processes Lecture II
CHAPTER 5 SIGNAL SPACE ANALYSIS
Introduction to Digital Signals
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
Digital Communications Chapeter 3. Baseband Demodulation/Detection Signal Processing Lab.
ELE 488 F06 ELE 488 Fall 2006 Image Processing and Transmission Image Restoration distortion noise Inverse Filtering Wiener Filtering Ref: Jain,
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
ENEE631 Digital Image Processing (Spring'04) Signal Processing: From 1-D to 2-D (m-D) Spring ’04 Instructor: Min Wu ECE Department, Univ. of Maryland,
ENEE631 Digital Image Processing (Spring'04) Basics on 2-D Random Signal Spring ’04 Instructor: Min Wu ECE Department, Univ. of Maryland, College Park.
EE 3220: Digital Communication Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Al Dawaser Prince Sattam bin.
M. Wu: ENEE631 Digital Image Processing (Spring'09) Optimal Bit Allocation and Unitary Transform in Image Compression Spring ’09 Instructor: Min Wu Electrical.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Normal Equations The Orthogonality Principle Solution of the Normal Equations.
Autoregressive (AR) Spectral Estimation
Discrete-time Random Signals
ELE 488 F06 ELE 488 Fall 2006 Image Processing and Transmission ( ) Image Compression Quantization independent samples uniform and optimum correlated.
Random Processes Gaussian and Gauss-Markov processes Power spectrum of random processes and white processes.
6.4 Random Fields on Graphs 6.5 Random Fields Models In “Adaptive Cooperative Systems” Summarized by Ho-Sik Seok.
Geology 6600/7600 Signal Analysis 26 Oct 2015 © A.R. Lowry 2015 Last time: Wiener Filtering Digital Wiener Filtering seeks to design a filter h for a linear.
EE565 Advanced Image Processing Copyright Xin Li Why do we Need Image Model in the first place? Any image processing algorithm has to work on a collection.
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
Chapter 6 Random Processes
Biointelligence Laboratory, Seoul National University
9/11/2018 ENEE631 Spring’09 Lecture 7 (2/16/2009) Image Restoration
SIGNALS PROCESSING AND ANALYSIS
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Linear Prediction.
Image Analysis Image Restoration.
Outline S. C. Zhu, X. Liu, and Y. Wu, “Exploring Texture Ensembles by Efficient Markov Chain Monte Carlo”, IEEE Transactions On Pattern Analysis And Machine.
Texture.
Foundation of Video Coding Part II: Scalar and Vector Quantization
Image and Video Processing
Image and Video Processing
16. Mean Square Estimation
Presentation transcript:

M. Wu: ENEE631 Digital Image Processing (Spring'09) Texture Analysis and Synthesis Spring ’09 Instructor: Min Wu Electrical and Computer Engineering Department, University of Maryland, College Park   bb.eng.umd.edu (select ENEE631 S’09)   ENEE631 Spring’09 Lecture 27 (5/11/2009)

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [2] Overview and Logistics Last Time: –Multi-dimension Lattice sampling –Sampling rate conversion and applications in video processing Today: –Texture analysis and synthesis –More discussions on image modeling Project presentation: Thursday May 21, 2009, 11am Kim 2211 –15min; Arrange to have all members in the team speak; Practice See course webpage for guides on writing & presentation Course evaluation (online) UMCP ENEE631 Slides (created by M.Wu © 2004)

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [3] Recap: Sampling Lattice Conversion From Wang’s book preprint Fig.4.4 Intermediate Original Targeted UMCP ENEE631 Slides (created by M.Wu © 2001)

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [4] Video Format Conversion for NTSC  PAL Require both temporal and spatial rate conversion –NTSC 525 lines per picture, 60 fields per second –PAL 625 lines per picture, 50 fields per second Ideal approach (direct conversion) –525 lines 60 field/sec  line 300 field/sec  625 lines 50 field/sec 4-step sequential conversion –Deinterlace => line rate conversion => frame rate conversion => interlace Simplified conversion –50 field/sec  60 field/sec: deinterlace, then simplify to 5  6 frames –625 lines  525 lines: simplify to 25 lines  21 lines –Conversion involves two adjacent lines or frames only UMCP ENEE631 Slides (created by M.Wu © 2001)

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [5] From Wang’s book preprint Fig.4.9 UMCP ENEE631 Slides (created by M.Wu © 2001)

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [6] Simplified Video Format Conversion 50 field/sec  60 field/sec –After deinterlacing, s implify to 5 frames  6 frames –Conversion involves two adjacent frames only 625 lines  525 lines –Simplify to 25 lines  21 lines –Conversion involves two adjacent lines only UMCP ENEE631 Slides (created by M.Wu © 2001)

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [7] UMCP ENEE631 Slides (created by M.Wu © 2001) From Wang’s book preprint

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [8] Texture Analysis & Synthesis UMCP ENEE631 Slides (created by M.Wu © 2004)

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [9] Types of Image Processing Tasks Image in, Image out –Codec (compression-decompression) –Image enhancement and restoration –Digital watermarking => May require both analysis and synthesis operations u Intermediate output may be non-image like (coded stream), but end output should reconstruct into an image close/relate to the input Image in, Features out –Features may be used for classification, recognition, and other studies

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [10] Texture Observed in structural patterns of objects’ surfaces –[Natural] wood, grain, sand, grass, tree leaves, cloth –[Man-made] tiles, printing patterns “Texture” ~ repetition of “texels” (basic texture element) –Texels’ placement may be periodic, quasi-periodic, or random From and Gonzalez 3/e book online resource

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [11] Properties and Major Approaches to Study Texture Texture properties: smoothness, coarseness, regularity Structural approach –Describe arrangement of basic image primitives Statistical approach –Examine histogram and other features derived from it –Characterize textures as smooth, coarse, grainy Spectral and random field approach –Exploit Fourier spectrum properties –Detect global periodicity

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [12] Statistical Measures on Textures x ~ r.v. of pixel value R = 1 – 1 / ( 1 +  x 2 ) ~ 0 for constant region; 1 for large variance E[ (X –  x ) K ] 3 rd moment: ~ histogram’s skewness 4 th moment: ~ relative flatness Uniformity or Energy ~ squared sum of hist. bins Figures from Gonzalez’s book resource

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [13] Characterizing Textures Structural measures –Periodic textures u Features of deterministic texel: gray levels, shape, orientation, etc. u Placement rules: period, repetition grid pattern, etc. –Textures with random nature u Features of texel: edge density, histogram features, etc. Stochastic/spectral measures –Mainly for textures with random nature; Model as a random field u 2-D sequence of random variables –Autocorrelation function: measuring the relations among those r.v. R(m,n; m’,n’) = E[ U(m,n) U(m’,n’) ] “wide-sense stationary”: R(m,n;m’n’) = R U (m-m’,n-n’) and constant mean –Fit into random field models ~ analysis and synthesis u Focus on second order statistics for simplicity u Two textures with same 2nd order statistics often appear similar

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [14] Examples: Spectral Approaches to Study Texture Figures from Gonzalez’s 2/e book resource

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [15] Texture Synthesis Recall: error concealment of small blocks Exploit surrounding edge info. to interpolate General approach: analysis then synthesis Image In-painting –Filling in missing/occluded regions with synthesized version u Maintain structural consistency (in edge & overall color/brightness) u Maintain texture’s statistical continuity (such as oscillation pattern) for improved visual effect Ref: M. Bertalmio, L. Vese, G. Sapiro, and S. Osher: “Simultaneous Structure and Texture Image Inpainting,” IEEE Trans. on Image Proc., vol.12, no.8, August edge estimation edge-directed interpolation Image examples from Bertalmio et al. TIP’03 paper

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [16] Image Inpainting: Basic Approach and Example Figures from Bertalmio et al. TIP’03 paper

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [17] Texture Synthesis Approach by Effros-Leung Model texture as Markov Random Field (MRF) –Probability distribution of pixel brightness given spatial neighborhood is independent of the rest of the image. –Model neighborhood of a pixel with a square window around Window size controls how stochastic the texture will be E.g. Choose window size on the scale of biggest regular feature 2-step estimate of cond’l p.d.f. –Match neighborhood with some allowable distortion –Build histogram of corresp. pixels from matched neighborhoods –Produce estimate based on the histogram Figures from Effros-Leung ICCV’99 paper

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [18] Recall: Characterize the Ensemble of 2-D Signals Specify by a joint probability distribution function –Difficult to measure and specify the joint distribution for images of practical size => too many r.v. : e.g. 512 x 512 = 262,144 Specify by the first few moments –Mean (1 st moment) and Covariance (2 nd moment) u may still be non-trivial to measure for the entire image size By various stochastic models –Use a few parameters to describe the relations among all pixels u E.g. 2-D extensions from 1-D Autoregressive (AR) model Important for a variety of image processing tasks –image compression, enhancement, restoration, understanding, … UMCP ENEE631 Slides (created by M.Wu © 2004)

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [19] Recall: Discrete Random Field We call a 2-D sequence discrete random field if each of its elements is a random variable –when the random field represents an ensemble of images, we often call it a random image Mean and Covariance of a complex random field E[u(m,n)] =  (m,n) Cov[u(m,n), u(m’,n’)] = E[ (u(m,n) –  (m,n)) (u(m’,n’) –  (m’,n’)) * ] = r u ( m, n; m’, n’) u For zero-mean random field, autocorrelation function = cov. function Wide-sense stationary (or wide-sense homogeneity)  (m,n) =  = constant r u ( m, n; m’, n’) = r u ( m – m’, n – n’; 0, 0) = r ( m – m’, n – n’ ) u also called shift invariant, spatial invariant in some literature UMCP ENEE631 Slides (created by M.Wu © 2004)

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [20] Recall: Special Random Fields of Interests White noise field –A stationary random field –Any two elements at different locations x(m,n) and x(m’,n’) are mutually uncorrelated r x ( m – m’, n – n’) =  x 2 ( m, n )  ( m – m’, n – n’ ) Gaussian random field –Every segment defined on an arbitrary finite grid is Gaussian i.e. every finite segment of u(m,n) when mapped into a vector have a joint Gaussian p.d.f. of UMCP ENEE631 Slides (created by M.Wu © 2004)

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [21] Recall: Spectral Density Function Spectral density function (SDF) is defined as the Fourier transform of the covariance function r x u Also known as the power spectral density (p.s.d.) Example: SDF of stationary white noise field with r(m,n)=  2  (m,n)  S(  1,  2) =  2 SDF Properties: –Real and nonnegative: S(  1,  2 ) = S*(  1,  2 ); S(  1,  2 )  0 u By conjugate symmetry of covariance function: r (m, n) = r * (-m, -n) u By non-negative definiteness of covariance function –SDF of the output from a LSI system w/ freq response H(  1,  2 ) S y (  1,  2 ) = | H(  1,  2 ) | 2 S x (  1,  2 ) UMCP ENEE631 Slides (created by M.Wu © 2004)

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [22] More on Image Modeling Good image model can facilitate many image proc. tasks –Coding/compression; restoration; estimation/interpolation; … Image models we’ve seen/used so far –Consider pixel values as realizations of a r.v. u Color/grayscale histogram –Predictive models u Use linear combination of (causal) neighborhood to estimate –Random field u(m,n) u Characterized by 2-D correlation function or p.s.d. Generally can characterize u(m,n) = u’(m,n) + e(m,n) u’(m,n) is some prediction of u(m,n); e(m,n) is another random field u Minimum Variance representation (MVR): e(m,n) is error of min. var. prediction u White Noise Driven representation: e(m,n) is chosen as a white noise field u ARMA representation: e(m,n) is a 2-D moving average of a white noise field

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [23] Recall: Linear Predictor Causality required for coding purpose –Can’t use the samples that decoder hasn’t got as reference Use last sample u q (n-1): equiv. to coding the difference (DPCM) p th –order auto-regressive (AR) model –Linear predictor from past samples Prediction neighborhood –Line-by-line DPCM u predict from the past samples in the same line –2-D DPCM u predict from past samples in the same line and from previous lines –Non-causal neighborhood u Use samples around as prediction/estimation => for filtering, restoration, etc Predictor coefficients in MMSE sense: get from orthogonality condition (from Wiener filtering discussions) UMCP ENEE631 Slides (created by M.Wu © 2001)

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [24] Commonly Used Image Model Gaussian model and Gaussian mixture model Every segment defined on an arbitrary finite grid is Gaussian -or- follows a distribution of linear combining several Gaussian u Reduce the modeling to estimate mean(s), variance(s), and weightings (Ref: Prof. R. Gray’s IEEE talk S’07 Markov random field –Markovianity: conditional independence u Define past, present, future pixel set for each pix location u Given the present, the future is independent of the past –2-D spatial causal AR model (under Gaussian noise or MVR) Gauss-Markov random field model –Gaussian: conditional independence => conditional uncorrelateness Bring in multi-scale and wavelet/multi-resolution ideas Ref: Section of Bovik’s Handbook

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [25] Comparison of Various Predictive Models –Ref. Jain’s pp495 From Jain’s Fig UMCP ENEE631 Slides (created by M.Wu © 2001)

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [26] Recursive Estimation: Basic Ideas Kalman filter: recursive Linear MMSE estimator 1-D example: provide linear estimation for an AR signal –System model AR(M): x(n)=c 1 x(n-1)+… c M x(n-M) + w(n) u State equation: x(n) = C x(n-1) + w(n) for n = 0, 1, …, x(-1)=0 ~ AR sig u Observation equation: y(n) = h T x(n) + v(n) u M-dimension state vector x(n); Model noise w(n) and observation noise v(n) are white & orthogonal –Signal model is Mth-order Markov under Gaussian noise w –Linear MMSE estimator is globally optimal if model and observation noise w and v are both Gaussian –C can be time-variant and physics motivated by applications General MMSE solution: equiv. to find conditional mean u Filtering estimate: E[ x(n) | y(n), y(n-1), … y(0) ]  x a (n) u One-step predictor estimate: E[ x(n) | y(n-1), … y(0) ]  x b (n)

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [27] Recursive Estimation (cont’d) 1-D Kalman filter equations u Prediction: x b (n) = C x a (n-1) u Update: x a (n) = x b (n) + g(n) [ y(n) – h T x b (n) ] –Error variance equations P b (n) = C P a (n-1) C T + Q w initialize: x(0) = [w(0), 0, … 0 ] T, x b (0) = 0 P a (n) = (I – g(n) h T ) P b (n) P b (0) = Q w ~ all zero except 1 st entry  w 2 –Kalman gain vector g(n) u g(n) = P b (n) (h T P b (n) h +  v 2 ) -1 2-D Kalman filtering: define proper state vector u Raster scan observations & map to equiv. 1-D case u Restrict Kalman gain terms to be just surround current observation to reduce computational complexity past present state future

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [28] Summary of Today’s Lecture Texture analysis and synthesis More on image modeling Readings –Texture: Gonzalez’s book ; see also Jain’s 9.11 –Image modeling: u Wood’s book Chapter 7 & 9.4; Jain’s book Chapter 6; Bovik’s Handbook u Recursive/Kalman estimation: Woods’ book Chapter 7; EE621 (Poor’s book) UMCP ENEE631 Slides (created by M.Wu © 2004)

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [29] Beyond ENEE631 ENEE633 Statistical Pattern Recognition ENEE731 Image Understanding Audio/Speech: ENEE632 Adaptive algorithms: ENEE634 Special-topic research-oriented courses (sometimes offered) –On medical image processing –On media security & forensics UMCP ENEE631 Slides (created by M.Wu © 2004)

M. Wu: ENEE631 Digital Image Processing (Spring'09) Lec 27 – Texture & Feature Analysis [30]