Download presentation
Presentation is loading. Please wait.
Published byNoah Andrews Modified over 6 years ago
1
Outline S. C. Zhu, X. Liu, and Y. Wu, “Exploring Texture Ensembles by Efficient Markov Chain Monte Carlo”, IEEE Transactions On Pattern Analysis And Machine Intelligence, Vol. 22, No. 6, pp , 2000
2
Limitations of Linear Representations
Linear representations do not depend on the spatial relationships among pixels For example, if we shuffle the pixels and corresponding representations, then the classification results will remain the same But in images spatial relationships are important November 21, 2018 Computer Vision
3
Image Features November 21, 2018 Computer Vision
4
Spectral Representation of Images
Spectral histogram Given a bank of filters F(a), a = 1, …, K, a spectral histogram is defined as the marginal distribution of filter responses November 21, 2018 Computer Vision
5
Spectral Representation of Images - continued
An example of spectral histogram November 21, 2018 Computer Vision
6
Image Modeling - continued
Given observed feature statistics {H(a)obs}, we associate an energy with any image I as Then the corresponding Gibbs distribution is The q(I) can be sampled using a Gibbs sampler or other Markov chain Monte-Carlo algorithms November 21, 2018 Computer Vision
7
Image Modeling - continued
Image Synthesis Algorithm Compute {Hobs} from an observed texture image Initialize Isyn as any image, and T as T0 Repeat Randomly pick a pixel v in Isyn Calculate the conditional probability q(Isyn(v)| Isyn(-v)) Choose new Isyn(v) under q(Isyn(v)| Isyn(-v)) Reduce T gradually Until E(I) < e November 21, 2018 Computer Vision
8
A Texture Synthesis Example
Observed image Initial synthesized image November 21, 2018 Computer Vision
9
A Texture Synthesis Example
Image patch Energy Conditional probability Temperature Energy and conditional probability of the marked pixel November 21, 2018 Computer Vision
10
A Texture Synthesis Example - continued
Average spectral histogram error A white noise image was transformed to a perceptually similar texture by matching the spectral histogram November 21, 2018 Computer Vision
11
A Texture Synthesis Example - continued
Synthesized images from different initial conditions November 21, 2018 Computer Vision
12
Texture Synthesis Examples - continued
Observed image Synthesized image A random texture image November 21, 2018 Computer Vision
13
Texture Synthesis Examples - continued
Observed image Synthesized image An image with periodic structures November 21, 2018 Computer Vision
14
Texture Synthesis Examples - continued
Mud image Synthesized image A mud image with some animal foot prints November 21, 2018 Computer Vision
15
Texture Synthesis Examples - continued
Observed image Synthesized image A random texture image with elements November 21, 2018 Computer Vision
16
Texture Synthesis Examples - continued
Observed image Synthesized image An image consisting of two regions Note that wrap-around boundary conditions were used November 21, 2018 Computer Vision
17
Texture Synthesis Examples - continued
Original cheetah skin patch Synthesized image A cheetah skin image November 21, 2018 Computer Vision
18
Texture Synthesis Examples - continued
Observed image Synthesized image An image consisting of circles November 21, 2018 Computer Vision
19
Texture Synthesis Examples - continued
Observed image Synthesized image An image consisting of crosses November 21, 2018 Computer Vision
20
Texture Synthesis Examples - continued
Observed image Synthesized image A pattern with long-range structures November 21, 2018 Computer Vision
21
Object Synthesis Examples
As in texture synthesis, we start from a random image In addition, similar object images are used as boundary conditions in that the corresponding pixel values are not updated during sampling process November 21, 2018 Computer Vision
22
Object Synthesis Examples - continued
November 21, 2018 Computer Vision
23
Object Synthesis Examples - continued
November 21, 2018 Computer Vision
24
Linear Transformations of Images
Linear transformations include Principal component analysis Independent component analysis Fisher discriminant analysis Optimal component analysis They have been widely used to reduce dimension of images for appearance-based recognition applications Each image is viewed as a long vector and projected into a set of bases that have certain properties November 21, 2018 Computer Vision
25
Principal Component Analysis
Defined with respect to a training set such that the average reconstruction error is minimized November 21, 2018 Computer Vision
26
Principal Component Analysis - continued
November 21, 2018 Computer Vision
27
Eigen Values of 400 Eigen Vectors
November 21, 2018 Computer Vision
28
Principal Component Analysis - continued
Original Image Reconstructed using 50 PCs Reconstructed using 200 PCs November 21, 2018 Computer Vision
29
Principal Component Analysis - continued
Is PCA representation a good representation of images for recognition in that images that have similar principal representations are similar? Image generation through sampling Roughly speaking, we try to generate images that have the given coefficients along PCs November 21, 2018 Computer Vision
30
Principal Component Analysis - continued
November 21, 2018 Computer Vision
31
Principal Component Analysis - continued
November 21, 2018 Computer Vision
32
Difference Between Reconstruction and Sampling
Reconstruction is not sufficient to show the adequacy of a representation and sampling from the set of images with same representation is more informational November 21, 2018 Computer Vision
33
Object Recognition Experiments
We compare linear methods in the methods including Principal component analysis (PCA) Independent component analysis (ICA) Fisher discriminant analysis (FDA) Random component analysis (RCA) For fun and to show the actual gain of using different bases is relatively small Corresponding linear methods in the spectral histogram space including SPCA, SICA, SFDA, and SRCA November 21, 2018 Computer Vision
34
COIL Dataset November 21, 2018 Computer Vision
35
3D Recognition Results November 21, 2018 Computer Vision
36
Experimental Results - continued
To further demonstrate the effectiveness of our method for different types of images, we create a dataset of combining the texture dataset, face dataset, and COIL dataset, resulting in a dataset of 180 categories with images in total November 21, 2018 Computer Vision
37
Linear Subspaces of Spectral Representation
November 21, 2018 Computer Vision
38
Experimental Results - continued
Combined dataset – continued Not only the recognition rate is very good, but also it is very reliable and robust, as the average entropy of the p0(i|I) is 0.60 bit (The corresponding uniform distribution’s entropy is 7.49 bits) November 21, 2018 Computer Vision
39
Experimental Results - continued
Combined dataset – continued Not only the recognition rate is very good, but also it is very reliable and robust, as the average entropy of the p0(i|I) is 0.60 bit (The corresponding uniform distribution’s entropy is 7.49 bits) Entropy=0.60 bit Entropy=6.78bits November 21, 2018 Computer Vision
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.