Download presentation
1
Last update 2015.04.12 Heejune Ahn, SeoulTech
Image Features Last update Heejune Ahn, SeoulTech
2
Outline Goal of feature extraction and representation
External features Shape vectors Single-parameter shape Signatures and radical Fourier expansion Statistical moments of region Internal features Statistical measures of texture features Principal component Analysis
3
(points, boundaries, texture, etc)
0. Feature extraction Purpose of feature extraction Feature types (representation) external: boundaries or shapes of objects Internal : texture (variation of intensity/colors) statistical: e.g. PCA To consider which features to extract? how to extract? recognition, classification Image processing Features concise (small number) Image (big data) NxM (points, boundaries, texture, etc)
4
1. Landmarks and shape vectors
Desired properties smaller number (not all pixels on boundary lines) robust’ landmarks (similar to different observers) Types of landmarks Mathematical (Black) Extremes of gradient, curvature e.g Harris points, Anatomical/true (Red) by experts Pseudo landmark (Green) Shape vectors Sequence of landmark points
5
2. Shape descriptors: single params
Simple and concise values for classification verbal expr. : long & curved, roughly square, thin etc
6
Ex 9.1 & Fig 9.2 MATLAB regionprops(LabledImage, ‘area’, perimeter’, …) See the code
7
3. Signatures and radial F-expansion
definition r() : periodic (2) shape as points in 2-dim 1D function
8
Limited number of coefs
9
Strength : Robustness form transform
Invariant to translation: calculated from centroid Scale: simple constant multiple in coefs Rotation: phase in coefs Weakness cannot handle non-single valued shape
10
4. Statistical moments as region descriptors
definitions rotation invariant moment Translation invariant Central moment scale invariant normalized moment
11
Hu moments Ex 9.3 & F 9.5 H1 bwlabel Mpq Gpq H2 H2
12
5 Texture features Texture Statistical texture features
fluctuations of intensity of neighboring pixels Statistical texture features Range = {max – min} for Variations = {I2 – <I>2} for Entropy of information theory
13
9.6 Principal component analysis
Concept An expert can explain by 10 words what a normal needs 100 words. eigenvalue analysis & dimension reduction Classification Synthesis features huge size Principal components e.g.) 10k faces e.g.) combine with weighting them e.g.) unique1000 faces eye colors, shapes face shapes skin colors 100x100 binary image All cases = 2^10000 But not all combination is meaningful for human faces Why ? Correction of pixels
14
7. An illustrative PCA example
Korean medicine: 4 type of human bodies Psychology : MBTI test categorize 16 types Physical measures {age, height, weight, waist size, arm length, neck circumference, finger length } => { age, 0.7*height + 0.3*weight }
15
Algorithm for PCA Note Calculate principal axes I
Calculate the residuals Repeat 1 if i <= N or error > eTarget Note New coordinates do not have physical meanings.
16
8. Theory of PCA Goal and problem
N (>=M) samples of M-dim vectors: <x1,x2,…, xM> Want to find Pas, and transform And Such that y’s are not correlated, i.e.,
17
Solution Note The question is typical eigenvec/value problem.
i.e., if R with eigenvectors and with eiegen-values to matrix Explanation Definition of eigen-vec/value to a Matrix R: R x = l x Simply writing it in matrix form. Note We can choose the largest components only. =
18
PCA Calculation Procedure
Do Ex 9.5, Fig. 9.6 MATLAB [U,S,V] = svd(X) a diagonal matrix S of the same dimension as X with nonnegative diagonal elements in decreasing order unitary matrices U and V so that X = U*S*V'. Question Eigenface’s M = WxH. (large)
19
9.10 Principal axes and components
Dimension # of variables # of observations Principal components A project of data onto principal axes
20
11. PCA key properties PCA key properties Ex 9.6 & Fig. 9.11
PA (axes) are mutually orthogonal. Data in new axes are un-correlated. Successive axes add maximal variance. Diagonal matrix’s eigenvalues are variance. Ex 9.6 & Fig. 9.11 Data set : pixel directions from centroid in bw images PCA is two key directions
21
12. PCA dimensionality reduction
Sorting in order of variance size. Discard small variance components main purpose of PCA E.g.) 0.8 height weight waist Physical vars X1 X2 X3 . Xn VAR(X1) VAR(X2) VAR(X3) 60% 50% 20%. 3% Y1 Y2 Y3 Yn VAR(Y1) VAR(y2) VAR(Y3) VAR(Yn) 20% 10%. 1% Principal axes
22
13. PCA in Image processing
Covariance calculation over images vs elements (pixels) M (# of pixels) often >> N (# of images) So image domain variance is used. Cov[image I, image j] NxN matrix Cov[position i, position j] M x M matrix
23
14. Out of sample Training data vs test data Procedure
Training data: data set used for model generation Test data(i.e. out of sample): test the performance Procedure Since Accuracy vs compactness obtained using
24
15. Eignenface Face library Face identification N registered
Face Images K PAs <ak> for n Similarity measure (Eucleadian) Test Face K PAs <ak>
25
Sample of registered faces
Top 6 PCA
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.