Download presentation
1
Face alignment using Boosted Appearance Model (BAM)
Satya mahesh Muddamsetty Supervisor: Tommaso Gritti Video processing & Analysis group Examiner: Mikael Nilsson, Department of Signal processing, BTH September 30, 2009
2
Outlines Introduction Brief summary of Previous methods Shape model learning in BAM Appearance model learning in BAM Alignment using BAM Experiments & Results Conclusion
3
Introduction: Image alignment/fitting
It is the process of moving and deforming a template to minimize the distance between template and an image. Alignment is done in 3 steps step1: model choice template, ASM,AAM step2 : distance metrics MSE (Mean Square Error) step3: optimization Gradient descent methods
4
Introduction: applications
Face fitting [Baker & Matthews’04IJCV] Tracking [Hager & Belhumeur’98PAMI] Medical Image Interpretation [Mitchell et al.’02TMI] Industrial Inspection
5
Outlines Introduction Brief summary of previous methods Shape model learning in BAM Appearance model learning in BAM Alignment using BAM Experiments & Results Conclusion
6
Brief summary of previous methods
Point distribution model (PDM) [Cootes et al.’92BMVC] Active shape model (ASM) [Cootes & Taylor’92BMVC] Active appearance model (AAM) [Cootes et al.’01PAMI] Inverse compositional (IC) and simultaneous inverse compositional (SIC) AAM fitting [Baker & Matthews’04IJCV]
7
Brief summary of previous methods
Active shape model (ASM) [Cootes & Taylor’92BMVC] It uses the shape model (PDM) as a template Shape parameters Mean shape Eigen vectors It seeks to minimize the distance between model points and the corresponding points found in the image. Drawbacks: Only the local appearance information around each landmarks is learned, which might not be the effective way of modeling
8
Brief summary of previous methods
Inverse compositional (IC) and Simultaneous Inverse compositional (SIC) AAM fitting [Baker & Matthews’04IJCV] AAM template:
9
Brief summary of previous methods
Inverse compositional (IC) and Simultaneous Inverse compositional (SIC) AAM fitting [Baker & Matthews’04IJCV] AAM-distance Appearance basis Mean shape Mean appearance Shape parameter Appearance parameter Warping function Image coordinate * Image observation * Gross et al.’05IVC
10
Problems of Previous methods
All these AAM methods know to have a generalization problem, they degrades quickly when the they trained on large dataset. And the performance is poor on the unseen data These models are generative models How to solve them? New method known as Boosted Appearance Model (BAM) It is an discriminative model. It has shape model, appearance model and an specific alignment method.
11
Outlines Introduction Breif summary of previous methods Appearance model learning in BAM Alignment using BAM Experiments & Results Conclusion Shape model learning in BAM
12
Shape model learning in BAM
Same shape model (PDM) used in previous methods The shape model learned by applying principle component analysis (PCA) to the set of shape vectors S , where The shape data consists of points in nD-space si = (xi0,xi1,xi2,…,xin,yi0,yi1,…,yin )T with observation i = {1,..,N} Sample shape from training set Steps: Compute the mean of the training data : 2. Compute the covariance of the data: (nd x nd matrix) 3. Compute the Eigen vectors Φi and the corresponding Eigen values of C
13
Shape model learning in BAM
Eigenvectors: fi =(fi,1,fi,2,…,fi,nd-1,fi,nd) (which means (x0,x1,x2,…,xn,y0,y1,…,yn ) ) Eigen values I, 2 ,…, nd The eigenvector with highest eigen value is the most dominant shape variation in the training set The eigenvectors are therefore ordered in magnitude of eigen value. Matrix of eigenvectors: F =[f1T, f2T,…, fkT ] So finally our parametric shape model s can be expresses as a mean shape plus a linear combination of k eigen vectors fk Sample shape from training set
14
Shape model learning in BAM
learned Shape variations By varying the shape parameters with respect to mean shape where i=1 i=2 i=3 i=4 i=5,..k
15
Outlines Introduction Breif summary of Previous methods Shape model learning in BAM Alignment using BAM Experiments & Results Conclusion Appearance model learning in BAM
16
Appearance Model learning in BAM
Similar to AAM our appearance model is defined on the warped Image I(W(x;p) In BAM, appearance model is a set of weak classifiers which learns the decision boundary between correct alignment (positive class) and incorrect alignment (negative class). M- number of weak classifiers It is a function of warped image I(W(x;p) warped with the shape parameters p
17
- Is the shape vector Appearance Model learning in BAM
- Is the matrix of Eigenvectors Training samples Positive samples: compute the shape parameters: where Negative samples: Perturb the each element of original shape parameters: randomly and each element should be uniformly distributed between [-1,1]. n – number of perturbed shape per each original shape v – is the k – dimensional vector with each element uniformly distributed from [-1,1] randomly - is a vector with k- Eigen values
18
Appearance Model learning in BAM
Positive samples Negative samples N- original shapes N- original warped images Nq - perturbed shapes Nq- perturbed warped images
19
Appearance Model learning in BAM
Boosting: Label = 1 Label = -1 Computing the Rectangular Haar features on the warped images via integral image (Viola and Jones)
20
Appearance Model learning in BAM
Boosting: Number of original images ‘N’ Number of features ‘K’ Positive samples Gentle boosting Number of perturbed images ‘Nq’ Number of features ‘K’ Negative samples Haar Features selected by Gentle boost
21
Appearance Model learning in BAM
Weak classifier design: Selected feature by gentle boost computed on any warped image m=1 m=2 m=3 m=4 m=5 m=6,.M m = 1,2,…..M 1 - Selected Haar feature by gentle boost threshold
22
Appearance Model learning in BAM
Final Weak classifiers m=1 m=2 m=3 m=4 m=5 m=6 m=1 to 100
23
Appearance Model learning in BAM
Appearance Model: Finally our appearance model is a collection of parameters which are able to distinguish correct alignment (positive class) and incorrect alignment (negative class). Feature details(r, c, width, height) Haar Feature type sign threshold location
24
Outlines Introduction Breif summary of Previous methods Shape model & learning in BAM Appearance model & learning in BAM Experiments & Results Conclusion Alignment using BAM
25
Alignment using BAM How to do alignment using BAM ?
Use the classification score from the trained strong classifier as a distance metrics . How this score is computed? Takes Input: as warped image Output: SCORE This score indicates the quality of alignment Training samples postivesamples400(red color) negative sample 4000(blue color)
26
Alignment using BAM Alignment in the sense given an initial parameters will have negative score, trying to look new parameters will have maximum positive score. But finding this new parameters is a non-linear optimization problem To solve this iteratively we are using gradient ascent method
27
Alignment using BAM Alignment/fitting via Gradient Ascent method where
Solving the correct parameters iteratively RMSE
28
Alignment using BAM Gradient Ascent Solution
Our trained two- class strong classifier Gradient of is
29
Alignment using BAM: summary
Inputs: Input Image I , Initial shape parameters ,warped jacobian BAM; Shape model {mean shape , Eigen vectors } Appearance model { ;m=1,2,…..M} Step 0: Compute the gradient of the image , repeat 1. Warp I with, to compute 2. Compute the selected feature for the each weak classifier on the warped input image : 3. Warp the gradient image with the 4. Compute the steepest descent image 5. Compute the integral images for each Colum of SD and obtain the rectangular features for each weak classifier: 6. Compute using 7. Update until
30
Outlines Introduction Breif summary of Previous methods Shape model learning in BAM Appearance model learning in BAM Alignment using BAM Experiments & Results Conclusion
31
Experiments & Results We used challenging FERET DATASET which contains frontal images, with different variations, pose, race, illuminations, expressions. samples images are here
32
Experiments & Results Training Shape model Appearance model
We trained shape model (PDM) with 1636 image annotations Appearance model sets No of images No of positive samples No of negative samples TRAINset1 400 400 4000 800 800 8000 TRAINset2 Created Negative samples for different perturbation ranges {.8,1,1.2} for TRAINset1 and TRAINset2
33
Experiments & Results Alignment example RMSE
34
Experiments & Results performance: TEST DATA Test Sets No of images
Details TESTset1 300 From train data TESTset2 300 unseen data
35
Experiments & Results performance: On TRAINset1 TESTset1
TESTset2(unseen data) Train perturbation .8 1 1.2 Test perturbation .4 84% 58% 54% Test perturbation .6 78% 34% 37% Test perturbation .8 71% 30% Test perturbation 1 56% 26% 28% Test perturbation 1.2 50% 17% 21% Test perturbation 1.4 35% 15% 12% Train perturbation .8 1 1.2 Test perturbation .4 85% 57% 47% Test perturbation .6 86% 45% 39% Test perturbation .8 78% 37% 33% Test perturbation 1 69% 30% 28% Test perturbation 1.2 22% 23% Test perturbation 1.4 19% 15%
36
Experiments & Results performance: On TRAINset2 TESTset1
TESTset2(unseen data) Train perturbation .8 1 1.2 Test perturbation .4 68% 82% 37% Test perturbation .6 58% 26% Test perturbation .8 46% 54% 18% Test perturbation 1 42% 40% 12% Test perturbation 1.2 32% 33% 9% Test perturbation 1.4 24% 22% 5% Train perturbation .8 1 1.2 Test perturbation .4 62% 85% 35% Test perturbation .6 52% 70% 25% Test perturbation .8 41% 59% 20% Test perturbation 1 48% 17% Test perturbation 1.2 30% 11% Test perturbation 1.4 21% 24% 7%
37
Experiments & Results Test on different illumination image database YALE DATABASE Collected 30 images Generated 5 initializations randomly per each image finally 150 trails Test perturbations .4 .6 .8 1 1.2 1.4 converged 65% 45% 30% 16% 11% 9%
38
Outlines Introduction Breif summary of Previous methods Shape model & learning in BAM Appearance model & learning in BAM Alignment using BAM Experiments & Results Conclusion
39
Conclusions Future work
Idea of discriminative method in AAM seems like a powerful extension to classical methods Computational complexity still quite high Influence of amount of perturbation on training set is never mentioned in literature, but very strong integration of procrustes analysis not mentioned in the papers, even if it could help in building better shape models Future work compare results with classical AAM implementations test with very large training database
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.