Download presentation
Presentation is loading. Please wait.
Published byGwenda Farmer Modified over 8 years ago
1
Martina Uray Heinz Mayer Joanneum Research Graz Institute of Digital Image Processing Horst Bischof Graz University of Technology Institute for Computer Graphics and Vision Robust Incremental Linear Discriminant Analysis Learning by Autonomous Outlier Detection
2
2 Overview Motivation Standard LDA How is it defined? What is its purpose? Where are the drawbacks? Robust Incremental LDA What is the idea? How is it constructed? Why does it work? Results What type of noise can be handled? How is the classification performance? Does the data itself influence the efficiency? Conclusion
3
3 Motivation Goal Find an appropriate description of images Enable recognition of trained objects Properties of LDA Stores only image representations Allows to discard the original images Enables a simple classification Incorporate new information Update instead new construction Discard images after addition Handle noisy data (occlusions)
4
4 Development Idea Incorporate some reconstructive information Adapt robust PCA approaches Deviation Automatic outlier detection Subspace update from inliers only Final achievement Reliable subspace Training dataTest data Non-robust results Robust results
5
5 Eigenvalue Problem Min(Fisher Criterion) Linear Discriminant Analysis … within-class-scatter … between-class-scatter … data mean … class mean
6
6 LDA: Efficiently separates the data good classification PCA: Approximates the data good reconstruction Combine reconstructive model and discriminative classifier Embed LDA learning and classification into PCA framework (Fidler et al., PAMI’06) First k << n principal vectors contain most visual variability The Augmented Basis I
7
7 Augment k-dimensional PCA subspace with c-1 additional basis vectors Keep discriminative information The Augmented Basis II
8
8 Robust learning Basis for a reliable recognition Two types of outliers (due to non-optimal conditions during data acquisition) global noise outlying images local noise outlying pixels (non-Gaussian noise) Incremental learning No recalculation of model Build representations incrementally (update representation) Not all images are given in advance Keep only image representations (discard training images) Robust Incremental LDA
9
9 Outlier Detection I A) Creation of N hypothesis 1.Choose subset of pixels from 2.Iterate (until error is small) Calculate the aPCA coefficient from the current subset Reconstruct the image Remove pixels with the largest reconstruction error 3.Calculate the aPCA coefficient B) Selection of best hypothesis 1.Calculate the reconstruction error for each hypothesis 2.Count the values exceeding a predefined threshold 3.Select the hypothesis for according to the smallest error count
10
10 Outlier Detection II C) Iterate (until small error) 1.Reconstruct image 2.Remove pixels with largest reconstruction error 3.Recalculate the aPCA coefficient D) Finalize 1.Reconstruct image 2.Calculate the reconstruction error 3.Determine outliers for predefined threshold 4.Replace missing pixels
11
11 1. Project outlier free image in current eigenspace 2. Difference between reconstruction and input image is orthogonal to current eigenspace 3. Enlarge eigenspace with normalized residual Update I
12
12 4. Perform PCA on the actual representation new principal axes 5. Project coefficient vectors into new eigenspace 6. Rotate current basis vectors to match new basis vectors 7. Perform LDA on coefficient vectors in new eigenspace Update PCA space Perform LDA Update II
13
13 Experiments Aim Autonomously detect wrong pixels Achieve recognition rates approaching those from clean data Compare to non-robust and known missing Pixels (MPL) Datasets 1.Faces: Sheffield Face Database (Graham & Allinson, Face Recognition: From Theory to Applications’98) 2.Objects: Coil20 (Nene et al., TR-Columbia University’96)
14
14 Facts Start with reliable basis Coil20: 4 images SFD: 2 images Test several occlusions Salt-and-pepper Horizontal bar Vertical bar Square Test several intensities White Black (Random) Gray Parameters Black square Gray vertical bar Salt-and-pepper White horizontal bar
15
15 Results: Salt-and-Pepper
16
16 Results: Black Square
17
17 Results: White Horizontal Bar
18
18 Results: Gray Vertical Bar
19
19 Conclusion and Future Work Augmented PCA subspace Combines: Reconstructive information Allows for incremental & robust learning Discriminative information Allows for efficient classification Enables: Incremental learning Similar recognition rates as batch Outlier detection Reliable occlusion handling (Clear improvement to non-robust approach) Room for improvement Optimization of parameters Handle labelling noise
20
20 Thanks for your attention !
21
21 Model grows when adding a new image truncate by one similar to augmenting the basis Maintain Subspace Size
22
22 Classification is a two step procedure Project novel image to PCA space Project obtained coefficient vector in LDA space The classification function Keeps unchanged Preserves all discriminative information Classification
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.