Download presentation
Presentation is loading. Please wait.
Published byEleanor Harris Modified over 9 years ago
1
Meeting 8: Features for Object Classification Ullman et al.
2
PSYC 6130A, PROF. J. ELDER 2 Visual Features of Intermediate Complexity (Nature Neuroscience 2002)
3
PSYC 6130A, PROF. J. ELDER 3 Classifying Objects
4
PSYC 6130A, PROF. J. ELDER 4 Features: Image Fragments
5
PSYC 6130A, PROF. J. ELDER 5 Features: Image Fragments
6
PSYC 6130A, PROF. J. ELDER 6 Feature Detection Features are detected by weighted sum of gray-level gradient and orientation differences. Threshold selected to maximize mutual information.
7
PSYC 6130A, PROF. J. ELDER 7 Selection of Features How should image features be selected? Idea: maximize mutual information between feature (image fragment) F and object class C.
8
PSYC 6130A, PROF. J. ELDER 8 Selection of Features
9
PSYC 6130A, PROF. J. ELDER 9 Selection of Features First rectangular feature maximizes mutual information between class and feature, over all locations, sizes, shapes, and resolutions. Feature i+1 maximizes minimum gain in mutual information when added pairwise to all i features already selected.
10
PSYC 6130A, PROF. J. ELDER 10 Selected Features
11
PSYC 6130A, PROF. J. ELDER 11 Hierarchical Representation of Features
12
PSYC 6130A, PROF. J. ELDER 12 Features of Intermediate Size are Optimal
13
PSYC 6130A, PROF. J. ELDER 13 Features of Intermediate Size are Optimal
14
PSYC 6130A, PROF. J. ELDER 14 Why are Intermediate Features Optimal? Tradeoff: –Small features are not specific enough (may occur frequently out of class). –Large features do not generalize (do not occur frequently enough in novel in-class images).
15
PSYC 6130A, PROF. J. ELDER 15 Intermediate Resolutions are Optimal (For Faces)
16
PSYC 6130A, PROF. J. ELDER 16 Reconstruction
17
PSYC 6130A, PROF. J. ELDER 17 Classification Algorithm Multiple features combined to classify novel images, based on a weighted sum of binary feature responses. Features are grouped according to region of object (e.g., face):
18
PSYC 6130A, PROF. J. ELDER 18 Hits and Misses
19
PSYC 6130A, PROF. J. ELDER 19 Some Interesting False Positives
20
PSYC 6130A, PROF. J. ELDER 20 Multiscale Overlapping Features Reduce Responses to Scrambled Objects
21
PSYC 6130A, PROF. J. ELDER 21 Feature Hierarchies (ICCV 2005) Idea: Generate feature hierarchies by the same process of maximizing mutual information for classification. Variations in configuration of sub-parts are learned from training data. Negative examples for each level of the hierarchy are derived from false positives from higher level. Additional features are selected for each subimage until gain in information reaches lower limit, to maximum of 10 features. If decomposition of previous level resulted in information gain, process is repeated on resulting sub-parts.
22
PSYC 6130A, PROF. J. ELDER 22 Feature Hierarchy
23
PSYC 6130A, PROF. J. ELDER 23 Feature Hierarchy
24
PSYC 6130A, PROF. J. ELDER 24
25
PSYC 6130A, PROF. J. ELDER 25
26
PSYC 6130A, PROF. J. ELDER 26 Atomic Fragments Only atomic fragments need be correlated against input image. Responses at high levels of hierarchy are determined by weighted sum of lower level responses.
27
PSYC 6130A, PROF. J. ELDER 27 Optimizing Region of Interest
28
PSYC 6130A, PROF. J. ELDER 28 Optimizing Parameters Combination weights and feature positions optimized in alternating fashion. Positions (within feature ROIs) optimized top-down, using dynamic programming. Weights optimized using back-prop.
29
PSYC 6130A, PROF. J. ELDER 29 Feature Example
30
PSYC 6130A, PROF. J. ELDER 30 Classification Results
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.