Hybrid Classifiers for Object Classification with a Rich Background M. Osadchy, D. Keren, and B. Fadida-Specktor, ECCV 2012 Computer Vision and Video Analysis.

Slides:



Advertisements
Similar presentations
Improving the Fisher Kernel for Large-Scale Image Classification Florent Perronnin, Jorge Sanchez, and Thomas Mensink, ECCV 2010 VGG reading group, January.
Advertisements

EigenFaces and EigenPatches Useful model of variation in a region –Region must be fixed shape (eg rectangle) Developed for face recognition Generalised.
ECG Signal processing (2)
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
SVM - Support Vector Machines A new classification method for both linear and nonlinear data It uses a nonlinear mapping to transform the original training.
Efficient Large-Scale Structured Learning

An Introduction of Support Vector Machine
Classification using intersection kernel SVMs is efficient Joint work with Subhransu Maji and Alex Berg Jitendra Malik UC Berkeley.
Support Vector Machines
SVM—Support Vector Machines
Machine learning continued Image source:
LPP-HOG: A New Local Image Descriptor for Fast Human Detection Andy Qing Jun Wang and Ru Bo Zhang IEEE International Symposium.
Ziming Zhang*, Ze-Nian Li, Mark Drew School of Computing Science Simon Fraser University Vancouver, Canada {zza27, li, AdaMKL: A Novel.
Robust Object Tracking via Sparsity-based Collaborative Model
Second order cone programming approaches for handing missing and uncertain data P. K. Shivaswamy, C. Bhattacharyya and A. J. Smola Discussion led by Qi.
Enhancing Exemplar SVMs using Part Level Transfer Regularization 1.
Fast intersection kernel SVMs for Realtime Object Detection
DISCRIMINATIVE DECORELATION FOR CLUSTERING AND CLASSIFICATION ECCV 12 Bharath Hariharan, Jitandra Malik, and Deva Ramanan.
Discriminative and generative methods for bags of features
Pattern Recognition and Machine Learning
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Support Vector Machines (SVMs) Chapter 5 (Duda et al.)
University of Texas at Austin Machine Learning Group Department of Computer Sciences University of Texas at Austin Support Vector Machines.
Predictive Automatic Relevance Determination by Expectation Propagation Yuan (Alan) Qi Thomas P. Minka Rosalind W. Picard Zoubin Ghahramani.
Support Vector Machines Kernel Machines
Support Vector Machines
Using Image Priors in Maximum Margin Classifiers Tali Brayer Margarita Osadchy Daniel Keren.
Ch. Eick: Support Vector Machines: The Main Ideas Reading Material Support Vector Machines: 1.Textbook 2. First 3 columns of Smola/Schönkopf article on.
Carven von Bearnensquash
Cao et al. ICML 2010 Presented by Danushka Bollegala.
Step 3: Classification Learn a decision rule (classifier) assigning bag-of-features representations of images to different classes Decision boundary Zebra.
“Study on Parallel SVM Based on MapReduce” Kuei-Ti Lu 03/12/2015.
Marcin Marszałek, Ivan Laptev, Cordelia Schmid Computer Vision and Pattern Recognition, CVPR Actions in Context.
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 24 – Classifiers 1.
Support Vector Machines Mei-Chen Yeh 04/20/2010. The Classification Problem Label instances, usually represented by feature vectors, into one of the predefined.
计算机学院 计算感知 Support Vector Machines. 2 University of Texas at Austin Machine Learning Group 计算感知 计算机学院 Perceptron Revisited: Linear Separators Binary classification.
Kernel Methods A B M Shawkat Ali 1 2 Data Mining ¤ DM or KDD (Knowledge Discovery in Databases) Extracting previously unknown, valid, and actionable.
Lecture 31: Modern recognition CS4670 / 5670: Computer Vision Noah Snavely.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Classifiers Given a feature representation for images, how do we learn a model for distinguishing features from different classes? Zebra Non-zebra Decision.
Beyond Sliding Windows: Object Localization by Efficient Subwindow Search The best paper prize at CVPR 2008.
D. M. J. Tax and R. P. W. Duin. Presented by Mihajlo Grbovic Support Vector Data Description.
ECE 8443 – Pattern Recognition LECTURE 08: DIMENSIONALITY, PRINCIPAL COMPONENTS ANALYSIS Objectives: Data Considerations Computational Complexity Overfitting.
Visual Categorization With Bags of Keypoints Original Authors: G. Csurka, C.R. Dance, L. Fan, J. Willamowski, C. Bray ECCV Workshop on Statistical Learning.
CS 1699: Intro to Computer Vision Support Vector Machines Prof. Adriana Kovashka University of Pittsburgh October 29, 2015.
University of Texas at Austin Machine Learning Group Department of Computer Sciences University of Texas at Austin Support Vector Machines.
CS558 Project Local SVM Classification based on triangulation (on the plane) Glenn Fung.
Support Vector Machines and Gene Function Prediction Brown et al PNAS. CS 466 Saurabh Sinha.
Guest lecture: Feature Selection Alan Qi Dec 2, 2004.
CZ5225: Modeling and Simulation in Biology Lecture 7, Microarray Class Classification by Machine learning Methods Prof. Chen Yu Zong Tel:
Locally Linear Support Vector Machines Ľubor Ladický Philip H.S. Torr.
A Kernel Approach for Learning From Almost Orthogonal Pattern * CIS 525 Class Presentation Professor: Slobodan Vucetic Presenter: Yilian Qin * B. Scholkopf.
Support Vector Machines Reading: Ben-Hur and Weston, “A User’s Guide to Support Vector Machines” (linked from class web page)
Computational Intelligence: Methods and Applications Lecture 24 SVM in the non-linear case Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
CS 2750: Machine Learning Support Vector Machines Prof. Adriana Kovashka University of Pittsburgh February 17, 2016.
Nawanol Theera-Ampornpunt, Seong Gon Kim, Asish Ghoshal, Saurabh Bagchi, Ananth Grama, and Somali Chaterji Fast Training on Large Genomics Data using Distributed.
Incremental Reduced Support Vector Machines Yuh-Jye Lee, Hung-Yi Lo and Su-Yun Huang National Taiwan University of Science and Technology and Institute.
Finding Clusters within a Class to Improve Classification Accuracy Literature Survey Yong Jae Lee 3/6/08.
Does one size really fit all? Evaluating classifiers in a Bag-of-Visual-Words classification Christian Hentschel, Harald Sack Hasso Plattner Institute.
1 Bilinear Classifiers for Visual Recognition Computational Vision Lab. University of California Irvine To be presented in NIPS 2009 Hamed Pirsiavash Deva.
Predictive Automatic Relevance Determination by Expectation Propagation Y. Qi T.P. Minka R.W. Picard Z. Ghahramani.
Support Vector Machines (SVMs) Chapter 5 (Duda et al.) CS479/679 Pattern Recognition Dr. George Bebis.
NICTA SML Seminar, May 26, 2011 Modeling spatial layout for image classification Jakob Verbeek 1 Joint work with Josip Krapac 1 & Frédéric Jurie 2 1: LEAR.
LECTURE 09: BAYESIAN ESTIMATION (Cont.)
Alan Qi Thomas P. Minka Rosalind W. Picard Zoubin Ghahramani
Learning with information of features
CS 2750: Machine Learning Support Vector Machines
Design of Hierarchical Classifiers for Efficient and Accurate Pattern Classification M N S S K Pavan Kumar Advisor : Dr. C. V. Jawahar.
Presentation transcript:

Hybrid Classifiers for Object Classification with a Rich Background M. Osadchy, D. Keren, and B. Fadida-Specktor, ECCV 2012 Computer Vision and Video Analysis An international workshop in honor of Prof. Shmuel Peleg The Hebrew University of Jerusalem October 21, 2012 ECCV paper (PDF)

In a nutshell… One-against-all classification. Positive class = cars, negative class = all non-cars (= background). SVM etc. requires samples from both classes (and one-class SVM is too simple to work here). Hard to sample from the (huge) background. Proposed solution: Represent background by a distribution. Construct a “hybrid” classifier, separating positive samples from background distribution.

Classes Diversity in Natural Images

Previous Work 1.Cost sensitive methods (e.g. Weighted SVM). 2.Undersampling the majority class. 3.Oversampling the minority class. 4.… Alas, these methods do not solve the complexity issue. Linear SVM (Joachims, 2006) PEGASOS (Shalev-Shwartz et al, 2007) Kernel Matrix approximation (Keerthi et al,2006; Joachims et al, 2009) Special kernel forms: (Maji et al, 2008; Perronnin et al 2010) Discriminative Decorrelation for Clustering and Classification (Hariharan et al, 2012).

M. Osadchy & D. Keren (CVPR 2006) Instead of minimizing the number of background samples: minimize the overall probability volume of the background prior in the acceptance region. Object class No negative samples! Less constraints in the optimization No negative SVs Background is modeled just once, very useful if you want many one- against-all classifiers.

“Hybrid SVM”: positive samples, negative prior. M.Osadchy & D. Keren (CVPR 2006), cont.

“ Boltzmann” prior: characterizes grey level features. Gaussian smoothness-based probability. ONE constraint on the probability, instead of many constraints on negative samples. Expression for the probability that for a natural image x, vector w, and scalar b. Problem formulation

Contributions of Current Work Work with SIFT. Kernelize. Kernel hybrid classifier, which is more efficient than kernel SVM, without compromising accuracy.

How do projections of natural images look like? Under certain independence conditions, low dimensional projections of high-dimensional data are close to Gaussian. Experiments show that SIFT BOW projections are Gaussian-like: Histogram Intersection kernel of Sift Bow Projections Problem – background distribution is known to be extremely complicated. BUT – classification is done post-projection! To separate the positive samples from the background, we must first model the background.

Linear Classifier - Probability Constraint shows a good correspondence with reality. constraint

Hybrid Kernel Classifier

Experiments  Caltech256 dataset  SIFT BoW with 1000, SPM kernel.  Performance of linear and kernel Hybrid Classifiers was compared to linear and kernel SVMs and their weighted versions  30 positive samples, 1280 samples for Covariance matrix + mean estimation. In SVM: 7650 samples  EER for binary classification was computed with 25 samples from each class. Predict absence/presence of a specific class in the test image.

Results SVMWeighted SVM Hybrid Linear71%73.9%73.8% Kernel83.4%83.6%84% Weighted SVMHybrid Number of kernel evaluations Number of parameters in optimization Number of constraints in optimization Memory usage450M4.5M