Using Image Priors in Maximum Margin Classifiers Tali Brayer Margarita Osadchy Daniel Keren.

Slides:



Advertisements
Similar presentations
PEBL: Web Page Classification without Negative Examples Hwanjo Yu, Jiawei Han, Kevin Chen- Chuan Chang IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING,
Advertisements

Statistical Machine Learning- The Basic Approach and Current Research Challenges Shai Ben-David CS497 February, 2007.
SVM—Support Vector Machines
Semi-supervised Learning Rong Jin. Semi-supervised learning  Label propagation  Transductive learning  Co-training  Active learning.
An Overview of Machine Learning
IMAN SAUDY UMUT OGUR NORBERT KISS GEORGE TEPES-NICA BARLEY SEEDS CLASSIFICATION.
CMPUT 466/551 Principal Source: CMU
Large Scale Manifold Transduction Michael Karlen Jason Weston Ayse Erkan Ronan Collobert ICML 2008.
Hidden Variables, the EM Algorithm, and Mixtures of Gaussians Computer Vision CS 143, Brown James Hays 02/22/11 Many slides from Derek Hoiem.
Hidden Variables, the EM Algorithm, and Mixtures of Gaussians Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 03/15/12.
Software Quality Ranking: Bringing Order to Software Modules in Testing Fei Xing Michael R. Lyu Ping Guo.
Watching Unlabeled Video Helps Learn New Human Actions from Very Few Labeled Snapshots Chao-Yeh Chen and Kristen Grauman University of Texas at Austin.
1 Fast Asymmetric Learning for Cascade Face Detection Jiaxin Wu, and Charles Brubaker IEEE PAMI, 2008 Chun-Hao Chang 張峻豪 2009/12/01.
Generative Models for Image Analysis Stuart Geman (with E. Borenstein, L.-B. Chang, W. Zhang)
Maria-Florina Balcan Modern Topics in Learning Theory Maria-Florina Balcan 04/19/2006.
Robust Moving Object Detection & Categorization using self- improving classifiers Omar Javed, Saad Ali & Mubarak Shah.
Zhu, Rogers, Qian, Kalish Presented by Syeda Selina Akter.
Assuming normally distributed data! Naïve Bayes Classifier.
Co-Training and Expansion: Towards Bridging Theory and Practice Maria-Florina Balcan, Avrim Blum, Ke Yang Carnegie Mellon University, Computer Science.
1 Transfer Learning Algorithms for Image Classification Ariadna Quattoni MIT, CSAIL Advisors: Michael Collins Trevor Darrell.
Announcements  Project proposal is due on 03/11  Three seminars this Friday (EB 3105) Dealing with Indefinite Representations in Pattern Recognition.
Object Recognition. So what does object recognition involve?
Support Vector Machines Formulation  Solve the quadratic program for some : min s. t.,, denotes where or membership.  Different error functions and measures.
Spatial Semi- supervised Image Classification Stuart Ness G07 - Csci 8701 Final Project 1.
Sample Selection Bias Lei Tang Feb. 20th, Classical ML vs. Reality  Training data and Test data share the same distribution (In classical Machine.
1 How to be a Bayesian without believing Yoav Freund Joint work with Rob Schapire and Yishay Mansour.
Semi-Supervised Clustering Jieping Ye Department of Computer Science and Engineering Arizona State University
Presented by Zeehasham Rasheed
5/30/2006EE 148, Spring Visual Categorization with Bags of Keypoints Gabriella Csurka Christopher R. Dance Lixin Fan Jutta Willamowski Cedric Bray.
Graph-Based Semi-Supervised Learning with a Generative Model Speaker: Jingrui He Advisor: Jaime Carbonell Machine Learning Department
Kernel Methods Part 2 Bing Han June 26, Local Likelihood Logistic Regression.
CS Ensembles and Bayes1 Semi-Supervised Learning Can we improve the quality of our learning by combining labeled and unlabeled data Usually a lot.
Face Processing System Presented by: Harvest Jang Group meeting Fall 2002.
Semi-supervised Learning Rong Jin. Semi-supervised learning  Label propagation  Transductive learning  Co-training  Active learing.
Selective Transfer Machine for Personalized Facial Action Unit Detection Wen-Sheng Chu, Fernando De la Torre and Jeffery F. Cohn Robotics Institute, Carnegie.
Transfer Learning From Multiple Source Domains via Consensus Regularization Ping Luo, Fuzhen Zhuang, Hui Xiong, Yuhong Xiong, Qing He.
Learning with Positive and Unlabeled Examples using Weighted Logistic Regression Wee Sun Lee National University of Singapore Bing Liu University of Illinois,
SVCL Automatic detection of object based Region-of-Interest for image compression Sunhyoung Han.
Support Vector Machine & Image Classification Applications
Watch, Listen and Learn Sonal Gupta, Joohyun Kim, Kristen Grauman and Raymond Mooney -Pratiksha Shah.
Semisupervised Learning A brief introduction. Semisupervised Learning Introduction Types of semisupervised learning Paper for review References.
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
Transfer Learning Task. Problem Identification Dataset : A Year: 2000 Features: 48 Training Model ‘M’ Testing 98.6% Training Model ‘M’ Testing 97% Dataset.
SVM Support Vector Machines Presented by: Anas Assiri Supervisor Prof. Dr. Mohamed Batouche.
Machine Learning in Ad-hoc IR. Machine Learning for ad hoc IR We’ve looked at methods for ranking documents in IR using factors like –Cosine similarity,
Lecture 31: Modern recognition CS4670 / 5670: Computer Vision Noah Snavely.
Automatic Image Annotation by Using Concept-Sensitive Salient Objects for Image Content Representation Jianping Fan, Yuli Gao, Hangzai Luo, Guangyou Xu.
Extending the Multi- Instance Problem to Model Instance Collaboration Anjali Koppal Advanced Machine Learning December 11, 2007.
PSEUDO-RELEVANCE FEEDBACK FOR MULTIMEDIA RETRIEVAL Seo Seok Jun.
Face Detection Using Large Margin Classifiers Ming-Hsuan Yang Dan Roth Narendra Ahuja Presented by Kiang “Sean” Zhou Beckman Institute University of Illinois.
Geodesic Flow Kernel for Unsupervised Domain Adaptation Boqing Gong University of Southern California Joint work with Yuan Shi, Fei Sha, and Kristen Grauman.
Multiple Instance Learning via Successive Linear Programming Olvi Mangasarian Edward Wild University of Wisconsin-Madison.
Bing LiuCS Department, UIC1 Chapter 8: Semi-supervised learning.
Ohad Hageby IDC Support Vector Machines & Kernel Machines IP Seminar 2008 IDC Herzliya.
HAITHAM BOU AMMAR MAASTRICHT UNIVERSITY Transfer for Supervised Learning Tasks.
Neural Text Categorizer for Exclusive Text Categorization Journal of Information Processing Systems, Vol.4, No.2, June 2008 Taeho Jo* 報告者 : 林昱志.
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
COP5992 – DATA MINING TERM PROJECT RANDOM SUBSPACE METHOD + CO-TRAINING by SELIM KALAYCI.
Support vector machine LING 572 Fei Xia Week 8: 2/23/2010 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A 1.
Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1.
Hybrid Classifiers for Object Classification with a Rich Background M. Osadchy, D. Keren, and B. Fadida-Specktor, ECCV 2012 Computer Vision and Video Analysis.
Hidden Variables, the EM Algorithm, and Mixtures of Gaussians Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/22/11.
Generalization Error of pac Model  Let be a set of training examples chosen i.i.d. according to  Treat the generalization error as a r.v. depending on.
Tree and Forest Classification and Regression Tree Bagging of trees Boosting trees Random Forest.
1 Kernel Machines A relatively new learning methodology (1992) derived from statistical learning theory. Became famous when it gave accuracy comparable.
Semi-Supervised Clustering
Machine Learning Week 2.
Pattern Recognition CS479/679 Pattern Recognition Dr. George Bebis
Brief Review of Recognition + Context
Anti-Faces for Detection
Presentation transcript:

Using Image Priors in Maximum Margin Classifiers Tali Brayer Margarita Osadchy Daniel Keren

Object Detection Problem: Locate instances of object category in a given image. Asymmetric classification problem! BackgroundObject (Category) Very largeRelatively small Complex (thousands of categories) Simple (single category) Large prior to appear in an image Small prior Easy to collect (not easy to learn from examples) Hard to collect

All images Intuition  Denote H to be the acceptance region of a classifier. We propose to minimize Pr(All images) ( Pr(bkg)) in H except for the object samples. Background Object class All images Background We have a prior on the distribution of all natural images

Other work: Combine small labeled training set with large unlabeled set – semi-supervised learning: EM with generative mixture models, Fisher kernel,self-training, co-training, transductive SVM, and graph-based methods… All good for the symmetric case, but We have more information: marginal background

Image smoothness measure Lower probability Distribution of Natural Images – “Boltzmann-like” In frequency domain:

Linear SVM Maximal margin Enough training data Class 1 Class 2 Not Enough training data

Linear SVM Class 1 Class 2

MM classifier with Prior Class 1 Class 2

Minimize the probability of natural images over H After some manipulations it reduces to Q

Random w with unit norm and random b from [-0.5, 0.5] % of images that wx+b>0 Relation between the number of natural random images in the positive half-space and the integral

Training Algorithm Probability constraint: ( δ→0 )

Convex Constraint convex

Results Tested categories: cars (side view), faces. Training: 5/10/20/60/(all available data) object’s images. All available background images. Test: Face set: 472 faces, 23,573 bkg. Images Cars test: 299 cars, 10,111 bkg. images Ran 50 trials for each set with different random choices of training data. Weighted SVM was used to deal with the asymmetry in class sizes. UIUC CBCL

Average recognition rate(%): Faces 51060all Weighted Linear SVM Weighted Kernel SVM MM_prior Linear MM_prior Kernel

Average recognition rate(%): Cars 51060all Weighted Linear SVM Weighted Kernel SVM MM_prior Linear MM_prior Kernel

Future Work Video. Explore additional and more robust features. Refining the priors (using background examples). Kernelization.