Boris 2 Boris Babenko 1 Ming-Hsuan Yang 2 Serge Belongie 1 (University of California, Merced, USA) 2 (University of California, San Diego, USA) Visual.

Slides:



Advertisements
Similar presentations
Lectures 17,18 – Boosting and Additive Trees Rice ECE697 Farinaz Koushanfar Fall 2006.
Advertisements

Games of Prediction or Things get simpler as Yoav Freund Banter Inc.
Patch to the Future: Unsupervised Visual Prediction
CMPUT 466/551 Principal Source: CMU
Online Multiple Classifier Boosting for Object Tracking Tae-Kyun Kim 1 Thomas Woodley 1 Björn Stenger 2 Roberto Cipolla 1 1 Dept. of Engineering, University.
Robust Moving Object Detection & Categorization using self- improving classifiers Omar Javed, Saad Ali & Mubarak Shah.
Boosting CMPUT 615 Boosting Idea We have a weak classifier, i.e., it’s error rate is a little bit better than 0.5. Boosting combines a lot of such weak.
Sparse vs. Ensemble Approaches to Supervised Learning
Boosting Rong Jin. Inefficiency with Bagging D Bagging … D1D1 D2D2 DkDk Boostrap Sampling h1h1 h2h2 hkhk Inefficiency with boostrap sampling: Every example.
Multiple-Instance Learning Paper 1: A Framework for Multiple-Instance Learning [Maron and Lozano-Perez, 1998] Paper 2: EM-DD: An Improved Multiple-Instance.
A Brief Introduction to Adaboost
Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007.
Adaboost and its application
Active Learning Strategies for Compound Screening Megon Walker 1 and Simon Kasif 1,2 1 Bioinformatics Program, Boston University 2 Department of Biomedical.
Sparse vs. Ensemble Approaches to Supervised Learning
Boosting Main idea: train classifiers (e.g. decision trees) in a sequence. a new classifier should focus on those cases which were incorrectly classified.
Boosting for tumor classification
Online Learning Algorithms
Kullback-Leibler Boosting Ce Liu, Hueng-Yeung Shum Microsoft Research Asia CVPR 2003 Presented by Derek Hoiem.
Latent Boosting for Action Recognition Zhi Feng Huang et al. BMVC Jeany Son.
Face Detection using the Viola-Jones Method
Face Alignment Using Cascaded Boosted Regression Active Shape Models
Visual Tracking with Online Multiple Instance Learning
Boris Babenko Department of Computer Science and Engineering University of California, San Diego Semi-supervised and Unsupervised Feature Scaling.
Detecting Pedestrians Using Patterns of Motion and Appearance Paul Viola Microsoft Research Irfan Ullah Dept. of Info. and Comm. Engr. Myongji University.
Kaihua Zhang Lei Zhang (PolyU, Hong Kong) Ming-Hsuan Yang (UC Merced, California, U.S.A. ) Real-Time Compressive Tracking.
LOGO Ensemble Learning Lecturer: Dr. Bo Yuan
Window-based models for generic object detection Mei-Chen Yeh 04/24/2012.
Benk Erika Kelemen Zsolt
Multiple Instance Real Boosting with Aggregation Functions Hossein Hajimirsadeghi and Greg Mori School of Computing Science Simon Fraser University International.
Learning from Multi-topic Web Documents for Contextual Advertisement KDD 2008.
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.6: Linear Models Rodney Nielsen Many of.
BOOSTING David Kauchak CS451 – Fall Admin Final project.
Boris Babenko 1, Ming-Hsuan Yang 2, Serge Belongie 1 1. University of California, San Diego 2. University of California, Merced OLCV, Kyoto, Japan.
Combining multiple learners Usman Roshan. Bagging Randomly sample training data Determine classifier C i on sampled data Goto step 1 and repeat m times.
Today Ensemble Methods. Recap of the course. Classifier Fusion
Ensemble Learning Spring 2009 Ben-Gurion University of the Negev.
BAGGING ALGORITHM, ONLINE BOOSTING AND VISION Se – Hoon Park.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Robust Object Tracking with Online Multiple Instance Learning
Concept learning, Regression Adapted from slides from Alpaydin’s book and slides by Professor Doina Precup, Mcgill University.
Robust Real Time Face Detection
Boris Babenko, Steve Branson, Serge Belongie University of California, San Diego ICCV 2009, Kyoto, Japan.
Bibek Jang Karki. Outline Integral Image Representation of image in summation format AdaBoost Ranking of features Combining best features to form strong.
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
COP5992 – DATA MINING TERM PROJECT RANDOM SUBSPACE METHOD + CO-TRAINING by SELIM KALAYCI.
Multiple Instance Learning for Sparse Positive Bags Razvan C. Bunescu Machine Learning Group Department of Computer Sciences University of Texas at Austin.
Learning to Detect Faces A Large-Scale Application of Machine Learning (This material is not in the text: for further information see the paper by P.
Ensemble Methods.  “No free lunch theorem” Wolpert and Macready 1995.
Ensemble Methods in Machine Learning
A Brief Introduction on Face Detection Mei-Chen Yeh 04/06/2010 P. Viola and M. J. Jones, Robust Real-Time Face Detection, IJCV 2004.
Identifying “Best Bet” Web Search Results by Mining Past User Behavior Author: Eugene Agichtein, Zijian Zheng (Microsoft Research) Source: KDD2006 Reporter:
Combining multiple learners Usman Roshan. Decision tree From Alpaydin, 2010.
Max-Confidence Boosting With Uncertainty for Visual tracking WEN GUO, LIANGLIANG CAO, TONY X. HAN, SHUICHENG YAN AND CHANGSHENG XU IEEE TRANSACTIONS ON.
Tree and Forest Classification and Regression Tree Bagging of trees Boosting trees Random Forest.
Adaboost (Adaptive boosting) Jo Yeong-Jun Schapire, Robert E., and Yoram Singer. "Improved boosting algorithms using confidence- rated predictions."
Week 3 Emily Hand UNR. Online Multiple Instance Learning The goal of MIL is to classify unseen bags, instances, by using the labeled bags as training.
Another Example: Circle Detection
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
Robust and Fast Collaborative Tracking with Two Stage Sparse Optimization Authors: Baiyang Liu, Lin Yang, Junzhou Huang, Peter Meer, Leiguang Gong and.
2. Skin - color filtering.
Ensemble methods with Data Streams
Session 7: Face Detection (cont.)
ECE 5424: Introduction to Machine Learning
Incremental Boosting Incremental Learning of Boosted Face Detector ICCV 2007 Unsupervised Incremental Learning for Improved Object Detection in a Video.
Identifying Confusion from Eye-Tracking Data
Multiple Instance Learning: applications to computer vision
CSCI B609: “Foundations of Data Science”
Introduction to Data Mining, 2nd Edition
Boris Babenko, Steve Branson, Serge Belongie
Presentation transcript:

Boris 2 Boris Babenko 1 Ming-Hsuan Yang 2 Serge Belongie 1 (University of California, Merced, USA) 2 (University of California, San Diego, USA) Visual Tracking with Online Multiple Instance Learning

Introduction Multiple Instance Learning Online Multiple Instance Boosting Tracking with Online MIL Experiments Conclusions 2

Introduction Multiple Instance Learning Online Multiple Instance Boosting Tracking with Online MIL Experiments Conclusions 3

First frame is labeled

Classifier Online classifier (i.e. Online AdaBoost)

Grab one positive patch, and some negative patch, and train/update the model. negative positive Classifier

Get next frame negative positive Classifier

Evaluate classifier in some search window negative positive Classifier

Evaluate classifier in some search window negative positive old location X Classifier

Find max response negative positive old location new location X X Classifier

Repeat… negative positive negative positive Classifier

Introduction Multiple Instance Learning Online Multiple Instance Boosting Tracking with Online MIL Experiments Conclusion 12

What if classifier is a bit off? Tracker starts to drift How to choose training examples?

Classifier MIL Classifier

Ambiguity in training data Instead of instance/label pairs, get bag of instances/label pairs Bag is positive if one or more of it’s members is positive

Problem: Labeling with rectangles is inherently ambiguous Labeling is sloppy

Solution: Take all of these patches, put into positive bag At least one patch in bag is “correct”

Classifier MIL Classifier

MIL Classifier

Supervised Learning Training Input MIL Training Input

Positive bag contains at least one positive instance Goal: learning instance classifier Classifier is same format as standard learning

Introduction Multiple Instance Learning Online Multiple Instance Boosting Tracking with Online MIL Experiments Conclusion 22

Need an online MIL algorithm Combine ideas from MILBoost and Online Boosting

Train classifier of the form: where is a weak classifier Can make binary predictions using

Objective to maximize: Log likelihood of bags: where (Noisy-OR)

Objective to maximize: Log likelihood of bags: where (Noisy-OR) (as in LogitBoost)

Train weak classifier in a greedy fashion For batch MILBoost can optimize using functional gradient descent. We need an online version…

At all times, keep a pool of weak classifier candidates

At time t get more training data Update all candidate classifiers Pick best K in a greedy fashion

Frame tFrame t+1 Get data (bags) Update all classifiers in pool Greedily add best K to strong classifier

Introduction Multiple Instance Learning Online Multiple Instance Boosting Tracking with Online MIL Experiments Conclusion 32

MILTrack = Online MILBoost + Stumps for weak classifiers + Randomized Haar features + greedy local search

34

Introduction Multiple Instance Learning Online Multiple Instance Boosting Tracking with Online MIL Experiments Conclusions 35

Compare MILTrack to: OAB1 = Online AdaBoost w/ 1 pos. per frame OAB5 = Online AdaBoost w/ 45 pos. per frame SemiBoost = Online Semi-supervised Boosting FragTrack = Static appearance model

37

38

Best Second Best

Introduction Multiple Instance Learning Online Multiple Instance Boosting Tracking with Online MIL Experiments Conclusions 40

Proposed Online MILBoost algorithm Using MIL to train an appearance model results in more robust tracking