CSE 473 Ensemble Learning. © CSE AI Faculty 2 Ensemble Learning Sometimes each learning technique yields a different hypothesis (or function) But no perfect.

Slides:



Advertisements
Similar presentations
Ensemble Learning Reading: R. Schapire, A brief introduction to boosting.
Advertisements

On-line learning and Boosting
AdaBoost Reference Yoav Freund and Robert E. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. Journal.
FilterBoost: Regression and Classification on Large Datasets Joseph K. Bradley 1 and Robert E. Schapire 2 1 Carnegie Mellon University 2 Princeton University.
Longin Jan Latecki Temple University
Face detection Many slides adapted from P. Viola.
Introduction to Boosting Slides Adapted from Che Wanxiang( 车 万翔 ) at HIT, and Robin Dhamankar of Many thanks!
2D1431 Machine Learning Boosting.
Northwestern University Winter 2007 Machine Learning EECS Machine Learning Lecture 13: Computational Learning Theory.
A Practical Approach to Recognizing Physical Activities Jonathan Lester Tanzeem Choudhury Gaetano Borriello.
Ensemble Learning: An Introduction
ICS 273A Intro Machine Learning
1 How to be a Bayesian without believing Yoav Freund Joint work with Rob Schapire and Yishay Mansour.
Adaboost and its application
Introduction to Boosting Aristotelis Tsirigos SCLT seminar - NYU Computer Science.
Boosting Main idea: train classifiers (e.g. decision trees) in a sequence. a new classifier should focus on those cases which were incorrectly classified.
Foundations of Computer Vision Rapid object / face detection using a Boosted Cascade of Simple features Presented by Christos Stoilas Rapid object / face.
Face Detection CSE 576. Face detection State-of-the-art face detection demo (Courtesy Boris Babenko)Boris Babenko.
Online Learning Algorithms
Machine Learning CS 165B Spring 2012
AdaBoost Robert E. Schapire (Princeton University) Yoav Freund (University of California at San Diego) Presented by Zhi-Hua Zhou (Nanjing University)
CSSE463: Image Recognition Day 27 This week This week Last night: k-means lab due. Last night: k-means lab due. Today: Classification by “boosting” Today:
CS 391L: Machine Learning: Ensembles
Window-based models for generic object detection Mei-Chen Yeh 04/24/2012.
Benk Erika Kelemen Zsolt
Boosting of classifiers Ata Kaban. Motivation & beginnings Suppose we have a learning algorithm that is guaranteed with high probability to be slightly.
Face detection Slides adapted Grauman & Liebe’s tutorial
BOOSTING David Kauchak CS451 – Fall Admin Final project.
Decision Tree Learning R&N: Chap. 18, Sect. 18.1–3.
Combining multiple learners Usman Roshan. Bagging Randomly sample training data Determine classifier C i on sampled data Goto step 1 and repeat m times.
1 Ensembles An ensemble is a set of classifiers whose combined results give the final decision. test feature vector classifier 1classifier 2classifier.
Today Ensemble Methods. Recap of the course. Classifier Fusion
Ensemble Methods: Bagging and Boosting
Ensemble Learning Spring 2009 Ben-Gurion University of the Negev.
Ensemble Learning (1) Boosting Adaboost Boosting is an additive model
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Learning with AdaBoost
E NSEMBLE L EARNING : A DA B OOST Jianping Fan Dept of Computer Science UNC-Charlotte.
Classification (slides adapted from Rob Schapire) Eran Segal Weizmann Institute.
Ensemble Methods.  “No free lunch theorem” Wolpert and Macready 1995.
1 CHUKWUEMEKA DURUAMAKU.  Machine learning, a branch of artificial intelligence, concerns the construction and study of systems that can learn from data.
1 January 24, 2016Data Mining: Concepts and Techniques 1 Data Mining: Concepts and Techniques — Chapter 7 — Classification Ensemble Learning.
Classification and Prediction: Ensemble Methods Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
Decision Trees IDHairHeightWeightLotionResult SarahBlondeAverageLightNoSunburn DanaBlondeTallAverageYesnone AlexBrownTallAverageYesNone AnnieBlondeShortAverageNoSunburn.
Combining multiple learners Usman Roshan. Decision tree From Alpaydin, 2010.
Boosting ---one of combining models Xin Li Machine Learning Course.
Thrust IIA: Environmental State Estimation and Mapping Dieter Fox (Lead) Nicholas Roy MURI 8 Kickoff Meeting 2007.
AdaBoost Algorithm and its Application on Object Detection Fayin Li.
1 Machine Learning Lecture 8: Ensemble Methods Moshe Koppel Slides adapted from Raymond J. Mooney and others.
Adaboost (Adaptive boosting) Jo Yeong-Jun Schapire, Robert E., and Yoram Singer. "Improved boosting algorithms using confidence- rated predictions."
1 Ensembles An ensemble is a set of classifiers whose combined results give the final decision. test feature vector classifier 1classifier 2classifier.
Ensemble Classifiers.
Machine Learning: Ensemble Methods
HW 2.
Reading: R. Schapire, A brief introduction to boosting
Session 7: Face Detection (cont.)
Trees, bagging, boosting, and stacking
Ensemble Learning Introduction to Machine Learning and Data Mining, Carla Brodley.
Boosting Nearest-Neighbor Classifier for Character Recognition
Adaboost Team G Youngmin Jun
Data Mining Practical Machine Learning Tools and Techniques
The
Introduction to Data Mining, 2nd Edition
Machine Learning Ensemble Learning: Voting, Boosting(Adaboost)
ADABOOST(Adaptative Boosting)
Ensemble learning.
Model Combination.
Ensemble learning Reminder - Bagging of Trees Random Forest
Ensembles An ensemble is a set of classifiers whose combined results give the final decision. test feature vector classifier 1 classifier 2 classifier.
Ensembles An ensemble is a set of classifiers whose combined results give the final decision. test feature vector classifier 1 classifier 2 classifier.
Presentation transcript:

CSE 473 Ensemble Learning

© CSE AI Faculty 2 Ensemble Learning Sometimes each learning technique yields a different hypothesis (or function) But no perfect hypothesis… Could we combine several imperfect hypotheses to get a better hypothesis?

© CSE AI Faculty 3 Example this line is one simple classifier saying that everything to the left is + and everything to the right is - Combining 3 linear classifiers  More complex classifier

© CSE AI Faculty 4 Analogies: Elections combine voters’ choices to pick a good candidate (hopefully) Committees combine experts’ opinions to make better decisions Students working together on Othello project Intuitions: Individuals make mistakes but the “majority” may be less likely to Individuals often have partial knowledge; a committee can pool expertise to make better decisions Ensemble Learning: Motivation

© CSE AI Faculty 5 Technique 1: Bagging Combine hypotheses via majority voting

© CSE AI Faculty 6 Bagging: Analysis Error probability went down from 0.1 to 0.01!

© CSE AI Faculty 7 Weighted Majority Voting In practice, hypotheses rarely independent Some hypotheses have less errors than others  all votes are not equal! Idea: Let’s take a weighted majority

© CSE AI Faculty 8 Technique 2: Boosting Most popular ensemble learning technique Computes a weighted majority of hypotheses Can “boost” performance of a “weak learner” Operates on a weighted training set Each training example (instance) has a “weight” Learning algorithm takes weight of input into account Idea: when an input is misclassified by a hypothesis, increase its weight so that the next hypothesis is more likely to classify it correctly

© CSE AI Faculty 9 Boosting Example with Decision Trees (DTs) training case correctly classified training case has large weight in this round this DT has a strong vote. Output of h final is weighted majority of outputs of h 1,…,h 4

© CSE AI Faculty 10 AdaBoost Algorithm

© CSE AI Faculty 11 AdaBoost Example Original training set D 1 : Equal weights to all training inputs Goal: In round t, learn classifier h t that minimizes error with respect to weighted training set h t maps input to True (+1) or False (-1) Taken from “A Tutorial on Boosting” by Yoav Freund and Rob Schapire

© CSE AI Faculty 12 AdaBoost Example ROUND 1 MisclassifiedIncrease weights z 1 = 0.42

© CSE AI Faculty 13 AdaBoost Example ROUND 2 z 2 = 0.65

© CSE AI Faculty 14 AdaBoost Example ROUND 3 z 3 = 0.92

© CSE AI Faculty 15 AdaBoost Example h final sign(x) = +1 if x > 0 and -1 otherwise

Example 1: Semantic Mapping

Motivation Human-Robot interaction: User: “Go to the corridor” Room Corridor Doorway

Shape Room

Observations Room Doorway

Observations Room Corridor Doorway

Simple Features Gap = d > θ f = # Gaps Minimum f =Area f =Perimeter f = d d d i d f = d d Σ d i

Experiments Training (top) # examples: Test (bottom) # examples: classificatio n: 93.94% Building 079 Uni. Freiburg Room Corridor Doorway

Application to a New Environment Training map Intel Research Lab in Seattle

Application to a New Environment Training map Intel Research Lab in Seattle Room Corridor Doorway

Example 2: Wearable Multi- Sensor Unit © CSE AI Faculty Battery Camera (on ribbon cable) GPS receiver iMote2 + two sensor boards MicrophoneCamera Light sensors 2 GB SD card Indicator LEDs Records 4 hours of audio, images (1/sec), GPS, and sensor data (accelerometer, barometric pressure, light intensity, gyroscope, magnetometer)

Data Stream © CSE AI Faculty Courtesy of G. Borriello

Activity Recognition Model © CSE AI Faculty Boosted cassifiers [Lester-Choudhury-etAl: IJCAI-05] Virtual evidence boosting [Liao-Choudhury-Fox-Kautz: IJCAI-07] Accuracy: 88% activities, 93% environment

Boosting Extremely flexible framework Handles high-dimensional continuous data Easy to implement Limitation: Only models local classification problems 28 © CSE AI Faculty