START OF DAY 6 Reading: Chap. 8. Group Project Progress Report.

Slides:



Advertisements
Similar presentations
© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/ Other Classification Techniques 1.Nearest Neighbor Classifiers 2.Support Vector Machines.
Advertisements

Ensemble Methods An ensemble method constructs a set of base classifiers from the training data Ensemble or Classifier Combination Predict class label.
ECE 8527 Homework Final: Common Evaluations By Andrew Powell.
Multiple Criteria for Evaluating Land Cover Classification Algorithms Summary of a paper by R.S. DeFries and Jonathan Cheung-Wai Chan April, 2000 Remote.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
CS Ensembles and Bayes1 Ensembles, Model Combination and Bayesian Combination.
Learning from Observations Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 18 Fall 2005.
Ensemble Learning what is an ensemble? why use an ensemble?
MCS 2005 Round Table In the context of MCS, what do you believe to be true, even if you cannot yet prove it?
Ensemble Learning: An Introduction
Machine Learning IV Ensembles CSE 473. © Daniel S. Weld 2 Machine Learning Outline Machine learning: Supervised learning Overfitting Ensembles of classifiers.
ROC Curves.
Data mining and statistical learning - lecture 13 Separating hyperplane.
Examples of Ensemble Methods
Machine Learning: Ensemble Methods
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Experimental Evaluation
Boosting Main idea: train classifiers (e.g. decision trees) in a sequence. a new classifier should focus on those cases which were incorrectly classified.
ROC Curves.
Cost-Profit-Volume Analysis Samir K Mahajan. BREAK -EVEN ANALYSIS Break –even Analysis refer to a system of determination of activity where total cost.
For Better Accuracy Eick: Ensemble Learning
3 ème Journée Doctorale G&E, Bordeaux, Mars 2015 Wei FENG Geo-Resources and Environment Lab, Bordeaux INP (Bordeaux Institute of Technology), France Supervisor:
Semi-Supervised Learning
Data Quality Issues-Chapter 10
Comparing the Parallel Automatic Composition of Inductive Applications with Stacking Methods Hidenao Abe & Takahira Yamaguchi Shizuoka University, JAPAN.
CS 391L: Machine Learning: Ensembles
Learning from Multi-topic Web Documents for Contextual Advertisement KDD 2008.
Data Mining - Volinsky Columbia University 1 Topic 10 - Ensemble Methods.
Exploiting Context Analysis for Combining Multiple Entity Resolution Systems -Ramu Bandaru Zhaoqi Chen Dmitri V.kalashnikov Sharad Mehrotra.
Computational Intelligence: Methods and Applications Lecture 36 Meta-learning: committees, sampling and bootstrap. Włodzisław Duch Dept. of Informatics,
Ensemble Methods: Bagging and Boosting
Ensembles. Ensemble Methods l Construct a set of classifiers from training data l Predict class label of previously unseen records by aggregating predictions.
CS Ensembles1 Ensembles. 2 A “Holy Grail” of Machine Learning Automated Learner Just a Data Set or just an explanation of the problem Hypothesis.
CLASSIFICATION: Ensemble Methods
BAGGING ALGORITHM, ONLINE BOOSTING AND VISION Se – Hoon Park.
ISQS 6347, Data & Text Mining1 Ensemble Methods. ISQS 6347, Data & Text Mining 2 Ensemble Methods Construct a set of classifiers from the training data.
Limitations of Cotemporary Classification Algorithms Major limitations of classification algorithms like Adaboost, SVMs, or Naïve Bayes include, Requirement.
Chapter 11 Statistical Techniques. Data Warehouse and Data Mining Chapter 11 2 Chapter Objectives  Understand when linear regression is an appropriate.
Cooperative Classifiers Rozita Dara Supervisor: Prof. Kamel Pattern Analysis and Machine Intelligence Lab University of Waterloo.
Machine Learning II Decision Tree Induction CSE 573.
Homework Complete H.W. #2 for tomorrow..
Ensemble Methods.  “No free lunch theorem” Wolpert and Macready 1995.
Classification Ensemble Methods 1
1 January 24, 2016Data Mining: Concepts and Techniques 1 Data Mining: Concepts and Techniques — Chapter 7 — Classification Ensemble Learning.
Ensemble Methods Construct a set of classifiers from the training data Predict class label of previously unseen records by aggregating predictions made.
Machine Learning in Practice Lecture 24 Carolyn Penstein Rosé Language Technologies Institute/ Human-Computer Interaction Institute.
… Algo 1 Algo 2 Algo 3 Algo N Meta-Learning Algo.
Cell Segmentation in Microscopy Imagery Using a Bag of Local Bayesian Classifiers Zhaozheng Yin RI/CMU, Fall 2009.
CS Ensembles and Bayes1 Ensembles, Model Combination and Bayesian Combination.
Neural networks (2) Reminder Avoiding overfitting Deep neural network Brief summary of supervised learning methods.
1 Machine Learning: Ensemble Methods. 2 Learning Ensembles Learn multiple alternative definitions of a concept using different training data or different.
1 Machine Learning Lecture 8: Ensemble Methods Moshe Koppel Slides adapted from Raymond J. Mooney and others.
Section 5-1 p. 206 Goal – to write ratios as percents and vice versa.
Machine Learning: Ensemble Methods
Ensemble Learning Introduction to Machine Learning and Data Mining, Carla Brodley.
A “Holy Grail” of Machine Learing
Cube root – Prime Factorization
Combining Base Learners
Chapter 12 Work and Energy
Multiple Instance Learning: applications to computer vision
Introduction to Data Mining, 2nd Edition
Machine Learning Ensemble Learning: Voting, Boosting(Adaboost)
Square root Prime factorization
Square Roots.
Statistical Learning Dong Liu Dept. EEIS, USTC.
Ensemble learning.
Model Combination.
INTRODUCTION TO Machine Learning 3rd Edition
Presentation transcript:

START OF DAY 6 Reading: Chap. 8

Group Project Progress Report

3 Minute Synopsis What have you done? Where are you going? Thoughts on how you are going to get there

Model Combination

Prophetic Warning Now it is not common that the voice of the people desireth anything contrary to that which is right; but it is common for the lesser part of the people to desire that which is not right; therefore this shall ye observe and make it your law--to do your business by the voice of the people. (Mosiah 29:26) What is the point? One person may get it wrong Many less likely so

Following the Prophet Learning algorithms have different biases – They probably do not make the same mistakes – If one makes a mistake, the others may not Solution: model combination – Exploit variation in data Bagging, Boosting – Exploit variation in algorithms Ensemble, Stacking, Cascade Generalization, Cascading, Delegating, Arbitrating Sometimes called metalearning

Bagging (I)

Bagging (II)

Boosting (I)

Boosting (II)

Ensemble (I)

Ensemble (II) Key issue: diversity

Classifier Output Distance Measures difference in behavior Accuracy problematic – A and are both 50% accurate on T – Appear the same, yet A misses what B gets right, and vice versa! COD = ratio of number of disagreements between A and B to the total number of instances – COD(A,B)=1 (maximum)

Stacking (I)

Stacking (II)

Cascade Generalization (I)

Cascade Generalization (II) 2-step

Cascade Generalization (III) n-step

Cascading (I)

Cascading (II)

Delegating (I)

Delegating (II)

Arbitrating (I)

Arbitrating (II)

END OF DAY 6 Homework: Classification Model Evaluation