1 Decision Trees ExampleSkyAirTempHumidityWindWaterForecastEnjoySport 1SunnyWarmNormalStrongWarmSameYes 2SunnyWarmHighStrongWarmSameYes 3RainyColdHighStrongWarmChangeNo.

Slides:



Advertisements
Similar presentations
Concept Learning and the General-to-Specific Ordering
Advertisements

2. Concept Learning 2.1 Introduction
1 Machine Learning: Lecture 3 Decision Tree Learning (Based on Chapter 3 of Mitchell T.., Machine Learning, 1997)
Decision Tree Learning
Decision Trees Decision tree representation ID3 learning algorithm
Machine Learning III Decision Tree Induction
1er. Escuela Red ProTIC - Tandil, de Abril, Decision Tree Learning 3.1 Introduction –Method for approximation of discrete-valued target functions.
Classification Algorithms
Decision Tree Algorithm (C4.5)
ICS320-Foundations of Adaptive and Learning Systems
Classification Techniques: Decision Tree Learning
CS 484 – Artificial Intelligence1 Announcements Project 1 is due Tuesday, October 16 Send me the name of your konane bot Midterm is Thursday, October 18.
Università di Milano-Bicocca Laurea Magistrale in Informatica
Decision Tree Learning
Machine Learning II Decision Tree Induction CSE 473.
Decision Trees. DEFINE: Set X of Instances (of n-tuples x = ) –E.g., days decribed by attributes (or features): Sky, Temp, Humidity, Wind, Water, Forecast.
Decision Tree Learning Learning Decision Trees (Mitchell 1997, Russell & Norvig 2003) –Decision tree induction is a simple but powerful learning paradigm.
Ch 3. Decision Tree Learning
Concept Learning and Version Spaces
Machine Learning Chapter 3. Decision Tree Learning
Short Introduction to Machine Learning Instructor: Rada Mihalcea.
CS 484 – Artificial Intelligence1 Announcements List of 5 source for research paper Homework 5 due Tuesday, October 30 Book Review due Tuesday, October.
1 Machine Learning What is learning?. 2 Machine Learning What is learning? “That is what learning is. You suddenly understand something you've understood.
Machine Learning Chapter 11.
CpSc 810: Machine Learning Decision Tree Learning.
General-to-Specific Ordering. 8/29/03Logic Based Classification2 SkyAirTempHumidityWindWaterForecastEnjoySport SunnyWarmNormalStrongWarmSameYes SunnyWarmHighStrongWarmSameYes.
1 Concept Learning By Dong Xu State Key Lab of CAD&CG, ZJU.
Machine Learning Chapter 2. Concept Learning and The General-to-specific Ordering Tom M. Mitchell.
Chapter 2: Concept Learning and the General-to-Specific Ordering.
For Wednesday No reading Homework: –Chapter 18, exercise 6.
Machine Learning, Decision Trees, Overfitting Machine Learning Tom M. Mitchell Machine Learning Department Carnegie Mellon University January 14,
CpSc 810: Machine Learning Concept Learning and General to Specific Ordering.
Concept Learning and the General-to-Specific Ordering 이 종우 자연언어처리연구실.
Outline Inductive bias General-to specific ordering of hypotheses
Overview Concept Learning Representation Inductive Learning Hypothesis
For Monday No new reading Homework: –Chapter 18, exercises 3 and 4.
Classification Algorithms Decision trees Rule-based induction Neural networks Memory(Case) based reasoning Genetic algorithms Bayesian networks Basic Principle.
Decision Trees, Part 1 Reading: Textbook, Chapter 6.
Machine Learning: Lecture 2
Machine Learning Concept Learning General-to Specific Ordering
Decision Tree Learning
Training Examples. Entropy and Information Gain Information answers questions The more clueless I am about the answer initially, the more information.
Seminar on Machine Learning Rada Mihalcea Decision Trees Very short intro to Weka January 27, 2003.
Decision Trees Reading: Textbook, “Learning From Examples”, Section 3.
CS464 Introduction to Machine Learning1 Concept Learning Inducing general functions from specific training examples is a main issue of machine learning.
Computational Learning Theory Part 1: Preliminaries 1.
1 Universidad de Buenos Aires Maestría en Data Mining y Knowledge Discovery Aprendizaje Automático 4-Inducción de árboles de decisión (1/2) Eduardo Poggi.
Concept learning Maria Simi, 2011/2012 Machine Learning, Tom Mitchell Mc Graw-Hill International Editions, 1997 (Cap 1, 2).
Review of Decision Tree Learning Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
CSE573 Autumn /09/98 Machine Learning Administrative –Last topic: Decision Tree Learning Reading: 5.1, 5.4 Last time –finished NLP sample system’s.
CSE573 Autumn /11/98 Machine Learning Administrative –Finish this topic –The rest of the time is yours –Final exam Tuesday, Mar. 17, 2:30-4:20.
Machine Learning Inductive Learning and Decision Trees
Decision Tree Learning
Decision Tree Learning
Concept Learning Machine Learning by T. Mitchell (McGraw-Hill) Chp. 2
CSE543: Machine Learning Lecture 2: August 6, 2014
CS 9633 Machine Learning Concept Learning
Artificial Intelligence
Analytical Learning Discussion (4 of 4):
Data Science Algorithms: The Basic Methods
Introduction to Machine Learning Algorithms in Bioinformatics: Part II
Decision Tree Saed Sayad 9/21/2018.
Machine Learning Chapter 3. Decision Tree Learning
Machine Learning: Lecture 3
Machine Learning Chapter 3. Decision Tree Learning
Concept Learning Berlin Chen 2005 References:
Machine Learning Chapter 2
Version Space Machine Learning Fall 2018.
Machine Learning Chapter 2
Presentation transcript:

1 Decision Trees ExampleSkyAirTempHumidityWindWaterForecastEnjoySport 1SunnyWarmNormalStrongWarmSameYes 2SunnyWarmHighStrongWarmSameYes 3RainyColdHighStrongWarmChangeNo 4SunnyWarmHighStrongCoolChangeYes 5CloudyWarmHighWeakCoolSameYes 6CloudyColdHighWeakCoolSameNo

2 Decision Trees Sky AirTemp SunnyRainyCloudy WarmCold Yes No YesNo (Sky = Sunny)  (Sky = Cloudy  AirTemp = Warm)

3 Decision Trees Sky AirTemp SunnyRainyCloudy WarmCold Yes No YesNo 7RainyWarmNormalWeakCoolSame? 8CloudyWarmHighStrongCoolChange?

4 Decision Trees Humidity NormalHigh Yes Sky AirTemp SunnyRainyCloudy WarmCold Yes No YesNo

5 Decision Trees  A 2 = v 2 A 1 = v 1

6 Homogenity of Examples Entropy(S) =  p + log 2 p +  p - log 2 p - 0.5

7 Homogenity of Examples Entropy(S) =  i=1,c  p i log 2 p i impurity measure

8 Information Gain Gain(S, A) = Entropy(S)   v  Values(A) (|S v |/|S|).Entropy(S v ) A S v1 S v2...

9 Example Entropy(S) =  p + log 2 p +  p - log 2 p - =  (4/6)log 2 (4/6)  (2/6)log 2 (2/6) = = Gain(S, Sky) = Entropy(S)   v  {Sunny, Rainy, Cloudy} (|S v |/|S|)Entropy(S v ) = Entropy(S)  [(3/6).Entropy(S Sunny ) + (1/6).Entropy(S Rainy ) + (2/6).Entropy(S Cloudy )] = Entropy(S)  (2/6).Entropy(S Cloudy ) = Entropy(S)  (2/6)[  (1/2)log 2 (1/2)  (1/2)log 2 (1/2)] =  = 0.584

10 Example Entropy(S) =  p + log 2 p +  p - log 2 p - =  (4/6)log 2 (4/6)  (2/6)log 2 (2/6) = = Gain(S, Water) = Entropy(S)   v  {Warm, Cool} (|S v |/|S|)Entropy(S v ) = Entropy(S)  [(3/6).Entropy(S Warm ) + (3/6).Entropy(S Cool )] = Entropy(S)  (3/6).2.[  (2/3)log 2 (2/3)  (1/3)log 2 (1/3)] = Entropy(S)   = 0

11 Example Sky ? SunnyRainyCloudy Yes No Gain(S Cloudy, AirTemp) = Entropy(S Cloudy )   v  {Warm, Cold} (|S v |/|S|)Entropy(S v ) = 1 Gain(S Cloudy, Humidity) = Entropy(S Cloudy )   v  {Normal, High} (|S v |/|S|)Entropy(S v ) = 0

12 Inductive Bias Hypothesis space: complete!

13 Inductive Bias Hypothesis space: complete! Shorter trees are preferred over larger trees Prefer the simplest hypothesis that fits the data

14 Inductive Bias Decision Tree algorithm: searches incompletely thru a complete hypothesis space.  Preference bias Cadidate-Elimination searches completely thru an incomplete hypothesis space.  Restriction bias

15 Overfitting h  H is said to overfit the training data if there exists h’  H, such that h has smaller error than h’ over the training examples, but h’ has a smaller error than h over the entire distribution of instances:

16 Overfitting h  H is said to overfit the training data if there exists h’  H, such that h has smaller error than h’ over the training examples, but h’ has a smaller error than h over the entire distribution of instances: – There is noise in the data – The number of training examples is too small to produce a representative sample of the target concept

17 Homework Exercises 3-1  3.4 (Chapter 3, ML textbook)