Download presentation
Presentation is loading. Please wait.
1
Induction of Decision Trees and ID3
2
Agenda Example of Decision Trees What is ID3? Entropy
Calculating Entropy with Code Information Gain Advantages and Disadvantages Examples
3
مثالی از یک درخت تصمیم محل درد شکم گلو سینه هیچکدام آپاندیس تب سکته
سرفه بله خیر بله خیر باکتری ویروسی تب هیچکدام بله خیر هر برگ این درخت یک کلاس یا دسته را مشخص میکند. یک مثال آموزشی در درخت تصمیم به این صورت دسته بندی میشود: از ریشه درخت شروع میشود. ویژگی معین شده توسط این گره تست می گردد. و سپس منطبق با ارزش ویژگی در مثال داده شده در طول شاخه ها حرکت رو به پائین انجام می دهد. این فرآیند برای گره های زیردرختان گره جدید تکرار می شود. آنفولانزا سرماخوردگی
4
An Example Data Set and Decision Tree
yes no sunny rainy med small big outlook company sailboat
5
Classification outlook sunny rainy yes company no big med no sailboat
small big yes no
6
Induction of Decision Trees
Data Set (Learning Set) Each example = Attributes + Class TDIDT Top Down Induction of Decision Trees
7
Decision Trees Rules for classifying data using attributes.
The tree consists of decision nodes and leaf nodes. A decision node has two or more branches, each representing values for the attribute tested. A leaf node produces a homogeneous result (all in one class), which does not require additional classification testing.
8
Some TDIDT Systems ID3 (Quinlan 79) CART (Brieman et al. 84)
Assistant (Cestnik et al. 87) C4.5 (Quinlan 93) ...
9
What is ID3? A mathematical algorithm for building the decision tree.
Invented by J. Ross Quinlan in 1979. Uses Information Theory invented by Shannon in 1948. Builds the tree from the top down, with no backtracking. Information Gain is used to select the most useful attribute for classification.
10
TDIDT Algorithm Also known as ID3 (Quinlan)
To construct decision tree T from learning set S: If all examples in S belong to some class C Then make leaf labeled C Otherwise select the “most informative” attribute A partition S according to A’s values recursively construct subtrees T1, T2, ..., for the subsets of S
11
TDIDT Algorithm tree T is: Attribute A A v1 v2 vn A’s values T1 T2 Tn
Subtrees
12
Another Example
13
Simple Tree Outlook Humidity P Windy N P N P sunny rainy overcast high
normal yes no N P N P
14
Complicated Tree Temperature Outlook Outlook Windy P P Windy Windy P
hot cold moderate Outlook Outlook Windy sunny rainy sunny rainy yes no overcast overcast P P Windy Windy P Humidity N Humidity yes no yes no high normal high normal N P P N Windy P Outlook P yes no sunny rainy overcast N P N P null
15
Attribute Selection Criteria
Main principle Select attribute which partitions the learning set into subsets as “pure” as possible Various measures of purity Information-theoretic X2 ...
16
Information-Theoretic Approach
To classify an object, a certain information is needed I, information After we have learned the value of attribute A, we only need some remaining amount of information to classify the object Ires, residual information Gain Gain(A) = I – Ires(A) The most ‘informative’ attribute is the one that minimizes Ires, i.e., maximizes Gain
17
Entropy The average amount of information I needed to classify an object is given by the entropy measure For a two-class problem: entropy p(c1)
18
Residual Information After applying attribute A, S is partitioned into subsets according to values v of A Ires is equal to weighted sum of the amounts of information for the subsets
19
Information Gain (IG) The information gain is based on the decrease in entropy after a dataset is split on an attribute. Which attribute creates the most homogeneous branches? First the entropy of the total dataset is calculated. The dataset is then split on the different attributes. The entropy for each branch is calculated. Then it is added proportionally, to get total entropy for the split. The resulting entropy is subtracted from the entropy before the split. The result is the Information Gain, or decrease in entropy. The attribute that yields the largest IG is chosen for the decision node.
20
Information Gain (cont’d)
A branch set with entropy of 0 is a leaf node. Otherwise, the branch needs further splitting to classify its dataset. The ID3 algorithm is run recursively on the non-leaf branches, until all data is classified.
21
Triangles and Squares
22
Example:Triangles and Squares
Data Set: A set of classified objects . . . . . .
23
Entropy 5 triangles 9 squares class probabilities entropy . . . . . .
24
Entropy reduction by data set partitioning
. . red yellow green Color? Entropy reduction by data set partitioning . .
25
. . . . . . . red Color? green . yellow .
26
. . . . . . . red Information Gain Color? green . yellow .
27
Information Gain of The Attribute
Attributes Gain(Color) = 0.246 Gain(Outline) = 0.151 Gain(Dot) = 0.048 Heuristics: attribute with the highest gain is chosen This heuristics is local (local minimization of impurity)
28
Gain(Outline) = 0.971 – 0 = 0.971 bits
red Color? green . yellow . Gain(Outline) = – 0 = bits Gain(Dot) = – = bits Color red green yellow square
29
Gain(Outline) = 0.971 – 0.951 = 0.020 bits
red Gain(Outline) = – = bits Gain(Dot) = – 0 = bits Color? green . yellow . solid . Color red green Outline? yellow square Outline dashed . dashed solid triangle square
30
. . . . Dot? Color? . . . Outline? . red yes no green yellow solid
dashed .
31
Decision Tree . . . . . . Color Dot square Outline triangle square
red green yellow Dot square Outline yes no dashed solid triangle square triangle square
32
Advantages of using ID3 Understandable prediction rules are created from the training data. Builds the fastest tree. Builds a short tree. Only need to test enough attributes until all data is classified. Finding leaf nodes enables test data to be pruned, reducing number of tests. Whole dataset is searched to create tree.
33
Disadvantages of using ID3
Data may be over-fitted or over-classified, if a small sample is tested. Only one attribute at a time is tested for making a decision. Classifying continuous data may be computationally expensive, as many trees must be generated to see where to break the continuum.
34
تمرین مهلت تحویل: آخر آبان ماه
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.