Download presentation
Presentation is loading. Please wait.
1
Chapter 8 Tutorial
2
Expected information (entropy) needed to classify a tuple in D:
Where pi is the probability that an arbitrary tuple in D belongs to class Ci, estimated by |Ci, D|/|D| Information needed (after using A to split D into v partitions) to classify D: Information gained by branching on attribute A
3
Question Consider the training examples shown in the table for a binary classification problem.
4
(A) what is the entropy of this collection of training examples with respect to the positive class?
There are four positive examples and five negative examples. Thus, P(+) = 4/9 and P(−) = 5/9. The entropy of the training examples is −4/9 log2(4/9) − 5/9 log2(5/9) =
5
(b) What are the information gains of a1 and a2 relative to these training examples?
6
End
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.