Presentation is loading. Please wait.

Presentation is loading. Please wait.

Teori Keputusan (Decision Theory)

Similar presentations


Presentation on theme: "Teori Keputusan (Decision Theory)"— Presentation transcript:

1 Teori Keputusan (Decision Theory)
Kecerdasan Buatan Teori Keputusan (Decision Theory)

2 Decision trees Sebuah pohon keputusan dapat didefinisikan sebagai peta proses penalaran.

3 An example of a decision tree

4 A decision tree consists of nodes, branches and leaves.
The top node is called the root node. All nodes are connected by branches. Nodes that are at the end of branches are called terminal nodes, or leaves.

5 Example of a Decision Tree
categorical continuous class Splitting Attributes Refund Yes No NO MarSt Single, Divorced Married TaxInc NO < 80K > 80K NO YES Training Data Model: Decision Tree

6 Apply Model to Test Data
Start from the root of tree. Refund MarSt TaxInc YES NO Yes No Married Single, Divorced < 80K > 80K

7 Apply Model to Test Data
Refund MarSt TaxInc YES NO Yes No Married Single, Divorced < 80K > 80K

8 Apply Model to Test Data
Refund Yes No NO MarSt Single, Divorced Married TaxInc NO < 80K > 80K NO YES

9 Apply Model to Test Data
Refund Yes No NO MarSt Single, Divorced Married TaxInc NO < 80K > 80K NO YES

10 Apply Model to Test Data
Refund Yes No NO MarSt Single, Divorced Married TaxInc NO < 80K > 80K NO YES

11 Apply Model to Test Data
Refund Yes No NO MarSt Married Assign Cheat to “No” Single, Divorced TaxInc NO < 80K > 80K NO YES

12 Weather Data: Play or not Play?
Outlook Temperature Humidity Windy Play? sunny hot high false No true overcast Yes rain mild cool normal

13 Example Tree for “Play?”
Outlook sunny rain overcast Humidity Yes Windy high normal true false No Yes No Yes

14 Entropy at a given node t:
(NOTE: p( j | t) is the relative frequency of class j at node t).

15 Examples for computing Entropy
P(C1) = 0/6 = P(C2) = 6/6 = 1 Entropy = – 0 log 0 – 1 log 1 = – 0 – 0 = 0 P(C1) = 1/ P(C2) = 5/6 Entropy = – (1/6) log2 (1/6) – (5/6) log2 (1/6) = 0.65 P(C1) = 2/ P(C2) = 4/6 Entropy = – (2/6) log2 (2/6) – (4/6) log2 (4/6) = 0.92

16 Information Gain: Parent Node, p is split into k partitions;
ni is number of records in partition i

17 Which attribute to select?
witten&eibe

18 Example: attribute “Outlook”
“Outlook” = “Sunny”: “Outlook” = “Overcast”: “Outlook” = “Rainy”: Expected information for attribute: Note: log(0) is not defined, but we evaluate 0*log(0) as zero witten&eibe


Download ppt "Teori Keputusan (Decision Theory)"

Similar presentations


Ads by Google