Teori Keputusan (Decision Theory) Kecerdasan Buatan Teori Keputusan (Decision Theory)
Decision trees Sebuah pohon keputusan dapat didefinisikan sebagai peta proses penalaran.
An example of a decision tree
A decision tree consists of nodes, branches and leaves. The top node is called the root node. All nodes are connected by branches. Nodes that are at the end of branches are called terminal nodes, or leaves.
Example of a Decision Tree categorical continuous class Splitting Attributes Refund Yes No NO MarSt Single, Divorced Married TaxInc NO < 80K > 80K NO YES Training Data Model: Decision Tree
Apply Model to Test Data Start from the root of tree. Refund MarSt TaxInc YES NO Yes No Married Single, Divorced < 80K > 80K
Apply Model to Test Data Refund MarSt TaxInc YES NO Yes No Married Single, Divorced < 80K > 80K
Apply Model to Test Data Refund Yes No NO MarSt Single, Divorced Married TaxInc NO < 80K > 80K NO YES
Apply Model to Test Data Refund Yes No NO MarSt Single, Divorced Married TaxInc NO < 80K > 80K NO YES
Apply Model to Test Data Refund Yes No NO MarSt Single, Divorced Married TaxInc NO < 80K > 80K NO YES
Apply Model to Test Data Refund Yes No NO MarSt Married Assign Cheat to “No” Single, Divorced TaxInc NO < 80K > 80K NO YES
Weather Data: Play or not Play? Outlook Temperature Humidity Windy Play? sunny hot high false No true overcast Yes rain mild cool normal
Example Tree for “Play?” Outlook sunny rain overcast Humidity Yes Windy high normal true false No Yes No Yes
Entropy at a given node t: (NOTE: p( j | t) is the relative frequency of class j at node t).
Examples for computing Entropy P(C1) = 0/6 = 0 P(C2) = 6/6 = 1 Entropy = – 0 log 0 – 1 log 1 = – 0 – 0 = 0 P(C1) = 1/6 P(C2) = 5/6 Entropy = – (1/6) log2 (1/6) – (5/6) log2 (1/6) = 0.65 P(C1) = 2/6 P(C2) = 4/6 Entropy = – (2/6) log2 (2/6) – (4/6) log2 (4/6) = 0.92
Information Gain: Parent Node, p is split into k partitions; ni is number of records in partition i
Which attribute to select? witten&eibe
Example: attribute “Outlook” “Outlook” = “Sunny”: “Outlook” = “Overcast”: “Outlook” = “Rainy”: Expected information for attribute: Note: log(0) is not defined, but we evaluate 0*log(0) as zero witten&eibe