1 CO3301 - Games Development 2 Week 19 Probability Trees + Decision Trees (Learning Trees) Gareth Bellaby.

Slides:



Advertisements
Similar presentations
Data Mining Lecture 9.
Advertisements

DECISION TREES. Decision trees  One possible representation for hypotheses.
Decision Tree Learning - ID3
CPSC 502, Lecture 15Slide 1 Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 15 Nov, 1, 2011 Slide credit: C. Conati, S.
Huffman code and ID3 Prof. Sin-Min Lee Department of Computer Science.
Data Mining Techniques: Classification. Classification What is Classification? –Classifying tuples in a database –In training set E each tuple consists.
IT 433 Data Warehousing and Data Mining
Decision Tree Approach in Data Mining
Decision Tree Learning 主講人:虞台文 大同大學資工所 智慧型多媒體研究室.
Induction and Decision Trees. Artificial Intelligence The design and development of computer systems that exhibit intelligent behavior. What is intelligence?
Decision Trees Jeff Storey. Overview What is a Decision Tree Sample Decision Trees How to Construct a Decision Tree Problems with Decision Trees Decision.
Decision Trees Instructor: Qiang Yang Hong Kong University of Science and Technology Thanks: Eibe Frank and Jiawei Han.
ID3 Algorithm Abbas Rizvi CS157 B Spring What is the ID3 algorithm? ID3 stands for Iterative Dichotomiser 3 Algorithm used to generate a decision.
Induction of Decision Trees
Basic Data Mining Techniques
Three kinds of learning
LEARNING DECISION TREES
Learning decision trees derived from Hwee Tou Ng, slides for Russell & Norvig, AI a Modern Approachslides Tom Carter, “An introduction to information theory.
Learning decision trees
Learning decision trees derived from Hwee Tou Ng, slides for Russell & Norvig, AI a Modern Approachslides Tom Carter, “An introduction to information theory.
Classification and Prediction by Yen-Hsien Lee Department of Information Management College of Management National Sun Yat-Sen University March 4, 2003.
Classification.
MACHINE LEARNING. What is learning? A computer program learns if it improves its performance at some task through experience (T. Mitchell, 1997) A computer.
Learning Chapter 18 and Parts of Chapter 20
By Wang Rui State Key Lab of CAD&CG
Fall 2004 TDIDT Learning CS478 - Machine Learning.
Machine Learning Chapter 3. Decision Tree Learning
Learning what questions to ask. 8/29/03Decision Trees2  Job is to build a tree that represents a series of questions that the classifier will ask of.
CS-424 Gregory Dudek Today’s outline Administrative issues –Assignment deadlines: 1 day = 24 hrs (holidays are special) –The project –Assignment 3 –Midterm.
Mohammad Ali Keyvanrad
Decision Trees & the Iterative Dichotomiser 3 (ID3) Algorithm David Ramos CS 157B, Section 1 May 4, 2006.
Benk Erika Kelemen Zsolt
1 Learning Chapter 18 and Parts of Chapter 20 AI systems are complex and may have many parameters. It is impractical and often impossible to encode all.
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.3: Decision Trees Rodney Nielsen Many of.
Artificial Intelligence in Game Design N-Grams and Decision Tree Learning.
CS690L Data Mining: Classification
CS 8751 ML & KDDDecision Trees1 Decision tree representation ID3 learning algorithm Entropy, Information gain Overfitting.
ID3 Algorithm Michael Crawford.
MACHINE LEARNING 10 Decision Trees. Motivation  Parametric Estimation  Assume model for class probability or regression  Estimate parameters from all.
CS 5751 Machine Learning Chapter 3 Decision Tree Learning1 Decision Trees Decision tree representation ID3 learning algorithm Entropy, Information gain.
Decision Trees Binary output – easily extendible to multiple output classes. Takes a set of attributes for a given situation or object and outputs a yes/no.
Decision Trees, Part 1 Reading: Textbook, Chapter 6.
Learning to Detect Faces A Large-Scale Application of Machine Learning (This material is not in the text: for further information see the paper by P.
Ensemble Methods in Machine Learning
An Introduction Student Name: Riaz Ahmad Program: MSIT( ) Subject: Data warehouse & Data Mining.
CSC 8520 Spring Paula Matuszek DecisionTreeFirstDraft Paula Matuszek Spring,
Classification Today: Basic Problem Decision Trees.
Data Mining By Farzana Forhad CS 157B. Agenda Decision Tree and ID3 Rough Set Theory Clustering.
1 CO Games Development 2 Week 16 Blackboard Model Gareth Bellaby.
Adversarial Search and Game Playing Russell and Norvig: Chapter 6 Slides adapted from: robotics.stanford.edu/~latombe/cs121/2004/home.htm Prof: Dekang.
Outline Decision tree representation ID3 learning algorithm Entropy, Information gain Issues in decision tree learning 2.
Dr. Chen, Data Mining  A/W & Dr. Chen, Data Mining Chapter 3 Basic Data Mining Techniques Jason C. H. Chen, Ph.D. Professor of MIS School of Business.
Prof. Pushpak Bhattacharyya, IIT Bombay1 CS 621 Artificial Intelligence Lecture 12 – 30/08/05 Prof. Pushpak Bhattacharyya Fundamentals of Information.
1 CO Games Development 2 Week 13 Influence Maps Gareth Bellaby.
CSE343/543 Machine Learning: Lecture 4.  Chapter 3: Decision Trees  Weekly assignment:  There are lot of applications and systems using machine learning.
CO Games Development 2 Week 16 Blackboard Model
DECISION TREES An internal node represents a test on an attribute.
Decision Trees an introduction.
CO Games Development 2 Week 22 Trees
Artificial Intelligence
Ch9: Decision Trees 9.1 Introduction A decision tree:
Data Science Algorithms: The Basic Methods
SAD: 6º Projecto.
Issues in Decision-Tree Learning Avoiding overfitting through pruning
Classification and Prediction
Machine Learning Chapter 3. Decision Tree Learning
Machine Learning Chapter 3. Decision Tree Learning
AI and Machine Learning
Decision Trees Jeff Storey.
Presentation transcript:

1 CO Games Development 2 Week 19 Probability Trees + Decision Trees (Learning Trees) Gareth Bellaby

2 Probability Trees A probability tree is a tree which has probabilities associated with each branch.

3 Probability Trees

4 Probabilities are propagated down the tree. A probability which follows on from another probability is multiplied, i.e. one probability is multiplied by the other probability in order to calculate the final probability. A each level of the tree the total of the probabilities must be equal to the sum of = 1

5 Decision Trees A decision tree is a way of representing knowledge. A decision tree is a way of using inputs to predict future outputs. Decision trees are a good way of expressing decisions for computer games. Not just for AI but general game play. A decision tree is a classification method. Decision trees learn from examples using induction.

6

7 Decision Trees Each internal node is a test. Each leaf node is a classification. The intention is that an unknown type can be classified by traversing the tree. Decision trees can be created in real time extremely efficiently. This means that they are a practical option for machine learning for games.

8 Decision Trees Can have the response "unknown" on a branch. Decision trees can deal with uncertainty. Decision trees cannot use ranges because of the large number of branching alternatives, e.g. floating point numbers. Instead you must provide a set of "buckets" each with a specified range. We'll an example of this below with Black and White talking about "none", "medium" and "maximum". Tests followed in sequence down the tree can be considered to be logical AND. The branching can be considered to be logical OR.

9 Black & White What he ateFeedback - "How nice it tasted" A big rock A small rock-0.5 A small rock-0.4 A tree-0.2 A cow+0.6 The values are averaged. Taken from Evans, R., (2002), "Varieties of Learning".

10 Black & White What creature attackedFeedback from player Friendly town, weak defence, tribe Celtic Enemy town, weak defence, tribe Celtic+0.4 Friendly town, strong defence, tribe Norse Enemy town, strong defence, tribe Norse-0.2 Friendly town, medium defence, tribe Greek Enemy town, medium defence, tribe Greek+0.2 Enemy town, strong defence, tribe Greek-0.4 Enemy town, medium defence, tribe Aztec0.0 Friendly town, weak defence, tribe Aztec

11 Black & White Taken from Evans, R., (2002), "Varieties of Learning".

12 Black & White Some of the criteria is lost because it turns out to be irrelevant, e.g. information about the tribe. The decision tree is created in real-time. Each time the creature receives new input from the player the tree will be rebuilt. Each new input will change the values. The rebuilding could be significant. Information that previously was jettisoned as irrelevant could become relevant.

13 Black & White The Evan's article provides more detail as to how decision trees were used in Black and White. One thing that it is important to add is his observation that in order to iterate through all of the attributes of an object efficiently is necessary to define the objects by their attributes.

14 ID3 The ID3 algorithm was presented by Quinlan, Uses an iterative method. From the training examples a random subset is selected. Test the tree on training examples. If all of the examples are classified successfully then end. Otherwise add some more training examples to our subset and repeat the process.

15 ID3 Start with a root node. Assign to the root node the best attribute. Branch then generated for each value of the attribute. A node is created at the end of each branch. Each training example is assigned to one of these new nodes. If no examples are assigned then the node and branch can be removed. Each node is then treated as a new root and the process repeated.

16 ID3 It should be apparent that different trees can be constructed. It is desirable to derive the smallest tree since this will be the most efficient one. The top most choices need to be the most informative. Aiming towards the greatest information gain. Information theory provides a mathematical measurement of the information content of a message. Information Theory was presented by Shannon in 1948.

17 Information Theory Shannon defines the amount of information in a message as a function of the probability of occurrence of each possible message.

18 ID3 ID3 was extended by Quinlan to provide probabilistic classification using Bayesian statistics.

19 Sources & Further reading DeSylva, C., (2005), "Optimizing a Decision Tree Query Algorithm for Multithreaded Architectures", Game Programming Gems 5, Charles River Media: Hingham, Mass, USA. Evans, R., (2002), "Varieties of Learning", AI Game Programming Wisdom, Charles River Media: Hingham, Mass, USA. Fu, D., & Houlette, R., (2003), "Constructing a Decision Tree Based on Past Experience", AI Game Programming Wisdom 2, Charles River Media: Hingham, Mass, USA.

20 Sources & Further reading Manslow, J., (2006), "Practical Algorithms for In-Game Learning", AI Game Programming Wisdom 3, Charles River Media: Hingham, Mass, USA. Quinlan, J. R., (1986), "Induction of decision trees", Machine Learning, 1: Shannon, C, (1948), "A mathematical theory of communication", Bell System Technical Journal.