Incorporating New Information to Decision Trees (posterior probabilities) MGS3100 - Chapter 6 Part 3.

Slides:



Advertisements
Similar presentations
Decision Making Under Risk Continued: Bayes’Theorem and Posterior Probabilities MGS Chapter 8 Slides 8c.
Advertisements

Chapter 8: Decision Analysis
Chapter 4 Decision Analysis.
Decision Making Under Risk Continued: Decision Trees MGS Chapter 8 Slides 8b.
Psychology 290 Special Topics Study Course: Advanced Meta-analysis April 7, 2014.
Chapter 8 Decision Analysis MT 235.
1 1 Slide © 2009 South-Western, a part of Cengage Learning Slides by John Loucks St. Edward’s University.
INTRODUCTION TO ARTIFICIAL INTELLIGENCE Massimo Poesio LECTURE 11 (Lab): Probability reminder.
CS 484 – Artificial Intelligence1 Announcements Homework 8 due today, November 13 ½ to 1 page description of final project due Thursday, November 15 Current.
BA 452 Lesson C.4 The Value of Information ReadingsReadings Chapter 13 Decision Analysis.
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. 6.1 Chapter Six Probability.
Bayes’ Theorem Suppose we have estimated prior probabilities for events we are concerned with, and then obtain new information. We would like to a sound.
Probability Probability Principles of EngineeringTM
1 1 Slide © 2009 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Chapter 4 Probability.
1 CS 430 / INFO 430 Information Retrieval Lecture 12 Probabilistic Information Retrieval.
1 CS 430 / INFO 430 Information Retrieval Lecture 12 Probabilistic Information Retrieval.
DECISION MODELING WITH MICROSOFT EXCEL Copyright 2001 Prentice Hall DECISION Chapter 8 ANALYSIS Part 2.
Decision Making Under Risk Continued: Decision Trees MGS Chapter 6 Part 2.
Incorporating New Information to Decision Trees (posterior probabilities) MGS Chapter 6 Part 3.
Econ 140 Lecture 51 Bivariate Populations Lecture 5.
DECISION MODELING WITH MICROSOFT EXCEL Copyright 2001 Prentice Hall Publishers and Ardith E. Baker DECISION Chapter 8 ANALYSIS Part 2.
University of Minnesota Medical Technology Evaluation and Market Research Course: MILI/PUBH 6589 Spring Semester, 2012 Stephen T. Parente, Ph.D. Carlson.
1 NA387 Lecture 6: Bayes’ Theorem, Independence Devore, Sections: 2.4 – 2.5.
MBA7025_10.ppt/Apr 7, 2015/Page 1 Georgia State University - Confidential MBA 7025 Statistical Business Analysis Decision Tree & Bayes’ Theorem Apr 7,
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
CHAPTER 5 Probability: Review of Basic Concepts
What is Probability?  Hit probabilities  Damage probabilities  Personality (e.g. chance of attack, run, etc.)  ???  Probabilities are used to add.
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
1 1 Slide Decision Theory Professor Ahmadi. 2 2 Slide Learning Objectives n Structuring the decision problem and decision trees n Types of decision making.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
1 Business System Analysis & Decision Making - Lecture 10 Zhangxi Lin ISQS 5340 July 2006.
November 2004CSA4050: Crash Concepts in Probability1 CSA4050: Advanced Topics in NLP Probability I Experiments/Outcomes/Events Independence/Dependence.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
Chapter 4 Probability ©. Sample Space sample space.S The possible outcomes of a random experiment are called the basic outcomes, and the set of all basic.
Mathematics Conditional Probability Science and Mathematics Education Research Group Supported by UBC Teaching and Learning Enhancement Fund
Quantitative Decision Techniques 13/04/2009 Decision Trees and Utility Theory.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 07: BAYESIAN ESTIMATION (Cont.) Objectives:
12/7/20151 Math b Conditional Probability, Independency, Bayes Theorem.
To accompany Quantitative Analysis for Management, 7e by (Render/Stair 4-1 © 2000 by Prentice Hall, Inc., Upper Saddle River, N.J Quantitative Analysis.
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Probability. Probability Probability is fundamental to scientific inference Probability is fundamental to scientific inference Deterministic vs. Probabilistic.
Univariate Gaussian Case (Cont.)
Statistical NLP: Lecture 4 Mathematical Foundations I: Probability Theory (Ch2)
Conditional Probability If two events are not mutually exclusive, the fact that we know that B has happened will have an effect on the probability of A.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
PROBABILITY 1. Basic Terminology 2 Probability 3  Probability is the numerical measure of the likelihood that an event will occur  The probability.
1 1 Slide © 2005 Thomson/South-Western Chapter 13 Decision Analysis n Problem Formulation n Decision Making without Probabilities n Decision Making with.
Chapter 4, part E Download this file. Download this file.
MGS3100_07.ppt/Apr 7, 2016/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Decision Tree & Bayes’ Theorem Apr 7, 2016.
Bayes’ Theorem Suppose we have estimated prior probabilities for events we are concerned with, and then obtain new information. We would like to a sound.
Bayesian Estimation and Confidence Intervals
Chapter 4 Probability.
Quick Review Probability Theory
Quick Review Probability Theory
Decision Tree Analysis
John Loucks St. Edward’s University . SLIDES . BY.
Bayes’ Theorem Suppose we have estimated prior probabilities for events we are concerned with, and then obtain new information. We would like to a sound.
STAT 203 Tree Diagrams and Bayes’ Rule
Chapter 23 Decision Analysis.
Decision Making Under Risk Continued: Decision Trees
Statistical NLP: Lecture 4
Introduction to Probability & Statistics Expectations
LECTURE 09: BAYESIAN LEARNING
Bayes for Beginners Luca Chech and Jolanda Malamud
Decision Analysis.
basic probability and bayes' rule
Presentation transcript:

Incorporating New Information to Decision Trees (posterior probabilities) MGS3100 - Chapter 6 Part 3

How We Will Use Bayes' Theorem Prior information can be based on the results of previous experiments, or expert opinion, and can be expressed as probabilities. If it is desirable to improve on this state of knowledge, an experiment can be conducted. Bayes' Theorem is the mechanism used to update the state of knowledge with the results of the experiment to provide a posterior distribution.

Bayes’ Theorem Used to revise probabilities based upon new data Prior Posterior probabilities

P(A|B) * P(B) = P(AB) = P(B|A) * P(A) How Bayes' Theorem Works Let the experiment be A and the prediction be B. Let’s assume that both have occurred. The probability of both A and B together is P(A∩B), or simply P(AB). The law of conditional probability says that this probability can be found as the product of the conditional probability of one, given the other, times the probability of the other. That is: P(A|B) * P(B) = P(AB) = P(B|A) * P(A) Simple algebra shows that: P(B|A) = P(A|B) * P(B) P(A)    This is Bayes' Theorem.

Sequential Decisions Would you hire a market research group or a consultant (or a psychic) to get more info about states of nature? How would additional info cause you to revise your probabilities of states of nature occuring? Draw a new tree depicting the complete problem.

Problem: Marketing Cellular Phones The design and product-testing phase has just been completed for Sonorola’s new line of cellular phones. Three alternatives are being considered for a marketing/production strategy for this product: 1. Aggressive (A) 2. Basic (B) 3. Cautious (C) Management decides to categorize the level of demand as either strong (S) or weak (W).

This decision is indicated in the TreePlan by Here, we reproduce the last slide of the Sonorola problem from lecture slides part 2. Of the three expected values, choose 12.85, the branch associated with the Basic strategy. This decision is indicated in the TreePlan by the number 2 in the decision node.

Marketing Department Reports on the state of the market Encouraging Discouraging

Find the Conditional Probability based on the prior track record: First, find out the reliability of the source of information (in this case, the marketing research group). Find the Conditional Probability based on the prior track record: For two events A and B, the conditional probability [P(A|B)], is the probability of event A given that event B will occur. For example, P(E|S) is the conditional probability that marketing gives an encouraging report given that the market is in fact going to be strong.

If marketing were perfectly reliable, P(E|S) = 1. However, marketing has the following “track record” in predicting the market: P(E|S) = 0.6 P(D|S) = 1 - P(E|S) = 0.4 P(D|W) = 0.7 P(E|W) = 1 - P(D|W) = 0.3 Here is the same information displayed in tabular form:

Calculating the Posterior Probabilities: Suppose that marketing has come back with an encouraging report. Knowing this, what is the probability that the market is in fact strong [P(S|E)]? Note that probabilities such as P(S) and P(W) are initial estimates called a prior probabilities. Conditional probabilities such as P(S|E) are called posterior probabilities. The domestic tractor division has already estimated the prior probabilities as P(S) = 0.45 and P(W) = 0.55. Now, use Bayes’ Theorem (see appendix for a formal description) to determine the posterior probabilities.

P(E|S) P(E|W) P(D|S) P(D|W) =SUM(B12:C12) =B3*B$8 =SUM(B12:B13) =B12/$D12

Appendix Bayes Theorem Bayes' theorem is a result in probability theory, which gives the conditional probability distribution of a random variable A given B in terms of the conditional probability distribution of variable B given A and the marginal probability distribution of A alone. In the context of Bayesian probability theory and statistical inference, the marginal probability distribution of A alone is usually called the prior probability distribution or simply the prior. The conditional distribution of A given the "data" B is called the posterior probability distribution or just the posterior.