Presentation is loading. Please wait.

Presentation is loading. Please wait.

Incorporating New Information to Decision Trees (posterior probabilities) MGS3100 - Chapter 6 Part 3.

Similar presentations


Presentation on theme: "Incorporating New Information to Decision Trees (posterior probabilities) MGS3100 - Chapter 6 Part 3."— Presentation transcript:

1 Incorporating New Information to Decision Trees (posterior probabilities) MGS3100 - Chapter 6 Part 3

2 Here, we reproduce the last slide of the Sonorola problem from lecture slides part 2. Basic Of the three expected values, choose 12.85, the branch associated with the Basic strategy. This decision is indicated in the TreePlan by 2 the number 2 in the decision node.

3 Sequential Decisions Would you hire a market research group or a consultant (or a psychic) to get more info about states of nature? How would additional info cause you to revise your probabilities of states of nature occuring? Draw a new tree depicting the complete problem.

4 First, find out the reliability of the source of information (in this case, the marketing research group). Find the Conditional Probability based on the prior track record: given For two events A and B, the conditional probability [P(A|B)], is the probability of event A given that event B will occur. given For example, P(E|S) is the conditional probability that marketing gives an encouraging report given that the market is in fact going to be strong.

5 If marketing were perfectly reliable, P(E|S) = 1. However, marketing has the following “track record” in predicting the market: P(E|S) = 0.6 P(D|S) = 1 - P(E|S) = 0.4 P(D|W) = 0.7 P(E|W) = 1 - P(D|W) = 0.3 Here is the same information displayed in tabular form:

6 Calculating the Posterior Probabilities: Calculating the Posterior Probabilities: Suppose that marketing has come back with an encouraging report. Knowing this, what is the probability that the market is in fact strong [P(S|E)]? prior probabilities Note that probabilities such as P(S) and P(W) are initial estimates called a prior probabilities. posterior probabilities Conditional probabilities such as P(S|E) are called posterior probabilities. The domestic tractor division has already estimated the prior probabilities as P(S) = 0.45 and P(W) = 0.55. Now, use Bayes’ Theorem (see appendix for a formal description) to determine the posterior probabilities.

7 P(E|S)P(E|W)P(D|S)P(D|W)=B12/$D12=SUM(B12:B13)=SUM(B12:C12)=B3*B$8

8 Appendix Bayes Theorem Bayes' theorem is a result in probability theory, which gives the conditional probability distribution of a random variable A given B in terms of the conditional probability distribution of variable B given A and the marginal probability distribution of A alone. In the context of Bayesian probability theory and statistical inference, the marginal probability distribution of A alone is usually called the prior probability distribution or simply the prior. The conditional distribution of A given the "data" B is called the posterior probability distribution or just the posterior.


Download ppt "Incorporating New Information to Decision Trees (posterior probabilities) MGS3100 - Chapter 6 Part 3."

Similar presentations


Ads by Google