L09: Car Buyer Example Joe is going to buy a used car, which could be good with probability 0.8 or a lemon with probability 0.2. Joe's profit will be.

Slides:



Advertisements
Similar presentations
Making Simple Decisions
Advertisements

Homework Questions.
Maximizing the Spread of Influence through a Social Network By David Kempe, Jon Kleinberg, Eva Tardos Report by Joe Abrams.
CPSC 322, Lecture 35Slide 1 Finish VE for Sequential Decisions & Value of Information and Control Computer Science cpsc322, Lecture 35 (Textbook Chpt 9.4)
Decision Theory: Sequential Decisions Computer Science cpsc322, Lecture 34 (Textbook Chpt 9.3) April, 1, 2009.
Decision Theory: Sequential Decisions Computer Science cpsc322, Lecture 34 (Textbook Chpt 9.3) April, 12, 2010.
Warm-up Problems N(2,4) is a normal random variable. What is E[3+N(2,4)]? Random variable X equals 0 with probability 0.4, 3 with probability 0.5, and.
Lottery Problem A state run monthly lottery can sell 100,000tickets at $2 a piece. A ticket wins $1,000,000with a probability , $100 with probability.
© 2010 Pearson Education Inc.Goldstein/Schneider/Lay/Asmar, CALCULUS AND ITS APPLICATIONS, 12e – Slide 1 of 15 Chapter 12 Probability and Calculus.
Lab Assignment 1 Environments Search Bayes Nets. Problem 1: Peg Solitaire Is Peg Solitaire: Partially observable? Stochastic? Continuous? Adversarial?
Analyzing Categorical Data Categorical data is data divided in categories and each category has an associated value Ways to display categorical data: Bar.
7.2 Frequency and Probability Distributions
Chapter 5.1 Probability Distributions.  A variable is defined as a characteristic or attribute that can assume different values.  Recall that a variable.
Areas Of Simulation Application
4.3 Binomial Distributions. Red Tiles and Green Tiles in a Row You have 4 red tiles and 3 green tiles. You need to select 4 tiles. Repeated use of a tiles.
Word Problem! Oh No! Your car has broken down! You don't like walking so you call the local garage to find out what they will charge you to fix it. The.
Simulation is the process of studying the behavior of a real system by using a model that replicates the behavior of the system under different scenarios.
Choice under uncertainty Assistant professor Bojan Georgievski PhD 1.
Continuous Random Variables Lecture 25 Section Mon, Feb 28, 2005.
Using Access Probabilities In Address Lookups Jim Washburn & Katerina Argyraki EE384Y Class Project.
4.1 Probability Distributions NOTES Coach Bridges.
Continuous Random Variables Lecture 24 Section Tue, Mar 7, 2006.
PROBABILITY DISTRIBUTIONS Examples: Sometimes, quantitative variables have values which are based on chance or random outcomes. In this case, they are.
Introduction to Random Variables and Probability Distributions
12/4/2015 Vijit Mittal (NBS, Gr. Noida) 1 Monte Carlo Simulation,Real Options and Decision Tree.
4.2 Binomial Distributions
Continuous Random Variables Lecture 22 Section Mon, Feb 25, 2008.
Probability Distributions, Discrete Random Variables
1 CSC 384 Lecture Slides (c) , C. Boutilier and P. Poupart CSC384: Lecture 25  Last time Decision trees and decision networks  Today wrap up.
Comparison of Tarry’s Algorithm and Awerbuch’s Algorithm CS 6/73201 Advanced Operating System Presentation by: Sanjitkumar Patel.
Today we will solve equations with two variables. Solve = figure out.
Discrete Random Variables
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Continuous Random Variables Lecture 26 Section Mon, Mar 5, 2007.
1 1 Slide Continuous Probability Distributions n The Uniform Distribution  a b   n The Normal Distribution n The Exponential Distribution.
Continuous Random Variables Lecture 24 Section Tue, Oct 18, 2005.
Graphing Function Charts. X Y I Y = 2X + 1 X Y ½ Y - intercept 0, 1 X - intercept -½, 0.
Decision Making ECE457 Applied Artificial Intelligence Spring 2007 Lecture #10.
Discrete Probability Distributions Chapter 4. § 4.1 Probability Distributions.
Copyright © 2009 Pearson Education, Inc. Chapter 16 Random Variables.
Chapter 7 The Normal Probability Distribution
Lecture on Bayesian Belief Networks (Basics)
3.3 Rate of Change and Slope
Continuous Random Variables
5.2 Normal Distributions: Finding Probabilities
Chapter 5 Review MDM 4U Mr. Lieff.
ECE457 Applied Artificial Intelligence Fall 2007 Lecture #10
Chapter 10 (part 3): Using Uncertain Knowledge
ECE457 Applied Artificial Intelligence Spring 2008 Lecture #10
Tips While Buying Computer Games
Research – the good, the bad & the ugly
Chapter 15: Probability Rules!
slope - describes the steepness of a line
The Binomial and Geometric Distributions
Binomial Distributions
Graphing Equations With Two Variables
Buy now at
5 minutes Warm-Up Solve. 1) 2) 3) 4).
Continuous Random Variables
Continuous Random Variables
Use the graph of the given normal distribution to identify μ and σ.
Uniform Distributions and Random Variables
Continuous Random Variables
3-3 Linear Equations A linear equation is an equation of a line.
Modeling Discrete Variables
CS 188: Artificial Intelligence Fall 2007
Probability Distributions
Zachary Blomme University of Nebraska, Lincoln Pattern Recognition
Continuous Random Variables
Introduction to Perfect Competition
Presentation transcript:

L09: Car Buyer Example Joe is going to buy a used car, which could be good with probability 0.8 or a lemon with probability 0.2. Joe's profit will be $60 if the car is good, and $-100 if it is bad. Before buying the car he has the option of having one test or two tests done on it. The first test costs $9, and both together cost $13. The first test has a 90% chance of returning positive if the car is good, and a 40% chance if it's a lemon. If the first test returns positive, then the second test has a 88.89% chance of returning positive if the car is good, and a 33.33% chance if it's a lemon. If the first test returns negative, then the second test has a 100% chance of returning positive if the car is good, and a 44.44% chance if it's a lemon. How to make the decisions: whether to do the tests, and whether to buy the car.

The Model: Decision Graph Random nodes Condition (C): {good, lemon}; First Test (F); Second Test (S): {not done, positive, negative} Decision Nodes Do Tests (T): {none, first, both}; Buy It (B): {buy don’t buy} Utility Nodes U: costs of tests; V: Profits

The Model: Decision Graph Probability Distributions for random variables P(C): P(S|F, C, T) P(F|C, T)

The Model: Decision Graph Utility functions for utility nodes U(T) V(C, B)

Analysis with Netica Finding optimal polices with Netica: Compile network and check “table” of decision nodes Best decision regarding Tests: none

Best decision regarding Buy: No test: Buy First test: Buy if positive Both test: Buy if both positive Impossible scenarios crossed out

Analysis with Netica Netica can also compute expected values for different scenario: Just set the node values, and let Netica do the rest. T=none B=buy: Expected Value: 28 B=don’t buy Expected Value: 0

Analysis with Netica T=first F=positive (0.8) B=buy 35 (best second decision) B=don’t buy -9 F=negative (0.2) B=buy -45 B=don’t buy -9 (best second decision)