Bayes Rule for probability. Let A 1, A 2, …, A k denote a set of events such that An generalization of Bayes Rule for all i and j. Then.

Slides:



Advertisements
Similar presentations
Discrete Random Variables To understand what we mean by a discrete random variable To understand that the total sample space adds up to 1 To understand.
Advertisements

Random Variables & Probability Distributions The probability of someone laughing at you is proportional to the stupidity of your actions.
Lecture Discrete Probability. 5.2 Recap Sample space: space of all possible outcomes. Event: subset of of S. p(s) : probability of element s of.
7 Probability Experiments, Sample Spaces, and Events
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Copyright © Cengage Learning. All rights reserved. 8.6 Probability.
Chapter 5 Probability Distributions. E.g., X is the number of heads obtained in 3 tosses of a coin. [X=0] = {TTT} [X=1] = {HTT, THT, TTH} [X=2] = {HHT,
22C:19 Discrete Structures Discrete Probability Fall 2014 Sukumar Ghosh.
Lec 18 Nov 12 Probability – definitions and simulation.
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 4-1 Business Statistics: A Decision-Making Approach 7 th Edition Chapter.
Class notes for ISE 201 San Jose State University
Probability Distributions: Finite Random Variables.
Joint Distribution of two or More Random Variables
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Stat 1510: Introducing Probability. Agenda 2  The Idea of Probability  Probability Models  Probability Rules  Finite and Discrete Probability Models.
Chapter 3 Section 3.2 Basic Terms of Probability.
Section 7.1. Section Summary Finite Probability Probabilities of Complements and Unions of Events Probabilistic Reasoning.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Conditional probability.
Chapter 7 With Question/Answer Animations. Section 7.1.
Discrete probability Business Statistics (BUSA 3101) Dr. Lari H. Arjomand
5.3 Random Variables  Random Variable  Discrete Random Variables  Continuous Random Variables  Normal Distributions as Probability Distributions 1.
Random Variables an important concept in probability.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Copyright © Cengage Learning. All rights reserved. 8.6 Probability.
CHAPTER 10: Introducing Probability ESSENTIAL STATISTICS Second Edition David S. Moore, William I. Notz, and Michael A. Fligner Lecture Presentation.
22C:19 Discrete Structures Discrete Probability Spring 2014 Sukumar Ghosh.
Random Variables an important concept in probability.
3. Conditional probability
Rules of Probability. Recall: Axioms of Probability 1. P[E] ≥ P[S] = 1 3. Property 3 is called the additive rule for probability if E i ∩ E j =
Natural Language Processing Giuseppe Attardi Introduction to Probability IP notice: some slides from: Dan Jurafsky, Jim Martin, Sandiway Fong, Dan Klein.
Probability Distributions
Rules of Probability. The additive rule P[A  B] = P[A] + P[B] – P[A  B] and if P[A  B] =  P[A  B] = P[A] + P[B]
Introduction Lecture 25 Section 6.1 Wed, Mar 22, 2006.
Chapter 4 Random Variables - 1. Outline Random variables Discrete random variables Expected value 2.
MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.
Week 21 Rules of Probability for all Corollary: The probability of the union of any two events A and B is Proof: … If then, Proof:
Great Theoretical Ideas in Computer Science for Some.
Basic Concepts of Discrete Probability 1. Sample Space When “probability” is applied to something, we usually mean an experiment with certain outcomes.
Section 7.1. Probability of an Event We first define these key terms: An experiment is a procedure that yields one of a given set of possible outcomes.
Math 145 September 18, Terminologies in Probability  Experiment – Any process that produces an outcome that cannot be predicted with certainty.
No Warm-Up today. You have a Quiz Clear your desk of everything but a calculator and something to write with.
3/7/20161 Now it’s time to look at… Discrete Probability.
Warm Up 1. Gretchen is making dinner. She has tofu, chicken and beef for an entrée, and French fries, salad and corn for a side. If Ingrid has 6 drinks.
Warm Up 1. Ingrid is making dinner. She has tofu, chicken and beef for an entrée, and French fries, salad and corn for a side. If Ingrid has 6 drinks to.
1 What Is Probability?. 2 To discuss probability, let’s begin by defining some terms. An experiment is a process, such as tossing a coin, that gives definite.
Chapter 6 Probability Mohamed Elhusseiny
Terminologies in Probability
Copyright © 2016, 2013, and 2010, Pearson Education, Inc.
Reference: (Material source and pages)
CS104:Discrete Structures
Math 145 September 25, 2006.
CHAPTER 12: Introducing Probability
Warm Up 1. Gretchen is making dinner. She has tofu, chicken and beef for an entrée, and French fries, salad and corn for a side. If Ingrid has 6 drinks.
Conditional Probability. Expected Value
Natural Language Processing
Warm Up Which of the following are combinations?
Bayes Rule for probability
Great Theoretical Ideas In Computer Science
Terminologies in Probability
Terminologies in Probability
Terminologies in Probability
2. Conditional Probability
Terminologies in Probability
Discrete & Continuous Random Variables
Math 145 October 3, 2006.
Math 145 June 26, 2007.
Terminologies in Probability
Math 145 February 12, 2008.
Terminologies in Probability
Presentation transcript:

Bayes Rule for probability

Let A 1, A 2, …, A k denote a set of events such that An generalization of Bayes Rule for all i and j. Then

Example: We have three urns. Urn 1 contains 14 red balls and 12 black balls. Urn 2 contains 6 red balls and 20 black balls. Urn 3 contains 3 red balls and 23 black balls. An Urn is selected at random and a ball is selected from that urn. If the ball turns out to be red what is the probability that it came from the first urn? second urn? third Urn? Urn 1 Urn 2 Urn 3

Solution: Note: the desired conditional probability is in the reverse direction of the given conditional probabilities. This is the case when Bayes rule should be used Let A i = the event that we select urn i Let B = the event that we select a red ball

Bayes rule states

Example: Suppose that an electronic device is manufactured by a company. During a period of a week –15% of this product is manufactured on Monday, –23% on Tuesday, –26% on Wednesday, –24% on Thursday and –12% on Friday.

Also during a period of a week –5% of the product is manufactured on Monday is defective –3 % of the product is manufactured on Tuesday is defective, –1 % of the product is manufactured on Wednesday is defective, –2 % of the product is manufactured on Thursday is defective and –6 % of the product is manufactured on Friday is defective. If the electronic device manufactured by this plant turns out to be defective, what is the probability that is as manufactured on Monday, Tuesday, Wednesday, Thursday or Friday?

Solution: Let A 1 = the event that the product is manufactured on Monday A 2 = the event that the product is manufactured on Tuesday A 3 = the event that the product is manufactured on Wednesday A 4 = the event that the product is manufactured on Thursday A 5 = the event that the product is manufactured on Friday Let B = the event that the product is defective

Now P[A 1 ] = 0.15, P[A 2 ] = 0.23, P[A 3 ] = 0.26, P[A 4 ] = 0.24 and P[A 5 ] = 0.12 Also P[B|A 1 ] = 0.05, P[B|A 2 ] = 0.03, P[B|A 3 ] = 0.01, P[B|A 4 ] = 0.02 and P[B|A 5 ] = 0.06 We want to find P[A 1 |B], P[A 2 |B], P[A 3 |B], P[A 4 |B] and P[A 5 |B]. We will apply Bayes Rule

iP[Ai]P[Ai]P[B|Ai]P[B|Ai]P[Ai]P[B|Ai]P[Ai]P[B|Ai]P[Ai|B]P[Ai|B] Total

The sure thing principle and Simpson’s paradox

The sure thing principle Suppose Example – to illustrate Let A = the event that horse A wins the race. B = the event that horse B wins the race. C = the event that the track is dry = the event that the track is muddy

Proof:

Simpson’s Paradox Does Example to illustrate D = death due to lung cancer S = smoker C = lives in city, = lives in country

If we let Then the statement would be true using the Sure Thing Principle This logic is incorrect The events are not defined and do not make sense. The conditional probabilities are defined.

similarly Solution

is greater than whether depends also on the values of

than whether and

123 The Monty Hall Problem Behind one of the three doors there is a valuable prize. Behind the other two doors is a worthless prize. You are asked to pick one of the doors. After you have selected, Monty Hall opens one of the doors and reveals a worthless prize. He then asks you do you want to switch your choice.

1.Should you change your choice? 2.Should you keep your first choice? or 3.It does not matter. Solution Suppose you choice is door #1, and Monty reveals that door #3 has a worthless prize behind it. We can always renumber the doors so that this is the case. Let A i = the event that the valuable prize is behind door number i. i = 1, 2, 3. P [A 1 ] = P [A 2 ] = P [A 3 ] = 1 / 3 S = A 1  A 2  A 3 and A i  A j = 

Another Solution (the correct solution) The probability that you pick the correct door is 1 / 3. If you pick the correct door Monty will pick randomly between the two worthless doors. If you did not pick the correct door Monty will choose the worthless door to open with with probability 1. Let B i = the event that Monty opens door i. i = 1, 2, 3. Again P [A 1 ] = P [A 2 ] = P [A 3 ] = 1 / 3 and S = A 1  A 2  A 3 and A i  A j = 

Also We want to compute P [A 1 |B 3 ] andP [A 2 |B 3 ]. and

1 23 Another Problem We have three chests each having 2 drawers In chest 1 there is a gold coin in each drawer. In chest 2 there is a silver coin in each drawer. In chest 3 there is a gold coin in the top drawer and a silver coin in the bottom drawer..

One of the chests is selected at random. Then the drawer is selected at random. The coin in that drawer turns out to be gold. What is the probability that the coin in the other drawer is also gold? Is it ½ ? 1 23 Solution Let C i = the event that we select Chest i. i = 1, 2, 3. P [C 1 ] = P [C 2 ] = P [C 3 ] = 1 / 3 S = C 1  C 2  C 3 and C i  C j = 

We want to compute Let D 1 = the event that we select top drawer in the chest. G = the event the coin in the drawer is goldLet = (C 1  D 1 )  (C 1  D 2 )  (C 3  D 1 ) D 2 = the event that we select bottom drawer in the chest. P[C 1 |G].

Thus Comment: There are 6 drawers and three of those drawers contain gold coins. Of those three drawers two are in a chest that has a gold coin in the other drawer.

Random Variables an important concept in probability

A random variable, X, is a numerical quantity whose value is determined be a random experiment Examples 1.Two dice are rolled and X is the sum of the two upward faces. 2.A coin is tossed n = 3 times and X is the number of times that a head occurs. 3.We count the number of earthquakes, X, that occur in the San Francisco region from 2000 A. D, to 2050A. D. 4.Today the TSX composite index is 11,050.00, X is the value of the index in thirty days

Examples – R.V.’s - continued 5.A point is selected at random from a square whose sides are of length 1. X is the distance of the point from the lower left hand corner. 6.A chord is selected at random from a circle. X is the length of the chord. point X chord X

Definition – The probability function, p(x), of a random variable, X. For any random variable, X, and any real number, x, we define where {X = x} = the set of all outcomes (event) with X = x.

Definition – The cumulative distribution function, F(x), of a random variable, X. For any random variable, X, and any real number, x, we define where {X ≤ x} = the set of all outcomes (event) with X ≤ x.

(1,1) 2 (1,2) 3 (1,3) 4 (1,4) 5 (1,5) 6 (1,6) 7 (2,1) 3 (2,2) 4 (2,3) 5 (2,4) 6 (2,5) 7 (2,6) 8 (3,1) 4 (3,2) 5 (3,3) 6 (3,4) 7 (3,5) 8 (3,6) 9 (4,1) 5 (4,2) 6 (4,3) 7 (4,4) 8 (4,5) 9 (4,6) 10 (5,1) 6 (5,2) 7 (5,3) 8 (5,4) 9 (5,5) 10 (5,6) 11 (6,1) 7 (6,2) 8 (6,3) 9 (6,4) 10 (6,5) 11 (6,6) 12 Examples 1.Two dice are rolled and X is the sum of the two upward faces. S, sample space is shown below with the value of X for each outcome

Graph x p(x)p(x)

The cumulative distribution function, F(x) For any random variable, X, and any real number, x, we define where {X ≤ x} = the set of all outcomes (event) with X ≤ x. Note {X ≤ x} =  if x < 2. Thus F(x) = 0. {X ≤ x} =  {(1,1)} if 2 ≤ x < 3. Thus F(x) = 1/36 { X ≤ x} =  {(1,1),(1,2),(1,2)} if 3 ≤ x < 4. Thus F(x) = 3/36

Continuing we find F(x) is a step function

2.A coin is tossed n = 3 times and X is the number of times that a head occurs. The sample Space S = {HHH (3), HHT (2), HTH (2), THH (2), HTT (1), THT (1), TTH (1), TTT (0)} for each outcome X is shown in brackets

Graph probability function p(x)p(x) x

Graph Cumulative distribution function F(x)F(x) x

Examples – R.V.’s - continued 5.A point is selected at random from a square whose sides are of length 1. X is the distance of the point from the lower left hand corner. 6.A chord is selected at random from a circle. X is the length of the chord. point X chord X

Examples – R.V.’s - continued 5.A point is selected at random from a square whose sides are of length 1. X is the distance of the point from the lower left hand corner. An event, E, is any subset of the square, S. P[E] = (area of E)/(Area of S) = area of E point X S E

Thus p(x) = 0 for all values of x. The probability function for this example is not very informative S The probability function

The Cumulative distribution function S

S

The probability density function, f(x), of a continuous random variable Suppose that X is a random variable. Let f(x) denote a function define for -  < x <  with the following properties: 1. f(x) ≥ 0 Then f(x) is called the probability density function of X. The random, X, is called continuous.

Probability density function, f(x)

Cumulative distribution function, F(x)

Thus if X is a continuous random variable with probability density function, f(x) then the cumulative distribution function of X is given by: Also because of the fundamental theorem of calculus.

Example A point is selected at random from a square whose sides are of length 1. X is the distance of the point from the lower left hand corner. point X

Now

Also

Now and

Finally

Graph of f(x)