ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring 2013 4. Random variables part two.

Slides:



Advertisements
Similar presentations
DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
Advertisements

Discrete Random Variables
Acknowledgement: Thanks to Professor Pagano
Chapter 12 Probability © 2008 Pearson Addison-Wesley. All rights reserved.
Lecture (7) Random Variables and Distribution Functions.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
BINOMIAL AND NORMAL DISTRIBUTIONS BINOMIAL DISTRIBUTION “Bernoulli trials” – experiments satisfying 3 conditions: 1. Experiment has only 2 possible outcomes:
Flipping an unfair coin three times Consider the unfair coin with P(H) = 1/3 and P(T) = 2/3. If we flip this coin three times, the sample space S is the.
Chapter 2 Discrete Random Variables
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Chapter 1 Probability Theory (i) : One Random Variable
Discrete Random Variables and Probability Distributions
Probability Distributions
1 Copyright M.R.K. Krishna Rao 2003 Chapter 5. Discrete Probability Everything you have learned about counting constitutes the basis for computing the.
Bell Work: Factor x – 6x – Answer: (x – 8)(x + 2)
Review of important distributions Another randomized algorithm
1 Random Variables and Discrete probability Distributions SESSION 2.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Chapter 5 Several Discrete Distributions General Objectives: Discrete random variables are used in many practical applications. These random variables.
Chapter 5 Discrete Probability Distribution I. Basic Definitions II. Summary Measures for Discrete Random Variable Expected Value (Mean) Variance and Standard.
Unit 4 Starters. Starter Suppose a fair coin is tossed 4 times. Find the probability that heads comes up exactly two times.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 4 and 5 Probability and Discrete Random Variables.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables.
Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics.
Simple Mathematical Facts for Lecture 1. Conditional Probabilities Given an event has occurred, the conditional probability that another event occurs.
Expected values and variances. Formula For a discrete random variable X and pmf p(X): Expected value: Variance: Alternate formula for variance:  Var(x)=E(X^2)-[E(X)]^2.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Conditional probability.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Basic Business Statistics.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Chapter 5: Random Variables and Discrete Probability Distributions
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Properties of expectation.
Bernoulli Trials Two Possible Outcomes –Success, with probability p –Failure, with probability q = 1  p Trials are independent.
Convergence in Distribution
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part two.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
3. Conditional probability
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Limit theorems.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Chapter 16 Week 6, Monday. Random Variables “A numeric value that is based on the outcome of a random event” Example 1: Let the random variable X be defined.
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U Authors: Gary Greer (with.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
3/7/20161 Now it’s time to look at… Discrete Probability.
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
1. 2 At the end of the lesson, students will be able to (c)Understand the Binomial distribution B(n,p) (d) find the mean and variance of Binomial distribution.
Section 7.3. Why we need Bayes?  How to assess the probability that a particular event occurs on the basis of partial evidence.  The probability p(F)
Conditional Probability 423/what-is-your-favorite-data-analysis-cartoon 1.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Continuous Random Variables.
INTRODUCTION TO ECONOMIC STATISTICS Topic 5 Discrete Random Variables These slides are copyright © 2010 by Tavis Barr. This work is licensed under a Creative.
Chapter Five The Binomial Probability Distribution and Related Topics
3 Discrete Random Variables and Probability Distributions
What Is Probability?.
Random variables (r.v.) Random variable
5. Continuous Random Variables
Binomial Distribution
3.4 The Binomial Distribution
Simple Random Sample A simple random sample (SRS) of size n consists of n elements from the population chosen in such a way that every set of n elements.
Introduction to Probability and Statistics
Probability Key Questions
Probability Theory and Specific Distributions (Moore Ch5 and Guan Ch6)
Lecture 2 Binomial and Poisson Probability Distributions
Expected values and variances
3. Independence and Random Variables
4. Expectation and Variance Joint PMFs
5. Conditioning and Independence
Chapter 11 Probability.
Presentation transcript:

ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part two

Review A discrete random variable X assigns a discrete value to every outcome in the sample space. Probability mass function of X : p(x) = P(X = x) Expected value of X : E[X] = ∑ x x p(x). E[N]E[N] N: number of heads in two coin flips

One die Example from last time F = face value of fair 6-sided die E[F] = =

Two dice S = sum of face values of two fair 6-sided dice Solution s pS(s)pS(s) E[S] = … = 7 We calculate the p.m.f. of S :

Two dice again S = sum of face values of two fair 6-sided dice F1F1 F2F2 S = F 1 + F 2 F 1 = outcome of first die F 2 = outcome of second die

Sum of random variables Let X, Y be two random variables. X + Y is the random variable that assigns value X(  ) + Y(  ) to outcome . X assigns value X(  ) to outcome  Y assigns value Y(  ) to outcome 

Sum of random variables F1F1 F2F2 S = F 1 + F … … … …

Linearity of expectation E[X + Y] = E[X] + E[Y] For every two random variables X and Y

Two dice again S = sum of face values of two fair 6-sided dice Solution 2 E[S] = E[F 1 ] + E[F 2 ]= = 7 F1F1 F2F2 S = F 1 + F 2

Balls We draw 3 balls without replacement from this urn: What is the expected sum of values on the 3 balls? 0

Balls S = B 1 + B 2 + B 3 where B i is the value of i -th ball. E[S] = E[B 1 ] + E[B 2 ] + E[B 3 ] p.m.f of B 1 : -101 x p(x)p(x) E[B 1 ] = -1 (4/9) + 0 (2/9) + 1 (3/9) = -1/9 same for B 2, B 3 E[S] = 3 (-1/9) = -1/3.

Three dice E[N] = E[I 1 ] + E[I 2 ] + E[I 3 ] Solution I1I1 I2I2 I3I3 N = I 1 + I 2 + I 3 E[I 1 ] = 1 (1/6) + 0(5/6) = 1/6 E[I 2 ], E[I 3 ] = 1/6 = 3 (1/6) = 1/2 Ik =Ik = 1 if face value of k th die equals 0 if not N = number of s. Find E[N].

Problem for you to solve Five balls are chosen without replacement from an urn with 8 blue balls and 10 red balls. What is the expected number of blue balls that are chosen? What if the balls are chosen with replacement? (a) (b)

The indicator (Bernoulli) random variable Perform a trial that succeeds with probability p and fails with probability 1 – p p(x)p(x) 1 – pp x p = 0.5 p(x)p(x) p = 0.4 p(x)p(x) E[X] = p If X is Bernoulli(p) then

The binomial random variable Binomial(n, p) : Perform n independent trials, each of which succeeds with probability p. X = number of successes Examples Toss n coins. (# heads) is Binomial(n, ½). Toss n dice. (# s) is Binomial(n, 1/6).

A less obvious example Toss n coins. Let C be the number of consecutive changes ( HT or TH ).  C()C() HTHHHHT 3 THHHHHT HHHHHHH 2 0 Examples: Then C is Binomial(n – 1, ½).

A non-example Draw 10 cards from a 52-card deck. Let N = number of aces among the drawn cards Is N a Binomial(10, 1/13) random variable? No! Different trial outcomes are not independent.

Properties of binomial random variables If X is Binomial(n, p), its p.m.f. is p(k) = P(X = k) = C(n, k) p k (1 - p) n-k We can write X = I 1 + … + I n, where I i is an indicator random variable for the success of the i -th trial E[X] = E[I 1 ] + … + E[I n ] = p + … + p = np. E[X] = np

Probability mass function Binomial(10, 0.5) Binomial(50, 0.5) Binomial(10, 0.3) Binomial(50, 0.3)

Investments You have two investment choices: A: put $25 in one stock B: put $½ in each of 50 unrelated stocks Which do you prefer?

Investments Probability model Each stock doubles in value with probability ½ loses all value with probability ½ Different stocks perform independently

Investments N A = amount on choice A N B = amount on choice B 50 × Bernoulli(½) A: put $50 in one stockB: put $½ in each of 50 stocks Binomial(50, ½) E[NA]E[NA] E[NB]E[NB]

Variance and standard deviation Let  = E[X] be the expected value of X. The variance of X is the quantity Var[X] = E[(X –  ) 2 ] The standard deviation of X is  = √ Var[X] They measure how close X and  are typically.

Calculating variance  = E[N A ] y25 2 q(y)q(y) 1 p.m.f of (N A –  ) 2 Var[N A ] = E[(N A –  ) 2 ] x050 p(x)p(x) ½ p.m.f of N A = 25 2  = std. dev. of N A = 25  –   + 

Another formula for variance Var[X] = E[(X –  ) 2 ] = E[X 2 – 2  X +  2 ] = E[X 2 ] + E[–2  X] + E[  2 ] = E[X 2 ] – 2  E[X] +  2 = E[X 2 ] – 2  +  2 = E[X 2 ] –  2 for constant c, E[cX] = cE[X] for constant c, E[c] = c Var[X] = E[X 2 ] – E[X] 2

Variance of binomial random variable Suppose X is Binomial(n, p). Then X = I 1 + … + I n, where I i = 1, if trial i succeeds 0, if trial i fails Var[X] = E[X 2 ] –  2 = E[X 2 ] – (np) 2  = E[X] = np E[X2]E[X2] = E[(I 1 + … + I n ) 2 ] = E[I … + I n 2 + I 1 I 2 + … + I n-1 I n ] = E[I 1 2 ] + … + E[I n 2 ] + E[I 1 I 2 ] + … + E[I n-1 I n ] E[I i 2 ] = E[I i ] = p E[I i I j ] = P(I i = 1 and I j = 1) = P(I i = 1) P(I j = 1) = p 2 = n p = n(n-1) p 2 = np + n(n-1) p 2 – (np) 2

Variance of binomial random variable Suppose X is Binomial(n, p). Var[X] = np + n(n-1) p 2 – (np) 2  = E[X] = np = np – p 2 = np(1-p) Var[X] = np(1-p)  = √ np(1-p). The standard deviation of X is

Investments N A = amount on choice A N B = amount on choice B 50 × Bernoulli(½) A: put $50 in one stockB: put $½ in each of 50 stocks Binomial(50, ½)    = 25  = √ 50 ½ ½ = 3.536…  –   + 

Apples About 10% of the apples on your farm are rotten. You sell 10 apples. How many are rotten? Probability model N umber of rotten apples you sold is Binomial(n = 10, p = 1/10). E[N] = np = 1

Apples You improve productivity; now only 5% apples rot. You can now sell 20 apples and only one will be rotten on average. N is now Binomial(20, 1/20).

Binomial(10, 1/10) Binomial(20, 1/20)

The Poisson random variable A Poisson(  ) random variable has this p.m.f.: p(k) = e -   k /k! k = 0, 1, 2, 3, … Poisson random variables do not occur “naturally” in the sample spaces we have seen. They approximate Binomial(n, p) random variables when  = np is fixed and n is large (so p is small) p Poisson(  ) (k) = lim n → ∞ p Binomial(n,  /n) (k)

Rain is falling on your head at an average speed of 2.8 drops/second. Divide the second evenly in n intervals of length 1/n. Raindrops Let E i be the event “raindrop hits during interval i.” Assuming E 1, …, E n are independent, the number of drops in the second N is a Binomial(n, p) r.v. Since E[N] = 2.8, and E[N] = np, p must equal 2.8/n. 0 1

Raindrops Number of drops N is Binomial(n, 2.8/n) 0 1 As n gets larger, the number of drops within the second “approaches” a Poisson(2.8) random variable:

Expectation and variance of Poisson If X is Binomial(n, p) then E[X] = npVar[X] = np(1-p) When p =  /n, we get E[X] =  Var[X] =  (1-  /n) As n → ∞, E[X] →  and Var[X] → . This suggests When X is Poisson(  ), E[X] =  and Var[X] = .

Problem for you to solve Rain falls on you at an average rate of 3 drops/sec. You walk for 30 sec from MTR to bus stop. When 100 drops hit you, your hair gets wet. What is the probability your hair got wet?

Problem for you to solve Solution On average, 90 drops fall in 30 seconds. So we model the number of drops N you receive as a Poisson(90) random variable. Using the online Poisson calculator at or the poissonpmf(n, L) function in 13L07.py we getonline P(N > 100) = 1 - ∑ i = 0 P(N = i) ≈ 13.49% 99