ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring 2014 7. Properties of expectation.

Slides:



Advertisements
Similar presentations
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Limit theorems.
Advertisements

Binomial random variables
Chapter 5 Discrete Random Variables and Probability Distributions
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Independence of random variables
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Chapter 4 Discrete Random Variables and Probability Distributions
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 4-1 Introduction to Statistics Chapter 5 Random Variables.
Section 5.3 Suppose X 1, X 2, …, X n are independent random variables (which may be either of the discrete type or of the continuous type) each from the.
Great Theoretical Ideas in Computer Science.
P robability Midterm Practice Condition Independence 郭俊利 2009/04/13.
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Statistics.
Probability Mass Function Expectation 郭俊利 2009/03/16
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Chris Morgan, MATH G160 February 3, 2012 Lecture 11 Chapter 5.3: Expectation (Mean) and Variance 1.
1 Introduction to Stochastic Models GSLM Outline  independence of random variables  variance and covariance  two useful ideas  examples 
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Copyright © 2010 Pearson Education, Inc. Slide
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 7 – Discrete Random Variables: Conditioning and Independence Farinaz Koushanfar ECE Dept.,
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables.
Great Theoretical Ideas In Computer Science Steven Rudich, Anupam GuptaCS Spring 2004 Lecture 22April 1, 2004Carnegie Mellon University
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Basic Business Statistics.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part two.
Expected values and variances. Formula For a discrete random variable X and pmf p(X): Expected value: Variance: Alternate formula for variance:  Var(x)=E(X^2)-[E(X)]^2.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Basic Business Statistics.
Great Theoretical Ideas in Computer Science.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Axioms of Probability part two.
Tim Marks, Dept. of Computer Science and Engineering Random Variables and Random Vectors Tim Marks University of California San Diego.
Probability Refresher. Events Events as possible outcomes of an experiment Events define the sample space (discrete or continuous) – Single throw of a.
COMPSCI 102 Introduction to Discrete Mathematics.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Limit theorems.
Sections 5.1 and 5.2 Review and Preview and Random Variables.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Probability Distributions
4 - 1 © 2003 Pearson Prentice Hall Chapter 4 Discrete Random Variables.
Lecture 6 Dustin Lueker.  Standardized measure of variation ◦ Idea  A standard deviation of 10 may indicate great variability or small variability,
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.
Part I: Theory Ver Chapter 2: Axioms of Probability.
Andrej Bogdanov ENGG 2430A: Probability and Statistics for Engineers Spring Axioms of Probability.
Review Know properties of Random Variables
Lecture 7 Dustin Lueker.  Experiment ◦ Any activity from which an outcome, measurement, or other such result is obtained  Random (or Chance) Experiment.
Great Theoretical Ideas in Computer Science for Some.
Great Theoretical Ideas In Computer Science John LaffertyCS Fall 2006 Lecture 10 Sept. 28, 2006Carnegie Mellon University Classics.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Chapter 31 Conditional Probability & Conditional Expectation Conditional distributions Computing expectations by conditioning Computing probabilities by.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
Intensive Actuarial Training for Bulgaria January 2007 Lecture 0 – Review on Probability Theory By Michael Sze, PhD, FSA, CFA.
Business Statistics, A First Course (4e) © 2006 Prentice-Hall, Inc. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Business Statistics,
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 3: Discrete Random Variables and Their Distributions CIS.
Introduction to Discrete Mathematics
Probability Distributions; Expected Value
Ch. 6. Binomial Theory.
Expected values and variances
5. Conditioning and Independence
Probability overview Event space – set of possible outcomes
Further Topics on Random Variables: Covariance and Correlation
Further Topics on Random Variables: Covariance and Correlation
Presentation transcript:

ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Properties of expectation

Calculating expectation E[g(X, Y)] = ∑ x g(x, y) P(X = x, Y = y) E[X] = ∑ x x P(X = x) E[X 1 + … + X n ] = E[X 1 ] + … + E[X n ] 1. From the definition 2. Using linearity of expectation 3. Expectation of derived random variables

Runs You toss a coin 10 times. What is the expected number of runs R with at least 3 heads? Examples R = 2 HHHHTHHHTH HHHHHHHHHH R = 1 HHHTTTTTTT R = 1 1. Definition 2. Linearity of expectation 3. Derived random var. Which method to use?

Runs Solution R = I 1 + I 2 + … + I 8 where I 1 is an indicator that a run with at least 3 heads starts at position 1, and so on. HHHHTHHHTH In this example, I 1 and I 5 equal 1, and all others equal 0. E[R] = E[I 1 ] + E[I 2 ] + … + E[I 8 ]

Runs E[I 1 ] = P(I 1 = 1) = P( run of ≥ 3 H s starts at position 1) = 1/8. E[I 2 ] = P(I 2 = 1) = P( run of ≥ 3 H s starts at position 2) = 1/8 = 1/16

Runs E[I 3 ] = By the same reasoning: so E[I 1 ] = 1/8 1/16E[I 4 ] = … = E[I 8 ] = 1/16 E[R] = E[I 1 ] + E[I 2 ] + … + E[I 8 ] = 1/8 + 7 × 1/16 = 9/16. E[I 2 ] = 1/16

Problem for you to solve You toss a coin 10 times. What is the expected number of runs R with exactly 3 heads? R = 1 HHHHTHHHTH HHHHHHHHHH R = 0 Examples

Two cars on a road Two cars are at random positions along a 1-mile long road. Find the expected distance between them Definition 2. Linearity of expectation 3. Derived random var. Which method to use? D

Two cars on a road Probability model Car positions X, Y are independent Uniform(0, 1) The distance between them is D = |Y – X| E[D]E[D] = ∫ 0 ∫ 0 |y – x| dy dx 1 1 x 1 y |y – x| = ∫ 0 (x 2 /2 + (1 – x) 2 /2)dx 1 = 1/3 x 1 - x

Conditional p.m.f. Let X be a random variable and A be an event. The conditional p.m.f. of X given A is P(X = x | A) = P(X = x and A) P(A)P(A) The conditional expectation of X given A is E[X | A] = ∑ x x P(X = x | A)

Example You flip 3 coins. What is the expected number of heads X given that there is at least one head ( A )? Solution P(X = x) x /8 3/8 1/8 p.m.f. of X : P(A) = 7/8 p.m.f. of X|A : P(X = x|A) x /7 1/7 E[X | A] = 1 ∙ 3/7 + 2 ∙ 3/7 + 3 ∙ 1/7 = 12/7

Average of conditional expectations E[X] = E[X|A] P(A) + E[X|A c ] P(A c ) A1A1 A2A2 A3A3 A4A4 A5A5 E[X] = E[X|A 1 ]P(A 1 ) + … + E[X|A n ]P(A n ) More generally, if A 1,…, A n partition S then

A gambling strategy You play 10 rounds of roulette. You start with $100 and bet 10% of your cash on red in every round. How much money do you expect to be left with? Solution Let X n be the cash you have after the n -th round Let W n be the event of a win in the n -th round

A gambling strategy E[X n ] = E[X n | W n-1 ] P(W n-1 ) + E[X n | W n-1 c ] P(W n-1 c ) 18/3719/371.1X n-1 0.9X n-1 E[X n ] = E[1.1 X n-1 ] 18/37 + E[0.9 X n-1 ] 19/37 = (1.1×18/ ×19/37) E[X n-1 ] = 369/370 E[X n-1 ]. E[X 10 ] = 369/370 E[X 9 ] = (369/370) 2 E[X 8 ] =... = (369/370) 10 E[X 0 ] ≈

Example You flip 3 coins. What is the expected number of heads X given that there is at least one head ( A )? Solution 2 E[X] = E[X | A] P(A) + E[X | A c ] P(A c ) 07/81/83/2 E[X | A] = (3/2)/(7/8) = 12/7.

Geometric random variable Let X 1, X 2, … be independent Bernoulli(p) trials. A Geometric(p) random variable N is the time of the first success among X 1, X 2, … : N = first (smallest) n such that X n = 1. So P(N = n) = P(X 1 = 0, …, X n-1 = 0, X n = 1) = (1 – p) n-1 p. This is the p.m.f. of N.

Geometric(0.5)Geometric(0.7) Geometric(0.05)

Geometric random variable If N is Geometric(p), its expected value is E[N] = ∑ n n P(N = n) = ∑ n n (1 – p) n-1 p= … = 1/p Here is a better way: E[N] = E[N|X 1 = 1] P(X 1 = 1) + E[N|X 1 = 0] P(X 1 = 0) 1 + N p1 - p1 E[N] = p + E[1 + N](1 – p) so E[N] = 1/p.

Geometric(0.5)Geometric(0.7) Geometric(0.05)

Coupon collection There are n types of stickers. Every day you get one. When do you expect to get all the coupon types?

Coupon collection Solution Let X be the day on which you collect all coupons Let W i be the number of days you wait between sticking the i – 1 st coupon and the i th coupon X = W 1 + W 2 + … + W n E[X] = E[W 1 ] + E[W 2 ] + … + E[W n ]

Coupon collection Let’s calculate E[W 1 ], E[W 2 ], …, E[W n ] E[W 1 ] = 1 E[W 2 ] = ? W 2 is Geometric((n – 1)/n) n/(n – 1) E[W 3 ] = ?n/(n – 2) W 3 is Geometric((n – 2)/n) E[W n ] = ?n/1 W n is Geometric(1/n)

Coupon collection E[X] = E[W 1 ] + E[W 2 ] + … + E[W n ] = 1 + n/(n – 1) + n/(n – 2) + … + n = n(1 + 1/2 + … + 1/n) = n ln n +γn ± 1 (see To collect 272 coupons, it takes about 1681 day on average. γ ≈

Review: Calculating expectation 1. From the definition Always works, but calculation is sometimes difficult. 2. Using linearity of expectation Great when the random variable counts the number of events of some type. They don’t have to be independent! 3. Derived random variables Useful when method 2 fails, e.g. E[|X – Y|] 4. Average of conditional expectations Very useful for experiments that happen in stages

Expectation and independence E[g(X)h(Y)] = E[g(X)] E[h(Y)] for all real valued functions g and h. Random variables X and Y (discrete or continuous) are independent if and only if In particular, E[XY] = E[X]E[Y] for independent X and Y (but not in general).

Variance and covariance The covariance of X and Y is Cov[X, Y] = E[(X – E[X])(Y – E[Y])] Recall the variance of X is Var[X] = E[(X – E[X]) 2 ] = E[X 2 ] – E[X] 2 If X = Y, then Cov[X, Y] = Var[X] ≥ 0 If X, Y are independent then Cov[X, Y] = 0 = E[XY] – E[X]E[Y]

Variance of sums Var[X + Y] = Var[X] + Var[Y] + Cov[X, Y] + Cov[Y, X] When every pair among X 1,…, X n is independent: Var[X 1 + … + X n ] = Var[X 1 ] + … + Var[X n ]. Var[X 1 + … + X n ] = Var[X 1 ] + … + Var[X n ] + ∑ i ≠ j Cov[X i, X j ] For any X 1, …, X n :

Hats n people throw their hats in the air. Let N be the number of people that get back their own hat. N = I 1 + … + I n where I i is the indicator for the event that person i gets their hat. Then E[I i ] = P(I i = 1) = 1/n Solution E[N ] = n 1/n = 1.

Hats E[I i ] = 1/n Var[I i ] = (1 – 1/n)1/n Var[N ] = n ⋅ (1 – 1/n)1/n + n(n – 1) ⋅ 1/n 2 (n – 1) = 1. Cov[I i, I j ] = E[I i I j ] – E[I i ]E[I j ] = P(I i = 1, I j = 1) – P(I i = 1) P(I j = 1) = 1/n(n – 1) – 1/n 2 = 1/n 2 (n – 1)

Patterns A coin is tossed n times. Find the expectation and variance in the number of patterns HH. N = I 1 + … + I n-1 where I i is the indicator for the event that the i th and (i + 1) st toss both came out H. Solution E[I i ] = P(I i = 1) = 1/4 E[N ] = (n – 1)/4

Patterns E[I i ] = 1/4Var[I i ] = 3/4 1/4 = 3/16 Cov[I i, I j ] = E[I i I j ] – E[I i ]E[I j ] = P(I i = 1, I j = 1) – P(I i = 1) P(I j = 1) Cov[I 1, I 2 ] = HHH??????? 1/8 – (1/4) 2 = 1/16 Cov[I 1, I 3 ] = HHHH?????? 1/16 – (1/4) 2 = 0 because I 1 and I 3 are independent! Cov[I 1, I 2 ] = Cov[I 2, I 3 ] = … = Cov[I n-2, I n-1 ] = 1/16 all others = 0 Var[N ] = (n – 1) ⋅ 3/16 + 2(n – 2) ⋅ 1/16 = (5n – 7)/16. Cov[I 2, I 1 ] = Cov[I 3, I 2 ] = … = Cov[I n-1, I n-2 ] = 1/16

Problem for you to solve 8 husband-wife couples are seated at a round table. Let N be the number of couples seated together. Find the expected value and the variance of N.