4. Expectation and Variance Joint PMFs

Slides:



Advertisements
Similar presentations
2 Discrete Random Variable. Independence (module 1) A and B are independent iff the knowledge that B has occurred does not change the probability that.
Advertisements

Lecture (7) Random Variables and Distribution Functions.
Chapter 5 Discrete Random Variables and Probability Distributions
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Review of Basic Probability and Statistics
Chapter 4 Discrete Random Variables and Probability Distributions
Section 2.6 Consider a random variable X = the number of occurrences in a “unit” interval. Let = E(X) = expected number of occurrences in a “unit” interval.
Statistical Inference Most data comes in the form of numbers We have seen methods to describe and summarise patterns in data. Most data are samples (subsets)
Probability Distributions
Business 90: Business Statistics
P robability Midterm Practice Condition Independence 郭俊利 2009/04/13.
P robability Important Random Variable Independent random variable Mean and variance 郭俊利 2009/03/23.
Probability Mass Function Expectation 郭俊利 2009/03/16
1 Sampling Distribution Theory ch6. 2  Two independent R.V.s have the joint p.m.f. = the product of individual p.m.f.s.  Ex6.1-1: X1is the number of.
1 Random Variables and Discrete probability Distributions SESSION 2.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Chapter 5 Discrete Probability Distribution I. Basic Definitions II. Summary Measures for Discrete Random Variable Expected Value (Mean) Variance and Standard.
Unit 4 Starters. Starter Suppose a fair coin is tossed 4 times. Find the probability that heads comes up exactly two times.
Chapter 5.4: Bernoulli and Binomial Distribution
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables.
Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics.
The Binomial Distribution
Poisson Random Variable Provides model for data that represent the number of occurrences of a specified event in a given unit of time X represents the.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part two.
Expected values and variances. Formula For a discrete random variable X and pmf p(X): Expected value: Variance: Alternate formula for variance:  Var(x)=E(X^2)-[E(X)]^2.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Chapter 5: Random Variables and Discrete Probability Distributions
Convergence in Distribution
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part two.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Math b (Discrete) Random Variables, Binomial Distribution.
Chapter 5 Discrete Probability Distributions. Random Variable A numerical description of the result of an experiment.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Random Variables Example:
Chapter 16 Week 6, Monday. Random Variables “A numeric value that is based on the outcome of a random event” Example 1: Let the random variable X be defined.
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U.
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U Authors: Gary Greer (with.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
1. 2 At the end of the lesson, students will be able to (c)Understand the Binomial distribution B(n,p) (d) find the mean and variance of Binomial distribution.
Chapter 4 Discrete Random Variables and Probability Distributions
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Continuous Random Variables.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 3: Discrete Random Variables and Their Distributions CIS.
3 Discrete Random Variables and Probability Distributions
Random Variables and Their Distributions
Expectations of Random Variables, Functions of Random Variables
5. Continuous Random Variables
Discrete Probability Distributions
Discrete Probability Distributions
Discrete Probability Distributions
3.4 The Binomial Distribution
Introduction to Probability and Statistics
Probability Review for Financial Engineers
Some Discrete Probability Distributions
Probability Key Questions
Useful Discrete Random Variable
Probability Theory and Specific Distributions (Moore Ch5 and Guan Ch6)
Expected values and variances
3. Independence and Random Variables
5. Conditioning and Independence
7. Continuous Random Variables II
Elementary Statistics
Bernoulli Trials Two Possible Outcomes Trials are independent.
Discrete Random Variables and Probability Distributions
Presentation transcript:

4. Expectation and Variance Joint PMFs

Expected value Example The expected value (expectation) of a random variable X with p.m.f. p is E[X] = ∑x x p(x) Example N = number of Hs

Expected value Example N = number of Hs The expectation is the average value the random variable takes when experiment is done many times

F = face value of fair 6-sided die

If appears k times, you win $k. If it doesn’t appear, you lose $1.

F +5 -20 +100 -300 Utility Should I go to tutorial? not called called Come Skip +100 F -300 35/40 5/40

80% 50%

Expectation of a function 0 1 2 p.m.f. of X: p(x) 1/3 1/3 1/3 E[X] = E[X – 1]= E[(X – 1)2]=

Expectation of a function, again 0 1 2 p.m.f. of X: p(x) 1/3 1/3 1/3 E[X] = E[X – 1]= E[(X – 1)2]= E[f(X)] = ∑x f(x) p(x)

1km ☀️ ⛈ 🚶🏻‍♀️ 🚍 40% 60% 5km/h 30km/h

Joint probability mass function The joint PMF of random variables X, Y is the bivariate function p(x, y) = P(X = x, Y = y) Example

1 2 3 4 There is a bag with 4 cards: You draw two without replacement. What is the joint PMF of the face values?

What is the PMF of the sum? What is the expected value?

PMF and expectation of a function Z = f(X, Y) has PMF pZ(z) = ∑x, y: f(x, y) = z pXY(x, y) and expected value E[Z] = ∑x, y f(x, y) pXY(x, y)

What if the cards are drawn with replacement?

Marginal probabilities X 1 2 3 4 Y 1/12 1/4 1 2 P(Y = y) = ∑x P(X = x, Y = y) 3 4 1/4 1 P(X = x) = ∑y P(X = x, Y = y)

Linearity of expectation E[X + Y] = E[X] + E[Y] For every two random variables X and Y

E[X + Y] = ?

The indicator (Bernoulli) random variable Perform a trial that succeeds with probability p and fails with probability 1 – p. p = 0.5 p(x) 0 1 p(x) 1 – p p x p = 0.4 p(x) If X is Bernoulli(p) then E[X] = p

Mean of the Binomial Binomial(n, p): Perform n independent trials, each of which succeeds with probability p. X = number of successes

Binomial(10, 0.5) Binomial(50, 0.5) Binomial(10, 0.3) Binomial(50, 0.3)

n people throw their hats in a box and pick one out at random n people throw their hats in a box and pick one out at random. How many on average get back their own?

Mean of the Poisson p(k) = e-l lk/k! Poisson(l) approximates Binomial(n, l/n) for large n p(k) = e-l lk/k! k = 0, 1, 2, 3, …

Raindrops Rain is falling on your head at an average speed of 2.8 drops/second. 1

Number of drops N is Binomial(n, 2.8/n) Raindrops 1 Number of drops N is Binomial(n, 2.8/n)

Rain falls on you at an average rate of 3 drops/sec. When 100 drops hit you, your hair gets wet. You walk for 30 sec from MTR to bus stop. What is the probability your hair got wet?

You have three investment choices: Investments You have three investment choices: A: put $25 in one stock B: put $½ in each of 50 unrelated stocks C: keep your money in the bank Which do you prefer?

Investments Probability model doubles in value with probability ½ Each stock loses all value with probability ½ Different stocks perform independently

Variance and standard deviation Let m = E[X] be the expected value of X. The variance of X is the quantity Var[X] = E[(X – m)2] The standard deviation of X is s = √Var[X] It measures how close X and m are typically.

Var[Binomial(n, p)] = np(1 – p)

Another formula for variance

E[X] = ? Var[X] = ?

In 2011 the average household in Hong Kong had 2.9 people. Take a random person. What is the average number of people in his/her household? A: < 2.9 B: 2.9 C: > 2.9

average household size average size of random person’s household

What is the average household size? household size 1 2 3 4 5 more % of households 16.6 25.6 24.4 21.2 8.7 3.5 From Hong Kong Annual Digest of Statistics, 2012 Probability model 1 Households under equally likely outcomes 2.7 X = number of people in the household E[X] =

What is the average household size? household size 1 2 3 4 5 more % of households 16.6 25.6 24.4 21.2 8.7 3.5 Probability model 2 People under equally likely outcomes 3.521 Y = number of people in the household E[Y] =

Summary Because Var[X] ≥ 0, E[X2] ≥ (E[X])2 X = number of people in a random household Y = number of people in household of a random person E[X2] E[Y] = E[X] Because Var[X] ≥ 0, E[X2] ≥ (E[X])2 So E[Y] ≥ E[X]. The two are equal only if all households have the same size.