Lectures prepared by: Elchanan Mossel Yelena Shvets

Slides:



Advertisements
Similar presentations
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Advertisements

Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Review of Probability. Definitions (1) Quiz 1.Let’s say I have a random variable X for a coin, with event space {H, T}. If the probability P(X=H) is.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lecture (7) Random Variables and Distribution Functions.
Excursions in Modern Mathematics, 7e: Copyright © 2010 Pearson Education, Inc. 16 Mathematics of Normal Distributions 16.1Approximately Normal.
Chapter 5 Discrete Random Variables and Probability Distributions
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Sampling Distributions
1 Random Variables and Discrete probability Distributions Chapter 7.
© 2003 All Rights Reserved, Robi Polikar, Rowan University, Dept. of Electrical and Computer Engineering Lecture 8 Engineering Statistics Part II: Estimation.
1 Sampling Distributions Chapter Introduction  In real life calculating parameters of populations is prohibitive because populations are very.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Chapter 4: Joint and Conditional Distributions
1 Fin500J Topic 10Fall 2010 Olin Business School Fin500J: Mathematical Foundations in Finance Topic 10: Probability and Statistics Philip H. Dybvig Reference:
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Joint Probability distribution
1 Random Variables and Discrete probability Distributions SESSION 2.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
STA Lecture 161 STA 291 Lecture 16 Normal distributions: ( mean and SD ) use table or web page. The sampling distribution of and are both (approximately)
AP Statistics Section 8.1: The Binomial Distribution.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Random Variables and Discrete probability Distributions
FINAL REVIEW. The amount of coffee (in ounces) filled in a jar by a machine has a normal distribution with a mean amount of 16 ounces and a standard deviation.
Lectures prepared by: Elchanan Mossel elena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 8. Some Approximations to Probability Distributions: Limit Theorems Sections 8.4: The Central Limit.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
 A probability function is a function which assigns probabilities to the values of a random variable.  Individual probability values may be denoted by.
X = 2*Bin(300,1/2) – 300 E[X] = 0 Y = 2*Bin(30,1/2) – 30 E[Y] = 0.
LECTURE 17 THURSDAY, 22 OCTOBER STA 291 Fall
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Chapter 7: Sampling Distributions Section 7.1 How Likely Are the Possible Values of a Statistic? The Sampling Distribution.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Analysis of Experimental Data; Introduction
AP STATISTICS Section 7.1 Random Variables. Objective: To be able to recognize discrete and continuous random variables and calculate probabilities using.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Uniform, normal, and exponential. 2.Exponential example. 3.Uniform.
F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define –P57.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Chapter 9: Joint distributions and independence CIS 3033.
Basics of Multivariate Probability
Stat 35b: Introduction to Probability with Applications to Poker
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Chapter 5 Joint Probability Distributions and Random Samples
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Lectures prepared by: Elchanan Mossel Yelena Shvets
Section 4.4 Sampling Distribution Models and the Central Limit Theorem
Lectures prepared by: Elchanan Mossel Yelena Shvets
Conditional Probability on a joint discrete distribution
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Some Rules for Expectation
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Estimating a Population Proportion
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Presentation transcript:

Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Lectures prepared by: Elchanan Mossel Yelena Shvets Follows Jim Pitman’s book: Probability Sections 6.5

Let (X,Y) be independent Normal variables. Bivariate Normal Let (X,Y) be independent Normal variables. r = 0; y x

Bivariate Normal Question: What is the joint distribution of (X,Y) Where X = child height and Y = parent height? We expect X and Y to be Gaussian However, X and Y are not independent: (X,Y) > 0.

Bivariate Normal Intuitive sample of Gaussian (X,Y) with  = 0.7 y x

Bivariate Normal Intuitive sample of X,Y with (X,Y) = 1. r = 1; y x

Construction of Bivariate Normal Construction of correlated Gaussians: Let X,Z » N(0,1) be independent. Let: Y = Xcosq + Zsinq; Z Then: E(Y) = E(X) = E(Z) = 0 Var(Y) = cos2q + sin2 q = 1 SD (Y) = SD(X) = SD(Z) =1 Y q r(X,Y) = E(XY)= E(X2) cosq + E(XZ) sin q = cos q Xcosq Zsinq q X r = -1; q = p r = -0.707; q = 3p/4 r = 0; q = p/2 r = 0.707; q = p/4 r = 1; q = 0

Standard Bivariate Normal Def: We say that X and Y have standard bivariate normal distribution with correlation r if and only if Where X and Z are independent N(0,1). Claim: If (X,Y) is  correlated bivariate normal then: Marginals: X » N(0,1); Y » N(0,1) Conditionals: X|Y=y » N(r y, 1-r2); Y|X=x » N(r x, 1-r2) Independence: X,Y are independent if and only if r = 0 Symmetry: (Y,X) is  correlated bivariate normal.

Bivariate Normal Distribution Definition: We say that the random variables U and V have bivariate normal distribution with parameters mU, mV, s2U, s2V and r if and only if the standardized variables X=(U - mU)/sU Y=(V - mV)/sV have standard bivariate normal distribution with correlation r.

Errors in Voting Machines There are two candidates in a certain election: Mr.B and Mr.K. We use a voting machine to count the votes. Suppose that the voting machine at a particular polling station is somewhat capricious – it flips the votes with probability e.

+1 -1 A voting Problem Consider a vote between two candidates where: At the morning of the vote: Each voter tosses a coin and votes according to the outcome. Assume that the winner of the vote is the candidate with the majority of votes. In other words let Vi 2 {§ 1} be the vote of voter i. So if V = i Vi > 0 then +1 wins; otherwise -1 wins. +1 -1

A mathematical model of voting machines Which voting schemes are more robust against noise? Simplest model of noise: The voting machine flips each vote independently with probability . Registered vote Intended vote prob  +1 -1 prob 1- -1 prob  -1 1 prob 1- +1

On Errors in voting machines Question: Let Vi = intended vote i. Wi = registered vote i. What is dist of V = i=1n Vi, W = i=1n Wi for large n? Answer: V ~ N(0,n); W ~N(0,n). Question: What is P[V > 0]? P[W > 0]? Answer: ~½. Question: What is the probability that machine errors flipped the elections outcome? Answer: P[V > 0, W < 0] + P[V < 0, W > 0]? = 1 – 2 P[V > 0, W > 0].

On Errors in voting machines Answer continued: Take (X,Y) = (V,W)/n1/2. Then (X,Y) is bivarite-normal where SD(X) = SD(Y) = 1 and = (X,Y) = (V,W)/n = (Vi,Wi) = 1 – 2 . Need to find 1-2P[X > 0, Y > 0]. Answer continued: Let  = cos . Then we need to find: P[X > 0, Y > 0] = P[X cos 0 + Z sin 0 > 0, X cos  + Z sin  > 0] = = P[0 < Arg (X,Z) < , - < Arg(X,Z) <  - ] = P [0 < Arg(X,Z) < -] = ½((-)/) = ½(1-(arcos )/). So 1-2P[X > 0, Y > 0] = (arcos )/ = (arcos(1-2))/

Conditional Expectation given an interval Suppose that (X,Y) has the Standard Bivariate Normal Distribution with correlation r. Question: For a < b, what is the E(Y|a < X <b)? y Solution: x a b

Conditional Expectation given an interval Solution: E(Y | a < X < b) = sab E(Y | X = x) fX(x) / sab fX(x)dx We know that and sabfX(x) dx = F(b) - F(a). Since , where X & Z are independent We have: