Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.

Slides:



Advertisements
Similar presentations
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Advertisements

Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Review of Probability. Definitions (1) Quiz 1.Let’s say I have a random variable X for a coin, with event space {H, T}. If the probability P(X=H) is.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lecture (7) Random Variables and Distribution Functions.
Excursions in Modern Mathematics, 7e: Copyright © 2010 Pearson Education, Inc. 16 Mathematics of Normal Distributions 16.1Approximately Normal.
Chapter 5 Discrete Random Variables and Probability Distributions
CHAPTER 13: Binomial Distributions
+ The Practice of Statistics, 4 th edition – For AP* STARNES, YATES, MOORE Chapter 6: Random Variables Section 6.3 Binomial and Geometric Random Variables.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Laws of division of casual sizes. Binomial law of division.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
1 Random Variables and Discrete probability Distributions Chapter 7.
© 2003 All Rights Reserved, Robi Polikar, Rowan University, Dept. of Electrical and Computer Engineering Lecture 8 Engineering Statistics Part II: Estimation.
1 Sampling Distributions Chapter Introduction  In real life calculating parameters of populations is prohibitive because populations are very.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Chapter 4: Joint and Conditional Distributions
1 Fin500J Topic 10Fall 2010 Olin Business School Fin500J: Mathematical Foundations in Finance Topic 10: Probability and Statistics Philip H. Dybvig Reference:
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability distribution
1 Random Variables and Discrete probability Distributions SESSION 2.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Stat 13, Thu 5/3/ Correction on normal notation. 2. Normal percentiles. 3. Normal probability plots. 4. Bernoulli and Binomial random variables.
June 18, 2008Stat Lecture 11 - Confidence Intervals 1 Introduction to Inference Sampling Distributions, Confidence Intervals and Hypothesis Testing.
Lecture 14: Multivariate Distributions Probability Theory and Applications Fall 2005 October 25.
AP Statistics Section 8.1: The Binomial Distribution.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Random Variables and Discrete probability Distributions
FINAL REVIEW. The amount of coffee (in ounces) filled in a jar by a machine has a normal distribution with a mean amount of 16 ounces and a standard deviation.
Lectures prepared by: Elchanan Mossel elena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 8. Some Approximations to Probability Distributions: Limit Theorems Sections 8.4: The Central Limit.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
 A probability function is a function which assigns probabilities to the values of a random variable.  Individual probability values may be denoted by.
X = 2*Bin(300,1/2) – 300 E[X] = 0 Y = 2*Bin(30,1/2) – 30 E[Y] = 0.
LECTURE 17 THURSDAY, 22 OCTOBER STA 291 Fall
Sample Means & Proportions
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Chapter 7: Sampling Distributions Section 7.1 How Likely Are the Possible Values of a Statistic? The Sampling Distribution.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Analysis of Experimental Data; Introduction
AP STATISTICS Section 7.1 Random Variables. Objective: To be able to recognize discrete and continuous random variables and calculate probabilities using.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Uniform, normal, and exponential. 2.Exponential example. 3.Uniform.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
 Confidence Intervals  Around a proportion  Significance Tests  Not Every Difference Counts  Difference in Proportions  Difference in Means.
Basics of Multivariate Probability
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Chapter 5 Joint Probability Distributions and Random Samples
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Some Rules for Expectation
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Estimating a Population Proportion
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Mathematical Expectation
Presentation transcript:

Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections 6.5

Bivariate Normal Let (X,Y) be independent Normal variables. x y  = 0;

Bivariate Normal Question: What is the joint distribution of (X,Y) Where X = child height and Y = parent height? We expect X and Y to be Gaussian However, X and Y are not independent:  (X,Y) > 0.

Bivariate Normal Intuitive sample of Gaussian (X,Y) with  = 0.7 x y  = 0.707;

Bivariate Normal Intuitive sample of X,Y with  (X,Y) = 1. x y  = 1;

Construction of Bivariate Normal Construction of correlated Gaussians: Let X,Z » N(0,1) be independent.  = -1;  =   = ;  = 3   = 0;  =   = 0.707;  =   = 1;  =  X Z Y   Zsin  Xcos  Let: Y = Xcos  Zsin  Then: E(Y) = E(X) = E(Z) = 0 Var(Y) = cos 2  + sin 2  = 1 SD (Y) = SD(X) = SD(Z) =1  X,Y) = E(XY)= E(X 2 ) cos  + E(XZ) sin   = cos  Y ~ N(0,1)

Standard Bivariate Normal Def: We say that X and Y have standard bivariate normal distribution with correlation  if and only if Where X and Z are independent N(0,1). Claim: If (X,Y) is  correlated bivariate normal then: Marginals : X » N(0,1); Y » N(0,1) Conditionals : X|Y=y » N(  y, 1-  2 ); Y|X=x » N(  x, 1-  2 ) Independence: X,Y are independent if and only if  = 0 Symmetry: (Y,X) is  correlated bivariate normal.

Bivariate Normal Distribution Definition: We say that the random variables U and V have bivariate normal distribution with parameters  U,  V,  2 U,  2 V and  if and only if the standardized variables X=(U -  U )/  U Y=(V -  V )/  V have standard bivariate normal distribution with correlation 

Errors in Voting Machines There are two candidates in a certain election: Mr.B and Mr.K. We use a voting machine to count the votes. Suppose that the voting machine at a particular polling station is somewhat capricious – it flips the votes with probability .

A voting Problem Consider a vote between two candidates where: At the morning of the vote: Each voter tosses a coin and votes according to the outcome. Assume that the winner of the vote is the candidate with the majority of votes. In other words let V i 2 { § 1} be the vote of voter i. So if V =  i V i > 0 then +1 wins; otherwise -1 wins. +1

A mathematical model of voting machines Which voting schemes are more robust against noise? Simplest model of noise: The voting machine flips each vote independently with probability . Intended vote Registered vote 1 prob 1-  prob  +1 prob 1-  prob  +1

Question: Let V i = intended vote i. W i = registered vote i. What is dist of V =  i=1 n V i, W =  i=1 n W i for large n? On Errors in voting machines Answer: V ~ N(0,n); W ~N(0,n). Question: What is P[V > 0]? P[W > 0]? Answer: ~½. Question: What is the probability that machine errors flipped the elections outcome? Answer: P[V > 0, W 0]? = 1 – 2 P[V > 0, W > 0].

Answer continued: Take (X,Y) = (V,W)/n 1/2. Then (X,Y) is bivarite-normal where SD(X) = SD(Y) = 1 and   (X,Y) =  (V,W)/n =  (V i,W i ) = 1 – 2 . Need to find 1-2P[X > 0, Y > 0]. On Errors in voting machines Answer continued: Let  = cos . Then we need to find: P[X > 0, Y > 0] = P[X cos 0 + Z sin 0 > 0, X cos  + Z sin  > 0] = = P[X > 0, Z > -(ctg  ) X] = (  -  )/2  = ½(1-(arcos  /  ). So 1-2P[X > 0, Y > 0] = (arcos  /  = (arcos(1-2  ))/  

Probability of error ~  1/2 Result is essentially due to Sheppard (1899): “On the application of the theory of error to cases of normal distribution and normal correlation”. For n 1/2 £ n 1/2 electoral college f  1/4 Majority and Electoral College

Suppose that (X,Y) has the Standard Bivariate Normal Distribution with correlation . Solution: Question: For a < b, what is the E(Y|a < X <b)? Conditional Expectation given an interval a b x y

Solution: E(Y | a < X < b) = s a b E(Y | X = x) f X (x) / s a b f X (x)dx We know that and s a b f X (x) dx =  (b) -  (a). Since, where X & Z are independent We have: Conditional Expectation given an interval

Claim: Let V =  i=1 n a i Z i, W =  i=1 n b i Z i where Z i are independent normal variables N(  i,  i 2 ). Then (V,U) is bivariate normal. Problem: calculate  V,  W,  V,  W and  Linear Combinations of indep. Normals