1 G89.2229 Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:

Slides:



Advertisements
Similar presentations
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Advertisements

Lecture 2 Today: Statistical Review cont’d:
Lecture note 6 Continuous Random Variables and Probability distribution.
Independence of random variables
Chapter 2: Probability Random Variable (r.v.) is a variable whose value is unknown until it is observed. The value of a random variable results from an.
Introduction to stochastic process
Probability Densities
Class notes for ISE 201 San Jose State University
QA-2 FRM-GARP Sep-2001 Zvi Wiener Quantitative Analysis 2.
FRM Zvi Wiener Following P. Jorion, Financial Risk Manager Handbook Financial Risk Management.
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Stat 321 – Lecture 19 Central Limit Theorem. Reminders HW 6 due tomorrow Exam solutions on-line Today’s office hours: 1-3pm Ch. 5 “reading guide” in Blackboard.
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Review of Probability and Statistics
CEEN-2131 Business Statistics: A Decision-Making Approach CEEN-2130/31/32 Using Probability and Probability Distributions.
Random Variable and Probability Distribution
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability distribution
1 Random Variables and Discrete probability Distributions SESSION 2.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability Distributions
Review of Probability.
Review of Probability.
Probability and Probability Distributions
Section 8 – Joint, Marginal, and Conditional Distributions.
 The relationship of two quantitative variables.
Probability theory 2 Tron Anders Moger September 13th 2006.
Random Variables and Discrete probability Distributions
G Lect 21 G Lecture 2 Regression as paths and covariance structure Alternative “saturated” path models Using matrix notation to write linear.
1 G Lect 3b G Lecture 3b Why are means and variances so useful? Recap of random variables and expectations with examples Further consideration.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
1 G Lect 8b G Lecture 8b Correlation: quantifying linear association between random variables Example: Okazaki’s inferences from a survey.
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
1 G Lect 4a G Lecture 4a f(X) of special interest: Normal Distribution Are These Random Variables Normally Distributed? Probability Statements.
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
Discrete Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4)
Probability and Statistics
Statistics for Business & Economics
1 G Lect 2w Review of expectations Conditional distributions Regression line Marginal and conditional distributions G Multiple Regression.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
1 V Lect 4b G Lecture 4b Covariance as Expectation Operator Utility of probability statements in psychological science Basic definitions.
Chapter 5 Joint Probability Distributions The adventure continues as we consider two or more random variables all at the same time. Chapter 5B Discrete.
Chapter 8: Simple Linear Regression Yang Zhenlin.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
Chapter Eight Expectation of Discrete Random Variable
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Jointly distributed Random variables Multivariate distributions.
Lecture 1: Basic Statistical Tools. A random variable (RV) = outcome (realization) not a set value, but rather drawn from some probability distribution.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
1 G Lect 3M Regression line review Estimating regression coefficients from moments Marginal variance Two predictors: Example 1 Multiple regression.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Chap 4-1 Chapter 4 Using Probability and Probability Distributions.
1 Ka-fu Wong University of Hong Kong A Brief Review of Probability, Statistics, and Regression for Forecasting.
Function of a random variable Let X be a random variable in a probabilistic space with a probability distribution F(x) Sometimes we may be interested in.
G Lecture 2 Regression as paths and covariance structure
Repeated Measures Analysis: An Example Math tools: Notation
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Keller: Stats for Mgmt & Econ, 7th Ed
Chapter 3: Getting the Hang of Statistics
How accurately can you (1) predict Y from X, and (2) predict X from Y?
Independence of random variables
Chapter 3: Getting the Hang of Statistics
Financial Econometrics Fin. 505
Mathematical Expectation
Presentation transcript:

1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions: Expectations Covariance as a statistical concept and tool G Multiple Regression in Psychology

2 G Lect 2M Three examples of correlation All from bar exam study discussed last week »Anxiety and Depression from POMS on day 29 (two days before bar exam) »Anger and Vigor from POMS on day 29 (two days before bar exam) »Anxiety and day to exam during week prior to start of exam.

3 G Lect 2M Anxious and Depressed Mood 2 Days Before Exam What do you notice about joint distribution? What is correlation? r = 0.64

4 G Lect 2M Anger and Vigor 2 Days Before Exam What do you notice about joint distribution? What is correlation? r = -.19

5 G Lect 2M Anxious Mood in Days Before the Exam What do you notice about joint distribution? What is correlation? r =.25

6 G Lect 2M Random Variables vs. Manipulated Variables A random variable is a quantity that is not known exactly prior to data collection. »E.g. anxiety and depression on any given day for a randomly selected subject A manipulated variable is a quantity that is determined by a sampling plan or an experimental design. »E.g. Day to exam, level of exposure, gender This distinction will have implications on statistical analysis of bivariate association.

7 G Lect 2M Thinking about bivariate (Joint) distributions Suppose we sample persons and measure two behaviors. »Both are random »The variables might be related or independent »The joint distribution contains information about each variable and the relation among them. When we ignore one of the two variables, and study the other, we say we are studying the Marginal distribution »This term simply reminds us that another variable is in the background

8 G Lect 2M Suppose we measure X, and Y, but choose to study only X (ignoring Y). We can describe the marginal distribution of X using the mean, the variance, and other moments such as coefficient of skewness and kurtosis. The population moments of the variable are described with Expectation Operators. Expectation operators can be used to study means and variances. Expectations and Moments for Marginal Distributions

9 G Lect 2M Expectation operators defined The population mean,  = E(X), is the average of all elements in the population. It can be derived knowing only the form of the population distribution. »Let f(X) be the density function describing the likelihood of different values of X in the population. »The population mean is the average of all values of X weighted by the likelihood of each value. If X has finite discrete values, each with probability f(X)=P(X), E(X)=  P(x i )x i If X has continuous values, we write E(X)=  x f(x) dx

10 G Lect 2M Rules for Expectation operators E(X)=  x is the first moment, the mean Let k represent some constant number (not random) »E(k*X) = k*E(X) = k*  x »E(X+k) = E(X)+k =  x +k Let Y represent another random variable (perhaps related to X) »E(X+Y) = E(X)+E(Y) =  x +  y »E(X-Y) = E(X)-E(Y) =  x -  y Putting these together »E( ) = E[(X 1 +X 2 )/2] =(  1 +  2 )/2 =  The expected value of the average of two random variables is the average of their means.

11 G Lect 2M Variance Operators Analogous to E(Y)= , is V(Y)=E(Y  ) 2 =  (y  ) 2 f(y) dy E[(X-  x ) 2 ] = V(X) =  x 2 Let k represent some constant »V(k*X) = k 2 *V(X) = k 2 *  x 2 »V(X+k) = V(X) =  x 2 Let Y represent another random variable that is independent of X »V(X+Y) = V(X)+V(Y) =  x 2 +  y 2 »V(X-Y) = V(X)+V(Y) =  x 2 +  y 2 A more general form of these formulas requires the concept of covariance

12 G Lect 2M Covariance: A Bivariate Moment E[(X-  x )(Y-  y )] = Cov(X,Y) =  XY is called the population covariance. »It is the average product of deviations from means »It is zero when the variables are linearly independent Formally it depends on the joint bivariate density of X and Y, f(X,Y). »f(X,Y) says how likely are any pair of values of X and Y »Cov(X,Y)=  (X-  x )(Y-  y )f(X,Y)dXdY

13 G Lect 2M Cov (X,Y) as an expectation operator »For k 1 and k 2 as constants, there are facts closely parallel to facts for variances: Cov(k 1 +X, k 2 +Y) = Cov(X,Y) =  XY Cov(k 1 X, k 2 Y) = k 1 *k 2 *Cov(X,Y) = k 1 *k 2 *  XY »Important special case: Let Y * = (1/  Y )Y and X * = (1/  X )X V(X * ) = V(Y * ) = 1.0 Cov(X *,Y * ) = (1/  Y ) (1/  X )  XY =  XY Cov (X *,Y * ) is the population correlation for the variables X and Y,  XY »Since  XY = (1/  Y ) (1/  X )  XY,  XY = (  Y ) (  X )  XY

14 G Lect 2M An important use of correlation and covariance We are often interested in linear functions of two random variables: aX+bY »a=1, b=1 gives sum »a=.5, b=.5 gives average »a=1, b=-1 gives difference What is the expected variance of W=aX+bY in general? »Var(W) = V(aX+bY) = a 2 V(X)+b 2 V(Y) + 2ab Cov(X,Y) = a 2  x 2 +b 2  y 2 + 2ab  x  y  xy »This can be used to compute expected standard error of contrasts of sample statistics.

15 G Lect 2M Example Suppose we want to average the POMS anxious and depressed moods. What is the expected variance? In the sample on day 29, »Var(Anx)=1.129, Var(Dep)=0.420 Corr(A,D)= 0.64 Cov(A,D)=.64*(1.129*.420) 1/2 = »Var(.5*A+.5*D) =.(25)(1.129)+(.25)(.420) +(2)(.25)(.441) = 0.648

16 G Lect 2M Final Comment Standard deviations and variances are particularly useful when variables are normally distributed Expectation operators assume that f(X), f(Y) and f(X,Y) can be known, but they do not assume that these describe bell shape or normal distributions Covariances and correlations can be estimated with non- normal variables, but be careful about statistical tests.