1 G89.2229 Lect 2w Review of expectations Conditional distributions Regression line Marginal and conditional distributions G89.2229 Multiple Regression.

Slides:



Advertisements
Similar presentations
Chapter 12 Simple Linear Regression
Advertisements

13- 1 Chapter Thirteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Linear regression models
Chapter 12 Simple Linear Regression
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Chapter 10 Simple Regression.
Simple Linear Regression
Linear Regression with One Regression
Chapter 13 Introduction to Linear Regression and Correlation Analysis
FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.
The Simple Regression Model
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
Chapter Topics Types of Regression Models
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
The Simple Regression Model
SIMPLE LINEAR REGRESSION
FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Business Statistics - QBM117 Statistical inference for regression.
Lecture 19 Simple linear regression (Review, 18.5, 18.8)
Introduction to Regression Analysis, Chapter 13,
Simple Linear Regression Analysis
Standard error of estimate & Confidence interval.
Correlation & Regression
Regression and Correlation Methods Judy Zhong Ph.D.
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
 The relationship of two quantitative variables.
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
1 G Lect 10a G Lecture 10a Revisited Example: Okazaki’s inferences from a survey Inferences on correlation Correlation: Power and effect.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 12-1 Correlation and Regression.
Introduction to Linear Regression
Chap 12-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 12 Introduction to Linear.
G Lect 21 G Lecture 2 Regression as paths and covariance structure Alternative “saturated” path models Using matrix notation to write linear.
1 G Lect 3b G Lecture 3b Why are means and variances so useful? Recap of random variables and expectations with examples Further consideration.
MTH 161: Introduction To Statistics
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 G Lect 8b G Lecture 8b Correlation: quantifying linear association between random variables Example: Okazaki’s inferences from a survey.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
1 G Lect 4a G Lecture 4a f(X) of special interest: Normal Distribution Are These Random Variables Normally Distributed? Probability Statements.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Lecture 10: Correlation and Regression Model.
Economics 173 Business Statistics Lecture 10 Fall, 2001 Professor J. Petry
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Psychology 202a Advanced Psychological Statistics October 22, 2015.
Dependent (response) Variable Independent (control) Variable Random Error XY x1x1 y1y1 x2x2 y2y2 …… xnxn ynyn Raw data: Assumption:  i ‘s are independent.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 G Lect 3M Regression line review Estimating regression coefficients from moments Marginal variance Two predictors: Example 1 Multiple regression.
1 G Lect 4W Multiple regression in matrix terms Exploring Regression Examples G Multiple Regression Week 4 (Wednesday)
1 Ka-fu Wong University of Hong Kong A Brief Review of Probability, Statistics, and Regression for Forecasting.
1 AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Part II: Theory and Estimation of Regression Models Chapter 5: Simple Regression Theory.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
Biostatistics Class 3 Probability Distributions 2/15/2000.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
The simple linear regression model and parameter estimation
Inference for Least Squares Lines
G Lecture 2 Regression as paths and covariance structure
Chapter 11: Simple Linear Regression
Slope of the regression line:
6-1 Introduction To Empirical Models
Simple Linear Regression
Simple Linear Regression
Chapter Thirteen McGraw-Hill/Irwin
Presentation transcript:

1 G Lect 2w Review of expectations Conditional distributions Regression line Marginal and conditional distributions G Multiple Regression in Psychology

2 G Lect 2w Review of Expectation Operators Let X and Y be random and k be some constant »E(k*X) = k*E(X) = k*  x »E(X+k) = E(X)+k =  x +k »E(X+Y) = E(X)+E(Y) =  x +  y »E(X-Y) = E(X)-E(Y) =  x -  y »E[(X-  x ) 2 ] = V(X) =  x 2 »V(k*X) = k 2 *V(X) = k 2 *  x 2 »V(X+k) = V(X) =  x 2 »E[(X-  x )(Y-  Y )] = Cov(X,Y) =  XY »Cov(k 1 +X, k 2 +Y) =  XY »Cov(k 1 X, k 2 Y) = k 1 *k 2 *  XY »V(k 1 X+ k 2 Y) = k 1 2  x 2 + k 2 2  y k 1 k 2  x  y  xy

3 G Lect 2w Example Suppose we want to contrast POMS anxious and depressed moods using A-D. What is the expected variance? In the sample on day 29, »Var(Anx)=1.129, Var(Dep)=0.420 Corr(A,D)= 0.64 Cov(A,D)=.64*(1.129*.420) 1/2 = »Var(1*A+(-1)*D) = 1*(1.129)+1*(.420)+(2)(-1)(.441) = Expectation operators are useful with both population and sample values.

4 G Lect 2w Second example Suppose that Z 1 and Z 2 are independent random variables »Each with mean 0 »Each with variance 1 Suppose that someone computes W=Z 1 + Z 2. What is E(W)? What is V(W)? How strongly will W be correlated with Z 1 ?

5 G Lect 2w Reminder Standard deviations and variances are particularly useful when variables are normally distributed Expectation operators assume that f(X), f(Y) and f(X,Y) can be known, but they do not assume that these describe bell shape or normal distributions Covariances and correlations can be estimated with non- normal variables, but be careful about statistical tests.

6 G Lect 2w E(Y|X) for Y:Depression and X:Anxious Mood »Note conditional distribution of depression shifts with increasing values of Anxious mood. »Linear model is only an approximation

7 G Lect 2w Linear Regression Approximate E(Y|X) with linear model »E(Y|X) = b 0 + b 1 X »Y = b 0 + b 1 X + e We choose values of b 0 and b 1 that minimize variance of e »Ordinary Least Squares (OLS) Y = X + e

8 G Lect 2w Interpretation of Linear Regression Results The expected value of depression for a person with Anxiety of zero is »E(Y|X) = b 0 + b 1 X »When X=0 E(Y|X=0) = b 0 + b 1 0 = b 0 For each unit change in Anxiety, depression is expected to increase by.391. »Compare X=1 and X=2 »Compare X=2 and X=3 »Increase is always.391 Sometimes we have to rescale X to make the interpretation of coefficients clearer.

9 G Lect 2w Sample Estimates vs. Population Parameters Assume the following »Sample is representative of a well defined population »Observations are independent »Variance of residuals that does not depend on X »Residuals distributed N(0,  2 ) Then »Standard errors of b 0 and b 1 can be estimated. »Ratio of b to standard error is distributed as t statistic on (n-2) df »Confidence bounds for regression weights can be estimated.

10 G Lect 2w Example revisited (estimates from Excel) Data are consistent with intercept (b 0 ) of zero. Data are consistent with population slope in range (.28,.51).

11 G Lect 2w Note on distribution of residuals If distribution of Y is normal, then residuals tend to be normal. Even if distribution of Y is not normal, residuals may more closely resemble normal Distribution (Y) Distribution (e)

12 G Lect 2w Regression Estimates from Sample Moments Ordinary Least Squares (OLS) estimates always satisfy the following relations Let S X, S Y, and S XY be the sample standard deviations of X and Y, and the covariance »The estimated standard error of the slope b 1 is given by

13 G Lect 2w Marginal from Conditional Expectations Suppose that 60% of the NYU undergrads were female (X=1) and 40% were male (X=0). Suppose E(Y|X=1)=5’5” and E(Y|X=0)=5’9” What is E(Y)? »E(Y|X=1)P(X=1)+E(Y|X=0)P(X=0) =5’5”(.60)+5’9”(.40) =5’6.6” »E(Y)=E[E(Y|X)] where the expectation outside the brackets is over the distribution of X and the Expectation inside is over the distribution of Y (for each given X)