G89.2247 Lect 21 G89.2247 Lecture 2 Regression as paths and covariance structure Alternative “saturated” path models Using matrix notation to write linear.

Slides:



Advertisements
Similar presentations
1 Regression as Moment Structure. 2 Regression Equation Y =  X + v Observable Variables Y z = X Moment matrix  YY  YX  =  YX  XX Moment structure.
Advertisements

Multiple Regression Analysis
1 G Lect 4M Interpreting multiple regression weights: suppression and spuriousness. Partial and semi-partial correlations Multiple regression in.
Multiple Regression W&W, Chapter 13, 15(3-4). Introduction Multiple regression is an extension of bivariate regression to take into account more than.
The Simple Regression Model
Structural Equation Modeling
4/14/ lecture 81 STATS 330: Lecture 8. 4/14/ lecture 82 Collinearity Aims of today’s lecture: Explain the idea of collinearity and its connection.
Omitted Variable Bias Methods of Economic Investigation Lecture 7 1.
Chapter 10 Curve Fitting and Regression Analysis
Multiple Regression. Outline Purpose and logic : page 3 Purpose and logic : page 3 Parameters estimation : page 9 Parameters estimation : page 9 R-square.
Simple Regression. Major Questions Given an economic model involving a relationship between two economic variables, how do we go about specifying the.
Linear Regression and Binary Variables The independent variable does not necessarily need to be continuous. If the independent variable is binary (e.g.,
Bivariate Regression Analysis
Variance and covariance M contains the mean Sums of squares General additive models.
The Simple Linear Regression Model: Specification and Estimation
Chapter 10 Simple Regression.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Designing Experiments In designing experiments we: Manipulate the independent.
Chapter 3 Simple Regression. What is in this Chapter? This chapter starts with a linear regression model with one explanatory variable, and states the.
1 Chapter 3 Multiple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
Statistics for Business and Economics
Chapter 4 Multiple Regression.
The Simple Regression Model
Chapter 11 Multiple Regression.
Topic 3: Regression.
Dr. Mario MazzocchiResearch Methods & Data Analysis1 Correlation and regression analysis Week 8 Research Methods & Data Analysis.
FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.
Linear Regression Models Powerful modeling technique Tease out relationships between “independent” variables and 1 “dependent” variable Models not perfect…need.
G Lect 31 G Lecture 3 SEM Model notation Review of mediation Estimating SEM models Moderation.
Multiple Linear Regression A method for analyzing the effects of several predictor variables concurrently. - Simultaneously - Stepwise Minimizing the squared.
Separate multivariate observations
Elements of Multiple Regression Analysis: Two Independent Variables Yong Sept
1 MULTI VARIATE VARIABLE n-th OBJECT m-th VARIABLE.
Soc 3306a Lecture 8: Multivariate 1 Using Multiple Regression and Path Analysis to Model Causality.
Chapter 4-5: Analytical Solutions to OLS
Bivariate Regression Analysis The most useful means of discerning causality and significance of variables.
© 2001 Prentice-Hall, Inc. Statistics for Business and Economics Simple Linear Regression Chapter 10.
1 G Lect 3b G Lecture 3b Why are means and variances so useful? Recap of random variables and expectations with examples Further consideration.
1 G Lect 8b G Lecture 8b Correlation: quantifying linear association between random variables Example: Okazaki’s inferences from a survey.
1 G Lect 4a G Lecture 4a f(X) of special interest: Normal Distribution Are These Random Variables Normally Distributed? Probability Statements.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Roger B. Hammer Assistant Professor Department of Sociology Oregon State University Conducting Social Research Ordinary Least Squares Regression.
GG 313 Geological Data Analysis Lecture 13 Solution of Simultaneous Equations October 4, 2005.
1 G Lect 14M Review of topics covered in course Mediation/Moderation Statistical power for interactions What topics were not covered? G Multiple.
Chapter Three TWO-VARIABLEREGRESSION MODEL: THE PROBLEM OF ESTIMATION
1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:
G Lecture 7 Confirmatory Factor Analysis
1 G Lect 2w Review of expectations Conditional distributions Regression line Marginal and conditional distributions G Multiple Regression.
G Lecture 81 Comparing Measurement Models across Groups Reducing Bias with Hybrid Models Setting the Scale of Latent Variables Thinking about Hybrid.
G Lecture 3 Review of mediation Moderation SEM Model notation
Correlation & Regression Analysis
Chapter 8: Simple Linear Regression Yang Zhenlin.
1 G Lect 3M Regression line review Estimating regression coefficients from moments Marginal variance Two predictors: Example 1 Multiple regression.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
1 G Lect 4W Multiple regression in matrix terms Exploring Regression Examples G Multiple Regression Week 4 (Wednesday)
1 Ka-fu Wong University of Hong Kong A Brief Review of Probability, Statistics, and Regression for Forecasting.
The SweSAT Vocabulary (word): understanding of words and concepts. Data Sufficiency (ds): numerical reasoning ability. Reading Comprehension (read): Swedish.
Multiple Regression.
REGRESSION G&W p
G Lecture 2 Regression as paths and covariance structure
CH 5: Multivariate Methods
Regression.
Chapter 5 STATISTICS (PART 4).
Multiple Regression.
6-1 Introduction To Empirical Models
Simple Linear Regression
OVERVIEW OF LINEAR MODELS
Regression Analysis.
3 basic analytical tasks in bivariate (or multivariate) analyses:
Structural Equation Modeling
Presentation transcript:

G Lect 21 G Lecture 2 Regression as paths and covariance structure Alternative “saturated” path models Using matrix notation to write linear models Multivariate Expectations Mediation

G Lect 22 Question: Does exposure to childhood foster care (X) lead to adverse outcomes (Y) ? Example of purported "causal model" X Y Y = B 0 + B 1 X + e Regression approach  B 0 and B 1 can be estimated using OLS  Estimates depend on sample standard deviations of Y and X, sample means, and covariance between Y and X B 1 = S XY /S 2 X B 0 = M Y -B 1 M X  Correlation, r XY = S XY /S X S Y, can be used to estimate the variance of the residual, e, V(e). S 2 e = S 2 Y (1-r 2 XY ) = S 2 Y - S 2 XY /S 2 X e B1B1

G Lect 23 A Covariance Structure Approach If we have data on Y and X we can compute a covariance matrix This estimates the population covariance structure,    Y can itself be expressed as B 2 1  2 X +  2 e  Three statistics in the sample covariance matrix are available to estimate three population parameters

G Lect 24 Covariance Structure Approach, Continued A structural model that has the same number of parameters as unique elements in the covariance matrix is "saturated". Saturated models always fit the sample covariance matrix.

G Lect 25 Another saturated model: Two explanatory variables The first model is likely not to yield an unbiased estimate of foster care because of selection factors (Isolation failure). Suppose we have a measure of family disorganization (Z) that is known to have an independent effect on Y and also to be related to who is assigned to foster care (X) Y X Z e    XZ

G Lect 26 Covariance Structure Expression The model: Y=b 0 +b 1 X+b 2 Z+e  If we assume E(X)=E(Z)=E(Y)=0  and V(X) = V(Z) = V(Y) = 1  then b 0 =0 and  's are standardized The parameters can be expressed  When sample correlations are substituted, these expressions give the OLS estimates of the regression coefficients.

G Lect 27 Covariance Structure: 2 Explanatory Variables In the standardized case the covariance structure is: Each correlation is accounted by two components, one direct and one indirect There are three regression parameters and three covariances.

G Lect 28 The more general covariance matrix for two IV multiple regression If we do not assume variances of unity the regression model implies

G Lect 29 More Math Review for SEM Matrix notation is useful

G Lect 210 A Matrix Derivation of OLS Regression OLS regression estimates make the sum of squared residuals as small as possible.  If Model is Then we choose B so that e'e is minimized. The minimum will occur when the residual vector is orthogonal to the regression plane  In that case, X'e = 0

G Lect 211 When will X'e = 0? When e is the residual from an OLS fit.

G Lect 212 Multivariate Expectations There are simple multivariate generalizations of the expectation facts:  E(X+k) = E(X)+k =  x +k  E(k*X) = k*E(X) = k*  x  V(X+k) = V(X) =  x 2  V(k*X) = k 2 *V(X) = k 2 *  x 2 Let X T =[X 1 X 2 X 3 X 4 ],  T =[         ] and let k be scalar value  E(k*X) = k*E(X) = k*   E(X+k* 1 ) = { E(X) + k* 1} =  + k* 1

G Lect 213 Multivariate Expectations In the multivariate case Var(X) is a matrix  V(X)=E[(X-  ) (X-  ) T ]

G Lect 214 Multivariate Expectations The multivariate generalizations of  V(X+k) = V(X) =  x 2  V(k*X) = k 2 *V(X) = k 2 *  x 2 Are:  Var( X + k* 1 ) =   Var(k* X ) = k 2  Let c T = [c 1 c 2 c 3 c 4 ]; c T X is a linear combination of the X's.  Var( c T X) = c T  c This is a scalar value If this positive for all values of c then  is positive definite

G Lect 215 Semi Partial Regression Adjustment The multiple regression coefficients are estimated taking all variables into account  The model assumes that for fixed X, Z has an effect of magnitude  Z.  Sometimes people say "controlling for X" The model explicitly notes that Z has two kinds of association with Y  A direct association through  Z (X fixed)  An indirect association through X (magnitude  X  XZ )

G Lect 216 Pondering Model 1: Simple Multiple Regression The semi-partial regression coefficients are often different from the bivariate correlations  Adjustment effects  Suppression effects Randomization makes  XZ = 0 in probability. Y X Z e    XZ

G Lect 217 Mathematically Equivalent Saturated Models Two variations of the first model suggest that the correlation between X and Z can itself be represented structurally. Y X Z eYeY   eZeZ  Y X Z eYeY   eXeX 

G Lect 218 Representation of Covariance Matrix Both models imply the same correlation structure The interpretation, however, is very different.

G Lect 219 Model 2: X leads to Z and Y X is assumed to be causally prior to Z.  The association between X and Z is due to X effects. Z partially mediates the overall effect of X on Y  X has a direct effect  1 on Y  X has an indirect effect      on Y through Z  Part of the bivariate association between Z and Y is spurious (due to common cause X) Y X Z eYeY   eZeZ 

G Lect 220 Model 3: Z leads to X and Y Z is assumed to be causally prior to X.  The association between X and Z is due to Z effects. X partially mediates the overall effect of Z on Y  Z has a direct effect  2 on Y  Z has an indirect effect      on Y through X  Part of the bivariate association between X and Y is spurious (due to common cause Z) Y X Z eYeY   eXeX 

G Lect 221 Choosing between models Often authors claim a model is good because it fits to data (sample covariance matrix)  All of these models fit the same (perfectly!) Logic and theory must establish causal order There are other possibilities besides 2 and 3  In some instances, X and Z are dynamic variables that are simultaneously affecting each other  In other instances both X and Z are outcomes of an additional variable, not shown.

G Lect 222 Mediation: A theory approach Sometimes it is possible to argue on theoretical grounds that  Z is prior to X and Y  X is prior to Y  The effect of Z on Y is completely accounted for by the indirect path through X. This is an example of total mediation If   is fixed to zero, then Model 3 is no longer saturated.  Question of fit becomes informative  Total mediation requires strong theory

G Lect 223 A Flawed Example Someone might try to argue for total mediation of family disorganization on low self-esteem through placement in foster care Baron and Kenny(1986) criteria might be met  Z is significantly related to Y  Z is significantly related to X  When Y is regressed on Z and X,   is significant but   is not significant. Statistical significance is a function of sample size. Logic suggests that children not assigned to foster care who live in a disorganized family may suffer directly.

G Lect 224 A More Compelling Example of Complete Mediation If Z is an experimentally manipulated variable such as a prime X is a measured process variable Y is an outcome logically subsequent to X  It should make sense that X affects Y for all levels of Z  E.g. Chen and Bargh (1997) Are participants who have been subliminally primed with negative stereotype words more likely to have partners who interact with them in a hostile manner?