LECTURE 13 PATH MODELING EPSY 640 Texas A&M University.

Slides:



Advertisements
Similar presentations
Continued Psy 524 Ainsworth
Advertisements

Chapter 5 Multiple Linear Regression
Structural Equation Modeling
Automated Regression Modeling Descriptive vs. Predictive Regression Models Four common automated modeling procedures Forward Modeling Backward Modeling.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 14-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
1 SSS II Lecture 1: Correlation and Regression Graduate School 2008/2009 Social Science Statistics II Gwilym Pryce
Quantitative Data Analysis: Hypothesis Testing
Chapter 13 Multiple Regression
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
CORRELATION LECTURE 1 EPSY 640 Texas A&M University.
Statistics II: An Overview of Statistics. Outline for Statistics II Lecture: SPSS Syntax – Some examples. Normal Distribution Curve. Sampling Distribution.
Chapter 12 Multiple Regression
LECTURE 11 Hypotheses about Correlations EPSY 640 Texas A&M University.
Correlation. Two variables: Which test? X Y Contingency analysis t-test Logistic regression Correlation Regression.
Structural Equation Modeling
Multiple Regression.
LECTURE 5 MULTIPLE REGRESSION TOPICS –SQUARED MULTIPLE CORRELATION –B AND BETA WEIGHTS –HIERARCHICAL REGRESSION MODELS –SETS OF INDEPENDENT VARIABLES –SIGNIFICANCE.
Statistics for Business and Economics Chapter 11 Multiple Regression and Model Building.
Ch. 14: The Multiple Regression Model building
LECTURE 16 STRUCTURAL EQUATION MODELING.
Relationships Among Variables
Review for Final Exam Some important themes from Chapters 9-11 Final exam covers these chapters, but implicitly tests the entire course, because we use.
Multiple Linear Regression A method for analyzing the effects of several predictor variables concurrently. - Simultaneously - Stepwise Minimizing the squared.
Review Guess the correlation. A.-2.0 B.-0.9 C.-0.1 D.0.1 E.0.9.
Multiple Linear Regression Response Variable: Y Explanatory Variables: X 1,...,X k Model (Extension of Simple Regression): E(Y) =  +  1 X 1 +  +  k.
Multiple Sample Models James G. Anderson, Ph.D. Purdue University.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Elements of Multiple Regression Analysis: Two Independent Variables Yong Sept
Wednesday PM  Presentation of AM results  Multiple linear regression Simultaneous Simultaneous Stepwise Stepwise Hierarchical Hierarchical  Logistic.
STATISTICS: BASICS Aswath Damodaran 1. 2 The role of statistics Aswath Damodaran 2  When you are given lots of data, and especially when that data is.
© 2002 Prentice-Hall, Inc.Chap 14-1 Introduction to Multiple Regression Model.
12a - 1 © 2000 Prentice-Hall, Inc. Statistics Multiple Regression and Model Building Chapter 12 part I.
Model Selection1. 1. Regress Y on each k potential X variables. 2. Determine the best single variable model. 3. Regress Y on the best variable and each.
Week 6: Model selection Overview Questions from last week Model selection in multivariable analysis -bivariate significance -interaction and confounding.
Simple Linear Regression One reason for assessing correlation is to identify a variable that could be used to predict another variable If that is your.
Introduction to Linear Regression
Soc 3306a Lecture 9: Multivariate 2 More on Multiple Regression: Building a Model and Interpreting Coefficients.
2 Multicollinearity Presented by: Shahram Arsang Isfahan University of Medical Sciences April 2014.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Lecture 9 TWO GROUP MEANS TESTS EPSY 640 Texas A&M University.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
11 Chapter 12 Quantitative Data Analysis: Hypothesis Testing © 2009 John Wiley & Sons Ltd.
Measurement Models: Exploratory and Confirmatory Factor Analysis James G. Anderson, Ph.D. Purdue University.
© Buddy Freeman, 2015 Multiple Linear Regression (MLR) Testing the additional contribution made by adding an independent variable.
Correlation Assume you have two measurements, x and y, on a set of objects, and would like to know if x and y are related. If they are directly related,
Simple Linear Regression (OLS). Types of Correlation Positive correlationNegative correlationNo correlation.
Analysis Overheads1 Analyzing Heterogeneous Distributions: Multiple Regression Analysis Analog to the ANOVA is restricted to a single categorical between.
Week 101 ANOVA F Test in Multiple Regression In multiple regression, the ANOVA F test is designed to test the following hypothesis: This test aims to assess.
Lecture 5 EPSY 642 Victor Willson Fall EFFECT SIZE DISTRIBUTION Hypothesis: All effects come from the same distribution What does this look like.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice- Hall, Inc. Chap 14-1 Business Statistics: A Decision-Making Approach 6 th Edition.
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Chap 13-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 13 Multiple Regression and.
Examples. Path Model 1 Simple mediation model. Much of the influence of Family Background (SES) is indirect.
LECTURE 3 EPSY 642 FALL CODING STUDIES Dependent variable(s) Construct(s) represented Measure name and related characteristics Effect size and associated.
Multiple Regression Learning Objectives n Explain the Linear Multiple Regression Model n Interpret Linear Multiple Regression Computer Output n Test.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 14-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Regression Analysis: A statistical procedure used to find relations among a set of variables B. Klinkenberg G
Lecture 7: Bivariate Statistics. 2 Properties of Standard Deviation Variance is just the square of the S.D. If a constant is added to all scores, it has.
Regression. Why Regression? Everything we’ve done in this class has been regression: When you have categorical IVs and continuous DVs, the ANOVA framework.
Yandell – Econ 216 Chap 15-1 Chapter 15 Multiple Regression Model Building.
Stats Methods at IC Lecture 3: Regression.
Chapter 14 Introduction to Multiple Regression
Chapter 15 Multiple Regression and Model Building
Correlation, Bivariate Regression, and Multiple Regression
Multiple Regression Analysis and Model Building
Regression.
Regression Analysis.
Presentation transcript:

LECTURE 13 PATH MODELING EPSY 640 Texas A&M University

Path Modeling Effects: –Direct effect –Indirect effect –Spurious effect –Unanalyzed effect

REVISING MODELS Classical regression approach –Forward: add variables according to improvement in R 2 for sample or population –Backward: start with all variables, remove those not contributing significantly to R 2 –Stepwise: use forward and backward together

REVISING MODELS SEM approach Model testing –improvement in fitting data Chi square test for model improvement (reduce model chi square significantly) Goodness of fit indices (based on chi square) –GFI, AGFI (proportional reduction in chi square) –NFI, CFI (model improvement, adjusted for df)

REVISING MODELS Changing paths in classical or SEM regression and path analysis –t-test for significance of regression coefficient (path coefficient)- test in unstandardized form –Lagrange Multiplier chi-square test that restricted path should not be zero –Wald chi square test that free path should be zero

REVISING MODELS t-test for significance of regression coefficient (path coefficient)- test in unstandardized form –coefficients are notoriously unstable in sample estimation –worse in forward or backward selections –different issue for sample-to-sample variation vs. sample-to-population variation

REVISING MODELS Purposes for regression determine interpretation of coefficients: –prediction: sets of coefficients are more stable than individual coefficients from one sample to the next:.5X +.3Y, may instead be better to assume next sample has sum of coefficients=.8 but that either one may not be close to.5 or.3

REVISING MODELS Purposes for regression determine interpretation of coefficients: –Theory-building: review of studies may provide distribution of coefficients (effect sizes), try to fit current research finding into the distribution –eg. Range of correlations between SES and IQ may be between -.1 and.6, mean of.33, SD=.15 –Did current result fit in the distribution?

REVISING MODELS OUTLIER ANALYSIS –Look at difference between predicted and actual score for each scores –Which differences are large? –Which of the predictor scores are most “discrepant” and causing the large difference in outcome? –Remove outlier and rerun analysis; does it change meaning or coefficients? –SPSS has such an analysis – VIF and CI indices

REVISING MODELS DROPPING PATHS THAT ARE NOT SIGNIFICANT –Drop one path only, then reanalyze, review results –Drop second path, reanalyze and review, especially possible inclusion of first path back in (modification indices, partial r’s) –Continue process with other candidate paths for deletion

REVISING MODELS COMPARING MODELS –R 2 improvement in subset regressions for path analysis [F-test with #paths dropped, df(error)] –Model fit analysis for entire path model- NFI, chi square change, etc. –Dropping paths increases MSerror, a tradeoff between increasing degrees of freedom for error (power) with reducing overall fit for model (loss of power): change of R 2 of chi-square per degree of freedom change

Venn Diagram for Model Change SSy SS for all effects but path being examined SS for path being examined SS added by path being examined

Biased Regression In some situations trade off biased estimate of regression coefficients for smaller standard errors Ridge regression is one approach: b* = b+  where  is a small amount –see if s e gets smaller as  is changed

MEDIATION VAR Y MEDIATES THE RELATIONSHIP BETWEEN X AND Z WHEN 1.X and Z are significantly related 2.X and Y are significantly related 3.Y and Z are significantly related 4.The relationship between X and Z is reduced (partial mediation) or zero (complete mediation) when Y partials the relationship X Y Z

X Y Z ex ez Partial correlation of X with Z partialling out Y

Z X Y r 2 XZ.Y

MEDIATION LOC SE DEP (.679) -.373

MEDIATION-Regressions LOC SE DEP Regression 1 Regression 2

GENERAL PATH MODELS LOC SE DEP Regression 1 Regression 2 ATYPIC ALITY Regression R 2 =.481 R 2 =.574 R 2 =.200

GENERAL PATH MODELS LOC SE DEP -.448*.512* -.373* ATYPIC ALITY Regression * R 2 =.483 R 2 =.572 R 2 = ns

GENERAL PATH MODELS LOC SE DEP -.448*.512* -.373* ATYPIC ALITY Regression * R 2 =.483 R 2 =.572 R 2 = ns sex