Multiple Regression: Advanced Topics David A. Kenny January 23, 2014.

Slides:



Advertisements
Similar presentations
Lesson 10: Linear Regression and Correlation
Advertisements

© McGraw-Hill Higher Education. All Rights Reserved. Chapter 2F Statistical Tools in Evaluation.
From last time….. Basic Biostats Topics Summary Statistics –mean, median, mode –standard deviation, standard error Confidence Intervals Hypothesis Tests.
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
MULTIPLE REGRESSION. OVERVIEW What Makes it Multiple? What Makes it Multiple? Additional Assumptions Additional Assumptions Methods of Entering Variables.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r Assumptions.
Statistics for the Social Sciences
CORRELATION AND REGRESSION Research Methods University of Massachusetts at Boston ©2006 William Holmes.
Elaboration Elaboration extends our knowledge about an association to see if it continues or changes under different situations, that is, when you introduce.
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
Multiple Regression Research Methods and Statistics.
Chapter 9: Correlational Research. Chapter 9. Correlational Research Chapter Objectives  Distinguish between positive and negative bivariate correlations,
Relationships Among Variables
Smith/Davis (c) 2005 Prentice Hall Chapter Eight Correlation and Prediction PowerPoint Presentation created by Dr. Susan R. Burns Morningside College.
Correlation & Regression
Exponents. Location of Exponent An exponent is a little number high and to the right of a regular or base number. 3 4 Base Exponent.
Introduction to Multilevel Modeling Using SPSS
Correlation and regression 1: Correlation Coefficient
Introduction to Regression Analysis. Two Purposes Explanation –Explain (or account for) the variance in a variable (e.g., explain why children’s test.
Chapter 6 & 7 Linear Regression & Correlation
Correlation and Regression PS397 Testing and Measurement January 16, 2007 Thanh-Thanh Tieu.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Chapter 9 Analyzing Data Multiple Variables. Basic Directions Review page 180 for basic directions on which way to proceed with your analysis Provides.
Chapter 17 Partial Correlation and Multiple Regression and Correlation.
Multiple Regression The Basics. Multiple Regression (MR) Predicting one DV from a set of predictors, the DV should be interval/ratio or at least assumed.
Multiplying and Dividing Integers When you MULTIPLY: Two positives equal a positive Two negatives equal a positive One positive & one negative equal.
Chapter 7 Relationships Among Variables What Correlational Research Investigates Understanding the Nature of Correlation Positive Correlation Negative.
Introduction to regression 3D. Interpretation, interpolation, and extrapolation.
Multiple Linear Regression Partial Regression Coefficients.
Chapter 4 Prediction. Predictor and Criterion Variables  Predictor variable (X)  Criterion variable (Y)
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Chapter 16 Data Analysis: Testing for Associations.
Controlling for Baseline
Multilevel Modeling: Other Topics David A. Kenny January 7, 2014.
Environmental Modeling Basic Testing Methods - Statistics III.
Correlation and Regression: The Need to Knows Correlation is a statistical technique: tells you if scores on variable X are related to scores on variable.
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 12 Testing for Relationships Tests of linear relationships –Correlation 2 continuous.
 Relationship between education level, income, and length of time out of school  Our new regression equation: is the predicted value of the dependent.
UNDERSTANDING DESCRIPTION AND CORRELATION. CORRELATION COEFFICIENTS: DESCRIBING THE STRENGTH OF RELATIONSHIPS Pearson r Correlation Coefficient Strength.
Essential Statistics Chapter 51 Least Squares Regression Line u Regression line equation: y = a + bx ^ –x is the value of the explanatory variable –“y-hat”
CHAPTER 5 CORRELATION & LINEAR REGRESSION. GOAL : Understand and interpret the terms dependent variable and independent variable. Draw a scatter diagram.
Outline of Today’s Discussion 1.Regression Analysis: Introduction 2.An Alternate Formula For Regression 3.Correlation, Regression, and Statistical Significance.
© 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Multiple Regression David A. Kenny January 12, 2014.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
Week of March 23 Partial correlations Semipartial correlations
Multiple Linear Regression An introduction, some assumptions, and then model reduction 1.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
GOAL: I CAN USE TECHNOLOGY TO COMPUTE AND INTERPRET THE CORRELATION COEFFICIENT OF A LINEAR FIT. (S-ID.8) Data Analysis Correlation Coefficient.
1 AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Part II: Theory and Estimation of Regression Models Chapter 5: Simple Regression Theory.
Psychology 202a Advanced Psychological Statistics November 3, 2015.
Multiple Regression Scott Hudson January 24, 2011.
Chapter 11 REGRESSION Multiple Regression  Uses  Explanation  Prediction.
Regression. Why Regression? Everything we’ve done in this class has been regression: When you have categorical IVs and continuous DVs, the ANOVA framework.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Correlation Measures the relative strength of the linear relationship between two variables Unit-less Ranges between –1 and 1 The closer to –1, the stronger.
Practice. Practice Practice Practice Practice r = X = 20 X2 = 120 Y = 19 Y2 = 123 XY = 72 N = 4 (4) 72.
Regression Chapter 6 I Introduction to Regression
Multiple Regression.
Multiple Regression Example
Regression.
Practice N = 130 Risk behaviors (DV; Range 0 – 4) Age (IV; M = 10.8)
Chapter 7 Using Multivariate Statistics
The Weather Turbulence
Measuring Change in Two-Wave Studies
Introduction to Regression
Sleeping and Happiness
Find the y-intercept and slope
Presentation transcript:

Multiple Regression: Advanced Topics David A. Kenny January 23, 2014

2 Topics You should already be familiar with Multiple Regression. Rescaling No intercept Adjusted R 2 Bilinear Effects Suppression

3 Rescaling a Predictor Imagine the following equation: Y = a + bX + E If X ʹ = d + eX, the new regression equation would be: Y = a – d(b/e) + (b/e)X ʹ + E The new intercept is a – d(b/e) and new slope for X ʹ is b/e. Note that if e = 1 which is what it equals with centering, then the new intercept is a – bd and the slope does not change.

4 What Changes? Coefficients intercept: almost always slope: only if the variable multiplied or divided Tests of coefficients intercept: almost always slope: no change R 2 and predicted values no change

5 Rescaling the Criterion Imagine the following equation: Y = a + bX + E If Y ʹ = d + eY, the new regression equation would be: Y ʹ = ae + d + beX + E The new intercept is ae + d and new slope for X ʹ is be.

6 No Intercept It is possible to run a multiple regression equation but fix the intercept to zero. This is done for different reasons. –There may be a reason to believe that the intercept is zero: criterion a change score. –May want two intercepts, one for each level of a dichotomous predictor: two- intercept model.

7 Adjusted R 2 The multiple correlation is biased, i.e. too large. We can adjust R 2 for bias by [R 2 – k/(N – 1)][(N – 1)/(N – k -1)] where N is the number of cases and k the number of predictors. If the result is negative, the adjusted R 2 is set to zero. The adjustment is bigger if k is large relative to N. Normally, the adjustment is not made and the regular R 2 is reported.

8 Bilinear or Piecewise Regression Imagine you want the effect of X to change at a given value of X 0. Create two variables X 1 = X when X ≤ X 0, zero otherwise X 2 = X when X > X 0, zero otherwise Regress Y on X 1 and X 2.

9

10 Suppression It can occur that a predictor may have little or correlation with the criterion, but have a moderate to large regression coefficient. For this to happen, two conditions must co-occur: 1) the predictor must be correlated relatively strongly with one (or more) other predictor and 2) that predictor must have a non-trivial coefficient. With suppression, because the suppressor is correlated with a predictor that has an effect on the criterion, the suppressor should correlate with the criterion. But it is not correlated. To explain this, the suppressor has an effect that compensates for the lack of correlation.

11 Hypothetical Example Happiness and Sadness correlate Happiness correlates.4 with Work Motivation (WM) and Sadness correlates 0. The beta (standardized regression weight) for Happiness predicting WM is.914, and the beta for Sadness is.686. Sadness is the suppressor variable. It does not correlate with the criterion but it has a non-zero regression coefficient. Because Sadness correlates strongly negatively with Happiness and because Happiness correlates positively with WM, Sadness “should” correlate negatively with WM. Because it does not, it is given a positive regression coefficient.

12 Next Presentation Example

Thank You! 13