Redundancy and Suppression

Slides:



Advertisements
Similar presentations
Multiple Correlation & Regression SPSS. Analyze, Regression, Linear Notice that we have added “ideal” to the model we tested earlier.
Advertisements

Multiple Regression Analysis
1 G Lect 4M Interpreting multiple regression weights: suppression and spuriousness. Partial and semi-partial correlations Multiple regression in.
Partial and Semipartial Correlation
1 1 Chapter 5: Multiple Regression 5.1 Fitting a Multiple Regression Model 5.2 Fitting a Multiple Regression Model with Interactions 5.3 Generating and.
The Multiple Regression Model.
Correlation Chapter 6. Assumptions for Pearson r X and Y should be interval or ratio. X and Y should be normally distributed. Each X should be independent.
BA 275 Quantitative Business Methods
Model Adequacy Testing Assumptions, Checking for Outliers, and More.
1 SSS II Lecture 1: Correlation and Regression Graduate School 2008/2009 Social Science Statistics II Gwilym Pryce
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Simple Linear Regression. Start by exploring the data Construct a scatterplot  Does a linear relationship between variables exist?  Is the relationship.
Psychology 202b Advanced Psychological Statistics, II February 1, 2011.
The Simple Linear Regression Model: Specification and Estimation
Correlation and Simple Regression Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Class 5: Thurs., Sep. 23 Example of using regression to make predictions and understand the likely errors in the predictions: salaries of teachers and.
Multiple Regression and Correlation Analysis
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
Lorelei Howard and Nick Wright MfD 2008
Relationships Among Variables
Review for Final Exam Some important themes from Chapters 9-11 Final exam covers these chapters, but implicitly tests the entire course, because we use.
Adding Integers. Adding Integers with the Same Sign Add the absolute values. The sum will have the same sign as the addends. Example 1 Find –2 + (-3)
Review Guess the correlation. A.-2.0 B.-0.9 C.-0.1 D.0.1 E.0.9.
Ordinary Least Squares
Chapter 4 Two-Variables Analysis 09/19-20/2013. Outline  Issue: How to identify the linear relationship between two variables?  Relationship: Scatter.
Example of Simple and Multiple Regression
Descriptive Methods in Regression and Correlation
Elements of Multiple Regression Analysis: Two Independent Variables Yong Sept
Regression with 2 IVs Generalization of Regression from 1 to 2 Independent Variables.
Definition of Covariance The covariance of X & Y, denoted Cov(X,Y), is the number where  X = E(X) and  Y = E(Y). Computational Formula:
Statistics for the Social Sciences Psychology 340 Fall 2013 Correlation and Regression.
Extension to Multiple Regression. Simple regression With simple regression, we have a single predictor and outcome, and in general things are straightforward.
Soc 3306a Lecture 9: Multivariate 2 More on Multiple Regression: Building a Model and Interpreting Coefficients.
Chapter 9 Analyzing Data Multiple Variables. Basic Directions Review page 180 for basic directions on which way to proceed with your analysis Provides.
Chapter 17 Partial Correlation and Multiple Regression and Correlation.
2 Multicollinearity Presented by: Shahram Arsang Isfahan University of Medical Sciences April 2014.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Integers & Operations on Integers
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
Multiple Linear Regression Partial Regression Coefficients.
Lecture 11 Multicollinearity BMTRY 701 Biostatistical Methods II.
Education 793 Class Notes Multiple Regression 19 November 2003.
REGRESSION DIAGNOSTICS Fall 2013 Dec 12/13. WHY REGRESSION DIAGNOSTICS? The validity of a regression model is based on a set of assumptions. Violation.
Chapter 10 For Explaining Psychological Statistics, 4th ed. by B. Cohen 1 A perfect correlation implies the ability to predict one score from another perfectly.
Joe,naz,hedger.  Factor each binomial if possible. Solution: Factoring Differences of Squares.
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Psychology 202a Advanced Psychological Statistics November 10, 2015.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Multiple Regression. PSYC 6130, PROF. J. ELDER 2 Multiple Regression Multiple regression extends linear regression to allow for 2 or more independent.
Multiple Regression David A. Kenny January 12, 2014.
Chapter 8 Relationships Among Variables. Outline What correlational research investigates Understanding the nature of correlation What the coefficient.
Venn diagram shows (R 2 ) the amount of variance in Y that is explained by X. Unexplained Variance in Y. (1-R 2 ) =.36, 36% R 2 =.64 (64%)
Ch 3.1 Add and Subtract Signed Numbers Vocabulary Op posites :2 numbers the same distance from 0 but in opposite directions
Topics, Summer 2008 Day 1. Introduction Day 2. Samples and populations Day 3. Evaluating relationships Scatterplots and correlation Day 4. Regression and.
Chapter 11: Linear Regression E370, Spring From Simple Regression to Multiple Regression.
Topics: Multiple Regression Analysis (MRA)
The simple linear regression model and parameter estimation
Review Guess the correlation
Chapter 9 Multiple Linear Regression
Regression 10/29.
Regression Diagnostics
Regression.
Multiple Regression Analysis
Correlation and Simple Linear Regression
Multiple Regression – Part II
Correlation and Simple Linear Regression
Multiple Linear Regression
Presentation transcript:

Redundancy and Suppression Trivariate Regression

Predictors Independent of Each Other b X1 X2 a c Y b = error

Redundancy For each X, sri and i will be smaller than ryi, and the sum of the squared semipartial r’s (a + c) will be less than the multiple R2. (a + b + c)

Formulas Used Here

Classical Suppression ry1 = .38, ry2 = 0, r12 = .45. the sign of  and sr for the classical suppressor variable may be opposite that of its zero-order r12. Notice also that for both predictor variables the absolute value of  exceeds that of the predictor’s r with Y. Y X2 X1

Classical Suppression WTF adding a predictor that is uncorrelated with Y (for practical purposes, one whose r with Y is close to zero) increased our ability to predict Y? X2 suppresses the variance in X1 that is irrelevant to Y (area d)

Classical Suppression Math r2y(1.2), the squared semipartial for predicting Y from X2 (sr22 ), is the r2 between Y and the residual (X1 – X1.2). It is increased (relative to r2y1) by removing from X1 the irrelevant variance due to X2  what variance is left in partialed X1 is better correlated with Y than is unpartialed X1.

Classical Suppression Math is less than Y X2 X1

Net Suppression ry1 = .65, ry2 = .25, and r12 = .70. X1 X2 Note that 2 has a sign opposite that of ry2. It is always the X which has the smaller ryi which ends up with a  of opposite sign. Each  falls outside of the range 0  ryi, which is always true with any sort of suppression.

Reversal Paradox Aka, Simpson’s Paradox treating severity of fire as the covariate, when we control for severity of fire, the more fire fighters we send, the less the amount of damage suffered in the fire. That is, for the conditional distributions (where severity of fire is held constant at some set value), sending more fire fighters reduces the amount of damage.

Cooperative Suppression Two X’s correlate negatively with one another but positively with Y (or positively with one another and negatively with Y) Each predictor suppresses variance in the other that is irrelevant to Y both predictor’s , pr, and sr increase in absolute magnitude (and retain the same sign as ryi).

Cooperative Suppression Y = how much the students in an introductory psychology class will learn Subjects are graduate teaching assistants X1 is a measure of the graduate student’s level of mastery of general psychology. X2 is an SOIS rating of how well the teacher presents simple easy to understand explanations.

Cooperative Suppression ry1 = .30, ry2 = .25, and r12 = 0.35.

Summary When i falls outside the range of 0  ryi, suppression is taking place If one ryi is zero or close to zero, it is classic suppression, and the sign of the  for the X with a nearly zero ryi may be opposite the sign of ryi.

Summary When neither X has ryi close to zero but one has a  opposite in sign from its ryi and the other a  greater in absolute magnitude but of the same sign as its ryi, net suppression is taking place. If both X’s have absolute i > ryi, but of the same sign as ryi, then cooperative suppression is taking place.

Psychologist Investigating Suppressor Effects in a Five Predictor Model