Multiple Correlation & Regression SPSS. Analyze, Regression, Linear Notice that we have added “ideal” to the model we tested earlier.

Slides:



Advertisements
Similar presentations
Test of (µ 1 – µ 2 ),  1 =  2, Populations Normal Test Statistic and df = n 1 + n 2 – 2 2– )1– 2 ( 2 1 )1– 1 ( 2 where ] 2 – 1 [–
Advertisements

Redundancy and Suppression
Kin 304 Regression Linear Regression Least Sum of Squares
Inference for Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Objectives (BPS chapter 24)
Inference for Regression 1Section 13.3, Page 284.
Comparing Regression Lines
Some Terms Y =  o +  1 X Regression of Y on X Regress Y on X X called independent variable or predictor variable or covariate or factor Which factors.
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
Statistics for the Social Sciences Psychology 340 Spring 2005 Prediction cont.
Chapter 10 Simple Regression.
1 Psych 5510/6510 Chapter Eight--Multiple Regression: Models with Multiple Continuous Predictors Part 2: Testing the Addition of One Parameter at a Time.
Intro to Statistics for the Behavioral Sciences PSYC 1900
Simple Linear Regression Analysis
Quantitative Business Analysis for Decision Making Simple Linear Regression.
Multiple Regression Research Methods and Statistics.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Multiple Regression Dr. Andy Field.
Simple Linear Regression Analysis
Quantitative Business Analysis for Decision Making Multiple Linear RegressionAnalysis.
Multiple Linear Regression Response Variable: Y Explanatory Variables: X 1,...,X k Model (Extension of Simple Regression): E(Y) =  +  1 X 1 +  +  k.
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Inference for regression - Simple linear regression
Elements of Multiple Regression Analysis: Two Independent Variables Yong Sept
Simple Linear Regression Models
Inferences in Regression and Correlation Analysis Ayona Chatterjee Spring 2008 Math 4803/5803.
CHAPTER 15 Simple Linear Regression and Correlation
Regression Analysis. Scatter plots Regression analysis requires interval and ratio-level data. To see if your data fits the models of regression, it is.
Statistics for the Social Sciences Psychology 340 Fall 2013 Correlation and Regression.
Simple Linear Regression One reason for assessing correlation is to identify a variable that could be used to predict another variable If that is your.
Soc 3306a Lecture 9: Multivariate 2 More on Multiple Regression: Building a Model and Interpreting Coefficients.
Multiple regression - Inference for multiple regression - A case study IPS chapters 11.1 and 11.2 © 2006 W.H. Freeman and Company.
Ch4 Describing Relationships Between Variables. Section 4.1: Fitting a Line by Least Squares Often we want to fit a straight line to data. For example.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Examining Relationships in Quantitative Research
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Multiple Linear Regression Partial Regression Coefficients.
Regression Lesson 11. The General Linear Model n Relationship b/n predictor & outcome variables form straight line l Correlation, regression, t-tests,
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Simple Linear Regression (SLR)
Correlation & Regression. The Data SPSS-Data.htmhttp://core.ecu.edu/psyc/wuenschk/SPSS/ SPSS-Data.htm Corr_Regr.
ANOVA, Regression and Multiple Regression March
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Example x y We wish to check for a non zero correlation.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
Week of March 23 Partial correlations Semipartial correlations
Chapter 12 REGRESSION DIAGNOSTICS AND CANONICAL CORRELATION.
Stats Methods at IC Lecture 3: Regression.
The simple linear regression model and parameter estimation
Chapter 14 Introduction to Multiple Regression
Regression Analysis.
Regression Analysis AGEC 784.
Psych 706: stats II Class #4.
Correlation, Bivariate Regression, and Multiple Regression
Regression Diagnostics
Multiple Regression.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Stats Club Marnie Brennan
CHAPTER 29: Multiple Regression*
Unit 3 – Linear regression
Hypothesis testing and Estimation
Simple Linear Regression
Presentation transcript:

Multiple Correlation & Regression SPSS

Analyze, Regression, Linear Notice that we have added “ideal” to the model we tested earlier.

Statistics, Part and Partial Correlations

Plots: Zresid Zpredict, Histogram

ANOVA

R2R2 In our previous model, without idealism, r 2 =.049. Adding idealism has increased r 2 by =.007, not much of a change.

Intercept and Slopes

When Misanth and Ideal are both zero, predicted Ar is Holding Ideal constant, predicted Ar increases by.185 point for each one point increase in Misanth. Holding Misanth constant, predicted Ar increases by.086 for each one point increase in Ideal.

Holding Ideal constant, predicted Ar increases by.233 standard deviations for each one standard deviation increase in Misanth. Holding Misanth constant, predicted Ar increases by.086 standard deviation for each one standard deviation increase in Ideal.

Tests of Partial (Unique) Effects Removing misanthropy from the model would significantly reduce the R 2. Removing idealism from the model would not significantly reduce the R 2.

sr i 2 The squared semipartial correlation coefficient is the amount of variance in Y that is explained by X i, above and beyond the variance that has already been explained by other predictors in the model. In other words, it is the amount by which R 2 would drop if X i were removed from the model.

a + b + c + d = 1 a + b = r 2 for Ar_Mis c + b = r 2 for A_Ideal R 2 = a + b + c b = redundancy between Mis and Ideal with respect to predicting Ar a = sr 2 for Mis – the unique contribution of Mis c = sr 2 for Ideal – the unique contribution of Ideal

“Part” is the square root of sr 2 The sr 2 for Misanth is.23 2 =.0529 The sr 2 for Ideal is =.007 We previously calculated the sr 2 for Ideal as the reduction in R 2 when we removed it from the model.

pr 2 The squared partial correlation coefficient is the proportional reduction in error variance caused by adding a new predictor to the current model. Of the variance in Y that is not already explained by the other predictors, what proportion is explained by X i ?

sr 2 versus pr 2 sr 2 is the proportion of all of Y that is explained uniquely by X i. pr 2 is the proportion of that part of Y not already explained by the other predictors that is explained by X i.

pr 2 for Mis is a/(a+d); sr 2 is a/(a+b+c+d) = sr 2 /1. pr 2 for Ideal is c/(c+d); sr 2 is c/(a+b+c+d) = sr 2 /1. pr 2 will be larger than sr 2.

The pr 2 for Misanth is =.053. The pr 2 for Ideal is =.008.

The Marginal Distribution of the Residuals (error) We have assumed that this is normal.

Standardized Residuals Plot

As you scan from left to right, is the variance in the columns of dots constant? Are the normally distributed?

Put a CI on R 2 If you want the CI to be consistent with the test of significance of R 2, use a confidence coefficient of 1-2 , not 1- .

The CI extends from.007 to.121.

Effect of Misanth Moderated by Ideal I had predicted that the relationship between Ar and Misanth would be greater among nonidealists than among idealists. Let us see if that is true. Although I am going to dichotomize Idealism here, that is generally not good practice. There is a better way, covered in advanced stats classes.

Split File by Idealism

Predict Ar from Misanth by Ideal

For the NonIdealists

Ar = Misanth

Among Idealists

Ar = Misanth

Confidence Intervals for  For the NonIdealists,

CI for the Idealists