Prediction/Regression

Slides:



Advertisements
Similar presentations
Wednesday AM  Presentation of yesterday’s results  Associations  Correlation  Linear regression  Applications: reliability.
Advertisements

Correlation and Linear Regression.
Prediction with multiple variables Statistics for the Social Sciences Psychology 340 Spring 2010.
Simple Linear Regression 1. Correlation indicates the magnitude and direction of the linear relationship between two variables. Linear Regression: variable.
Statistics for the Social Sciences Psychology 340 Spring 2005 Prediction cont.
Statistics for the Social Sciences
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
Intro to Statistics for the Behavioral Sciences PSYC 1900
Prediction/Regression
Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Apr. 15, 2008.
Reminders  HW2 due today  Exam 1 next Tues (9/27) – Ch 1-5 –3 sections: Short answers (concepts, definitions) Calculations (you’ll be given the formulas)
Regression The basic problem Regression and Correlation Accuracy of prediction in regression Hypothesis testing Regression with multiple predictors.
Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014.
Elaboration Elaboration extends our knowledge about an association to see if it continues or changes under different situations, that is, when you introduce.
Multiple Regression Research Methods and Statistics.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Multiple Regression Dr. Andy Field.
Simple Linear Regression Analysis
Linear Regression.  Uses correlations  Predicts value of one variable from the value of another  ***computes UKNOWN outcomes from present, known outcomes.
SPSS Statistical Package for Social Sciences Multiple Regression Department of Psychology California State University Northridge
Review Regression and Pearson’s R SPSS Demo
Chapter 12 Correlation and Regression Part III: Additional Hypothesis Tests Renee R. Ha, Ph.D. James C. Ha, Ph.D Integrative Statistics for the Social.
Statistics for the Social Sciences Psychology 340 Fall 2013 Tuesday, November 19 Chi-Squared Test of Independence.
Elements of Multiple Regression Analysis: Two Independent Variables Yong Sept
ASSOCIATION BETWEEN INTERVAL-RATIO VARIABLES
Moderation & Mediation
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Statistics for the Social Sciences Psychology 340 Fall 2013 Correlation and Regression.
Chapter 17 Partial Correlation and Multiple Regression and Correlation.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Multiple Regression Lab Chapter Topics Multiple Linear Regression Effects Levels of Measurement Dummy Variables 2.
Part IV Significantly Different: Using Inferential Statistics
Regression Lesson 11. The General Linear Model n Relationship b/n predictor & outcome variables form straight line l Correlation, regression, t-tests,
Chapter 4 Prediction. Predictor and Criterion Variables  Predictor variable (X)  Criterion variable (Y)
Chapter 16 Data Analysis: Testing for Associations.
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 12 Testing for Relationships Tests of linear relationships –Correlation 2 continuous.
Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1.
Statistics for Psychology CHAPTER SIXTH EDITION Statistics for Psychology, Sixth Edition Arthur Aron | Elliot J. Coups | Elaine N. Aron Copyright © 2013.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Regression. Outline of Today’s Discussion 1.Coefficient of Determination 2.Regression Analysis: Introduction 3.Regression Analysis: SPSS 4.Regression.
4 basic analytical tasks in statistics: 1)Comparing scores across groups  look for differences in means 2)Cross-tabulating categoric variables  look.
Outline of Today’s Discussion 1.Seeing the big picture in MR: Prediction 2.Starting SPSS on the Different Models: Stepwise versus Hierarchical 3.Interpreting.
Applied Quantitative Analysis and Practices LECTURE#28 By Dr. Osman Sadiq Paracha.
Regression. Why Regression? Everything we’ve done in this class has been regression: When you have categorical IVs and continuous DVs, the ANOVA framework.
Simple Bivariate Regression
Psych 706: stats II Class #4.
Correlation, Bivariate Regression, and Multiple Regression
Multiple Regression Prof. Andy Field.
Dr. Siti Nor Binti Yaacob
Statistics for the Social Sciences
Regression The basic problem Regression and Correlation
Correlation and Regression
Multiple Regression.
INFERENTIAL STATISTICS: REGRESSION ANALYSIS AND STANDARDIZATION
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Regression.
Dr. Siti Nor Binti Yaacob
Chapter 6 Predicting Future Performance
Multiple Regression – Part II
Statistics for the Social Sciences
Chapter 9 Introduction to the Analysis of Variance
Product moment correlation
Prediction/Regression
Regression Analysis.
Chapter 6 Predicting Future Performance
Introduction to Regression
Review I am examining differences in the mean between groups How many independent variables? OneMore than one How many groups? Two More than two ?? ?
3 basic analytical tasks in bivariate (or multivariate) analyses:
Regression Part II.
Correlation and Prediction
Presentation transcript:

Prediction/Regression Chapter 12 Prediction/Regression Part 3: Apr. 22, 2008

Multiple Regression Bivariate prediction – 1 predictor, 1 criterion Multiple regression – use multiple predictors Reg model/equations are same, just use separate reg coefficients () for each predictor Ex) multiple regression formula with three predictor variables a is still the regression constant (where the reg line crosses the y axis) b1 is the regression coefficient for X1 b2 is the regression coefficient for X2, etc…

Standardized regression coefficients With bivariate regression, we discussed finding the slope of the reg line, b. That was an unstandardized regression coefficient (based on the original scale) If the variable was measured on a 1-8 scale, that would be the scale for b as well. But many times, we’re interested in comparing our regression results to other researchers’ They may have measured the same variables but used different measures (maybe a 1-20 scale) Standardized regression coefficients (β or beta) will let us compare (more generalizable)

Using standardized coefficients (betas) There is a formula for changing b into β in the chapter, but you won’t be asked to use it So the regression equation (model) would look like this if we use standardized regression coefficients (β):

Overlap among predictors Common for there to be correlation among predictor variables β gives us the unique contribution of each variable β1 gives unique contribution of X1 in predicting Y, excluding any overlap w/other predictors R2 gives the % variance in y explained by all of the predictors together There will be a significance test for R2 in SPSS to determine whether the entire regression model explains significant variance in Y. If yes  Then examine the individual predictors’ β – there will be a signif test for each of these. Is each predictor important or only some of them?

Interpreting beta In general, can interpret it like a correlation between your predictor and criterion: if β is positive, higher scores on predictor (x) are related to higher scores on criterion (y) If β is negative, higher scores on x go with lower scores on y. More specifically, β gives us the predicted amount of change (in SD units) in the criterion for every 1 SD increase in the predictor Example… In bivariate regression (1 predictor), β is equal to the correlation betw the predictor & criterion (r) But this doesn’t work when we use more than 1 predictor

Hypothesis tests for regression We are usually interested in multiple issues Is the β significantly different from 0? (similar to the hyp test for correlation – is there any relationship?) If β = 0, then knowing someone’s score on x (predictor) tell us nothing about their score on y (criterion)…we can’t predict y from x. In multiple regression, we may be interested in which predictor is the best (has the strongest relationship to the criterion)

Mult Reg (cont.) How to judge the relative importance of each predictor variable in predicting the criterion? Consider both the rs and the βs Not necessarily the same rank order of magnitude for rs and βs, so check both. βs indicate unique relationship betw a predictor and criterion, controlling for other predictors r’s indicate general relationship betw x & y (includes effects of other predictors)

Extensions of Multiple Reg We’ve discussed the simplest version of multiple regression where all predictors are entered into the equation at the same time. Another option: Hierarchical mult reg – enter X1 at step 1, enter X2 at step 2, enter X3 at step 3 and examine the changes in the equation at each step How does R2 change at each step? What happens to betas for each variable when others are introduced into the equation? When might you use hierarchical regression?

Prediction in Research Articles Bivariate prediction models rarely reported Multiple regression results commonly reported Note example table in book, reports r’s and βs for each predictor; reports R2 in note at bottom.

Reporting mult. regression From previous table… The multiple regression equation was significant, R2 = .13, p < .05. Depression (β = .30, p<.001) and age (β = .20, p < .001) both significantly predicted intragroup effect, but number of sessions and duration of the disorder were not significant predictors. This indicates that older adults and those with higher levels of depression had higher (better) intragroup effects.

SPSS Reg Example Analyze Regression  Linear Note that terms used in SPSS are “Independent Variable”…this is x (predictor) “Dependent Variable”…this is y (criterion) Class handout of output – what to look for: “Model Summary” section - shows R2 ANOVA section – 1st line gives ‘sig value’, if < .05  signif This tests the significance of the R2 (is the whole regression equation significant or not? If yes  it does predict y) Coefficients section – 1st line gives ‘constant’ = a Other lines give ‘standardized coefficients’ = b or beta for each predictor For each predictor, there is also a significance test (if ‘sig’ if < .05, that predictor is significantly different from 0 and does predict y) If it is significant, you’d want to interpret the beta (like a correlation)