Multiple Regression Research Methods and Statistics
Intended Learning Outcomes At the end of this lecture and with additional reading you will be able to describe principles of a regression analysis describe assumptions of regression analysis describe the principles of the regression equation
Perfect Positive Relationships Scattergrams: all the datapoints fall on a straight line every time your sister ages by one year, so do you no cause can be attributed to correlational analysis
Perfect Negative Relationships Points will fall on a straight line: everytime x increases by a certain amount, y decreases by a certain, constant amount
The Strength/magnitude of the Relationship The strength of the relationship goes from zero to +1 (for positive relationships) and zero to -1 (for negative relationships Correlations are often given to two places, e.g zero
Regression Regression is an extension of a correlation. It allows you to predict the impact of one variable against another: bivariate linear regression assess one variable against another multiple regression assess several variables against another
Assumptions of regression Multicolinerarity Homoscedasticity Outliers Independence linearity Normal distribution Large sample (15+ per variable)
Regression analysis In regression variables are classed as criterion (DV’s) or predictors (IV’s) regression will show the amount of a change in y as a results of change in x (regression equation) it therefore allows researchers to predict scores of y against changes in x
Regression analysis For linear regression we can calculate someone's score on y from their score on x y = bx + a y is the variable to be predicted x is the score on variable x b is the value for the slope of the line a is the value of the constant (that is where the the straight line intercepts the y axis, also called the intercept)
For example If we think back to our study on stress in the police service, suppose we want to predict the anxiety scores from the depression scores remember y = bx + a y = 1.1 x y = 35.3
Multiple regression Multiple regression is an extension of linear regression It enables researchers to assess several predictor variables against a criterion variable
The multiple regression equation allows all predictor variables to contribute to the outcome of the criterion variable Therefore if we were trying to predict scores on anxiety from depression and dissociation score we would use the following equation: y = b¹x¹ + b²x² + b³x³ …. + a y = 1 x x y = 36.03
Output terminology R: the correlation coefficient R squared: the amount of variance explained Adjusted r: the variance explain adjusted to give a more realistic estimate Standard error: the standard deviations of the amount of error that may occur ANOVA: whether the regression line is significantly different from o B and Beta values: B values are the slope of the line and Beta are standardized coefficients and advise the strength of the predictor