Multiple Regression Research Methods and Statistics.

Slides:



Advertisements
Similar presentations
Lesson 10: Linear Regression and Correlation
Advertisements

Kin 304 Regression Linear Regression Least Sum of Squares
Chapter 12 Simple Linear Regression
Correlation and Linear Regression.
Overview Correlation Regression -Definition
© McGraw-Hill Higher Education. All Rights Reserved. Chapter 2F Statistical Tools in Evaluation.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r Assumptions.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Chapter 10 Simple Regression.
PPA 501 – Analytical Methods in Administration Lecture 8 – Linear Regression and Correlation.
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
PPA 415 – Research Methods in Public Administration
Lecture 4: Correlation and Regression Laura McAvinue School of Psychology Trinity College Dublin.
Linear Regression and Correlation
RESEARCH STATISTICS Jobayer Hossain Larry Holmes, Jr November 6, 2008 Examining Relationship of Variables.
Business Statistics - QBM117 Least squares regression.
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
Correlation and Regression Analysis
Multiple Regression Dr. Andy Field.
Simple Linear Regression Analysis
SPSS Statistical Package for Social Sciences Multiple Regression Department of Psychology California State University Northridge
Relationships Among Variables
Lecture 5 Correlation and Regression
Chapter 12 Correlation and Regression Part III: Additional Hypothesis Tests Renee R. Ha, Ph.D. James C. Ha, Ph.D Integrative Statistics for the Social.
Introduction to Linear Regression and Correlation Analysis
Chapter 11 Simple Regression
ASSOCIATION BETWEEN INTERVAL-RATIO VARIABLES
Chapter 6 & 7 Linear Regression & Correlation
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Agenda Review Association for Nominal/Ordinal Data –  2 Based Measures, PRE measures Introduce Association Measures for I-R data –Regression, Pearson’s.
Chapter 12 Examining Relationships in Quantitative Research Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
Simple Linear Regression One reason for assessing correlation is to identify a variable that could be used to predict another variable If that is your.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Ch4 Describing Relationships Between Variables. Section 4.1: Fitting a Line by Least Squares Often we want to fit a straight line to data. For example.
Part IV Significantly Different: Using Inferential Statistics
Regression Lesson 11. The General Linear Model n Relationship b/n predictor & outcome variables form straight line l Correlation, regression, t-tests,
Chapter 16 Data Analysis: Testing for Associations.
Chapter 13 Multiple Regression
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Correlation and Regression: The Need to Knows Correlation is a statistical technique: tells you if scores on variable X are related to scores on variable.
LESSON 6: REGRESSION 2/21/12 EDUC 502: Introduction to Statistics.
Advanced Statistical Methods: Continuous Variables REVIEW Dr. Irina Tomescu-Dubrow.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
…. a linear regression coefficient indicates the impact of each independent variable on the outcome in the context of (or “adjusting for”) all other variables.
© 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Multiple Regression Scott Hudson January 24, 2011.
Chapter 12: Correlation and Linear Regression 1.
Inference for Least Squares Lines
Multiple Regression Prof. Andy Field.
Kin 304 Regression Linear Regression Least Sum of Squares
Multiple Regression.
BIVARIATE REGRESSION AND CORRELATION
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Merve denizci nazlıgül, M.s.
Simple Linear Regression
Correlation and Regression
Checking Assumptions Primary Assumptions Secondary Assumptions
Chapter 14 Multiple Regression
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Multiple Regression Research Methods and Statistics

Intended Learning Outcomes  At the end of this lecture and with additional reading you will be able to describe principles of a regression analysis describe assumptions of regression analysis describe the principles of the regression equation

Perfect Positive Relationships  Scattergrams: all the datapoints fall on a straight line every time your sister ages by one year, so do you no cause can be attributed to correlational analysis

Perfect Negative Relationships  Points will fall on a straight line: everytime x increases by a certain amount, y decreases by a certain, constant amount

The Strength/magnitude of the Relationship  The strength of the relationship goes from zero to +1 (for positive relationships)  and zero to -1 (for negative relationships  Correlations are often given to two places, e.g zero

Regression  Regression is an extension of a correlation. It allows you to predict the impact of one variable against another: bivariate linear regression assess one variable against another multiple regression assess several variables against another

Assumptions of regression  Multicolinerarity  Homoscedasticity  Outliers  Independence  linearity  Normal distribution  Large sample (15+ per variable)

Regression analysis  In regression variables are classed as criterion (DV’s) or predictors (IV’s) regression will show the amount of a change in y as a results of change in x (regression equation) it therefore allows researchers to predict scores of y against changes in x

Regression analysis  For linear regression we can calculate someone's score on y from their score on x y = bx + a y is the variable to be predicted x is the score on variable x b is the value for the slope of the line a is the value of the constant (that is where the the straight line intercepts the y axis, also called the intercept)

For example  If we think back to our study on stress in the police service, suppose we want to predict the anxiety scores from the depression scores remember y = bx + a y = 1.1 x y = 35.3

Multiple regression  Multiple regression is an extension of linear regression  It enables researchers to assess several predictor variables against a criterion variable

 The multiple regression equation allows all predictor variables to contribute to the outcome of the criterion variable  Therefore if we were trying to predict scores on anxiety from depression and dissociation score we would use the following equation: y = b¹x¹ + b²x² + b³x³ …. + a y = 1 x x y = 36.03

Output terminology  R: the correlation coefficient  R squared: the amount of variance explained  Adjusted r: the variance explain adjusted to give a more realistic estimate  Standard error: the standard deviations of the amount of error that may occur  ANOVA: whether the regression line is significantly different from o  B and Beta values: B values are the slope of the line and Beta are standardized coefficients and advise the strength of the predictor