Multiple Regression Scott Hudson January 24, 2011.

Slides:



Advertisements
Similar presentations
Managerial Economics in a Global Economy
Advertisements

Kin 304 Regression Linear Regression Least Sum of Squares
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
© McGraw-Hill Higher Education. All Rights Reserved. Chapter 2F Statistical Tools in Evaluation.
Some Terms Y =  o +  1 X Regression of Y on X Regress Y on X X called independent variable or predictor variable or covariate or factor Which factors.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
MULTIPLE REGRESSION. OVERVIEW What Makes it Multiple? What Makes it Multiple? Additional Assumptions Additional Assumptions Methods of Entering Variables.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
Statistics for the Social Sciences
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
Lecture 6: Multiple Regression
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Lecture 5: Simple Linear Regression
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
Multiple Regression Research Methods and Statistics.
Simple Linear Regression Analysis
Basic Analysis of Variance and the General Linear Model Psy 420 Andrew Ainsworth.
Bivariate Linear Regression. Linear Function Y = a + bX +e.
Example of Simple and Multiple Regression
Chapter 12 Correlation and Regression Part III: Additional Hypothesis Tests Renee R. Ha, Ph.D. James C. Ha, Ph.D Integrative Statistics for the Social.
Elements of Multiple Regression Analysis: Two Independent Variables Yong Sept
Chapter 11 Simple Regression
LEARNING PROGRAMME Hypothesis testing Intermediate Training in Quantitative Analysis Bangkok November 2007.
MTH 161: Introduction To Statistics
Examining Relationships in Quantitative Research
Part IV Significantly Different: Using Inferential Statistics
Chapter 7 Relationships Among Variables What Correlational Research Investigates Understanding the Nature of Correlation Positive Correlation Negative.
Regression Chapter 16. Regression >Builds on Correlation >The difference is a question of prediction versus relation Regression predicts, correlation.
Chapter 16 Data Analysis: Testing for Associations.
Welcome to MM570 Psychological Statistics Unit 4 Seminar Dr. Srabasti Dutta.
Correlation and Regression: The Need to Knows Correlation is a statistical technique: tells you if scores on variable X are related to scores on variable.
 Relationship between education level, income, and length of time out of school  Our new regression equation: is the predicted value of the dependent.
Chapter 22: Building Multiple Regression Models Generalization of univariate linear regression models. One unit of data with a value of dependent variable.
© Buddy Freeman, 2015 Let X and Y be two normally distributed random variables satisfying the equality of variance assumption both ways. For clarity let.
LESSON 6: REGRESSION 2/21/12 EDUC 502: Introduction to Statistics.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
© 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Multiple Regression. PSYC 6130, PROF. J. ELDER 2 Multiple Regression Multiple regression extends linear regression to allow for 2 or more independent.
Using SPSS Note: The use of another statistical package such as Minitab is similar to using SPSS.
Multiple Regression David A. Kenny January 12, 2014.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
Welcome to MM570 Psychological Statistics Unit 4 Seminar Dr. Bob Lockwood.
Research Methodology Lecture No :26 (Hypothesis Testing – Relationship)
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Chapter 11 REGRESSION Multiple Regression  Uses  Explanation  Prediction.
رگرسیون چندگانه Multiple Regression
Regression. Why Regression? Everything we’ve done in this class has been regression: When you have categorical IVs and continuous DVs, the ANOVA framework.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Regression Analysis AGEC 784.
REGRESSION G&W p
Correlation, Bivariate Regression, and Multiple Regression
10.2 Regression If the value of the correlation coefficient is significant, the next step is to determine the equation of the regression line which is.
Regression Analysis.
Regression.
Chapter 15 Linear Regression
Quantitative Methods Simple Regression.
BIVARIATE REGRESSION AND CORRELATION
Correlation and Simple Linear Regression
Multiple Regression A curvilinear relationship between one variable and the values of two or more other independent variables. Y = intercept + (slope1.
BNAD 276: Statistical Inference in Management Spring 2016
Correlation and Simple Linear Regression
Simple Linear Regression and Correlation
Checking Assumptions Primary Assumptions Secondary Assumptions
Regression Analysis.
Introduction to Regression
3 basic analytical tasks in bivariate (or multivariate) analyses:
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Multiple Regression Scott Hudson January 24, 2011

Bivariate Regression Review Formula Y (dependent variable) = Constant (Intercept) + (Coefficient * X Independent Variable) Example (Compliment of Dr. Cumella) Income at Age 40 = -24,000 + (30,000 * College GPA)

Review Continued Pearson’s r – correlation between two variables – may be anywhere from -1 to plus 1: 0 means no correlation R squared – comes out as a decimal. Move the decimal point 2 spaces to the right to form a percentage. This tells you to what percent the variance of one is explained by the variance in the other-this is not the statistical significance

SPSS Scan the data to look for “Sig”; SIG or Significance A number of errors on the last assignment attributable to using the wrong score from the SPSS tables

Multiple Regression Multiple Regression is the same as a Bivariate Regression analysis, except that we are adding more predictor variables There is essentially an unlimited number to theoretical predictor variables we could add but ultimately we only want to include those predictor variables which are significant

Multiple Regression Example

Equation Y = a + b1*X1 + b2*X bp*Xp Y'i = b0 + b1X1i + b2X2i This is the same formula as we used for Simple Regression but we just added more predictor variables. We are still solving for the Dependent variable “Y”.

Assumptions Normal distribution Linear Variables measured without error Homoscedasticity

SPSS Example Data

Significance Testing SPSS will provide us with a P value which we can use to determine if a predictor variable is significant or not. If it is not statistically significant, we will not include it in our final multiple regression as it is just attributable to chance.

Types of Multiple Linear Regressions Forward Backward Stepwise

Multiple Regression Example Estimated Income at age 40 based on College GPA Thanks to Dr. Cumella for allowing use of this example Estimated Income at Age 40 = -$ $30,000 (GPA) Example: Income = ,000 (4.0) (remember to do the multiplication first) Should equal $96,000

Adding an additional Predictor Variable In this equations we will add an additional variable which has been found to influence income at age 40 (Parent’s income when parent was age 40) This is not a huge variable but it does have a multiplier of.10 (parent’s income at age 40)

Continued Let’s assume parent’s income at age 40 was $100,000 Income = -24, ,000 (4.0) +.10 (100,000) = $106,000

One more example GPA of 3.0 Parent income of $40,000 Income = -24, ,000 (3.0) +.10(40,000) = -24, , $70,000