1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.

Slides:



Advertisements
Similar presentations
Chapter 9: Simple Regression Continued
Advertisements

Forecasting Using the Simple Linear Regression Model and Correlation
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Inference for Regression
1 Home Gas Consumption Interaction? Should there be a different slope for the relationship between Gas and Temp after insulation than before insulation?
1 Multiple Regression Response, Y (numerical) Explanatory variables, X 1, X 2, …X k (numerical) New explanatory variables can be created from existing.
1 Multiple Regression Interpretation. 2 Correlation, Causation Think about a light switch and the light that is on the electrical circuit. If you and.
Regresi dan Analisis Varians Pertemuan 21 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Korelasi Ganda Dan Penambahan Peubah Pertemuan 13 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
1 Qualitative Independent Variables Sometimes called Dummy Variables.
January 6, morning session 1 Statistics Micro Mini Multiple Regression January 5-9, 2008 Beth Ayers.
© 2003 Prentice-Hall, Inc.Chap 14-1 Basic Business Statistics (9 th Edition) Chapter 14 Introduction to Multiple Regression.
1 Pertemuan 13 Uji Koefisien Korelasi dan Regresi Matakuliah: A0392 – Statistik Ekonomi Tahun: 2006.
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter Topics Types of Regression Models
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
Statistics for Business and Economics Chapter 11 Multiple Regression and Model Building.
Ch. 14: The Multiple Regression Model building
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Chapter 7 Forecasting with Simple Regression
Correlation & Regression
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Chapter 13: Inference in Regression
Chapter 14 Introduction to Multiple Regression Sections 1, 2, 3, 4, 6.
Lecture 14 Multiple Regression Model
© 2002 Prentice-Hall, Inc.Chap 14-1 Introduction to Multiple Regression Model.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
Multiple regression - Inference for multiple regression - A case study IPS chapters 11.1 and 11.2 © 2006 W.H. Freeman and Company.
Elementary Statistics Correlation and Regression.
1 Prices of Antique Clocks Antique clocks are sold at auction. We wish to investigate the relationship between the age of the clock and the auction price.
Regression. Population Covariance and Correlation.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
© Buddy Freeman, 2015 Multiple Linear Regression (MLR) Testing the additional contribution made by adding an independent variable.
Multiple Regression BPS chapter 28 © 2006 W.H. Freeman and Company.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
CORRELATION: Correlation analysis Correlation analysis is used to measure the strength of association (linear relationship) between two quantitative variables.
Lecture 4 Introduction to Multiple Regression
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Political Science 30: Political Inquiry. Linear Regression II: Making Sense of Regression Results Interpreting SPSS regression output Coefficients for.
1 Quadratic Model In order to account for curvature in the relationship between an explanatory and a response variable, one often adds the square of the.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
1.What is Pearson’s coefficient of correlation? 2.What proportion of the variation in SAT scores is explained by variation in class sizes? 3.What is the.
MEASURES OF GOODNESS OF FIT The sum of the squares of the actual values of Y (TSS: total sum of squares) could be decomposed into the sum of the squares.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Week 13 Application! DPPG 8504 Sara Spowart, PhD, MPA
Introduction to Multiple Regression Lecture 11. The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (Y) & 2 or more.
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
© 2000 Prentice-Hall, Inc. Chap Chapter 10 Multiple Regression Models Business Statistics A First Course (2nd Edition)
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
1 Simple Linear Regression Example - mammals Response variable: gestation (length of pregnancy) days Explanatory: brain weight.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
Chapter 11: Linear Regression E370, Spring From Simple Regression to Multiple Regression.
Regression Analysis: Statistical Inference
26134 Business Statistics Week 5 Tutorial
Linear Regression Using Excel
Political Science 30: Political Inquiry
Simple Linear Regression and Correlation
Multiple Regression BPS 7e Chapter 29 © 2015 W. H. Freeman and Company.
Multiple Regression Models
Pemeriksaan Sisa dan Data Berpengaruh Pertemuan 17
Multiple Linear Regression
Cases. Simple Regression Linear Multiple Regression.
Introduction to Regression
Adding variables. There is a difference between assessing the statistical significance of a variable acting alone and a variable being added to a model.
Presentation transcript:

1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k

2 Multiple Regression

3 Example Y, Response – Effectiveness score based on experienced teachers’ evaluations. Explanatory – Test 1, Test 2, Test 3, Test 4.

4 Test of Model Utility Is there any explanatory variable in the model that is helping to explain significant amounts of variation in the response?

5 Conclusion At least one of Test 1, Test 2, Test 3 or Test 4 is providing statistically significant information about the evaluation score. The model is useful. Maybe not the best, but useful.

6 Individual Slope Parameters In order to see what statistical tests of hypothesis for the various parameters in multiple regression model indicate, we need to go back to simple linear regression.

7 SLR – EVAL on Test 1 Predicted Eval = *Test1 For each additional point scored on Test 1, the Evaluation score increases by points, on average.

8 Explained Variation R 2 = 0.295, only 29.5% of the variation in Evaluation is explained by the linear relationship with Test 1.

9 Inference on t-Ratio = 2.97 P-value = Reject the null hypothesis that =0, because the P-value is so small. There is a statistically significant linear relationship between EVAL and Test 1.

10 Model with Test 1 If Test 1 is the only explanatory variable in the model, then the scores for Test 2, Test 3 and Test 4 are ignored by this model. What happens if we add Test 2 to the model with Test 1?

11

12 Model with Test 1, Test 2 Predicted EVAL = *Test *Test2 For each additional point on Test 1, while holding Test 2 constant, the Evaluation score increases by points, on average.

13 Model with Test 1, Test 2 Predicted EVAL = *Test *Test2 For each additional point on Test 2, while holding Test 1 constant, the Evaluation score increases by points, on average.

14 Explained Variation R 2 = 0.367, 36.7% of the variation in Evaluation is explained by the linear relationship with Test 1 and Test 2.

15 Explained Variation R 2 = – Test 1 and Test 2. R 2 = – Test 1 alone – = 0.072, 7.2% of the variation in EVAL is explained by the addition of Test 2 to Test 1.

16 Parameter Estimates – Test 2 Has Test 2 added significantly to the relationship between Test 1 and Evaluation? Note that this is different from asking if Test 2 is linearly related to Evaluation!

17 Parameter Estimates – Test 2 t-Ratio = 1.51 P-value = Because the P-value is not small, Test 2’s addition to the model with Test 1 is not statistically significant.

18 Parameter Estimates – Test 2 Although R 2 has increased by adding Test 2, that increase could have happened just by chance. The increase is not large enough to be deemed statistically significant.

19 Parameter Estimates – Test 1 Does Test 1 add significantly to the relationship between Test 2 and Evaluation? Note that this is different from asking if Test 1 is linearly related to Evaluation!

20 Parameter Estimates – Test 1 t-Ratio = 2.52 P-value = Because the P-value is small, Test 1’s addition to the model with Test 2 is statistically significant.

21 Parameter Estimates – Test 1 If we had started with a model relating Test 2 to EVAL, adding Test 1 would result in an increase in R 2. That increase is large enough to be deemed statistically significant.