Stat 112 Notes 8 Today: –Chapters 4.3 (Assessing the Fit of a Regression Model) –Chapter 4.4 (Comparing Two Regression Models) –Chapter 4.5 (Prediction.

Slides:



Advertisements
Similar presentations
Chapter 9: Simple Regression Continued
Advertisements

Chap 12-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 12 Simple Regression Statistics for Business and Economics 6.
Stat 112: Lecture 7 Notes Homework 2: Due next Thursday The Multiple Linear Regression model (Chapter 4.1) Inferences from multiple regression analysis.
Inference for Regression
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Class 16: Thursday, Nov. 4 Note: I will you some info on the final project this weekend and will discuss in class on Tuesday.
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
Copyright © 2008 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Managerial Economics, 9e Managerial Economics Thomas Maurice.
Lecture 23: Tues., Dec. 2 Today: Thursday:
Class 15: Tuesday, Nov. 2 Multiple Regression (Chapter 11, Moore and McCabe).
Chapter 12 Simple Regression
Statistics 350 Lecture 16. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Lecture 23: Tues., April 6 Interpretation of regression coefficients (handout) Inference for multiple regression.
Lecture 6 Notes Note: I will homework 2 tonight. It will be due next Thursday. The Multiple Linear Regression model (Chapter 4.1) Inferences from.
Stat 112: Lecture 8 Notes Homework 2: Due on Thursday Assessing Quality of Prediction (Chapter 3.5.3) Comparing Two Regression Models (Chapter 4.4) Prediction.
Lecture 24: Thurs. Dec. 4 Extra sum of squares F-tests (10.3) R-squared statistic (10.4.1) Residual plots (11.2) Influential observations (11.3,
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 11 th Edition.
Lecture 24: Thurs., April 8th
Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections ): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence.
Simple Linear Regression Analysis
Ch. 14: The Multiple Regression Model building
Stat 112: Lecture 9 Notes Homework 3: Due next Thursday
Chapter 7 Forecasting with Simple Regression
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Introduction to Multiple Regression Statistics for Managers.
Multiple Linear Regression Response Variable: Y Explanatory Variables: X 1,...,X k Model (Extension of Simple Regression): E(Y) =  +  1 X 1 +  +  k.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Testing Group Difference
Inference for regression - Simple linear regression
Chapter 13: Inference in Regression
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Correlation and Regression
Chapter 14 Introduction to Multiple Regression
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
Soc 3306a Lecture 9: Multivariate 2 More on Multiple Regression: Building a Model and Interpreting Coefficients.
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Production Planning and Control. A correlation is a relationship between two variables. The data can be represented by the ordered pairs (x, y) where.
Correlation and Regression Chapter 9. § 9.3 Measures of Regression and Prediction Intervals.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Chap 14-1 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics.
Copyright © 2005 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Managerial Economics Thomas Maurice eighth edition Chapter 4.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.3 Using Multiple Regression to Make Inferences.
ANOVA for Regression ANOVA tests whether the regression model has any explanatory power. In the case of simple regression analysis the ANOVA test and the.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall 14-1 Chapter 14 Introduction to Multiple Regression Statistics for Managers using Microsoft.
Stat 112 Notes 6 Today: –Chapter 4.1 (Introduction to Multiple Regression)
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 10 th Edition.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Introduction to Multiple Regression Lecture 11. The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (Y) & 2 or more.
Stat 112 Notes 6 Today: –Chapters 4.2 (Inferences from a Multiple Regression Analysis)
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 14-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Chapter 9 Minitab Recipe Cards. Contingency tests Enter the data from Example 9.1 in C1, C2 and C3.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
REGRESSION AND CORRELATION SIMPLE LINEAR REGRESSION 10.2 SCATTER DIAGRAM 10.3 GRAPHICAL METHOD FOR DETERMINING REGRESSION 10.4 LEAST SQUARE METHOD.
1 Simple Linear Regression Example - mammals Response variable: gestation (length of pregnancy) days Explanatory: brain weight.
EXCEL: Multiple Regression
Chapter 14 Introduction to Multiple Regression
Basic Estimation Techniques
Quantitative Methods Simple Regression.
Basic Estimation Techniques
Correlation and Regression
Multiple Regression BPS 7e Chapter 29 © 2015 W. H. Freeman and Company.
Multiple Regression Models
Presentation transcript:

Stat 112 Notes 8 Today: –Chapters 4.3 (Assessing the Fit of a Regression Model) –Chapter 4.4 (Comparing Two Regression Models) –Chapter 4.5 (Prediction with a Multiple Regression Equation)

Gas Mileage Regression

R-Squared (Coefficient of Determination) The coefficient of determination for multiple regression is defined as for simple linear regression: Represents percentage of variation in y that is explained by the multiple regression line. is between 0 and 1. The closer to 1, the better the fit of the regression equation to the data. For the gas mileage regression, RSquare Summary of Fit

Comparing Two Regression Models Multiple Regression Model for automobile data: We use t test to test if one variable, for example, cargo is useful after putting the rest of the three variables into the model. How to test whether cargo and/or seating are useful predictors once weight and hp are taken into account, i.e., test

Full vs. Reduced Model General setup for testing whether any of the variables are useful for predicting y after taking into account variables Full model: Reduced model: Is the full model better than the reduced model?

Partial F test Test statistic: Under H 0, F has an distribution. Round both degrees of freedom down when using Table B.4. Decision rule for test with significance level –Reject H 0 if –Accept H 0 if p-value = Prob (F (K-L, n-K-1) >F)

Automobile Example Test whether seating and length are useful predictors once weight and hp are taken into account. From Table B.4, F(.05; 2,120)=3.07 [rounding down to nearest denominator degrees of freedom] Because 60.59>3.07, we reject H 0. There is evidence that seating and/or length are useful predictors once weight and hp are taken into account.

Test of Usefulness of Model Are any of the variables useful for predicting y? Multiple Linear Regression model:

F Test of Usefulness of Model Under, F has F(K,n-K-1) distribution. Decision rule: Reject if [see Appendix B.3-B.5] F test in JMP in Analysis of Variance table. Prob>F is the p-value for the F test.

Test of Usefulness of Model for Gas Mileage Data

Prediction in Gas Mileage Data The design team is planning a new car with the following characteristics: weight=4000 lbs, horsepower = 200, seating = 5 adults, length=200 inches. What is a 95% prediction interval for the GPM1000 of this car?

Prediction with Multiple Regression Equation Prediction interval for individual with x 1,…,x K : For a large number of observations (say n>30+number of explanatory variables *10), the 95% prediction interval is approximately

Finding Prediction Interval in JMP Enter a line with the independent variables x 1,…,x K for the new individual. Do not enter a y for the new individual. Fit the model. Because the new individual does not have a y, JMP will not include the new individual when calculating the least squares fit. Click red triangle next to response, click Save Columns: –To find, click Predicted Values. Creates column with –To find 95% PI, click Indiv Confid Interval. Creates column with lower and upper endpoints of 95% PI.

Prediction in Automobile Example The design team is planning a new car with the following characteristics: weight = 4000 lbs, horsepower =200, seating =5, length =200 inches From JMP, – –95% prediction interval: (33.36, 45.63)