Prediction and Prediction Intervals

Slides:



Advertisements
Similar presentations
Test of (µ 1 – µ 2 ),  1 =  2, Populations Normal Test Statistic and df = n 1 + n 2 – 2 2– )1– 2 ( 2 1 )1– 1 ( 2 where ] 2 – 1 [–
Advertisements

Forecasting Using the Simple Linear Regression Model and Correlation
Inference for Regression
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Objectives (BPS chapter 24)
Chapter 12 Simple Linear Regression
Statistics 350 Lecture 16. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Chapter Topics Types of Regression Models
1 1 Slide © 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
BCOR 1020 Business Statistics
Simple Linear Regression and Correlation
Chapter 7 Forecasting with Simple Regression
Hypothesis tests for slopes in multiple linear regression model Using the general linear test and sequential sums of squares.
Introduction to Linear Regression and Correlation Analysis
Chapter 13: Inference in Regression
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Econ 3790: Business and Economics Statistics
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved OPIM 303-Lecture #9 Jose M. Cruz Assistant Professor.
1 1 Slide Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
Chapter 11 Linear Regression Straight Lines, Least-Squares and More Chapter 11A Can you pick out the straight lines and find the least-square?
Solutions to Tutorial 5 Problems Source Sum of Squares df Mean Square F-test Regression Residual Total ANOVA Table Variable.
Chapter 13 Multiple Regression
Regression Analysis Relationship with one independent variable.
Lack of Fit (LOF) Test A formal F test for checking whether a specific type of regression function adequately fits the data.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Simple Linear Regression (SLR)
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Multiple Regression I 1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Multiple Regression Analysis (Part 1) Terry Dielman.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Multiple Regression.
Lecture 11: Simple Linear Regression
Chapter 20 Linear and Multiple Regression
Essentials of Modern Business Statistics (7e)
Linear Regression Prof. Andy Field.
John Loucks St. Edward’s University . SLIDES . BY.
Regression model with multiple predictors
Relationship with one independent variable
Lecture 12 More Examples for SLR More Examples for MLR 9/19/2018
9/19/2018 ST3131, Lecture 6.
...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001
Slides by JOHN LOUCKS St. Edward’s University.
Business Statistics Multiple Regression This lecture flows well with
Cases of F-test Problems with Examples
CHAPTER 29: Multiple Regression*
Review of Chapter 3 where Multiple Linear Regression Model:
Multiple Regression.
Prepared by Lee Revere and John Large
Prediction of new observations
24/02/11 Tutorial 3 Inferential Statistics, Statistical Modelling & Survey Methods (BS2506) Pairach Piboonrungroj (Champ)
PENGOLAHAN DAN PENYAJIAN
Model Comparison: some basic concepts
Review of Chapter 2 Some Basic Concepts: Sample center
Full Model: contain ALL coefficients of interest
Relationship with one independent variable
Interpretation of Regression Coefficients
Chapter 11 Variable Selection Procedures
Introduction to Regression
St. Edward’s University
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Prediction and Prediction Intervals Lecture 11 Review of Lecture 10 Prediction and Prediction Intervals More Examples about Model Comparison 11/21/2018 ST3131, Lecture 11

Steps for Model Comparison : RM H0: The RM is adequate vs FM H1: The FM is adequate Step1: Fit the FM and get SSE (in the ANOVA table) df (in the ANOVA table) R_sq (under the Coefficient Table) Step 2: Fit the RM and get SSE, df, and R_sq. Step 3: Compute F-statistic: Step 4: Conclusion: Reject H0 if F>F(r,df(SSE,F),alpha) Can’t Reject H0 otherwise. 11/21/2018 ST3131, Lecture 11

Special Case: ANOVA Table (Analysis of Variance) Source Sum of Squares df Mean Square F-test P-value Regression SSR p MSR=SSR/p F=MSR/MSE Residuals SSE n-p-1 MSE=SSE/(n-p-1) Total SST n-1 11/21/2018 ST3131, Lecture 11

Predictions: Recall the prediction for the SLR model: 11/21/2018 ST3131, Lecture 11

Prediction: for MLR Model Standard Errors 11/21/2018 ST3131, Lecture 11

Problem 3. 5 (Page 76, textbook) Table 3 Problem 3.5 (Page 76, textbook) Table 3.11 shows the regression output, with some numbers erased, when a simple regression model relating a response variable Y to a predictor variable X1 is fitted based on 20 observations. Complete the 13 missing numbers, then compute Var(Y) and Var(X1). ANOVA Table Source Sum of Squares df Mean Square F-test Regression 1848.76 Residual Total Coefficient Table Variable Coefficients s.e. T-test P-value Constant -23.4325 12.74 .0824 X1 .1528 8.32 <.0001 n= R^2= Ra^2= S= df 11/21/2018 ST3131, Lecture 11

11/21/2018 ST3131, Lecture 11

11/21/2018 ST3131, Lecture 11

Sex (X1): An indicator variable(man=1, woman=0) Problem 3.12 (Page 78, textbook) Table 3.14 shows the regression output of a MLR model relating the beginning salaries in dollars of employees in a given company to the following predictor variables: Sex (X1): An indicator variable(man=1, woman=0) Education(X2): Years of Schooling at the time of hire Experience(X3): Number of months previous work experience Months(X4): Number of months with the company In (a)-(b) below, specify the null and alternative hypotheses the test used, and your conclusion using a 5% level of significance. 11/21/2018 ST3131, Lecture 11

Table 3.14 ANOVA Table Coefficient Table 11/21/2018 ST3131, Lecture 11 Source Sum of Squares df Mean Square F-test Regression 23665352 4 5916338 22.98 Residual 22657938 88 257477 Total 46323290 92 Coefficient Table Variable Coefficients s.e. T-test P-value Constant 3526.4 327.7 10.76 .000 Sex 722.5 117.8 6.13 Education 90.02 24.69 3.65 Experience 1.2690 .5877 2.16 .034 Month 23.406 5.201 4.50 n=93 R^2=.515 Ra^2=.489 S=507.4 Df=88 11/21/2018 ST3131, Lecture 11

Conclusion: H0, the overall fit is significant. Conduct the F-test for the overall fit of the regression (F(4,88,.05)<2.53) Test H0: vs H1: Statistic F= df=( , ) Conclusion: H0, the overall fit is significant. Is there a positive linear relationship between Salary and Experience, after accounting for the effect of the variables Sex, Education, and Months. Test H0: vs H1: Statistic T= P-value= Conclusion: H0. The positive relationship is significant at 5% significance level. 11/21/2018 ST3131, Lecture 11

(c) What salary would you forecast for a man with 12 years of Education, 10 months of Experience, and 15 months with the company? (d) What salary would you forecast, on average, for a man with 12 years of Education, 10 months of Experience, and 15 months with the company? 11/21/2018 ST3131, Lecture 11

(e) What salary would you forecast, on average, for a woman with 12 years of Education, 10 months of Experience, and 15 months with the company? Problem 3.13 (Page 79, textbook) Consider the regression model that generated output in Table 31.4 to be a Full Model. Now consider the Reduced Model in which Salary is regression on only Education . The ANOVA table obtained when fitting this model is shown in Table 3.15. Conduct a single test to compare the Full and Reduced Models. What conclusion can be drawn from the result of the test? (Use 5% significant level). 11/21/2018 ST3131, Lecture 11

Statistic SSE(R )= df(R )= SSE(F)= df(F)= F= df=( , ). Table 3.15 ANOVA Table Source Sum of Squares df Mean Square F-test Regression 7862535 1 18.60 Residual 38460756 91 422646 Total 46323291 92 Test H0: vs H1: Statistic SSE(R )= df(R )= SSE(F)= df(F)= F= df=( , ). Conclusion: H0. The Reduced Model is significant 11/21/2018 ST3131, Lecture 11

After-class Questions: Why ANOVA table can be used to test if R_sq=0? Why F-test can be used to test if the effect of a predictor variable is significant or not? 11/21/2018 ST3131, Lecture 11