1 Lecture 4 Main Tasks Today 1. Review of Lecture 3 2. Accuracy of the LS estimators 3. Significance Tests of the Parameters 4. Confidence Interval 5.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

C 3.7 Use the data in MEAP93.RAW to answer this question
Inference for Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Objectives (BPS chapter 24)
Session 2. Applied Regression -- Prof. Juran2 Outline for Session 2 More Simple Regression –Bottom Part of the Output Hypothesis Testing –Significance.
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
The Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
Multiple regression analysis
4.1 All rights reserved by Dr.Bill Wan Sing Hung - HKBU Lecture #4 Studenmund (2006): Chapter 5 Review of hypothesis testing Confidence Interval and estimation.
T-test.
Simple Linear Regression Analysis
REGRESSION AND CORRELATION
BCOR 1020 Business Statistics
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
1 732G21/732A35/732G28. Formal statement  Y i is i th response value  β 0 β 1 model parameters, regression parameters (intercept, slope)  X i is i.
Chapter 12 Section 1 Inference for Linear Regression.
Simple Linear Regression Analysis
Lecture 5 Correlation and Regression
Inference for regression - Simple linear regression
Chapter 13: Inference in Regression
Copyright © Cengage Learning. All rights reserved. 13 Linear Correlation and Regression Analysis.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Simple Linear Regression Models
BPS - 3rd Ed. Chapter 211 Inference for Regression.
Inference for Linear Regression Conditions for Regression Inference: Suppose we have n observations on an explanatory variable x and a response variable.
Inferences in Regression and Correlation Analysis Ayona Chatterjee Spring 2008 Math 4803/5803.
1 1 Slide © 2003 Thomson/South-Western Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination.
CHAPTER 14 MULTIPLE REGRESSION
+ Chapter 12: Inference for Regression Inference for Linear Regression.
Introduction to Linear Regression
Multiple regression - Inference for multiple regression - A case study IPS chapters 11.1 and 11.2 © 2006 W.H. Freeman and Company.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Chapter 11 Linear Regression Straight Lines, Least-Squares and More Chapter 11A Can you pick out the straight lines and find the least-square?
Lesson Inference for Regression. Knowledge Objectives Identify the conditions necessary to do inference for regression. Explain what is meant by.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Part 2: Model and Inference 2-1/49 Regression Models Professor William Greene Stern School of Business IOMS Department Department of Economics.
Inference for Regression Chapter 14. Linear Regression We can use least squares regression to estimate the linear relationship between two quantitative.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
14- 1 Chapter Fourteen McGraw-Hill/Irwin © 2006 The McGraw-Hill Companies, Inc., All Rights Reserved.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Lecture 10 Chapter 23. Inference for regression. Objectives (PSLS Chapter 23) Inference for regression (NHST Regression Inference Award)[B level award]
Inference with computer printouts. Coefficie nts Standard Errort StatP-value Lower 95% Upper 95% Intercept
Multiple Regression I 1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Multiple Regression Analysis (Part 1) Terry Dielman.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
A first order model with one binary and one quantitative predictor variable.
Regression. Height Weight How much would an adult female weigh if she were 5 feet tall? She could weigh varying amounts – in other words, there is a distribution.
Example x y We wish to check for a non zero correlation.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
Dependent (response) Variable Independent (control) Variable Random Error XY x1x1 y1y1 x2x2 y2y2 …… xnxn ynyn Raw data: Assumption:  i ‘s are independent.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Correlation and Regression Elementary Statistics Larson Farber Chapter 9 Hours of Training Accidents.
Regression Analysis Presentation 13. Regression In Chapter 15, we looked at associations between two categorical variables. We will now focus on relationships.
BPS - 5th Ed. Chapter 231 Inference for Regression.
Analysis of variance approach to regression analysis … an (alternative) approach to testing for a linear association.
Chapter 15 Inference for Regression. How is this similar to what we have done in the past few chapters?  We have been using statistics to estimate parameters.
Lecture #25 Tuesday, November 15, 2016 Textbook: 14.1 and 14.3
Chapter 14 Inference on the Least-Squares Regression Model and Multiple Regression.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Lecture 12 More Examples for SLR More Examples for MLR 9/19/2018
...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001
Properties of the LS Estimates Inference for Individual Coefficients
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
6-1 Introduction To Empirical Models
Simple Linear Regression
OUTLINE Lecture 5 A. Review of Lecture 4 B. Special SLR Models
Model Comparison: some basic concepts
Simple Linear Regression
Presentation transcript:

1 Lecture 4 Main Tasks Today 1. Review of Lecture 3 2. Accuracy of the LS estimators 3. Significance Tests of the Parameters 4. Confidence Interval 5. Prediction and Prediction Interval

2 Review of Lecture 3 Properties: (1).Linearity (2).Unbiased (3). Normality The LS- Estimators of and :

3 (4).Best Linear Unbiased Estimators. For any other Unbiased Linear Estimators and of and, we have Review of Lecture 3 (cont.)

4 Noise Variance is usually unknown. It has a natural estimator using residuals, given by SSE=Sum of Squared Errors (Residuals) n-2= degrees of freedom of SSE=Sample size- # of Coefficients Using the estimated noise variance, we can obtain the measures of the accuracy (Standard Errors, s.e.) of the estimators and Accuracy of the LS-Estimates

5 In the SLR model, we often want to know if (1) “ Y and X are linearly uncorrelated” (X has no effect on Y) ( ) (2) “ the linear regression line passes the origin” (no intercept is needed).( ) A hypothesis test has two hypotheses: H0 (Null) and H1 ( Alternative). A general test about the slope (intercept ) can be written as versus where ( or ) is a predetermined constant (belief of the experts). To test (1) [or (2)] is equivalent to test =0 [or =0] (a) versus(b) Significance Test of Parameters

6 To test the hypotheses, we need construct test statistics. Test statistic can usually be constructed as: For example, to test (1) and (2), we have test statistics respectively Which have t-distributions with n-2 degrees of freedom (dfs). Let be the Upper - percentile of t-distribution with n-2 dfs. Then the level two- sided test for test (1) or (2) is Test Statistics and 2-Sided Tests can be found in the Appendix (Table A.2, Page 339)

7 In another way, we can compute the p-value of the test statistic to perform the test: If the p-value<, then the test is level significant. The smaller the p-value, the higher significant level the test is. In general, we say 1) when p-value<.05, the test is significant; 2) when p-value <.01, the test is very significant; 3) when p-value>.20, the test is not significant. Example For computer repair data, to test Since 6.948>2.18, the test is significant at 5%-level significance. The <.005. So the test is highly significant P-value of the Test

8 One-Sided Tests: (a) (b) (c) (d) ( ) ( )

9 Results for: P027.txt Regression Analysis: Minutes versus Units The regression equation is Minutes = Units Predictor Coef SE Coef T P-value Constant Units S = R-Sq = 98.7% R-Sq(adj) = 98.6% Minitab Output Note that the T-values calculated here are for testing`` the true parameters are zeros”.

10 Since =0 is equivalent to Cor(Y,X)=0. So the following two tests are equivalent to each other: (3) (4) where denotes the true correlation between Y and X. To test (4), we can use the following test statistic We can also compute to perform the test A Test Using Correlation Coefficient

11 For the computer repair data, since Cor(Y,X)=.996, we have Which is much larger than. So the correlation is highly Significantly different from 0. That is X has high impact on Y. We reject H0 of test (4) or (3) at 5%-significant level. The corresponding p-value much less than So the significance is very high. Example:

12 Confidence Interval Since We have (similarly for ): For SLR Model

13 Interpretation of CI: Example

14 For SLR Model Prediction Interval

15 Prediction Interval (cont.)

16 Standard Errors Reasons: (1). is a random variable (2). is a constant.

17 Prediction Interval/Prediction Limits/Forecast Intervals Remark:

18 Example

19 Note that the PI bands (blue) are wider than the CI bands (red)

20 Remark