...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Objectives (BPS chapter 24)
Simple Linear Regression
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
Chapter 13 Introduction to Linear Regression and Correlation Analysis
The Simple Regression Model
Linear Regression and Correlation Analysis
Chapter 11 Multiple Regression.
Simple Linear Regression Analysis
REGRESSION AND CORRELATION
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Simple Linear Regression Analysis
Lecture 5 Correlation and Regression
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Introduction to Linear Regression and Correlation Analysis
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Simple Linear Regression Models
CHAPTER 14 MULTIPLE REGRESSION
Introduction to Linear Regression
Chapter 11 Linear Regression Straight Lines, Least-Squares and More Chapter 11A Can you pick out the straight lines and find the least-square?
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
1 Lecture 4 Main Tasks Today 1. Review of Lecture 3 2. Accuracy of the LS estimators 3. Significance Tests of the Parameters 4. Confidence Interval 5.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Inference for Regression Chapter 14. Linear Regression We can use least squares regression to estimate the linear relationship between two quantitative.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Multiple Regression I 1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Multiple Regression Analysis (Part 1) Terry Dielman.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Chapter 15 Inference for Regression. How is this similar to what we have done in the past few chapters?  We have been using statistics to estimate parameters.
Inference about the slope parameter and correlation
Lecture #25 Tuesday, November 15, 2016 Textbook: 14.1 and 14.3
23. Inference for regression
Chapter 20 Linear and Multiple Regression
Chapter 14 Inference on the Least-Squares Regression Model and Multiple Regression.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
AP Statistics Chapter 14 Section 1.
Correlation and Simple Linear Regression
Inference for Regression
Linear Regression and Correlation Analysis
Chapter 13 Created by Bethany Stubbe and Stephan Kogitz.
Lecture 12 More Examples for SLR More Examples for MLR 9/19/2018
…Don’t be afraid of others, because they are bigger than you
Correlation and Simple Linear Regression
The Practice of Statistics in the Life Sciences Fourth Edition
Properties of the LS Estimates Inference for Individual Coefficients
Correlation and Regression
CHAPTER 29: Multiple Regression*
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
Model Comparison: some basic concepts
Correlation and Simple Linear Regression
Review of Chapter 2 Some Basic Concepts: Sample center
Multiple Regression Chapter 14.
Interpretation of Regression Coefficients
Simple Linear Regression
CHAPTER 14 MULTIPLE REGRESSION
Simple Linear Regression and Correlation
Simple Linear Regression
Product moment correlation
Chapter 14 Inference for Regression
Chapter Fourteen McGraw-Hill/Irwin
Introduction to Regression
Regression Models - Introduction
Statistical inference for the slope and intercept in SLR
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001 HJGHK 9/21/2018 ST3131, Lecture 3

Lecture 3 Main Tasks Today! Review of Lecture 2 Statistical Inferences of the LS-Estimators a). Properties of the LS-estimators b). Accuracy of the LS-estimators c). Significance Tests of the Parameters 9/21/2018 ST3131, Lecture 3

Review of Lecture 2 Standardization of Y: we have =0 and =1. Covariance / Correlation of Y and X : Properties of Cov(Y,X) / [Cor(Y,X)] (1). Symmetric (2). Scale-dependent /[invariant] (3). Take values in the real line/[in [-1,1]] (4). Measure of direction of the linear relationship between Y and X (5). Not Robust statistics( affected by outliers) 9/21/2018 ST3131, Lecture 3

Review of Lecture 2 (cont.) For linear regression equation Y= + X + , Cov(Y,X) or Cor(Y,X) Linear Relationship between Y and X Positive Positive(Y increases as X increases) zero Uncorrelated (Y does not change as X changes) negative Negative (Y decreases as X increases) Cor(Y,X) Strength of Linear Relationship between Y and X Close to –1 or 1 Very strong Close to -.5 or .5 Strong Close to 0 Very weak (near linearly uncorrelated) 9/21/2018 ST3131, Lecture 3

i 1 2 … n …. Review of Lecture 2 (cont.) Total Note that Var(Y)=sum of squared deviations of Y divided by (n-1) Var(X)=sum of squared deviations of X divided by (n-1) Cov(Y,X)=sum of products of deviations of Y and X divided by (n-1). 9/21/2018 ST3131, Lecture 3

Review of Lecture 2 (cont.) The Least Squares Estimators of and : They are linear combinations of the observations . That is (Linearity) (Linearity) 9/21/2018 ST3131, Lecture 3

Review of Lecture 2 (cont.) Linear Regression Line: Fitted Values: Predicted Values Linear Regression Line For Computer Repair Data 9/21/2018 ST3131, Lecture 3

Linear Estimator and Its Property Linear Estimator = a linear combination of observations i.e., where c’s are any constants. Notice: and are linear estimators of and . Property: when T= and are independently normally distributed, we have 9/21/2018 ST3131, Lecture 3

Application to SLR Models Simple Linear Regression Model: Assumptions: (1) Linearity between Y and X (2) Normality for measurement errors (3) Measurement Errors are independently identically distributed (iid) Using the property of linear estimators, we have 9/21/2018 ST3131, Lecture 3

Application to Parametric Estimators of SLR Model Using the property of linear estimators, we can show that (Unbiased) (Normality) (Unbiased) (Normality) 9/21/2018 ST3131, Lecture 3

Properties of the LS-Estimators 1.) Linearity 2). Normality 3). Unbiased 4). Best Linear Unbiased Estimators. That is , they have smallest variances among all unbiased linear estimators. Let and be any other Unbiased Linear Estimators of and respectively. Then 9/21/2018 ST3131, Lecture 3

Accuracy of the LS-Estimates Noise Variance is usually unknown. It has a natural estimator using residuals, given by SSE=Sum of Squared Errors (Residuals) n-2= degrees of freedom of SSE=Sample size- # of Coefficients Using the estimated noise variance, we can obtain the measures of the accuracy (Standard Errors, s.e.) of the estimators and 9/21/2018 ST3131, Lecture 3

Significance Test of Parameters In the SLR model, we often want to know if (1) “ Y and X are linearly uncorrelated” (X has no effect on Y) ( ) (2) “ the linear regression line passes the origin” (no intercept is needed).( ) A hypothesis test has two hypotheses: H0 (Null) and H1 ( Alternative). A general test about the slope (intercept ) can be written as versus (a) versus (b) where ( or ) is a predetermined constant (belief of the experts). To test (1) [or (2)] is equivalent to test =0 [or =0] 9/21/2018 ST3131, Lecture 3

Test Statistics and 2-Sided Tests To test the hypotheses, we need construct test statistics. Test statistic can usually be constructed as: For example, to test (1) and (2) , we have test statistics respectively Which have t-distributions with n-2 degrees of freedom (dfs). Let be the Upper - percentile of t-distribution with n-2 dfs. Then the level two- sided test for test (1) or (2) is can be found in the Appendix (Table A.2, Page 339) 9/21/2018 ST3131, Lecture 3

P-value of the Test In another way, we can compute the p-value of the test statistic to perform the test: If the p-value< , then the test is level significant. The smaller the p-value, the higher significant level the test is. In general, we say 1) when p-value<.05, the test is significant; 2) when p-value < .01, the test is very significant; 3) when p-value>.20, the test is not significant. Example For computer repair data, to test Since 6.948>2.18, the test is significant at 5%-level significance. The <.005. So the test is highly significant 9/21/2018 ST3131, Lecture 3

One-Sided Tests: (a) ( (b) ) (c) ( (d) ) 9/21/2018 ST3131, Lecture 3

Outputs for the computer repair data using Splus 9/21/2018 ST3131, Lecture 3

Results using Minitab Results for: P027.txt Regression Analysis: Minutes versus Units The regression equation is Minutes = 4.16 + 15.5 Units Predictor Coef SE Coef T P-value Constant 4.162 3.355 1.24 0.239 Units 15.5088 0.5050 30.71 0.000 S = 5.392 R-Sq = 98.7% R-Sq(adj) = 98.6% Note that the T-values calculated here are for testing`` the true parameters are zeros”. 9/21/2018 ST3131, Lecture 3

A Test Using Correlation Coefficient Since =0 is equivalent to Cor(Y,X)=0. So the following two tests are equivalent to each other: (3) (4) where denotes the true correlation between Y and X. To test (4), we can use the following test statistic We can also compute to perform the test 9/21/2018 ST3131, Lecture 3

Example: For the computer repair data, since Cor(Y,X)=.996, we have Which is much larger than . So the correlation is highly Significantly different from 0. That is X has high impact on Y. We reject H0 of test (4) or (3) at 5%-significant level. The corresponding p-value much less than .0001. So the significance is very high. 9/21/2018 ST3131, Lecture 3

Review Sections 2.6-2.9 of Chapter 2. Reading Assignment Review Sections 2.6-2.9 of Chapter 2. Read Sections 2.10-2.11 of Chapter 2. Consider problems: a) What are special SLR models? b) Why need we study the special SLR models? c) How to do inference about the special SLR models? 9/21/2018 ST3131, Lecture 3