Properties of the LS Estimates Inference for Individual Coefficients

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Lesson 10: Linear Regression and Correlation
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Simple Linear Regression. G. Baker, Department of Statistics University of South Carolina; Slide 2 Relationship Between Two Quantitative Variables If.
Objectives (BPS chapter 24)
Simple Linear Regression
Session 2. Applied Regression -- Prof. Juran2 Outline for Session 2 More Simple Regression –Bottom Part of the Output Hypothesis Testing –Significance.
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
Linear Regression with One Regression
SIMPLE LINEAR REGRESSION
ASSESSING THE STRENGTH OF THE REGRESSION MODEL. Assessing the Model’s Strength Although the best straight line through a set of points may have been found.
Chapter Topics Types of Regression Models
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Simple Linear Regression and Correlation
Simple Linear Regression Analysis
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
7.1 Multiple Regression More than one explanatory/independent variable This makes a slight change to the interpretation of the coefficients This changes.
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
Introduction to Linear Regression
Multiple regression - Inference for multiple regression - A case study IPS chapters 11.1 and 11.2 © 2006 W.H. Freeman and Company.
Chapter 11 Linear Regression Straight Lines, Least-Squares and More Chapter 11A Can you pick out the straight lines and find the least-square?
1 Lecture 4 Main Tasks Today 1. Review of Lecture 3 2. Accuracy of the LS estimators 3. Significance Tests of the Parameters 4. Confidence Interval 5.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Part 2: Model and Inference 2-1/49 Regression Models Professor William Greene Stern School of Business IOMS Department Department of Economics.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Lecture 10 Chapter 23. Inference for regression. Objectives (PSLS Chapter 23) Inference for regression (NHST Regression Inference Award)[B level award]
[1] Simple Linear Regression. The general equation of a line is Y = c + mX or Y =  +  X.  > 0  > 0  > 0  = 0  = 0  < 0  > 0  < 0.
A first order model with one binary and one quantitative predictor variable.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Part II: Theory and Estimation of Regression Models Chapter 5: Simple Regression Theory.
Chapter 11 Linear Regression and Correlation. Explanatory and Response Variables are Numeric Relationship between the mean of the response variable and.
Correlation and Linear Regression Chapter 13 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.
Chapter 20 Linear and Multiple Regression
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Least Square Regression
Chapter 13 Created by Bethany Stubbe and Stephan Kogitz.
Least Square Regression
Lecture 12 More Examples for SLR More Examples for MLR 9/19/2018
9/19/2018 ST3131, Lecture 6.
...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001
Cases of F-test Problems with Examples
Solutions for Tutorial 3
Solutions of Tutorial 10 SSE df RMS Cp Radjsq SSE1 F Xs c).
CHAPTER 26: Inference for Regression
Chapter 12 Inference on the Least-squares Regression Line; ANOVA

Review of Chapter 3 where Multiple Linear Regression Model:
HS 167 Test Prep Wednesday 5/23/07; 9:45 – 12:00
Simple Linear Regression
OUTLINE Lecture 5 A. Review of Lecture 4 B. Special SLR Models
Model Comparison: some basic concepts
Review of Chapter 2 Some Basic Concepts: Sample center
Simple Linear Regression
Multiple Regression Chapter 14.
Interpretation of Regression Coefficients
Simple Linear Regression
Simple Linear Regression
Linear Regression and Correlation
SIMPLE LINEAR REGRESSION
Simple Linear Regression
Lecture 20 Last Lecture: Effect of adding or deleting a variable
Solutions of Tutorial 9 SSE df RMS Cp Radjsq SSE1 F Xs c).
Linear Regression and Correlation
Chapter Fourteen McGraw-Hill/Irwin
Chapter 11 Variable Selection Procedures
The Multiple Regression Model
Chapter Thirteen McGraw-Hill/Irwin
Presentation transcript:

Properties of the LS Estimates Inference for Individual Coefficients Lecture 8 Review of Lecture 7 Properties of the LS Estimates Inference for Individual Coefficients Procedure for Solving Problems An Example 11/16/2018 ST3131, Lecture 8

Review: Interpret the Regression Equation 1 Straight line 2 Plane >= 3 Hyper-plane Interpret the Regression Coefficients Interpretation 1: from the view of response change 11/16/2018 ST3131, Lecture 8

Interpretation 2: The MLR coefficients and the associated p SLR coefficients are not the same unless all predictor variables are uncorrelated. For example, 11/16/2018 ST3131, Lecture 8

Properties of LS Estimators We first show that the LS estimators are Linear combinations of Y1, Y2, …, Yn First note that b=(n-1)Cov(X,Y) is linear combination of Y1,Y2,…Yn. 11/16/2018 ST3131, Lecture 8

3). BLUE estimators (Best Linear Unbiased Estimators) It follows that Properties: 1). Linearity 2). Unbiased 3). BLUE estimators (Best Linear Unbiased Estimators) 11/16/2018 ST3131, Lecture 8

Variances of the Estimates In particular, when p=1, we have 11/16/2018 ST3131, Lecture 8

Noise variance estimator and Standard Errors 4) Normality 11/16/2018 ST3131, Lecture 8

Inferences for Individual Coefficients Hypothesis Testing Confidence Interval 11/16/2018 ST3131, Lecture 8

Procedure for Solving Problems Step 1. List the given conditions from the question statement. Step 2. Express the question in mathematical formula. Step 3. Plug-in the known conditions or results from other parts of the problem. Step 4. Simplify the results. Step 5. State your conclusion in ordinary languages. 11/16/2018 ST3131, Lecture 8

n=30. Answer the following questions: Example For the Supervisor Performance Data, you are given the regression coefficient table for the MLR model: n=30. Answer the following questions: (1). Find SSE and its degrees of freedom (2). Find the proportion of the variability in Y explained by Regression 11/16/2018 ST3131, Lecture 8

Regression Analysis: Y versus X1, X2, X3, X4, X5, X6 The regression equation is Y = 10.8 + 0.613 X1 - 0.073 X2 + 0.320 X3 + 0.082 X4 + 0.038 X5 - 0.217 X6 Predictor Coef SE Coef T P Constant 10.79 11.59 0.93 0.362 X1 0.6132 0.1610 3.81 0.001 X2 -0.0731 0.1357 -0.54 0.596 X3 0.3203 0.1685 1.90 0.070 X4 0.0817 0.2215 0.37 0.715 X5 0.0384 0.1470 0.26 0.796 X6 -0.2171 0.1782 -1.22 0.236 S = 7.068 R-Sq = 73.3% R-Sq(adj) = 66.3% 11/16/2018 ST3131, Lecture 8

(4) Find the effect of X1 and its estimated variance (3). Find the total variability of Y and the variability explained by Regression (4) Find the effect of X1 and its estimated variance 11/16/2018 ST3131, Lecture 8

(6). Without any calculation, can you give the intercept of the fitted line? What is it? 11/16/2018 ST3131, Lecture 8

11/16/2018 ST3131, Lecture 8

11/16/2018 ST3131, Lecture 8

11/16/2018 ST3131, Lecture 8

been explained by the regression? (9e). Based on the reduced model, how much of the total variability in Y has not been explained by the regression? 11/16/2018 ST3131, Lecture 8

After-class Questions: Can you interpret the exact meaning of the un-biasedness of a LS estimate? Why can we say the LS estimates are best linear unbiased estimates? 11/16/2018 ST3131, Lecture 8