Regression through the origin

Slides:



Advertisements
Similar presentations
Qualitative predictor variables
Advertisements

Multicollinearity.
More on understanding variance inflation factors (VIFk)
BA 275 Quantitative Business Methods
Simultaneous inference Estimating (or testing) more than one thing at a time (such as β 0 and β 1 ) and feeling confident about it …
Simple Linear Regression. G. Baker, Department of Statistics University of South Carolina; Slide 2 Relationship Between Two Quantitative Variables If.
Linear regression models
Objectives (BPS chapter 24)
Simple Linear Regression
1 Chapter 2 Simple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Simple Linear Regression Basic Business Statistics 11 th Edition.
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 11 th Edition.
Simple Linear Regression Analysis
REGRESSION AND CORRELATION
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Introduction to Linear Regression and Correlation Analysis
Simple linear regression Linear regression with one predictor variable.
Prediction concerning Y variable. Three different research questions What is the mean response, E(Y h ), for a given level, X h, of the predictor variable?
Stat13-lecture 25 regression (continued, SE, t and chi-square) Simple linear regression model: Y=  0 +  1 X +  Assumption :  is normal with mean 0.
CPE 619 Simple Linear Regression Models Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama.
Simple Linear Regression Models
M23- Residuals & Minitab 1  Department of ISM, University of Alabama, ResidualsResiduals A continuation of regression analysis.
Inferences in Regression and Correlation Analysis Ayona Chatterjee Spring 2008 Math 4803/5803.
Introduction to Linear Regression
An alternative approach to testing for a linear association The Analysis of Variance (ANOVA) Table.
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
Lesson Inference for Regression. Knowledge Objectives Identify the conditions necessary to do inference for regression. Explain what is meant by.
1 Lecture 4 Main Tasks Today 1. Review of Lecture 3 2. Accuracy of the LS estimators 3. Significance Tests of the Parameters 4. Confidence Interval 5.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Chap 13-1 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 13-1 Chapter 13 Simple Linear Regression Basic Business Statistics 12.
Sequential sums of squares … or … extra sums of squares.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
Lack of Fit (LOF) Test A formal F test for checking whether a specific type of regression function adequately fits the data.
Multiple regression. Example: Brain and body size predictive of intelligence? Sample of n = 38 college students Response (Y): intelligence based on the.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Lecture 10 Chapter 23. Inference for regression. Objectives (PSLS Chapter 23) Inference for regression (NHST Regression Inference Award)[B level award]
Environmental Modeling Basic Testing Methods - Statistics III.
[1] Simple Linear Regression. The general equation of a line is Y = c + mX or Y =  +  X.  > 0  > 0  > 0  = 0  = 0  < 0  > 0  < 0.
Chapter 12 Inference for Linear Regression. Reminder of Linear Regression First thing you should do is examine your data… First thing you should do is.
Regression. Height Weight How much would an adult female weigh if she were 5 feet tall? She could weigh varying amounts – in other words, there is a distribution.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
Chapter 12 Simple Linear Regression.
Multicollinearity. Multicollinearity (or intercorrelation) exists when at least some of the predictor variables are correlated among themselves. In observational.
Interaction regression models. What is an additive model? A regression model with p-1 predictor variables contains additive effects if the response function.
Chapter 26: Inference for Slope. Height Weight How much would an adult female weigh if she were 5 feet tall? She could weigh varying amounts – in other.
Regression Analysis Presentation 13. Regression In Chapter 15, we looked at associations between two categorical variables. We will now focus on relationships.
Simple linear regression. What is simple linear regression? A way of evaluating the relationship between two continuous variables. One variable is regarded.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
Simple linear regression. What is simple linear regression? A way of evaluating the relationship between two continuous variables. One variable is regarded.
Descriptive measures of the degree of linear association R-squared and correlation.
Analysis of variance approach to regression analysis … an (alternative) approach to testing for a linear association.
Announcements There’s an in class exam one week from today (4/30). It will not include ANOVA or regression. On Thursday, I will list covered material and.
Multiple Regression.
Chapter 20 Linear and Multiple Regression
Least Square Regression
Least Square Regression
Regression model with multiple predictors
Chapter 13 Simple Linear Regression
9/19/2018 ST3131, Lecture 6.
...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001
Solutions for Tutorial 3
Linear Regression.
Multiple Regression.
Simple Linear Regression
Simple Linear Regression
Chapter 14 Inference for Regression
Inference for Regression
Chapter 13 Simple Linear Regression
Presentation transcript:

Regression through the origin Procedures for when you know the regression function must pass through the origin (0,0)

Examples Circumference of circle = π×diameter Man hours = β1×number of items processed Distance traveled = β1×speed Blood alcohol content = β1×number drinks

No intercept model where: β1 is unknown slope parameter Xi are known constants i are unknown, independent normally distributed error terms with mean 0 and variance σ2

Example: Circumference = β1×diameter Diam Circum 6.8 21.7 10.8 33.4 5.6 18.0 1.9 6.4 2.6 8.0 4.4 12.9 9.4 29.5 16.6 51.4

No intercept model in Minitab Stat >> Regression >> Regression … Specify response and predictor. Under Options…, remove the default check mark from the “Fit the intercept” box. Note: Stat >> Regression >> Fitted line plot does not handle regression through origin.

Diam Circum DxC D_sq The regression equation is Circum = 3.11 Diam Predictor Coef SE Coef T P Noconstant Diam 3.11170 0.02011 154.73 0.000 S = 0.4876 Diam Circum DxC D_sq 6.8 21.7 147.56 46.24 10.8 33.4 360.72 116.64 5.6 18.0 100.80 31.36 1.9 6.4 12.16 3.61 2.6 8.0 20.80 6.76 4.4 12.9 56.76 19.36 9.4 29.5 277.30 88.36 16.6 51.4 853.24 275.56 ------- ------ 1829.3 587.89

S = 0.4876 Diam Circum RESI1 RESI1_sq 6.8 21.7 0.540409 0.292042 10.8 33.4 -0.206409 0.042605 5.6 18.0 0.574454 0.329998 1.9 6.4 0.487761 0.237911 2.6 8.0 -0.090432 0.008178 4.4 12.9 -0.791500 0.626472 9.4 29.5 0.249977 0.062489 16.6 51.4 -0.254296 0.064666 --------- -------- 0.50996 1.6644 is unbiased estimator of σ2.

Summary of key points Residuals don’t necessarily sum to 0 for regression through the origin. Error degrees of freedom is n-1, not n-2, since estimating only one parameter. Formulas are different for no-intercept model.

Analysis of Variance Source DF SS MS F P Regression 1 5692.4 5692.4 23941.06 0.000 Error 7 1.7 0.2 Total 8 5694.0 Diam Circum RESI1 RESI1_sq Circ_sq FITS FITS_sq 6.8 21.7 0.540409 0.292042 470.89 21.1596 447.73 10.8 33.4 -0.206409 0.042605 1115.56 33.6064 1129.39 5.6 18.0 0.574454 0.329998 324.00 17.4255 303.65 1.9 6.4 0.487761 0.237911 40.96 5.9122 34.95 2.6 8.0 -0.090432 0.008178 64.00 8.0904 65.46 4.4 12.9 -0.791500 0.626472 166.41 13.6915 187.46 9.4 29.5 0.249977 0.062489 870.25 29.2500 855.56 16.6 51.4 -0.254296 0.064666 2641.96 51.6543 2668.17 -------- ------- ------- ------- 1.6644 5694.00 5692.4

Summary of key points n total degrees of freedom, n-1 error degrees of freedom Total sum of squares is “uncorrected for the mean.” It is just sum of squared observed responses. Regression sum of squares also uncorrected for the mean. Just sum of squared fitted responses. SSTOU = SSRU + SSE

The regression equation is Circum = 3.11 Diam Predictor Coef SE Coef T P Noconstant Diam 3.11170 0.02011 154.73 0.000 S = 0.4876 ???? Many software packages, Minitab included, do not display an R2 value for regression through the origin. This is because it is possible that it is negative when you force the regression line through the origin, and therefore has no meaningful interpretation here.

Predicted Values for New Observations Diam Fit SE Fit 95.0% CI 95.0% PI 7.00 21.782 0.141 (21.449,22.115) (20.581,22.983)