The Simple Linear Regression Model. Estimators in Simple Linear Regression and.

Slides:



Advertisements
Similar presentations
Chapter 9: Simple Regression Continued
Advertisements

General Linear Model With correlated error terms  =  2 V ≠  2 I.
Chapter 12 Inference for Linear Regression
Lesson 10: Linear Regression and Correlation
Kin 304 Regression Linear Regression Least Sum of Squares
The Simple Regression Model
Estimating a Population Variance
BA 275 Quantitative Business Methods
Inference for Regression
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
The General Linear Model. The Simple Linear Model Linear Regression.
1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Simple Linear Regression Estimates for single and mean responses.
Simple Linear Regression
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
The Simple Linear Regression Model: Specification and Estimation
The Simple Regression Model
SIMPLE LINEAR REGRESSION
T-test.
Simple Linear Regression Analysis
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Lecture 5 Correlation and Regression
SIMPLE LINEAR REGRESSION
Simple linear regression Linear regression with one predictor variable.
Copyright © Cengage Learning. All rights reserved. 13 Linear Correlation and Regression Analysis.
Stat13-lecture 25 regression (continued, SE, t and chi-square) Simple linear regression model: Y=  0 +  1 X +  Assumption :  is normal with mean 0.
Lesson Confidence Intervals about a Population Standard Deviation.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
+ Chapter 12: Inference for Regression Inference for Linear Regression.
Linear Regression Hypothesis testing and Estimation.
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Inference for Regression Chapter 14. Linear Regression We can use least squares regression to estimate the linear relationship between two quantitative.
Exploratory Data Analysis Observations of a single variable.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Stat 112: Notes 2 Today’s class: Section 3.3. –Full description of simple linear regression model. –Checking the assumptions of the simple linear regression.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Copyright © Cengage Learning. All rights reserved. 13 Linear Correlation and Regression Analysis.
AP STATISTICS LESSON 14 – 1 ( DAY 1 ) INFERENCE ABOUT THE MODEL.
Hypothesis testing and Estimation
Multivariate Data. Descriptive techniques for Multivariate data In most research situations data is collected on more than one variable (usually many.
Multivariate data. Regression and Correlation The Scatter Plot.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Stats of Engineers, Lecture 8. 1.If the sample mean was larger 2.If you increased your confidence level 3.If you increased your sample size 4.If the population.
Lesson Testing the Significance of the Least Squares Regression Model.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Linear Regression Hypothesis testing and Estimation.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
Simple linear regression. What is simple linear regression? A way of evaluating the relationship between two continuous variables. One variable is regarded.
Marshall University School of Medicine Department of Biochemistry and Microbiology BMS 617 Lecture 10: Comparing Models.
Inference about the slope parameter and correlation
The simple linear regression model and parameter estimation
EXCEL: Multiple Regression
Multivariate Data.
AP Statistics Chapter 14 Section 1.
Hypothesis testing and Estimation
Estimating Population Variance
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
Linear Regression.
Comparing k Populations
Simple Linear Regression
Review of Chapter 2 Some Basic Concepts: Sample center
Correlation and Regression
Comparing k Populations
CHAPTER 12 More About Regression
Simple Linear Regression
Simple Linear Regression
Chapter 14 Inference for Regression
Presentation transcript:

The Simple Linear Regression Model

Estimators in Simple Linear Regression and

Sampling distributions of the estimators

Recall that if y 1, y 2, y 3 …, y n are 1.Independent 2.Normally distributed with means  1,  2,  3 …,  n and standard deviations  1,  2,  3 …,  n Then L = c 1 y 1 + c 2 y 2 + c 3 y 3 + … + c n y n is normal with mean and standard deviation

Sampling distribution the slope

Note : Also

Thus Hence where and standard deviation is normal with mean

Thus since and

Also

and standard deviation Henceis normal with mean

Sampling distribution of the intercept

The sampling distribution intercept of the least squares line : It can be shown that has a normal distribution with mean and standard deviation

Proof: where Thus

Also now

Hence and

and standard deviation Summary is normal with mean is normal with mean and standard deviation 1. 2.

Sampling distribution of the estimate of variance

The sampling distribution of s 2 This estimate of  is said to be based on n – 2 degrees of freedom

The sampling distribution of s 2 Recall that y 1, y 2, …, y n are independent, normal with mean  +  x i and standard deviation  Let Then z 1, z 2, …, z n are independent, normal with mean 0 and standard deviation 1, and Has a  2 distribution with n degrees of freedom

If  and  are replaced by their estimators: then has a  2 distribution with n-2 degrees of freedom Note:

Thus This verifies the statement made earlier that s 2 is an unbiased estimator of  2. and

and standard deviation Summary is normal with mean is normal with mean and standard deviation 1. 2.

and standard deviation Recall is normal with mean Therefore has a standard normal distribution

has a t distribution with n – 2 degrees of freedom and

(1 –  )100% Confidence Limits for slope  : t  /2 critical value for the t-distribution with n – 2 degrees of freedom

and standard deviation Also is normal with mean Therefore has a standard Normal distribution

and has a t distribution with n – 2 degrees of freedom

(1 –  )100% Confidence Limits for intercept  : t  /2 critical value for the t-distribution with n – 2 degrees of freedom

The following data showed the per capita consumption of cigarettes per month (X) in various countries in 1930, and the death rates from lung cancer for men in TABLE : Per capita consumption of cigarettes per month (X i ) in n = 11 countries in 1930, and the death rates, Y i (per 100,000), from lung cancer for men in Country (i)X i Y i Australia4818 Canada5015 Denmark3817 Finland11035 Great Britain11046 Holland4924 Iceland236 Norway259 Sweden3011 Switzerland5125 USA13020

Fitting the Least Squares Line

First compute the following three quantities:

Computing Estimate of Slope and Intercept

95% Confidence Limits for slope  : t.025 = critical value for the t-distribution with 9 degrees of freedom to

95% Confidence Limits for intercept  : to t.025 = critical value for the t-distribution with 9 degrees of freedom

(1 –  )100% Confidence Limits for a point on the regression line  +  x 0 : x y regression line  +  0 x 0 x0x0 y =  +  0 x

Let then and

Proof: where Note and Thus

Also now

Hence and

(1 –  )100% Confidence Limits for a point on the regression line intercept  +  x 0 : t  /2 critical value for the t-distribution with n - 2 degrees of freedom

Prediction In linear regression model

(1 –  )100% Prediction Limits for y when x = x 0 : t  /2 critical value for the t-distribution with n - 2 degrees of freedom