Stat13-lecture 25 regression (continued, SE, t and chi-square) Simple linear regression model: Y=  0 +  1 X +  Assumption :  is normal with mean 0.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Kin 304 Regression Linear Regression Least Sum of Squares
CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION
Chapter 12 Simple Linear Regression
BA 275 Quantitative Business Methods
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Probability & Statistical Inference Lecture 9
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Linear regression models
Objectives (BPS chapter 24)
Simple Linear Regression
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
What makes one estimator better than another Estimator is jargon term for method of estimating.
The Simple Linear Regression Model: Specification and Estimation
Chapter 10 Simple Regression.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
T-test.
Useful Statistical Distributions for Econometrics Econometrics is usually concerned with the estimation of equations of the form: The normal distribution.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Chapter 12 Section 1 Inference for Linear Regression.
Simple Linear Regression Analysis
Lecture 5 Correlation and Regression
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Simple linear regression Linear regression with one predictor variable.
Simple Linear Regression Models
+ Chapter 12: Inference for Regression Inference for Linear Regression.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
1 Lecture 4 Main Tasks Today 1. Review of Lecture 3 2. Accuracy of the LS estimators 3. Significance Tests of the Parameters 4. Confidence Interval 5.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Environmental Modeling Basic Testing Methods - Statistics III.
Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.
Essentials of Business Statistics: Communicating with Numbers By Sanjiv Jaggia and Alison Kelly Copyright © 2014 by McGraw-Hill Higher Education. All rights.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
The Simple Linear Regression Model. Estimators in Simple Linear Regression and.
Regression through the origin
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Linear Regression Hypothesis testing and Estimation.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
The simple linear regression model and parameter estimation
Regression Analysis AGEC 784.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Kin 304 Regression Linear Regression Least Sum of Squares
BPK 304W Regression Linear Regression Least Sum of Squares
...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001
Simple Linear Regression - Introduction
Correlation and Simple Linear Regression
I271B Quantitative Methods
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Simple Linear Regression
Unit 3 – Linear regression
Correlation and Simple Linear Regression
Undergraduated Econometrics
Statistical Process Control
Simple Linear Regression
Simple Linear Regression and Correlation
Simple Linear Regression
Simple Linear Regression
Regression & Prediction
Regression Models - Introduction
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Stat13-lecture 25 regression (continued, SE, t and chi-square) Simple linear regression model: Y=  0 +  1 X +  Assumption :  is normal with mean 0 variance  2 The fitted line is obtained by minimizing the sum of squared residuals; that is finding  0 and  1 so that (Y 1 -  0 -  1 X 1 ) 2 + …. (Y n -  0 -  1 X n ) 2 is as small as possible This method is called least squares method

Least square line is the same as the regression line discussed before It follows that estimated slope  1 can be computed by r [SD(Y)/SD(X)] = [cov(X,Y)/SD(X)SD(Y)][SD(Y)/SD(X)] =cov(X,Y)/VAR(X) (this is the same as equation for hat  1 on page 518) The intercept   is estimated by putting x=0 in the regression line; yielding equation on page 518 Therefore, there is no need to memorize the equation for least square line; computationally it is advantageous to use cov(X,Y)/var(X) instead of r[SD(Y)/SD(X)]

Finding residuals and estimating the variance of  Residuals = differences between Y and the regression line (the fitted line) An unbiased estimate of  2 is [sum of squared residuals]/ (n-2) Which divided by (n-2) ? Degree of freedom is n-2 because two parameters were estimated [sum of squared residuals]/  2 follows a chi- square.

Hypothesis testing for slope Slope estimate  1 is random It follows a normal distribution with mean equal to the true  1 and the variance equal to  2 / [n var(X)] Because  2 is unknown, we have to estimate from the data ; the SE (standard error) of the slope estimate is equal to the squared root of the above

t-distribution Suppose an estimate hat  is normal with variance c  2. Suppose  2 is estimated by s 2 which is related to a chi-squared distribution Then (  -  )/ (c s 2 ) follows a t-distribution with the degrees of freedom equal to the chi-square degree freedom

An example Determining small quantities of calcium in presence of magnesium is a difficult problem of analytical chemists. One method involves use of alcohol as a solvent. The data below show the results when applying to 10 mixtures with known quantities of CaO. The second column gives Amount CaO recovered. Question of interest : test to see if intercept is 0 ; test to see if slope is 1.

X:CaO present Y:CaO recovered Fitted value residual

Least Squares Estimates: Constant ( ) Predictor ( E-3) R Squared: Sigma hat: Number of cases: 10 Degrees of freedom: 8 Estimate Standard error Squared correlation Estimate of SD(  ) ( )/ E-3 = /.1378 =1.6547