What makes one estimator better than another Estimator is jargon term for method of estimating.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Regression and correlation methods
Kin 304 Regression Linear Regression Least Sum of Squares
Chapter 12 Simple Linear Regression
BA 275 Quantitative Business Methods
Lecture 3 HSPM J716. Efficiency in an estimator Efficiency = low bias and low variance Unbiased with high variance – not very useful Biased with low variance.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Regression Inferential Methods
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Simple Linear Regression. Start by exploring the data Construct a scatterplot  Does a linear relationship between variables exist?  Is the relationship.
Objectives (BPS chapter 24)
Simple Linear Regression
Statistics for the Social Sciences
Chapter 10 Simple Regression.
Correlation and Simple Regression Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Lecture 3 HSPM J716. New spreadsheet layout Coefficient Standard error T-statistic – Coefficient ÷ its Standard error.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
T-test.
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
Quantitative Business Analysis for Decision Making Simple Linear Regression.
Simple Linear Regression Analysis
Lecture 5 Correlation and Regression
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Inference for regression - Simple linear regression
Stat13-lecture 25 regression (continued, SE, t and chi-square) Simple linear regression model: Y=  0 +  1 X +  Assumption :  is normal with mean 0.
Inferences for Regression
Ch4 Describing Relationships Between Variables. Pressure.
+ Chapter 12: Inference for Regression Inference for Linear Regression.
Regression. Height Weight How much would an adult female weigh if she were 5 feet tall? She could weigh varying amounts – in other words, there is a distribution.
Regression. Height Weight Suppose you took many samples of the same size from this population & calculated the LSRL for each. Using the slope from each.
Ch4 Describing Relationships Between Variables. Section 4.1: Fitting a Line by Least Squares Often we want to fit a straight line to data. For example.
Regression. Population Covariance and Correlation.
Regression with Inference Notes: Page 231. Height Weight Suppose you took many samples of the same size from this population & calculated the LSRL for.
Inference for Regression Section Starter The Goodwill second-hand stores did a survey of their customers in Walnut Creek and Oakland. Among.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Stat 112: Notes 2 Today’s class: Section 3.3. –Full description of simple linear regression model. –Checking the assumptions of the simple linear regression.
Simple Linear Regression ANOVA for regression (10.2)
Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Dependent (response) Variable Independent (control) Variable Random Error XY x1x1 y1y1 x2x2 y2y2 …… xnxn ynyn Raw data: Assumption:  i ‘s are independent.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Lecture 8: Ordinary Least Squares Estimation BUEC 333 Summer 2009 Simon Woodcock.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
The simple linear regression model and parameter estimation
Inference for Regression
Simple Linear Regression
Regression Analysis AGEC 784.
AP Statistics Chapter 14 Section 1.
Kin 304 Regression Linear Regression Least Sum of Squares
BPK 304W Regression Linear Regression Least Sum of Squares
Simple Linear Regression - Introduction
Correlation and Simple Linear Regression
Multiple Regression A curvilinear relationship between one variable and the values of two or more other independent variables. Y = intercept + (slope1.
Linear Regression.
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
No notecard for this quiz!!
Correlation and Simple Linear Regression
Review of Chapter 2 Some Basic Concepts: Sample center
Simple Linear Regression
Simple Linear Regression
Simple Linear Regression
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

What makes one estimator better than another Estimator is jargon term for method of estimating

Estimate The estimator produces an estimate. The estimate is the number. The estimator is the method.

What makes one estimator better than another A better estimator is more likely to be close to the true line.

How close is our regression line to the true line? To answer, we must make assumptions. Assumption 1 is right in the question above. It’s that there is a true line that we’re trying to find. Assumptions are needed to assess an estimator.

To see where we’re going with the assumptions…

True line demo reviewdemo Y i = α + βX i + e i (spreadsheet)

Least squares demo reviewdemo

Errors’ expected value is 0. –Assumption 2 Why we draw our regression line through the middle of the points’ pattern Implies that the least squares estimator is unbiased Estimator = Method

Bias Unbiased means aimed at target. –Bias demodemo The expected value of the least squares slope is the true slope. Same for intercept.

All errors have the same variance –Assumption 3 Why you give each point equal consideration

Errors not correlated with each other –Assumption 4 Correlated means a linear relationship that lets you predict one error once you know another error. Serial correlation would be if one error helps you anticipate the direction of the next error.

Errors not correlated with each other Why you predict on the regression line rather than above or below it.

Normal distribution for errors –Assumption 5 Normal distribution results from the accumulation of small disturbances. Random walk with small steps. Normal distribution demos show how tight the normal distribution is.demos

Normal distribution for errors Least squares is best. –Unbiased –Least variance -- most efficient -- of any estimator that is unbiased Efficiency demosdemos Can do hypothesis testing.

1A spreadsheet adds … Standard error of coefficient for the slope T-statistic –Coefficient ÷ its Standard error R-squared Standard error of the regression

Standard error of coefficient Shows how near the estimated coefficient might be to the true coefficient.

t A unitless number with a known distribution, if the assumptions about the errors are true. Used here to test the hypothesis that the true slope parameter is 0.

R2R2 Between 0 and 1. DemoDemo Least squares maximizes this. Correlation coefficient r is square root.

Standard error of the regression “s” Should be called standard residual –But it isn’t

s Root-mean-square average size of the residuals s 2 is an estimate of  2

S 2 and  2 S2S2 Sum of squares of residuals Divided by N-2 22 Expected value of sum of squares of the errors Divided by N