5. Consistency We cannot always achieve unbiasedness of estimators. -For example, σhat is not an unbiased estimator of σ -It is only consistent -Where.

Slides:



Advertisements
Similar presentations
Multiple Regression Analysis
Advertisements

The Simple Regression Model
Statistical Techniques I EXST7005 Simple Linear Regression.
3.3 Omitted Variable Bias -When a valid variable is excluded, we UNDERSPECIFY THE MODEL and OLS estimates are biased -Consider the true population model:
Lecture 3 Today: Statistical Review cont’d:
3.2 OLS Fitted Values and Residuals -after obtaining OLS estimates, we can then obtain fitted or predicted values for y: -given our actual and predicted.
Econ 140 Lecture 61 Inference about a Mean Lecture 6.
Random effects estimation RANDOM EFFECTS REGRESSIONS When the observed variables of interest are constant for each individual, a fixed effects regression.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
Assumption MLR.3 Notes (No Perfect Collinearity)
8.4 Weighted Least Squares Estimation Before the existence of heteroskedasticity-robust statistics, one needed to know the form of heteroskedasticity -Het.
Part 1 Cross Sectional Data
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
2.5 Variances of the OLS Estimators
Simple Linear Regression
Point estimation, interval estimation
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly.
Multiple Regression Analysis
4. Multiple Regression Analysis: Estimation -Most econometric regressions are motivated by a question -ie: Do Canadian Heritage commercials have a positive.
FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.
Multiple Regression Analysis
Visual Recognition Tutorial
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 3. Asymptotic Properties.
Economics 20 - Prof. Anderson
The Simple Regression Model
Topic4 Ordinary Least Squares. Suppose that X is a non-random variable Y is a random variable that is affected by X in a linear fashion and by the random.
Lecture 2 (Ch3) Multiple linear regression
6.4 Prediction -We have already seen how to make predictions about our dependent variable using our OLS estimates and values for our independent variables.
FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.
1Prof. Dr. Rainer Stachuletz Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
3. Multiple Regression Analysis: Estimation -Although bivariate linear regressions are sometimes useful, they are often unrealistic -SLR.4, that all factors.
Intermediate Econometrics
Hypothesis Testing in Linear Regression Analysis
2-1 MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4)
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Estimation (Point Estimation)
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
Properties of OLS How Reliable is OLS?. Learning Objectives 1.Review of the idea that the OLS estimator is a random variable 2.How do we judge the quality.
10. Basic Regressions with Times Series Data 10.1 The Nature of Time Series Data 10.2 Examples of Time Series Regression Models 10.3 Finite Sample Properties.
1 Copyright © 2007 Thomson Asia Pte. Ltd. All rights reserved. CH5 Multiple Regression Analysis: OLS Asymptotic 
2.4 Units of Measurement and Functional Form -Two important econometric issues are: 1) Changing measurement -When does scaling variables have an effect.
Chapter 5 Parameter estimation. What is sample inference? Distinguish between managerial & financial accounting. Understand how managers can use accounting.
3.4 The Components of the OLS Variances: Multicollinearity We see in (3.51) that the variance of B j hat depends on three factors: σ 2, SST j and R j 2.
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin The Two-Variable Model: Hypothesis Testing chapter seven.
405 ECONOMETRICS Domodar N. Gujarati Prof. M. El-Sakka
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
1 We will now look at the properties of the OLS regression estimators with the assumptions of Model B. We will do this within the context of the simple.
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
CLASSICAL NORMAL LINEAR REGRESSION MODEL (CNLRM )
Joint Moments and Joint Characteristic Functions.
Economics 20 - Prof. Anderson1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
1 STOCHASTIC REGRESSORS Until now we have assumed that the explanatory variables in a regression model are nonstochastic, that is, that they do not have.
AUTOCORRELATION 1 Assumption B.5 states that the values of the disturbance term in the observations in the sample are generated independently of each other.
Econometrics III Evgeniya Anatolievna Kolomak, Professor.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
Lecture 6 Feb. 2, 2015 ANNOUNCEMENT: Lab session will go from 4:20-5:20 based on the poll. (The majority indicated that it would not be a problem to chance,
Multiple Regression Analysis: Estimation
Multiple Regression Analysis
Basic Econometrics Chapter 4: THE NORMALITY ASSUMPTION:
Statistical Assumptions for SLR
Some issues in multivariate regression
Simple Linear Regression
Tutorial 1: Misspecification
Multiple Regression Analysis: OLS Asymptotics
Multiple Regression Analysis: OLS Asymptotics
Instrumental Variables Estimation and Two Stage Least Squares
Multiple Regression Analysis
Presentation transcript:

5. Consistency We cannot always achieve unbiasedness of estimators. -For example, σhat is not an unbiased estimator of σ -It is only consistent -Where unbiasedness cannot be achieved, consistency is the minimum requirement for an estimator -Consistency requires MLR. 1 through MLR.4, as well as no correlation between x’s

5. Intuitive Consistency While the actual proof of consistency is complicated, it can be intuitively explained -Each sample of n observations produces a B j hat with a given distribution -MLR. 1 through MLR. 4 cause this B j hat to be unbiased with mean B j -If the estimator is consistent, as n increases the distribution becomes more tightly distributed around B j -as n tends to infinity, B j hat’s distribution collapses to B j

5. Empirical Consistency In general, If obtaining more data DOES NOT get us closer to our parameter of interest… We are using a poor (inconsistent) estimator. -Fortunately, the same assumptions imply unbiasedness and consistency:

Theorem 5.1 (Consistency of OLS) Under assumptions MLR. 1 through MLR. 4, the OLS estimator B j hat is consistent for B j for all j=0, 1,…,k.

Theorem 5.1 Notes While a general proof of this theorem requires matrix algebra, the single independent variable case can be proved from our B 1 hat estimator: Which uses the fact that y i =B 0 +B 1 x i1 +u 1 and previously seen algebraic properties

Theorem 5.1 Notes Using the law of large numbers, the numerator and denominator converge in probability to the population quantities Cov(x 1,u) and Var(x 1 ) -Since Var(x 1 )≠0 (MLR.3), we can use probability limits (Appendix C) to conclude: Note that MLR.4, which assumes x 1 and u aren’t correlated, is essential to the above -Technically, Var(x 1 ) and Var(u) should also be less than infinity

5. Correlation and Inconsistency -If MLR. 4 fails, consistency fails -that is, correlation between u and ANY x generally causes all OLS estimators to be inconsistent -”if the error is correlated with any of the independent variables, then OLS is biased and inconsistent” -in the simple regression case, the INCONSISTENCY in B 1 hat (or ASYMPTOTIC BIAS) is:

5. Correlation and Inconsistency -Since variance is always positive, the sign of inconsistency depends on the sign of covariance -If the covariance is small compared to the variance, the inconsistency is negligible -However we can’t estimate this covariance as u is unobserved

5. Correlation and Inconsistency Consider the following true model: Where we satisfy MLR.1 through MLR.4 (v has a zero mean and is uncorrelated with x 1 and x 2 ) -By Theorem 5.1 our OLS estimators (B j hat) are consistent -If we omit x 2 and run an OLS regression, then u=B 2 x 2 +v and

5. Correlation and Inconsistency Practically, inconsistency can be viewed the same as bias -Inconsistency deals with population covariance and variance -Bias deals with sample covariance and variance -If x 1 and x 2 are uncorrelated, the delta 1 =0 and B 1 tilde is consistent (but not necessarily unbiased)

5. Inconsistency -The direction of inconsistency can be calculated using the same table as bias: Corr(x 1,x 2 )>0Corr(x 1,x 2 )<0 B 2 hat>0Positive BiasNegative Bias B 2 hat<0Negative BiasPositive Bias

5. Inconsistency Notes If OLS is inconsistent, adding observations does not fix it -in fact, increasing sample size makes the problem worse -In the k regressor case, correlation between one x variable and u generally makes ALL coefficient estimators inconsistent -The one exception is when x j is correlated with u but ALL other variables are uncorrelated with both x j and u -Here only B j hat is inconsistent