Multiple Regression Analysis: OLS Asymptotics

Slides:



Advertisements
Similar presentations
Multiple Regression Analysis
Advertisements

The Simple Regression Model
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
3.3 Omitted Variable Bias -When a valid variable is excluded, we UNDERSPECIFY THE MODEL and OLS estimates are biased -Consider the true population model:
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
8. Heteroskedasticity We have already seen that homoskedasticity exists when the error term’s variance, conditional on all x variables, is constant: Homoskedasticity.
Lecture 4 Econ 488. Ordinary Least Squares (OLS) Objective of OLS  Minimize the sum of squared residuals: where Remember that OLS is not the only possible.
Assumption MLR.3 Notes (No Perfect Collinearity)
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 2. Hypothesis Testing.
Chapter 10 Simple Regression.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Multiple Regression Analysis
4. Multiple Regression Analysis: Estimation -Most econometric regressions are motivated by a question -ie: Do Canadian Heritage commercials have a positive.
Multiple Regression Analysis
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 3. Asymptotic Properties.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Simple Linear Regression Analysis
Standard error of estimate & Confidence interval.
Ordinary Least Squares
Chapter 11 Simple Regression
7.1 Multiple Regression More than one explanatory/independent variable This makes a slight change to the interpretation of the coefficients This changes.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
1 Copyright © 2007 Thomson Asia Pte. Ltd. All rights reserved. CH5 Multiple Regression Analysis: OLS Asymptotic 
1Spring 02 Problems in Regression Analysis Heteroscedasticity Violation of the constancy of the variance of the errors. Cross-sectional data Serial Correlation.
3.4 The Components of the OLS Variances: Multicollinearity We see in (3.51) that the variance of B j hat depends on three factors: σ 2, SST j and R j 2.
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin The Two-Variable Model: Hypothesis Testing chapter seven.
405 ECONOMETRICS Domodar N. Gujarati Prof. M. El-Sakka
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
5. Consistency We cannot always achieve unbiasedness of estimators. -For example, σhat is not an unbiased estimator of σ -It is only consistent -Where.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
CLASSICAL NORMAL LINEAR REGRESSION MODEL (CNLRM )
Class 5 Multiple Regression CERAM February-March-April 2008 Lionel Nesta Observatoire Français des Conjonctures Economiques
INFERENCE Farrokh Alemi Ph.D.. Point Estimates Point Estimates Vary.
Quantitative Methods. Bivariate Regression (OLS) We’ll start with OLS regression. Stands for  Ordinary Least Squares Regression. Relatively basic multivariate.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Lecturer: Ing. Martina Hanová, PhD..  How do we evaluate a model?  How do we know if the model we are using is good?  assumptions relate to the (population)
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
Multiple Regression Analysis: Inference
Lecture 6 Feb. 2, 2015 ANNOUNCEMENT: Lab session will go from 4:20-5:20 based on the poll. (The majority indicated that it would not be a problem to chance,
Heteroscedasticity Chapter 8
Multiple Regression Analysis: Estimation
Announcements Reminder: Exam 1 on Wed.
Linear Regression with One Regression
Multiple Regression Analysis: Estimation
Chapter 4. Inference about Process Quality
Multiple Regression Analysis: OLS Asymptotics
Multivariate Regression
Welcome to Econ 420 Applied Regression Analysis
Multiple Regression Analysis
More on Specification and Data Issues
BIVARIATE REGRESSION AND CORRELATION
Instrumental Variables and Two Stage Least Squares
Serial Correlation and Heteroskedasticity in Time Series Regressions
Chapter 6: MULTIPLE REGRESSION ANALYSIS
Basic Econometrics Chapter 4: THE NORMALITY ASSUMPTION:
The regression model in matrix form
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Instrumental Variables and Two Stage Least Squares
Serial Correlation and Heteroscedasticity in
Instrumental Variables and Two Stage Least Squares
Heteroskedasticity.
Multiple Regression Analysis: OLS Asymptotics
Instrumental Variables Estimation and Two Stage Least Squares
Serial Correlation and Heteroscedasticity in
Multiple Regression Analysis
Presentation transcript:

Multiple Regression Analysis: OLS Asymptotics

Multiple Regression Analysis: OLS Asymptotics So far we focused on properties of OLS that hold for any sample Properties of OLS that hold for any sample/sample size Expected values/unbiasedness under MLR.1 – MLR.4 Variance formulas under MLR.1 – MLR.5 Gauss-Markov Theorem under MLR.1 – MLR.5 Exact sampling distributions/tests under MLR.1 – MLR.6 Properties of OLS that hold in large samples Consistency under MLR.1 – MLR.4 Asymptotic normality/tests under MLR.1 – MLR.5 Without assuming nor-mality of the error term!

Multiple Regression Analysis: OLS Asymptotics Consistency Interpretation: Consistency means that the probability that the estimate is arbitrari- ly close to the true population value can be made arbitrarily high by increasing the sample size Consistency is a minimum requirement for sensible estimators An estimator is consistent for a population parameter if for arbitrary and . Alternative notation: The estimate converges in proba-bility to the true population value

Multiple Regression Analysis: OLS Asymptotics Theorem 5.1 (Consistency of OLS) Special case of simple regression model Assumption MLR.4‘ One can see that the slope estimate is consistent if the explanatory variable is exogenous, i.e. un-correlated with the error term. All explanatory variables must be uncorrelated with the error term. This assumption is weaker than the zero conditional mean assumption MLR.4.

Multiple Regression Analysis: OLS Asymptotics For consistency of OLS, only the weaker MLR.4‘ is needed Asymptotic analog of omitted variable bias True model Misspecified model Bias There is no omitted variable bias if the omitted variable is irrelevant or uncorrelated with the included variable

Multiple Regression Analysis: OLS Asymptotics Asymptotic normality and large sample inference In practice, the normality assumption MLR.6 is often questionable If MLR.6 does not hold, the results of t- or F-tests may be wrong Fortunately, F- and t-tests still work if the sample size is large enough Also, OLS estimates are normal in large samples even without MLR.6 Theorem 5.2 (Asymptotic normality of OLS) Under assumptions MLR.1 – MLR.5: In large samples, the standardized estimates are normally distributed also

Multiple Regression Analysis: OLS Asymptotics Practical consequences In large samples, the t-distribution is close to the Normal(0,1) distribution As a consequence, t-tests are valid in large samples without MLR.6 The same is true for confidence intervals and F-tests Important: MLR.1 – MLR.5 are still necessary, esp. homoskedasticity Asymptotic analysis of the OLS sampling errors Converges to Converges to Converges to a fixed number

Multiple Regression Analysis: OLS Asymptotics Asymptotic analysis of the OLS sampling errors (cont.) This is why large samples are better Example: Standard errors in a birth weight equation shrinks with the rate shrinks with the rate Use only the first half of observations