3.7 Multicollinearity ‘Perfect’ case is of no interest  easily detectable Consequences of quasi-perfect multicollinearity : Larger VAR & errors Larger.

Slides:



Advertisements
Similar presentations
Heteroskedasticity Hill et al Chapter 11. Predicting food expenditure Are we likely to be better at predicting food expenditure at: –low incomes; –high.
Advertisements

Applied Econometrics Second edition
Economics 20 - Prof. Anderson
Multiple Regression W&W, Chapter 13, 15(3-4). Introduction Multiple regression is an extension of bivariate regression to take into account more than.
Multivariate Regression
Welcome to Econ 420 Applied Regression Analysis
3.3 Omitted Variable Bias -When a valid variable is excluded, we UNDERSPECIFY THE MODEL and OLS estimates are biased -Consider the true population model:
Chapter 11 Autocorrelation.
Regression Analysis Notes. What is a simple linear relation? When one variable is associated with another variable in such a way that two numbers completely.
Specification Error II
Instrumental Variables Estimation and Two Stage Least Square
Lecture 9 Autocorrelation
8.4 Weighted Least Squares Estimation Before the existence of heteroskedasticity-robust statistics, one needed to know the form of heteroskedasticity -Het.
Chapter 13 Additional Topics in Regression Analysis
Multiple Linear Regression Model
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Chapter 10 Simple Regression.
Prof. Dr. Rainer Stachuletz
Additional Topics in Regression Analysis
Econ 140 Lecture 181 Multiple Regression Applications III Lecture 18.
Topic 3: Regression.
Ekonometrika 1 Ekonomi Pembangunan Universitas Brawijaya.
Ordinary Least Squares
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
ECON 6012 Cost Benefit Analysis Memorial University of Newfoundland
Understanding Multivariate Research Berry & Sanders.
What does it mean? The variance of the error term is not constant
Lecture 17 Summary of previous Lecture Eviews. Today discussion  R-Square  Adjusted R- Square  Game of Maximizing Adjusted R- Square  Multiple regression.
2-1 MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4)
MultiCollinearity. The Nature of the Problem OLS requires that the explanatory variables are independent of error term But they may not always be independent.
Specification Error I.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Twelve.
Pure Serial Correlation
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
12.1 Heteroskedasticity: Remedies Normality Assumption.
OLS SHORTCOMINGS Preview of coming attractions. QUIZ What are the main OLS assumptions? 1.On average right 2.Linear 3.Predicting variables and error term.
Copyright © 2014 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
1Spring 02 Problems in Regression Analysis Heteroscedasticity Violation of the constancy of the variance of the errors. Cross-sectional data Serial Correlation.
11/11/20151 The Demand for Baseball Tickets 2005 Frank Francis Brendan Kach Joseph Winthrop.
REGRESSION DIAGNOSTICS Fall 2013 Dec 12/13. WHY REGRESSION DIAGNOSTICS? The validity of a regression model is based on a set of assumptions. Violation.
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
Problems with the Durbin-Watson test
Linear Correlation. PSYC 6130, PROF. J. ELDER 2 Perfect Correlation 2 variables x and y are perfectly correlated if they are related by an affine transform.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Chap 8 Heteroskedasticity
Analysis of Experimental Data II Christoph Engel.
Class 5 Multiple Regression CERAM February-March-April 2008 Lionel Nesta Observatoire Français des Conjonctures Economiques
4. Binary dependent variable Sometimes it is not possible to quantify the y’s Ex. To work or not? To vote one or other party, etc. Some difficulties: 1.Heteroskedasticity.
Linear Regression ( Cont'd ). Outline - Multiple Regression - Checking The Regression : Coeff. Determination Standard Error Confidence Interval Hypothesis.
2/25/ lecture 121 STATS 330: Lecture 12. 2/25/ lecture 122 Diagnostics 4 Aim of today’s lecture To discuss diagnostics for independence.
7-1 MGMG 522 : Session #7 Serial Correlation (Ch. 9)
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED?
Quantitative Methods. Bivariate Regression (OLS) We’ll start with OLS regression. Stands for  Ordinary Least Squares Regression. Relatively basic multivariate.
Lecturer: Ing. Martina Hanová, PhD..  How do we evaluate a model?  How do we know if the model we are using is good?  assumptions relate to the (population)
Heteroscedasticity Heteroscedasticity is present if the variance of the error term is not a constant. This is most commonly a problem when dealing with.
Ch5 Relaxing the Assumptions of the Classical Model
Statistical Inference
Linear Regression with One Regression
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
Econometric methods of analysis and forecasting of financial markets
REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY
Fundamentals of regression analysis
The non-parametric tests
Serial Correlation and Heteroskedasticity in Time Series Regressions
Experimental Design Data Normal Distribution
Regression III.
BEC 30325: MANAGERIAL ECONOMICS
Checking Assumptions Primary Assumptions Secondary Assumptions
Chapter 13 Additional Topics in Regression Analysis
BEC 30325: MANAGERIAL ECONOMICS
Presentation transcript:

3.7 Multicollinearity ‘Perfect’ case is of no interest  easily detectable Consequences of quasi-perfect multicollinearity : Larger VAR & errors Larger confidence intervals (less precision) Non-significant t High r 2 but few significant t’s LS very sensitive to changes in data Wrong sign of some coefficients Individual contribution not easy to assess How to detect? – High r 2 and few significant t – High correlation among variables (r > 0.8)

3.8 Relaxing the CLRM basic assumptions 1.Before: errors cancel out (exogeneity). Now: they don’t, and will affect the dependent variable Consequence : endogeneity  LS is biased Hausman test; alternative: 2SLS (IV) 2.Before: same dispersion of errors (Homoskedasticity) Now: different dispersion (Heteroskedasticity) Consequence: inefficiency of LS  larger variances/errors White test; alternative: GLS 3.Before: no autocorrelation of errors (no serial correlation) Now: autocorrelation of errors Consequence : inefficiency of LS  larger variances/errors Durbin-Watson test ; alternative: GLS

3.8 Relaxing the CLRM basic assumptions 4.Before: normality of errors Now: absence of normality Consequence: hypothesis tests NOT valid Jarque-Bera test; central limit theorem 5.Before: absence of multicollinearity Now: multicollinearity Consequence: can’t calculate (perfect multicollinearity) or previous difficulties (quasi-perfect multicollinearity) Alternative: re-specify the model