MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED?

Slides:



Advertisements
Similar presentations
Further Inference in the Multiple Regression Model Hill et al Chapter 8.
Advertisements

Heteroskedasticity Hill et al Chapter 11. Predicting food expenditure Are we likely to be better at predicting food expenditure at: –low incomes; –high.
Managerial Economics in a Global Economy
Multiple Regression W&W, Chapter 13, 15(3-4). Introduction Multiple regression is an extension of bivariate regression to take into account more than.
Multivariate Regression
Welcome to Econ 420 Applied Regression Analysis
The Multiple Regression Model.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
3.3 Omitted Variable Bias -When a valid variable is excluded, we UNDERSPECIFY THE MODEL and OLS estimates are biased -Consider the true population model:
Specification Error II
Introduction and Overview
Studenmund(2006): Chapter 8
Linear Regression with One Regression
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
Chapter 7 Multicollinearity. What is in this Chapter? In Chapter 4 we stated that one of the assumptions in the basic regression model is that the explanatory.
Multicollinearity Omitted Variables Bias is a problem when the omitted variable is an explanator of Y and correlated with X1 Including the omitted variable.
1.The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated. 2.Homoscedasticity --the.
Chapter 9 Multicollinearity
Ekonometrika 1 Ekonomi Pembangunan Universitas Brawijaya.
Stat 112: Lecture 9 Notes Homework 3: Due next Thursday
AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Chapter 13.3 Multicollinearity.
Simple Linear Regression Analysis
Forecasting Revenue: An Example of Regression Model Building Setting: Possibly a large set of predictor variables used to predict future quarterly revenues.
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
Nonlinear Regression Functions
BSc (Hons) Finance II/ BSc (Hons) Finance with Law II
What does it mean? The variance of the error term is not constant
Lecture 17 Summary of previous Lecture Eviews. Today discussion  R-Square  Adjusted R- Square  Game of Maximizing Adjusted R- Square  Multiple regression.
MULTICOLLINEARITY: WHAT HAPPENS IF THE EGRESSORS
MultiCollinearity. The Nature of the Problem OLS requires that the explanatory variables are independent of error term But they may not always be independent.
Dept of Economics Kuwait University
Specification Error I.
3.7 Multicollinearity ‘Perfect’ case is of no interest  easily detectable Consequences of quasi-perfect multicollinearity : Larger VAR & errors Larger.
©2006 Thomson/South-Western 1 Chapter 14 – Multiple Linear Regression Slides prepared by Jeff Heyl Lincoln University ©2006 Thomson/South-Western Concise.
Chapter 16 Data Analysis: Testing for Associations.
Stat 112 Notes 9 Today: –Multicollinearity (Chapter 4.6) –Multiple regression and causal inference.
AUTOCORRELATION: WHAT HAPPENS IF THE ERROR TERMS ARE CORRELATED?
I271B QUANTITATIVE METHODS Regression and Diagnostics.
1. The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated. 2. Homoscedasticity--the.
5-1 MGMG 522 : Session #5 Multicollinearity (Ch. 8)
Linear Regression ( Cont'd ). Outline - Multiple Regression - Checking The Regression : Coeff. Determination Standard Error Confidence Interval Hypothesis.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
1/25 Introduction to Econometrics. 2/25 Econometrics Econometrics – „economic measurement“ „May be defined as the quantitative analysis of actual economic.
8- Multiple Regression Analysis: The Problem of Inference The Normality Assumption Once Again Example 8.1: U.S. Personal Consumption and Personal Disposal.
Chapter 12 REGRESSION DIAGNOSTICS AND CANONICAL CORRELATION.
Ch5 Relaxing the Assumptions of the Classical Model
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
REGRESSION DIAGNOSTIC II: HETEROSCEDASTICITY
THE LINEAR REGRESSION MODEL: AN OVERVIEW
Econometric methods of analysis and forecasting of financial markets
Multivariate Regression
EED 401: ECONOMETRICS Chapter # 11: MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED? Domodar N. Gujarati Haruna Issahaku.
REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY
Fundamentals of regression analysis
Quantitative Methods PSY302 Quiz Chapter 9 Statistical Significance
LESSON 24: INFERENCES USING REGRESSION
REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED
Some issues in multivariate regression
Multicollinearity Susanta Nag Assistant Professor Department of Economics Central University of Jammu.
Chapter 7: The Normality Assumption and Inference with OLS
BEC 30325: MANAGERIAL ECONOMICS
Chapter 13 Additional Topics in Regression Analysis
Instrumental Variables Estimation and Two Stage Least Squares
Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences? It may be difficult to separate.
Heteroskedasticity.
Financial Econometrics Fin. 505
BEC 30325: MANAGERIAL ECONOMICS
Presentation transcript:

MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED? CHAPTER 10. MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED?

Assumption 10 of the CLRM No Multicollinearity among the regressors in the regression model. The questions in Multicollinearity: What is the nature of multicollinearity? Is multicollinearity really a problem? What are its practical consequences? How does one detect it? What remedial measures can be taken to alleviate the problem of multicollinearity?

Why Multicollinearity? If multicollinearity is perfect, the regression coefficients of the X variables are indeterminate and their standard errors are infinite. If multicollinearity is less than perfect, the regression coefficients, although determinate, possess large standard errors, which means the coefficients cannot be estimated with great precision or accuracy.

Additional Sources of Multicollinearity Data collection method: Limited sampling Constraints on the model or in the population being sampled. For example, in the regression of electricity consumption on income (X2) and house size (X3) there is physical constraint in the population in that families with higher incomes generally have larger homes. Model Specification: Employing incorrect model An overdetermined model: where number of explanatory variables are more than the number of observations.

Consequences of Multicollinearity Large variances and covariances which make precise estimation difficult Wider confidence intervals leading to the acceptance of the “zero null hypothesis”. Insignificant t ratios High R-squares OLS estimators and their standard errors can be sensitive to small changes in data

How to Detect Multicollinearity High R-square but few significant t ratios High pair-wise correlations among regressors Auxiliary Regressions: regression each X on another and compare R-square and F values Eigenvalues and condition index Tolerance and variance inflation factor

Remedial Measures A Priori Information Combining cross sectional and time series data that is known as PANEL DATA. Dropping variable(s) by avoiding specification bias or error. Transformation of variables such as differencing, lagging, and ratio transformation.