Download presentation
Presentation is loading. Please wait.
Published byEunice Allison Modified over 9 years ago
2
1. The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated. 2. Homoscedasticity--the probability distributions of the error term have a constant variance for all values of the independent variables (X i 's). Perfect multicollinearity is a violation of assumption (1).Heteroscedasticity is a violation of assumption (2)
3
Multicollinearity is a problem with time series regression Suppose we wanted to estimate the following specification using quarterly time series data: Auto Sales t = 0 + 1 Income t + 2 Prices t where Income t is (nominal income in quarter t and Prices t is an index of auto prices in quarter t. The data reveal there is a strong (positive) correlation between nominal income and car prices
4
0 (Nominal) income Car prices Approximate linear relationship between explanatory variables
5
Why is multicollinearity a problem? In the case of perfectly collinear explanatory variables, OLS does not work. In the case where there is an approximate linear relationship among the explanatory variables (X i ’s), the estimates of the coefficients are still unbiased, but you run into the following problems: The estimates of the coefficients have high standard errors, weakening the capacity of the equation to produce accurate forecasts. The aforementioned problem means small t-ratios and greater likelihood that the null hypotheses will not be rejected. Muticollinearity means that the effect of the independent variables is mingled together--this makes it difficult for the researcher to disentangle the separate effects of the explanatory variables on the dependent variable. Estimates of the coefficients ( ’s) are “unstable,” meaning that a comparatively small change in the data set can produce a big change in the estimate of the coefficient.
6
How do you know you have a problem with multicollinearity? +Do the estimates have high standard errors? Are the t-ratios microscopic? +Does the correlation matrix reveal a high correlation between explanatory variables--say, 0.70 or higher?
7
What can be done about multicollinearity? Increase sample size Delete one or more explanatory variables form your specification
8
3Heteroscedasticity sometimes shows up when we do regression analysis using cross-sectional data. 3Consider the following model: is the deterministic part of the equation and e i is the error term. Recall that we assume that E( ) = 0
9
2 -2 0 4 -4 200 0 -400 -200 400 JAR #1JAR #2 = 0 Two distributions with the same mean and different variances
10
X1X1 X2X2 X2X2 X Y 0 P( ) The disturbance distributions of heteroscedasticity
11
Household Income Spending for electronics Scatter diagram of ascending heteroscedasticity
12
Why is heteroscedasticity a problem? Heteroscedasticity does not give us biased estimates of the coefficients--however, it does make the standard errors of the estimates unreliable. That is, we will understate the standard errors. Due to the aforementioned problem, t-tests cannot be trusted. We run the risk of rejecting a null hypothesis that should not be rejected.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.