Download presentation
Presentation is loading. Please wait.
Published byMeghan Hudson Modified over 9 years ago
1
3.7 Multicollinearity ‘Perfect’ case is of no interest easily detectable Consequences of quasi-perfect multicollinearity : Larger VAR & errors Larger confidence intervals (less precision) Non-significant t High r 2 but few significant t’s LS very sensitive to changes in data Wrong sign of some coefficients Individual contribution not easy to assess How to detect? – High r 2 and few significant t – High correlation among variables (r > 0.8)
2
3.8 Relaxing the CLRM basic assumptions 1.Before: errors cancel out (exogeneity). Now: they don’t, and will affect the dependent variable Consequence : endogeneity LS is biased Hausman test; alternative: 2SLS (IV) 2.Before: same dispersion of errors (Homoskedasticity) Now: different dispersion (Heteroskedasticity) Consequence: inefficiency of LS larger variances/errors White test; alternative: GLS 3.Before: no autocorrelation of errors (no serial correlation) Now: autocorrelation of errors Consequence : inefficiency of LS larger variances/errors Durbin-Watson test ; alternative: GLS
3
3.8 Relaxing the CLRM basic assumptions 4.Before: normality of errors Now: absence of normality Consequence: hypothesis tests NOT valid Jarque-Bera test; central limit theorem 5.Before: absence of multicollinearity Now: multicollinearity Consequence: can’t calculate (perfect multicollinearity) or previous difficulties (quasi-perfect multicollinearity) Alternative: re-specify the model
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.