Download presentation
Presentation is loading. Please wait.
1
Fundamentals of regression analysis 2
Obid A.Khakimov
2
OLS Estimation: Hetroscedasticity
If variance of residuals is constant then Our equation collapses to original variance Formula.
3
Consequences: The regression coefficients are unbiased
The usual formula for coefficient variances is wrong The OLS estimation is BLU but not efficient. t-test and F test are not valid.
4
Method of Generalized Least Squares
5
Method of Generalized Least Squares
6
Hetroscedasticity: Detection.
Graphical Method Park test White’s general Hetroscedasticity test Breush-Pagan-Godfrey Test
7
Park test If coefficient beta is statistically different from zero,
Is not known and we use If coefficient beta is statistically different from zero, it indicates that Hetroscedasticity is present.
8
Goldfeld-Quandt Test Order your sample from lowest to highest. Omit your central your central observation and divide your sample into two samples. 2. Run two regressions for two samples and obtain RSS1 and RSS2. RSS1 represents RSS from small X sample. 3. Each RSS has following degrees of freedom Calculate Follows F distribution with df of num and denom equal to
9
Breush-Pagan-Godfrey Test
If you reject your Null hypothesis then there is Hetroscedasticity. .
10
Breush-Pagan-Godfrey Test
Step1. Estimate original regression model and get residuals . Step2. Obtain Step3. Construct Step4. Estimate the following regression model. Step5. Obtain m- is number of parameters of Step 4 regression
11
White’s general Hetroscedasticity test
Step1. Estimate original regression model and get residuals . Step2. Estimate If you reject your Null hypothesis then there is Hetroscedasticity. .
12
Remedial Measures Weighted Least squares
White’s Hetroscedasticity consistent variance and standard errors. Transformations according to Hetroscedasticity pattern.
13
LM test score and assume that Regress each element of X2 onto all elements of X1 and collect residual in r matrix Then form u*R Then run regression 1 on ur
14
Autocorrelation reasons:
Inertia. Specification Bias: Omitted relevant variables. Specification bias: Incorrect functional form. Cobweb phenomenon. Lags Data manipulation. Data Transformation. Non-stationary
15
Consequences: The regression coefficients are unbiased
The usual formula for coefficient variances is wrong The OLS estimation is BLU but not efficient. t-test and F test are not valid.
16
Detection: Breusch-Godfrey
There is no kth order serial correlation Test Statistic Where n- number of observations, p-number of residual lag variables.
17
Generalized Least Square
If the value of rho is known If the value of rho is not known
18
Cochrane-Orcutt procedure :
First estimate original regression and obtain residuals After runing AR(1) regression obtain the value of Rho and run GLS regression Using GLS coefficients obtain new residuals and obtain new value of Rho Continue the process until you get convergence in coefficients.
19
Endogeneity 1. Omission of relevant variables
20
What to do ? If omitted variable does not relate to other included independent variables then OLS estimator still BLUE Proxy variables Use other methods other than OLS
21
Measurement error
22
Measurement error: independent variable
23
Endogenous regressors and bias
Bias. Single equation (OLS) estimators will be biased if one or more regressors is endogenous (jointly dependent). Consistency. Indirect Least Squares, Instrumental Variables or Two Stage Least Squares. 3
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.