Presentation is loading. Please wait.

Presentation is loading. Please wait.

Econometrics I Professor William Greene Stern School of Business

Similar presentations


Presentation on theme: "Econometrics I Professor William Greene Stern School of Business"— Presentation transcript:

1 Econometrics I Professor William Greene Stern School of Business
Department of Economics

2 Econometrics I Part 6 – Estimating the Variance of b

3

4 Econometric Dire Emergency

5 Context The true variance of b|X is 2(XX)-1 . We consider how to use the sample data to estimate this matrix. The ultimate objectives are to form interval estimates for regression slopes and to test hypotheses about them. Both require estimates of the variability of the distribution.

6 Estimating 2 Using the residuals instead of the disturbances:
The natural estimator: ee/N as a sample surrogate for /n Imperfect observation of i, ei = i - ( - b)xi Downward bias of ee/N. We obtain the result E[ee|X] = (N-K)2

7 Expectation of ee

8 Method 1:

9 Estimating σ2 The unbiased estimator is s2 = ee/(N-K).
“Degrees of freedom correction” Therefore, the unbiased estimator of 2 is s2 = ee/(N-K)

10 Method 2: Some Matrix Algebra

11 Decomposing M

12 Example: Characteristic Roots of a Correlation Matrix

13

14 Gasoline Data

15 X’X and its Roots

16 Var[b|X] Estimating the Covariance Matrix for b|X
The true covariance matrix is 2 (X’X)-1 The natural estimator is s2(X’X)-1 “Standard errors” of the individual coefficients are the square roots of the diagonal elements.

17 X’X (X’X)-1 s2(X’X)-1

18 Standard Regression Results
Ordinary least squares regression LHS=G Mean = Standard deviation = Number of observs. = Model size Parameters = Degrees of freedom = Residuals Sum of squares = Standard error of e = <= sqr[ /(36 – 7)] Fit R-squared = Adjusted R-squared = Variable| Coefficient Standard Error t-ratio P[|T|>t] Mean of X Constant| PG| *** Y| *** TREND| ** PNC| PUC| PPT| **

19 The Variance of OLS - Sandwiches

20 Robust Covariance Estimation
Not a structural estimator of XX/n If the condition is present, the estimator estimates the true variance of the OLS estimator If the condition is not present, the estimator estimates the same matrix that (2/n)(X’X/n)-1 estimates . Heteroscedasticity Autocorrelation Common effects

21 Heteroscedasticity Robust Covariance Matrix
Robust estimation: Generality How to estimate Var[b|X] = 2 (X’X)-1 XX (X’X)-1 for the LS b? The distinction between estimating 2 an n by n matrix and estimating the KxK matrix 2 XX = 2 ijij xi xj NOTE…… VVVIRs for modern applied econometrics. The White estimator Newey-West.

22

23 The White Estimator

24 Groupwise Heteroscedasticity
Countries are ordered by the standard deviation of their 19 residuals. Regression of log of per capita gasoline use on log of per capita income, gasoline price and number of cars per capita for 18 OECD countries for 19 years. The standard deviation varies by country. The “solution” is “weighted least squares.”

25 White Estimator |Variable| Coefficient | Standard Error |t-ratio |P[|T|>t]| Mean of X| Constant| LINCOMEP| LRPMG | LCARPCAP| | White heteroscedasticity robust covariance matrix | Constant| LINCOMEP| LRPMG | LCARPCAP|

26 Autocorrelated Residuals logG=β1 + β2logPg + β3logY + β4logPnc + β5logPuc + ε

27 The Newey-West Estimator Robust to Autocorrelation

28 Newey-West Estimate Variable| Coefficient Standard Error t-ratio P[|T|>t] Mean of X Constant| *** LP| LY| *** LPNC| ** LPUC| Robust VC Newey-West, Periods = Constant| *** LP| LY| *** LPNC| ** LPUC|

29 Panel Data Presence of omitted effects
Potential bias/inconsistency of OLS – depends on the assumptions about unobserved c. Variance of OLS is affected by autocorrelation in most cases.

30 Estimating the Sampling Variance of b
s2(X ́X)-1? Inappropriate because Correlation across observations (certainly) Heteroscedasticity (possibly) A ‘robust’ covariance matrix Robust estimation (in general) The White estimator A Robust estimator for OLS.

31 Cluster Robust Estimator

32

33

34 Alternative OLS Variance Estimators Cluster correction increases SEs
|Variable | Coefficient | Standard Error |b/St.Er.|P[|Z|>z] | Constant EXP EXPSQ D OCC SMSA MS FEM UNION ED Robust Constant EXP EXPSQ D OCC SMSA MS FEM UNION ED

35 Bootstrapping Some assumptions that underlie it - the sampling mechanism Method: 1. Estimate using full sample: --> b 2. Repeat R times: Draw N observations from the n, with replacement Estimate  with b(r). 3. Estimate variance with V = (1/R)r [b(r) - b][b(r) - b]’

36 Bootstrap Application
matr;bboot=init(3,21,0.)$ Store results here name;x=one,y,pg$ Define X regr;lhs=g;rhs=x$ Compute b calc;i=0$ Counter Proc Define procedure regr;lhs=g;rhs=x;quietly$ … Regression matr;{i=i+1};bboot(*,i)=b$... Store b(r) Endproc Ends procedure exec;n=20;bootstrap=b$ bootstrap reps matr;list;bboot' $ Display results

37 Results of Bootstrap Procedure
Variable| Coefficient Standard Error t-ratio P[|T|>t] Mean of X Constant| *** Y| *** PG| *** Completed bootstrap iterations. Results of bootstrap estimation of model. Model has been reestimated times. Means shown below are the means of the bootstrap estimates. Coefficients shown below are the original estimates based on the full sample. bootstrap samples have 36 observations. Variable| Coefficient Standard Error b/St.Er. P[|Z|>z] Mean of X B001| *** B002| *** B003| ***

38 Bootstrap Replications
Full sample result Bootstrapped sample results

39 Results of C&R Bootstrap Estimation

40 Bootstrap variance for a panel data estimator
Panel Bootstrap = Block Bootstrap Data set is N groups of size Ti Bootstrap sample is N groups of size Ti drawn with replacement.

41

42 Quantile Regression: Application of Bootstrap Estimation

43 OLS vs. Least Absolute Deviations
Least absolute deviations estimator Residuals Sum of squares = Standard error of e = Fit R-squared = Variable| Coefficient Standard Error b/St.Er. P[|Z|>z] Mean of X |Covariance matrix based on 50 replications. Constant| *** Y| *** PG| *** Ordinary least squares regression Residuals Sum of squares = Standard error of e = Standard errors are based on Fit R-squared = bootstrap replications Variable| Coefficient Standard Error t-ratio P[|T|>t] Mean of X Constant| *** Y| *** PG| ***

44 Quantile Regression Q(y|x,) = x,  = quantile
Estimated by linear programming Q(y|x,.50) = x, .50  median regression Median regression estimated by LAD (estimates same parameters as mean regression if symmetric conditional distribution) Why use quantile (median) regression? Semiparametric Robust to some extensions (heteroscedasticity?) Complete characterization of conditional distribution

45 Estimated Variance for Quantile Regression
Asymptotic Theory Bootstrap – an ideal application

46

47  = .25  = .50  = .75

48

49


Download ppt "Econometrics I Professor William Greene Stern School of Business"

Similar presentations


Ads by Google