Financial Econometrics Fin. 505

Slides:



Advertisements
Similar presentations
Applied Econometrics Second edition
Advertisements

CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION
Forecasting Using the Simple Linear Regression Model and Correlation
Hypothesis Testing Steps in Hypothesis Testing:
8. Heteroskedasticity We have already seen that homoskedasticity exists when the error term’s variance, conditional on all x variables, is constant: Homoskedasticity.
Module II Lecture 6: Heteroscedasticity: Violation of Assumption 3
Chapter 13 Additional Topics in Regression Analysis
Chapter Topics Types of Regression Models
Chapter 11 Multiple Regression.
Review.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
ECON 7710, Heteroskedasticity What is heteroskedasticity? What are the consequences? How is heteroskedasticity identified? How is heteroskedasticity.
Inference for regression - Simple linear regression
Hypothesis Testing in Linear Regression Analysis
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
What does it mean? The variance of the error term is not constant
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Chapter 10 Hetero- skedasticity Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
Copyright © 2014 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
Chapter Three TWO-VARIABLEREGRESSION MODEL: THE PROBLEM OF ESTIMATION
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin The Two-Variable Model: Hypothesis Testing chapter seven.
EC 532 Advanced Econometrics Lecture 1 : Heteroscedasticity Prof. Burak Saltoglu.
11.1 Heteroskedasticity: Nature and Detection Aims and Learning Objectives By the end of this session students should be able to: Explain the nature.
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Heteroscedasticity Heteroscedasticity is present if the variance of the error term is not a constant. This is most commonly a problem when dealing with.
Heteroscedasticity Chapter 8
Ch5 Relaxing the Assumptions of the Classical Model
Chapter 14 Inference on the Least-Squares Regression Model and Multiple Regression.
Inference for Least Squares Lines
6. Simple Regression and OLS Estimation
REGRESSION DIAGNOSTIC III: AUTOCORRELATION
Statistics for Managers using Microsoft Excel 3rd Edition
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
REGRESSION DIAGNOSTIC II: HETEROSCEDASTICITY
Prediction, Goodness-of-Fit, and Modeling Issues
Chapter 5: The Simple Regression Model
Fundamentals of regression analysis
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
Multiple Regression Analysis
Fundamentals of regression analysis 2
STOCHASTIC REGRESSORS AND THE METHOD OF INSTRUMENTAL VARIABLES
Pure Serial Correlation
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
I271B Quantitative Methods
CHAPTER 29: Multiple Regression*
Serial Correlation and Heteroskedasticity in Time Series Regressions
Two-Variable Regression Model: The Problem of Estimation
Chapter 6: MULTIPLE REGRESSION ANALYSIS
PENGOLAHAN DAN PENYAJIAN
REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED
Undergraduated Econometrics
Serial Correlation and Heteroscedasticity in
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED
Simple Linear Regression
Heteroskedasticity.
Chapter 7: The Normality Assumption and Inference with OLS
BEC 30325: MANAGERIAL ECONOMICS
Linear Regression Summer School IFPRI
Autocorrelation MS management.
Financial Econometrics Fin. 505
BEC 30325: MANAGERIAL ECONOMICS
Serial Correlation and Heteroscedasticity in
Presentation transcript:

Financial Econometrics Fin. 505 Chapter 10: HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?

I. The Natural of Homoscedasticity One of the important assumptions of the classical linear regression model is that the variance of each disturbance term ui, conditional on the chosen values of the explanatory variables, is some constant number equal to σ2. This is the assumption of homoscedasticity, or equal (homo) spread (scedasticity), that is, equal variance.

However the assumption of homoscedasticity may not always hold. When it does not happen, you have heteroscedasticity.

Example: Assume that in the two-variable model- Y represents savings and X represents income: Yi = β1 + β2Xi + ui Figures 11.1 and 11.2 show that as income increases, savings on the average also increase. But in Figure 11.1 the variance of savings remains the same at all levels of income, whereas in Figure 11.2 it increases with income. It seems that in Figure 11.2 the higher income families on the average save more than the lower-income families, but there is also more variability in their savings.

II. Consequences of Heteroscedasticity Heteroscedasticity violates one of the CLRM assumptions. When an assumption of the CLRM is violated, the OLS estimators may no longer be BLUE. Specifically, in the presence of heteroscedasticity, the OLS estimators may not be efficient (achieve the smallest variance). Also accordingly, the estimated standard errors of the coefficients will be biased, which results in unreliable hypothesis tests (t-statistics). The OLS estimates, however, remain unbiased.

III. Detecting Heteroscedasticity  

 

1) Examining the residuals in graph form: Although uˆ2i are not the same thing as u2i , they can be used as proxies especially if the sample size is sufficiently large. An examination of the uˆ2i may reveal patterns such as those shown in next Figure 11.9. You may plot uˆ2i against one of the explanatory variables Xi.

According to Figure 11.9 : A pattern such as that shown in Figure 11.9c, for instance, suggests that the variance of the disturbance term is linearly related to the X variable. Thus, if in the regression of savings on income one finds a pattern such as that shown in Figure 11.9c, it suggests that the heteroscedastic variance may be proportional to the value of the income variable. This knowledge may help us in transforming our data in such a manner that in the regression on the transformed data the variance of the disturbance is homoscedastic.

2) Breusch-Pagan test : This test is the most common test for heteroscedasticity. It begins by allowing the heteroscedasticity process to be a function of one or more of your independent variables. It is usually applied by assuming that heteroscedasticity may be a linear function of all independent variables in the model.

To illustrate this test with the k-variable linear regression model: Yi = β1 + β2X2i + ·· ·+βkXki + ui Assume that the error variance σ2i is described as σ2i = f (α1 + α2X2i + ·· ·+αKXki) Specifically, assume that: σ2i = α1 + α2X2i +· · ·+ αKXki That is, σ2i is a linear function of the X’s. If α2 = α3 = · · · = αk = 0, σ2i = α1, which is a constant.

 

Step (1): Estimate Yi = β1 + β2X2i + ·· ·+βkXki + ui by OLS and obtain the residuals uˆ1, uˆ2, . . , uˆn. Step (2): Obtain error variance ------- σ˜2 = Σuˆ2i /n. Step (3): Construct variables pi defined as pi = uˆ2i / σ˜2 which is simply each residual squared divided by σ˜2.

Step (4): Regress pi on the X’s as pi = α1 + α2X2i +· · ·+αkXki + vi where vi is the residual term of this regression. Step (5): Obtain the ESS (explained sum of squares) from the previous equation and define Θ =(1/2) (ESS) Assuming ui are normally distributed, one can show that if there is homoscedasticity and if the sample size n increases indefinitely, then Θ∼asy χ2k−1

that follows the chi-square distribution with (k− 1) degrees of freedom. Therefore, if the computed Θ (= χ2) exceeds the critical χ2 value at the chosen level of significance, one can reject the hypothesis of homoscedasticity; otherwise one does not reject it.

Example of Breusch–Pagan test: Regressing Y on X, we obtain the following: Step (1): Yˆi = 9.2903 + 0.6378Xi se = (5.2314) (0.0286) RSS = 2361.153 R2 = 0.9466 n = 30 Step (2): calculate σ˜2 =Σuˆ2i /30 = 2361.153/30 = 78.7051 Step (3): Divide the squared residuals uˆ2i obtained from regression (step 1) by 78.7051 to construct the variable pi.

Step (4): Assuming that pi are linearly related to Xi , we obtain the regression pˆi = −0.7426 + 0.0101Xi se = (0.7529) (0.0041) ESS = 10.4280 R2 = 0.18 Step (5): Θ = (½) (ESS) = 5.2140

Under the assumptions of the BPG test in asymptotically follows the chisquare distribution with 1 df. Now from the chi-square table we find that for 1 df the 5 percent critical, chi-square value is 3.8414 and the 1 percent critical, χ2 value is 6.6349. Thus, the observed chi-square value of 5.2140 is significant at the 5 percent but not the 1 percent level of significance.

Important Remarks: Keep in mind that, strictly speaking, the BPG test is an asymptotic, or large-sample test and in the present example 30 observations may not constitute a large sample. It should also be pointed out that in small samples the test is sensitive to the assumption that the disturbances ui are normally distributed. A weakness of BP test is that it assumes the hereroskedasticity is a linear function of the independent variables.

Failing to find evidence of hereroskedasticity with the BP test doesn’t rule out a nonlinear relationship between the independent variable(s) and the error variance. Additionally, the BP test isn’t useful for determining how to correct or adjust the model for hereroskedasticity.

As we have seen, heteroscedasticity does not destroy the unbiasedness and consistency properties of the OLS estimators, but they are no longer efficient, not even asymptotically (i.e., large sample size).