1Spring 02 Problems in Regression Analysis Heteroscedasticity Violation of the constancy of the variance of the errors. Cross-sectional data Serial Correlation.

Slides:



Advertisements
Similar presentations
Applied Econometrics Second edition
Advertisements

Econometric Modeling Through EViews and EXCEL
Multivariate Regression
Forecasting Using the Simple Linear Regression Model and Correlation
Homoscedasticity equal error variance. One of the assumption of OLS regression is that error terms have a constant variance across all value so f independent.
8.4 Weighted Least Squares Estimation Before the existence of heteroskedasticity-robust statistics, one needed to know the form of heteroskedasticity -Het.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
Chapter 13 Additional Topics in Regression Analysis
Multiple Linear Regression Model
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
Additional Topics in Regression Analysis
Statistical Analysis SC504/HS927 Spring Term 2008 Session 7: Week 23: 7 th March 2008 Complex independent variables and regression diagnostics.
Economics 20 - Prof. Anderson
GRA 6020 Multivariate Statistics Regression examples Ulf H. Olsson Professor of Statistics.
Topic 3: Regression.
1.The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated. 2.Homoscedasticity --the.
Dealing with Heteroscedasticity In some cases an appropriate scaling of the data is the best way to deal with heteroscedasticity. For example, in the model.
1Prof. Dr. Rainer Stachuletz Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Autocorrelation Lecture 18 Lecture 18.
Ordinary Least Squares
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
1 MADE WHAT IF SOME OLS ASSUMPTIONS ARE NOT FULFILED?
Serial Correlation Serial correlation is a problem associated with time-series data. It occurs when the errors of the regression model are correlated with.
Regression Method.
Returning to Consumption
What does it mean? The variance of the error term is not constant
2-1 MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4)
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Twelve.
1 MF-852 Financial Econometrics Lecture 10 Serial Correlation and Heteroscedasticity Roy J. Epstein Fall 2003.
Pure Serial Correlation
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
Byron Gangnes Econ 427 lecture 3 slides. Byron Gangnes A scatterplot.
Autocorrelation in Time Series KNNL – Chapter 12.
Christopher Dougherty EC220 - Introduction to econometrics (revision lectures 2011) Slideshow: autocorrelation Original citation: Dougherty, C. (2011)
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
Problems with the Durbin-Watson test
EC 532 Advanced Econometrics Lecture 1 : Heteroscedasticity Prof. Burak Saltoglu.
STAT 497 LECTURE NOTE 9 DIAGNOSTIC CHECKS 1. After identifying and estimating a time series model, the goodness-of-fit of the model and validity of the.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Residual Analysis Purposes –Examine Functional Form (Linear vs. Non- Linear Model) –Evaluate Violations of Assumptions Graphical Analysis of Residuals.
1. The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated. 2. Homoscedasticity--the.
More on regression Petter Mostad More on indicator variables If an independent variable is an indicator variable, cases where it is 1 will.
Economics 20 - Prof. Anderson1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
7-1 MGMG 522 : Session #7 Serial Correlation (Ch. 9)
AUTOCORRELATION 1 Assumption B.5 states that the values of the disturbance term in the observations in the sample are generated independently of each other.
Assumptions & Requirements.  Three Important Assumptions 1.The errors are normally distributed. 2.The errors have constant variance (i.e., they are homoscedastic)
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
Heteroscedasticity Heteroscedasticity is present if the variance of the error term is not a constant. This is most commonly a problem when dealing with.
Ch5 Relaxing the Assumptions of the Classical Model
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
REGRESSION DIAGNOSTIC II: HETEROSCEDASTICITY
Econometric methods of analysis and forecasting of financial markets
Fundamentals of regression analysis 2
Pure Serial Correlation
BIVARIATE REGRESSION AND CORRELATION
STAT 497 LECTURE NOTE 9 DIAGNOSTIC CHECKS.
Serial Correlation and Heteroskedasticity in Time Series Regressions
I271b Quantitative Methods
Serial Correlation and Heteroscedasticity in
Tutorial 1: Misspecification
BEC 30325: MANAGERIAL ECONOMICS
Multiple Regression Analysis: OLS Asymptotics
Chapter 13 Additional Topics in Regression Analysis
Financial Econometrics Fin. 505
BEC 30325: MANAGERIAL ECONOMICS
Serial Correlation and Heteroscedasticity in
Presentation transcript:

1Spring 02 Problems in Regression Analysis Heteroscedasticity Violation of the constancy of the variance of the errors. Cross-sectional data Serial Correlation Violation of uncorrelated error terms Time-series data

2Spring 02 Heteroscedasticity The OLS model assumes homoscedasticity, i.e., the variance of the errors is constant. In some regressions, especially in cross-sectional studies, this assumption may be violated. When heteroscedasticity is present, OLS estimation puts more weight on the observations which have large error variances than on those with small error variances. The OLS estimates are unbiased but they are inefficient but have larger than minimum variance.

3Spring 02 Tests of Heteroscedasticity Lagrange Multiplier Tests Goldfeld-Quant Test White’s Test

4Spring 02 Goldfeld-Quant Test Order the data by the magnitude of the independent variable, X, which is thouth to be related to the error variance. Omit the middle d observations. (d might be 1/5 of the total sample size) Fit two separate regressions; one for the low values, another for the high values Calculate ESS 1 and ESS 2 Calculate

5Spring 02 Problem Salvatore – Data on income and consumption

6Spring 02 Problem

7Spring 02 Problem Regression on the whole sample: Regressions on the first twelve and last twelve observations:

8Spring 02 To Correct for Heteroscedasticity To correct for heteroscedasticity of the form Var(  i )=CX 2, where C is a nonzero constant, transform the variables by dividing through by the problematic variable. In the two variable case, The transformed error term is now homoscedastic

9Spring 02 Problem

10Spring 02 Serial Correlation This is the problem which arises in OLS estimation when the errors are not independent. The error term in one period is correlated with error terms in previous periods. If  i is correlated with  i-1, then we say there is first order serial correlation. Serial correlation may be positive or negative. E(  i,  i-1 )>0 E(  i,  i-1 )<0

11Spring 02 Serial Correlation If serial correlation is present, the OLS estimates are still unbiased and consistent, but the standard errors are biased, leading to incorrect statistical tests and biased confidence intervals. With positive serial correlation, the standard errors of  hat is biased downward, leading to higher t stats With negative serial correlation, the standard errors of  hat is biased upward, leading to lower t stats

12Spring 02 Durbin-Watson Statistic 0 d L d U 2 4-d U 4-d L 4 +SC inconcl no serial correlation inconcl -SC

13Spring 02 Problem Data 9-4 shows corporate profits and sales in billions of dollars for the manufacturing sector of the U.S. from 1974 to Estimate the equation Profits =  1 +  2 Sales + e Test for first-order serial correlation.

14Spring 02 Problem OLS Estimate of Profit as a function of Sales:

15Spring 02 Problem Test for serial correlation SPSS

16Spring 02 Correcting for Serial Correlation We assume: Where u t is distributed normally with a zero mean and constant variance. Follow a Durbin Procedure

17Spring 02 Correcting for Serial Correlation

18Spring 02 Correcting for Serial Correlation Move the lagged dependent variable term to the right-hand side and estimate the equation using OLS. The estimated coefficient on the lagged dependent variable is .

19Spring 02 Correcting for Serial Correlation Create new independent and dependent variables by the following process: Estimate the following equation:

20Spring 02 Correcting for Serial Correlation The estimates of the slope coefficients are the same (but corrected for serial correlation) as in the original equation. The constant of the regression on the transformed variables is

21Spring 02 Problem Begin by regressing Profit (  ) on Profit lagged one period, Sales, and Sales lagged one period. The estimated coefficient on the lagged dependent variable is .

22Spring 02 Problem  =.49

23Spring 02 Problem Then generate the transformed (starred) variables. Run regression on transformed variables Profit*= Sales* Profit = Sales With no serial correlation