Regression Analysis.

Slides:



Advertisements
Similar presentations
Econometric Modelling
Advertisements

Autocorrelation Functions and ARIMA Modelling
Functional Form and Dynamic Models
Multiple Regression.
Autocorrelation and Heteroskedasticity
Ordinary least Squares
Applied Econometrics Second edition
Managerial Economics in a Global Economy
Forecasting Using the Simple Linear Regression Model and Correlation
Hypothesis Testing Steps in Hypothesis Testing:
Using SAS for Time Series Data
Inference for Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Chapter 11 Autocorrelation.
Correlation and Regression
Objectives (BPS chapter 24)
The Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
Chapter 13 Additional Topics in Regression Analysis
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Chapter 10 Simple Regression.
Additional Topics in Regression Analysis
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter Topics Types of Regression Models
Chapter 11 Multiple Regression.
Topic 3: Regression.
Review.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Inference for regression - Simple linear regression
1 MADE WHAT IF SOME OLS ASSUMPTIONS ARE NOT FULFILED?
Regression Method.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
Pure Serial Correlation
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Problems with the Durbin-Watson test
6. Simple Regression and OLS Estimation Chapter 6 will expand on concepts introduced in Chapter 5 to cover the following: 1) Estimating parameters using.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
1 Regression Review Population Vs. Sample Regression Line Residual and Standard Error of Regression Interpretation of intercept & slope T-test, F-test.
11.1 Heteroskedasticity: Nature and Detection Aims and Learning Objectives By the end of this session students should be able to: Explain the nature.
Ch5 Relaxing the Assumptions of the Classical Model
REGRESSION DIAGNOSTIC III: AUTOCORRELATION
Statistics for Managers using Microsoft Excel 3rd Edition
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
Chapter 11: Simple Linear Regression
Fundamentals of regression analysis 2
Pure Serial Correlation
Correlation and Simple Linear Regression
I271B Quantitative Methods
Chapter 12 – Autocorrelation
Autocorrelation.
Serial Correlation and Heteroskedasticity in Time Series Regressions
Regression Models - Introduction
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Linear regression Fitting a straight line to observations.
Correlation and Simple Linear Regression
Tutorial 1: Misspecification
Simple Linear Regression and Correlation
Heteroskedasticity.
Tutorial 6 SEG rd Oct..
Autocorrelation Dr. A. PHILIP AROKIADOSS Chapter 5 Assistant Professor
Autocorrelation.
Autocorrelation MS management.
Introduction to Regression
Regression Models - Introduction
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Regression Analysis

Introduction Derive the α and β Assess the use of the T-statistic Discuss the importance of the Gauss-Markov assumptions Describe the problems associated with autocorrelation, how to measure it and possible remedies Introduce the problem of heteroskedasicity

Values and Fitted Values

Deriving the α and β The aim of a least squares regression is to minimize the distance between the regression line and error terms (e).

The Constant

The Slope Coefficient (β)

T-test When conducting a t-test, we can use either a 1 or 2 tailed test, depending on the hypothesis We usually use a 2 tailed test, in this case our alternative hypothesis is that our variable does not equal 0. In a one tailed test we would stipulate whether it was greater than or less than 0. Thus the critical value for a 2 tailed test at the 5% level of significance is the same as the critical value for a 1 tailed test at the 2.5% level of significance.

T-test We can also test whether our coefficient equals 1.

Gauss-Markov Assumptions There are 4 assumptions relating to the error term. The first is that the expected value of the error term is zero The second is that the error terms are not correlated The third is that the error term has a constant variance The fourth is that the error term and explanatory variable are not correlated.

Gauss-Markov assumptions More formally we can write them as:

Additional Assumptions There are a number of additional assumptions such as normality of the error term and n (number of observations) exceeding k (the number of parameters). If these assumptions hold, we say the estimator is BLUE

BLUE Best or minimum variance Linear or straight line Unbiased or the estimator is accurate on average over a large number of samples. Estimator

Consequences of BLUE If the estimator is not BLUE, there are serious implications for the regression, in particular we can not rely on the t-tests. In this case we need to find a remedy for the problem.

Autocorrelation Autocorrelation occurs when the second Gauss-Markov assumption fails. It is often caused by an omitted variable In the presence of autocorrelation the estimator is not longer Best, although it is still unbiased. Therefore the estimator is not BLUE.

Durbin-Watson Test This tests for 1st order autocorrelation only In this case the autocorrelation follows the first-order autoregressive process

Durbin-Watson Test- decision framework dl du 2 4 4-du b-dl Zone of indecision

DW Statistic The DW test statistic lies between 0 and 4, if it lies below the dl point, we have positive autocorrelation. If it lies between du and 4-du, we have no autocorrelation and if above 4-dl we have negative autocorrelation. The dl and du value can be found in the DW d-statistic tables (at the back of most text books)

Lagrange Multiplier (LM) Statistic Tests for higher order autocorrelation The test involves estimating the model and obtaining the error term . Then run a second regression of the error term on lags of itself and the explanatory variable: (the number of lags depends on the order of the autocorrelation, i.e. second order)

LM Test The test statistic is the number of observations multiplied by the R-squared statistic. It follows a chi-squared distribution, the degrees of freedom are equal to the order of autocorrelation tested for (2 in this case) The null hypothesis is no autocorrelation, if the test statistic exceeds the critical value, reject the null and therefore we have autocorrelation.

Remedies for Autocorrelation There are 2 main remedies: The Cochrane-Orcutt iterative process An unrestricted version of the above process

Heteroskedasticity This occurs when the variance of the error term is not constant Again the estimator is not BLUE, although it is still unbisased it is no longer Best It often occurs when the values of the variables vary substantially in different observations, i.e. GDP in Cuba and the USA.

Conclusion The residual or error term is the difference between the fitted value and actual value of the dependent variable. There are 4 Gauss-Markov assumptions, which must be satisfied if the estimator is to be BLUE Autocorrelation is a serious problem and needs to be remedied The DW statistic can be used to test for the presence of 1st order autocorrelation, the LM statistic for higher order autocorrelation.