1 Another useful model is autoregressive model. Frequently, we find that the values of a series of financial data at particular points in time are highly.

Slides:



Advertisements
Similar presentations
Autocorrelation Functions and ARIMA Modelling
Advertisements

Regression Analysis.
Inference in the Simple Regression Model
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Welcome to Econ 420 Applied Regression Analysis
Correlation and regression
Forecasting Using the Simple Linear Regression Model and Correlation
Hypothesis Testing Steps in Hypothesis Testing:
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Ch11 Curve Fitting Dr. Deshi Ye
Objectives (BPS chapter 24)
CORRELATON & REGRESSION
Statistics for Managers Using Microsoft® Excel 5th Edition
Chapter 10 Simple Regression.
Additional Topics in Regression Analysis
Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Apr. 15, 2008.
Quantitative Business Forecasting Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
1 MF-852 Financial Econometrics Lecture 6 Linear Regression I Roy J. Epstein Fall 2003.
Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Dealing with Heteroscedasticity In some cases an appropriate scaling of the data is the best way to deal with heteroscedasticity. For example, in the model.
Autocorrelation Lecture 18 Lecture 18.
Chapter 15: Model Building
Simple Linear Regression Analysis
Correlation & Regression
© 2003 Prentice-Hall, Inc.Chap 12-1 Business Statistics: A First Course (3 rd Edition) Chapter 12 Time-Series Forecasting.
Lecture 4 Time-Series Forecasting
© 2002 Prentice-Hall, Inc.Chap 13-1 Statistics for Managers using Microsoft Excel 3 rd Edition Chapter 13 Time Series Analysis.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Regression Method.
Time-Series Analysis and Forecasting – Part V To read at home.
1 1 Slide © 2012 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Learning Objective Chapter 14 Correlation and Regression Analysis CHAPTER fourteen Correlation and Regression Analysis Copyright © 2000 by John Wiley &
Bivariate Regression (Part 1) Chapter1212 Visual Displays and Correlation Analysis Bivariate Regression Regression Terminology Ordinary Least Squares Formulas.
What does it mean? The variance of the error term is not constant
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved Section 10-5 Multiple Regression.
Pure Serial Correlation
Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
© 2000 Prentice-Hall, Inc. Chap The Least Squares Linear Trend Model Year Coded X Sales
Chapter 7 Relationships Among Variables What Correlational Research Investigates Understanding the Nature of Correlation Positive Correlation Negative.
Autocorrelation in Time Series KNNL – Chapter 12.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
Autocorrelation, Box Jenkins or ARIMA Forecasting.
© 1999 Prentice-Hall, Inc. Chap Chapter Topics Component Factors of the Time-Series Model Smoothing of Data Series  Moving Averages  Exponential.
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
Problems with the Durbin-Watson test
Environmental Modeling Basic Testing Methods - Statistics III.
Time Series Analysis Lecture 11
Correlation & Regression Analysis
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
The Box-Jenkins (ARIMA) Methodology
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
AUTOCORRELATION 1 Assumption B.5 states that the values of the disturbance term in the observations in the sample are generated independently of each other.
Economics 173 Business Statistics Lecture 28 © Fall 2001, Professor J. Petry
Spearman’s Rho Correlation
REGRESSION DIAGNOSTIC III: AUTOCORRELATION
Multiple Regression Equations
Regression Analysis Part D Model Building
Statistics for Managers using Microsoft Excel 3rd Edition
CHAPTER fourteen Correlation and Regression Analysis
Quantitative Methods PSY302 Quiz Chapter 9 Statistical Significance
Multiple Regression Models
Statistical Inference about Regression
Simple Linear Regression
Product moment correlation
Example on the Concept of Regression . observation
Chapter 14 Inference for Regression
Autocorrelation MS management.
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

1 Another useful model is autoregressive model. Frequently, we find that the values of a series of financial data at particular points in time are highly correlated with the value which precede and succeed them. Autoregressive models

2 Models with lagged variable Dependent variable is a function of itself at the previous moment of period or time. The creation of an autoregressive model generates a new predictor variable by using the Y variable lagged 1 or more periods.

3 The most often seen form of the equation is a linear form: where: y t – the dependent variable values at the moment t, y t-i (i = 1, 2,..., p) – the dependent variable values at the moment t-i, bo, bi (i=1,..., p) – regression coefficient, p – autoregression rank, e t – disturbance term.

4

5 A first-order autoregressive model is concerned with only the correlation between consecutive values in a series. A second-order autoregressive model considers the effect of relationship between consecutive values in a series as well as the correlation between values two periods apart.

6 The selection of an appropriate autoregressive model is not an easy task. Once a model is selected and OLS method is used to obtain estimates of the parameters, the next step would be to eliminate those parameters which do not contribute significantly.

7 (The highest-order parameter does not contribute to the prediction of Yt) (The highest-order parameter is significantly meaningful)

8 using an alpha level of significance, the decision rule is to reject H 0 if or if and not to reject H 0 if

9 Some helpful information:

10 If the null hypothesis is NOT rejected we may conclude that the selected model contains too many estimated parameters. The highest-order term then be deleted an a new autoregressive model would be obtained through least- squares regression. A test of the hypothesis that the “new” highest-order term is 0 would then be repeated.

11 This testing and modeling procedure continues until we reject H 0. When this occurs, we know that our highest-order parameter is significant and we are ready to use this model.

12 Example 1

13

14

15

16

17

18 We have to estimate the parameters of the first-order autoregressive model: and then check if Beta1 is statistically significant.

19

20 Example 2

21

22

23

24 Autogregressive Modeling  Used for Forecasting  Takes Advantage of Autocorrelation  1st order - correlation between consecutive values  2nd order - correlation between values 2 periods apart  Autoregressive Model for pth order: Random Error

25 Autoregressive Modeling Steps  1. Choose p:  2. Form a series of “lag predictor” variables Y i-1, Y i-2, … Y i-p  3. Use Excel to run regression model using all p variables  4. Test significance of B p  If null hypothesis rejected, this model is selected  If null hypothesis not rejected, decrease p by 1 and repeat your calculations