Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.

Slides:



Advertisements
Similar presentations
Autocorrelation Functions and ARIMA Modelling
Advertisements

Dates for term tests Friday, February 07 Friday, March 07
Rules of Matrix Arithmetic
5.1 Real Vector Spaces.
Ch 7.7: Fundamental Matrices
Using SAS for Time Series Data
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Christopher Dougherty EC220 - Introduction to econometrics (chapter 13) Slideshow: stationary processes Original citation: Dougherty, C. (2012) EC220 -
Non-stationary data series
Economics 20 - Prof. Anderson1 Testing for Unit Roots Consider an AR(1): y t =  +  y t-1 + e t Let H 0 :  = 1, (assume there is a unit root) Define.
Economics 20 - Prof. Anderson1 Time Series Data y t =  0 +  1 x t  k x tk + u t 2. Further Issues.
Unit Roots & Forecasting
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Matrices: Inverse Matrix
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
The Simple Linear Regression Model: Specification and Estimation
Time series analysis - lecture 2 A general forecasting principle Set up probability models for which we can derive analytical expressions for and estimate.
Maximum likelihood Conditional distribution and likelihood Maximum likelihood estimations Information in the data and likelihood Observed and Fisher’s.
1Prof. Dr. Rainer Stachuletz Testing for Unit Roots Consider an AR(1): y t =  +  y t-1 + e t Let H 0 :  = 1, (assume there is a unit root) Define 
Review of Matrix Algebra
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Linear and generalised linear models
Linear and generalised linear models
Basics of regression analysis
Linear regression models in matrix terms. The regression function in matrix terms.
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
5  Systems of Linear Equations: ✦ An Introduction ✦ Unique Solutions ✦ Underdetermined and Overdetermined Systems  Matrices  Multiplication of Matrices.
BOX JENKINS METHODOLOGY
Chapter 15 Forecasting Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
ARMA models Gloria González-Rivera University of California, Riverside
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
SYSTEMS OF EQUATIONS MATRIX SOLUTIONS TO LINEAR SYSTEMS
Chapter 6.7 Determinants. In this chapter all matrices are square; for example: 1x1 (what is a 1x1 matrix, in fact?), 2x2, 3x3 Our goal is to introduce.
STAT 497 LECTURE NOTES 2.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
WEEK 8 SYSTEMS OF EQUATIONS DETERMINANTS AND CRAMER’S RULE.
Algebra Form and Function by McCallum Connally Hughes-Hallett et al. Copyright 2010 by John Wiley & Sons. All rights reserved. 3.1 Solving Equations Section.
Autoregressive Integrated Moving Average (ARIMA) Popularly known as the Box-Jenkins methodology.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Founded 1348Charles University
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Dynamic Models, Autocorrelation and Forecasting ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
Joint Moments and Joint Characteristic Functions.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Significance Tests for Regression Analysis. A. Testing the Significance of Regression Models The first important significance test is for the regression.
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
Analysis of financial data Anders Lundquist Spring 2010.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Charles University Charles University STAKAN III
Charles University Charles University STAKAN III
5 Systems of Linear Equations and Matrices
Charles University Charles University STAKAN III
Machine Learning Week 4.
Charles University Charles University STAKAN III
Unit Root & Augmented Dickey-Fuller (ADF) Test
Charles University Charles University STAKAN III
The Spectral Representation of Stationary Time Series
Charles University Charles University STAKAN III
Presentation transcript:

Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences Jan Ámos Víšek Econometrics Tuesday, – Charles University Third Lecture (summer term)

Plan of the whole year Regression models for various situations ● Division according to character of data (with respect to time) : * Cross-sectional data (winter term) * Panel data (summer term)

Plan of the whole year Regression models for various situations ● Division according to character of variables * Continuous response (and nearly arbitrary) explanatory variables (winter and part of summer term) * Qualitative and limited response (and nearly arbitrary) explanatory variables (summer term)

Plan of the whole year Regression models for various situations ● Division according to contamination of data * Classical methods, neglecting contamination (winter and most of of summer term) * Robust methods (three lectures in summer term)

Schedule of today talk ● The Generalized Least Squares ● Modeling time series by AR(p) and MA(q) * Stationarity, Dickey-Fuller tests of unit roots * Convertibility * Moments and covariance matrices

The Generalized Least Squares Let us assume that - regular, i.e. homoscedasticity is broken. - regular and symmetric and put multiplying the basic model from the left by..

For we have, i.e. we have reached homoscedasticity. Then. Recalling that, i. e. Generalized Least Squares The Generalized Least Squares continued

The Generalized Least Squares continued What is problem with application of ? contains of unknown elements which cannot be estimated due to the fact that we have at hand only observations ! above the diagonalon the diagonal But sometimes we know the structure of and moreover it can be determined by a few parameters !!

Modeling time series by stochastic models - Box-Jenkins methodology Box, G. E. P., G. M. Jenkins: Time Series Analysis, Forecasting and Control. Holden Day, San Francisco, Judge, G.,G., W.,E. Griffiths, R.C. Hill, H. Lutkepohl, T.,C. Lee: The Theory and Practice of Econometrics. J.Wiley and Sons, New York, Cipra, T.: Analýza časových řad s aplikacemi v ekonomii. SNTL/ALFA, Praha, Brockwell, P. J., R. A. Davis: Time Series: Theory and Methods. Springer Verlag, New York, Recommended

Modeling time series by AR(p) and MA(q) Let be a sequence of i.i.d. r.v.’s. with zero mean and with. The sequence of r.v.’s given by (1) is called (1) autoregressive process of order p and denoted by AR(p). Put also (3) The sequence of r.v.’s given by (2) is called moving -average process of order q and denoted MA(q). Finally, put continued (2) variance equal to. Then put

Modeling time series by AR(p) and MA(q) The sequence of r.v.’s given by (3) is called autoregressive moving -average process (of order (p,q)) and denoted by ARMA(p,q). continued If the process is ARMA(p,q), then the original process is called the integrated autoregressive moving-average process (of order (p,1,q) ) and denoted by ARIMA(p,1,q).

Modeling time series by AR(p) and MA(q) continued If the process is ARMA(p,q), then the original process is called the integrated autoregressive moving-average process of order (p,h,q) and denoted Put by ARIMA(p,1,q). Assumption: is i.i.d. r.v.’s with and.

Modeling time series by AR(p) and MA(q) A very first question is, of course, how far the autoregressive processes can be expressed as moving average and vice versa ? For simplicity, consider AR(1) : ( Notice that the “dual” description is much more complicated. ) continued So we may say that is “dual” to.

Modeling time series by AR(p) and MA(q) continued It immediately gives two results: Firstly (moments of ) for

Modeling time series by AR(p) and MA(q) continued Secondly (conditions of stationarity for ) Let’s recall stationarity DEFINITION The sequence of r.v.’s is called stationary if alternatively (this definition is easier to understand)

Modeling time series by AR(p) and MA(q) continued DEFINITION The sequence of r.v.’s is called stationary if (this definition is usually employed) Of course, our sequence is not infinite on both sides, hence the definition is to be applied in a bit modifies way. Remark Assuming sequence to be stationary, of course requires some modification of the definitions.

Modeling time series by AR(p) and MA(q) continued Returning to, we immediately observe that only for the variance is finite and hence it has any sense to speak about somedistribution. Now, return to, consider any k-tuple of indices and corresponding k-tuple of r.v.’s and find ( it is sufficient in mind ) the structure of r.v.’s which generated.

Modeling time series by AR(p) and MA(q) continued the both structures of r.v.’s are the same but shifted about j. Since are i.i.d., the d.f. of both k-tuples, and, are the same ( for any fixed j ). So, if, the sequence, is stationary. Finally, do the same for and find that

Modeling time series by AR(p) and MA(q) continued We may take an analogy to the equation has to be in absolute value larger that 1. Then, if, the solution of the polynomial ( in z ) Let us look for a general condition for stationarity. the polynomial. In other words, if solution of, is larger than 1, ‘s are stationary.

Modeling time series by AR(p) and MA(q) continued Similarly (and alternatively), the solution of the equation has to be in absolute value less that 1. which can be viewed as an analogy to So again, if solution of, is less than 1, ‘s are stationary.

Modeling time series by AR(p) and MA(q) continued For general we conclude, in analogy with, that all roots of the polynomial have to be in absolute value ( notice that they are generally complex numbers ) larger than 1. ( 4 )

Modeling time series by AR(p) and MA(q) continued Again alternatively, have to be in absolute value less than 1. ( 5 ) “Conditions of stationarity” ( of course, they are equivalent). all roots of the polynomial The conditions ( 4) and (5) are called

Modeling time series by AR(p) and MA(q) continued We have not at hand ‘s but “only” ‘s, so that we solve and obtain, say, instead of. But even if, we can have. Hence we have to test whether statistically significantly. The test is known as the “Test on unit roots”. The best known is Dickey-Fuller test.

Modeling time series by AR(p) and MA(q) continued Dickey-Fuller test – for AR( 1 ) t-test of significance that. Since and are not independent, we cannot use “classical” student test. D. A. Dickey and W. A. Fuller (1979) made Monte Carlo study and tabulated the critical values. An alternative Augmented Dickey-Fuller test – for AR( 1 ) and test of significance whether.

Modeling time series by AR(p) and MA(q) continued We already know that for AR( 1) and for That is why we define (frequently) and and. and. Moreover,

Modeling time series by AR(p) and MA(q) continued. So the covariance matrix is given as.

Modeling time series by AR(p) and MA(q) continued and the inversion as where ( We shall need it later. )

Modeling time series by AR(p) and MA(q) continued It is easy to verify that the inversion matrix given on the previous slide is really inversion of. We have for the product of k-th line of and of the ( transposed ) j-th row of ( ) 1 st coor. 2 nd coor. coor. Similarly for..

Modeling time series by AR(p) and MA(q) continued For, i.e. for the product of k-th line of and of the ( transposed ) k-th row of we have 1 st coor. 2 nd coor. coor.. Along similar lines we verify that.

Modeling time series by AR(p) and MA(q) continued Let us move to MA( 1 ). and,, but and hence etc. So,. ( Notice that the “dual” description is again much more complicated. )

Modeling time series by AR(p) and MA(q) continued An analogy (or counterpart?) to the condition of stationarity for AR( p ), there is a condition of invertibility of MA( q ) which reads: The condition has following sense: DEFINITION Let L be operator of the back-shift, i.e. for any we have. ( The letter “L” went from “lagged” value of.) All roots of the polynomial have to be outside unit circle.

Modeling time series by AR(p) and MA(q) continued We shall use the operator L (rather formally) in the following way. Returning to the MA( 1 ) and changing the sign of (but only and then. Assuming now that, we can the sum of the geometric series, namely, write as, i.e. this moment of explanation of condition of invertibility), we have

Modeling time series by AR(p) and MA(q) continued and finally. During the derivation of the result, we have needed, i.e. solution of has to be larger than 1. Unlike for AR( p ), for MA( q ) we can easy ( without any “dual” representation ) evaluate moments and covariance matrix. Clearly, and

Modeling time series by AR(p) and MA(q) continued In a similar way ( assume )

Modeling time series by AR(p) and MA(q) continued for, otherwise. Specifying it for MA( 1 ) and,.

Modeling time series by AR(p) and MA(q) continued There are at least two or three problems: Why we study both AR( p ) and MA( q ), when we can convert Firstly How to recognize that there is some dependence in the series ? Secondly Which type of dependency took place? How large p or q is ? Thirdly on and vice versa, i.e. on ? We’ll answer them successively in the next lecture.

What is to be learnt from this lecture for exam ? All what you need is on The Generalized Least Squares AR(p), MA(q), ARMA(p,q), ARIMA(p,h,q) Stationarity - conditions for stationarity, - Dickey-Fuller test – for AR( 1 ), - augmented Dickey-Fuller test – for AR( 1 ) Moments and covariance matrices Convertibility