R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Autocorrelation Functions and ARIMA Modelling
Dates for term tests Friday, February 07 Friday, March 07
Model Building For ARIMA time series
Use of Business Tendency Survey Results for Forecasting Industry Production in Slovakia Use of Business Tendency Survey Results for Forecasting Industry.
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Unit Roots & Forecasting
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Time Series Building 1. Model Identification
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
STAT 497 APPLIED TIME SERIES ANALYSIS
How should these data be modelled?. Identification step: Look at the SAC and SPAC Looks like an AR(1)- process. (Spikes are clearly decreasing in SAC.
1 Ka-fu Wong University of Hong Kong Modeling Cycles: MA, AR and ARMA Models.
Modeling Cycles By ARMA
NY Times 23 Sept time series of the day. Stat Sept 2008 D. R. Brillinger Chapter 4 - Fitting t.s. models in the time domain sample autocovariance.
Prediction and model selection
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Economics 20 - Prof. Anderson
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Non-Seasonal Box-Jenkins Models
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Dealing with Heteroscedasticity In some cases an appropriate scaling of the data is the best way to deal with heteroscedasticity. For example, in the model.
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by descriptive statistic.
Forecast for the solar activity based on the autoregressive desciption of the sunspot number time series R. Werner Solar Terrestrial Influences Institute.
BOX JENKINS METHODOLOGY
Box Jenkins or Arima Forecasting. H:\My Documents\classes\eco346\Lectures\chap ter 7\Autoregressive Models.docH:\My Documents\classes\eco346\Lectures\chap.
AR- MA- och ARMA-.
ARMA models Gloria González-Rivera University of California, Riverside
Teknik Peramalan: Materi minggu kedelapan
Time Series Forecasting (Part II)
STAT 497 LECTURE NOTES 2.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
TIME SERIES ANALYSIS Time Domain Models: Red Noise; AR and ARMA models LECTURE 7 Supplementary Readings: Wilks, chapters 8.
#1 EC 485: Time Series Analysis in a Nut Shell. #2 Data Preparation: 1)Plot data and examine for stationarity 2)Examine ACF for stationarity 3)If not.
Various topics Petter Mostad Overview Epidemiology Study types / data types Econometrics Time series data More about sampling –Estimation.
Autoregressive Integrated Moving Average (ARIMA) Popularly known as the Box-Jenkins methodology.
Autocorrelation in Time Series KNNL – Chapter 12.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Autocorrelation, Box Jenkins or ARIMA Forecasting.
Big Data at Home Depot KSU – Big Data Survey Course Steve Einbender Advanced Analytics Architect.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Problems with the Durbin-Watson test
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
Dynamic Models, Autocorrelation and Forecasting ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
1 Chapter 5 : Volatility Models Similar to linear regression analysis, many time series exhibit a non-constant variance (heteroscedasticity). In a regression.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
The Box-Jenkins (ARIMA) Methodology
Univariate Time series - 2 Methods of Economic Investigation Lecture 19.
Introduction to stochastic processes
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
Subodh Kant. Auto-Regressive Integrated Moving Average Also known as Box-Jenkins methodology A type of linear model Capable of representing stationary.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
EC 827 Module 2 Forecasting a Single Variable from its own History.
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Stochastic Process - Introduction
Ch8 Time Series Modeling
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Statistics 153 Review - Sept 30, 2008
Model Building For ARIMA time series
CHAPTER 16 ECONOMIC FORECASTING Damodar Gujarati
Machine Learning Week 4.
Unit Root & Augmented Dickey-Fuller (ADF) Test
CH2 Time series.
BOX JENKINS (ARIMA) METHODOLOGY
Presentation transcript:

R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods

Inference statistic analysis of the time series Now: measures about the significance extrapolated trends causal relations between two variables Cross-section analysis: Y is a realization of a stochastic process, for example the errors must have a determined probability distribution Time series analysis: Prognosis for y t+1, the influences of exogenous parameters can be investigated on this basis

A model that describes probability structures is called stochastic process. The model includes assumptions for the mechanisms generating the observed time series. A general assumption is the stationarity: 4a) Autocovariances with a lag greater than k are assumed zero → moving average models 4b) Autocovariances of a higher order can be calculated by variances of a lower order → autoregressive models weakly stationarity

Wavelet transformation, MMNR*100 hab data

Autoregressive (AR) models of order p a t error term: white noise

AR(1) process

Theoretical autocorrelation functions (ACF) ACF for AR(1)

Yule-Walker equations AR(1) AR(2) AR(p)

2 0 2 φ1φ1 1 0 φ2φ2 AR(2) Conditions of stationarity: In the area under the circle the AR(2) model describes a quasi-cycle process

z(1)=1 z(2)=2 time

Step wise calculation of the coefficients from the Yule- Walker equations k=1: k=2: The theoretical PACF of an AR(p) process has values different from zero, only for k=1,2,…,p ! Model identification tool partial autocorrelation function (PACF), as known from the cross-section statistics

φ 1 =1.7 φ 2 =-0.95 Theoretical autocorrelation functions (ACF) and partial autocorrelation function for an AR(2) process φ 1 =1.7 φ 2 =-0.95

Yule 1927

Residues Parameter estimation YuleIn this work φ1φ φ2φ c00

Distribution of the residues Autocorrelation function of the residues

Moving-Average (MA) models AR models describe processes as a function of past z values, however as was shown for the AR(2) process z=1.7z t z t-2 the process is forced by the noise a t. (with a theoretical infinite influence of the shocks). Now the idea is: as for the AR-process, to minimize the process parameters of finite series of a t with time lags

Autocorrelation for a MA(1) process for a MA(2) process

PACF? θ1θ1 1 0 θ2θ2 МА(2) Invertibility condition: In the area under the circle the MA(2) model describes a quasi-cycle process Empiric ACF is a tool to identification of the MA order

Invertibility condition: For a MA(1) process we have

The MA(1) process can be presented by an AR( ) process In general MA(q) process can be presented by an AR( ) process and an AR(p) process can be presented by a MAR( ) process Box-Jenkins Principle: models with minimum number of parameters have to be used

Other models: ARMA: mixed model of AR and MA ARIMA: autoregressive integrating moving-average model it uses differences of the time series SARIMA: seasonal ARIMA model with constant seasonal figure VARMA : vector ARMA

Forecast AR(1)

MA(1) It can be shown that The MA models are not useful for prognosis

Forecast of the SSN by an AR(9) model

Dynamical regression Ordinary linear regression: X i may be transformed α and β can be optimally estimated by Ordinary Least Squares (OLS) using the assumptions: Y i is normal distributed, for X i it is not necessary to be stochastically distributed (for ex. can be fixed) 1.E(ε i )=0 2.ε i is not autocorrelated Cov(ε i, ε j )=0 3.ε i is normally distributed 4.Equilibrium conditions

For time series can be formally written ( i →t ): The assumption of equilibrium is not necessary However: In time series the error term is often autocorrelated -The estimations are not efficient (they have not the minimal variance) - Autocorrelations of X i can be transferred to ε, autocorrelations of ε produce deviations of σ ε from the true value, besides this implicates a not true value of σ β γ autocorr. of the residues λ autocorr. of the predictors

Simple lag model (models, dynamical by X) Distributed lag model the influence is distributed over k lags for example k=2 The statistical interpretation of β do not make sense

Therefore more restrictions are needed where the influence decreases exponentially with k. Then the model has only three parameters: α, β 0,δ For the model

where Using OLS, δ and β 0 can be estimated, and after this β k How do determine the parameters? Koyck transformation

Similar models Adaptive expectation model Partial adjustment model Models with two or more input variables

Model with autocorrelative error term We remember, that ε t in lin. regr. has to be N (0,σ). Here ε t is an AR(1) process. Estimation of the regr. coeff. by Cochrane/Orcutt method 1. By OLS estimation of α and β and calculation of the residues e t and estimation of the autocorrelation coeff.

new regr. equation where Note: to test if ε t is autocorrelated, the Durbin-Watson test can be applied

e = *t

Autocorrelation function of the detrended residues

Partial Autocorrelation function of the detrended residues

I want to acknowledge to the Ministery of Education and Science to support this work under the contract DVU01/0120 Acknowledgement