ARMA models 2012 International Finance CYCU

Slides:



Advertisements
Similar presentations
Autocorrelation Functions and ARIMA Modelling
Advertisements

Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
Model specification (identification) We already know about the sample autocorrelation function (SAC): Properties: Not unbiased (since a ratio between two.
Unit Roots & Forecasting
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
Stationary Stochastic Process
How should these data be modelled?. Identification step: Look at the SAC and SPAC Looks like an AR(1)- process. (Spikes are clearly decreasing in SAC.
Non-Seasonal Box-Jenkins Models
Time series analysis - lecture 2 A general forecasting principle Set up probability models for which we can derive analytical expressions for and estimate.
Modeling Cycles By ARMA
1 Power 2 Econ 240C. 2 Lab 1 Retrospective Exercise: –GDP_CAN = a +b*GDP_CAN(-1) + e –GDP_FRA = a +b*GDP_FRA(-1) + e.
Time Series Basics Fin250f: Lecture 3.1 Fall 2005 Reading: Taylor, chapter
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Non-Seasonal Box-Jenkins Models
L7: ARIMA1 Lecture 7: ARIMA Model Process The following topics will be covered: Properties of Stock Returns AR model MA model ARMA Non-Stationary Process.
BOX JENKINS METHODOLOGY
ARMA models Gloria González-Rivera University of California, Riverside
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
TIME SERIES ANALYSIS Time Domain Models: Red Noise; AR and ARMA models LECTURE 7 Supplementary Readings: Wilks, chapters 8.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Byron Gangnes Econ 427 lecture 11 slides Moving Average Models.
Byron Gangnes Econ 427 lecture 12 slides MA (part 2) and Autoregressive Models.
Time Series Basics (2) Fin250f: Lecture 3.2 Fall 2005 Reading: Taylor, chapter , 3.9(skip 3.6.1)
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Random Processes Gaussian and Gauss-Markov processes Power spectrum of random processes and white processes.
Introduction to stochastic processes
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
Adv DSP Spring-2015 Lecture#11 Spectrum Estimation Parametric Methods.
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Stochastic Process - Introduction
Covariance, stationarity & some useful operators
Time Series Analysis.
Introduction to Time Series Analysis
Financial Econometrics Lecture Notes 2
Time Series Analysis and Its Applications
Computational Data Analysis
Statistics 153 Review - Sept 30, 2008
Univariate time series modelling and forecasting
Stochastic models - time series.
Forecasting with non-stationary data series
Machine Learning Week 4.
Stochastic models - time series.
Unit Root & Augmented Dickey-Fuller (ADF) Test
Lecture 13 Time Series: Stationarity, AR(p) & MA(q)
Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t)
Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t)
The Spectral Representation of Stationary Time Series
Understanding Nonstationarity and Recognizing it When You See it
Eni Sumarminingsih, SSi, MM
Univariate time series modelling and forecasting
Introduction to Time Series Analysis
Introduction to Time Series Analysis
Lecturer Dr. Veronika Alhanaqtah
CH2 Time series.
BOX JENKINS (ARIMA) METHODOLOGY
Stationary Stochastic Process
Presentation transcript:

ARMA models 2012 International Finance CYCU Lecture 2 ARMA models 2012 International Finance CYCU

White noise? About color? About sounds? Remember the statistical definition!

White noise Def: {t} is a white-noise process if each value in the series: zero mean constant variance no autocorrelation In statistical sense: E(t) = 0, for all t var(t) = 2 , for all t cov(t t-k ) = cov(t-j t-k-j ) = 0, for all j, k, jk

White noise w.n. ~ iid (0, 2 ) iid: independently identical distribution white noise is a statistical “random” variable in time series

The AR(1) model (with w.n.) yt = a0 + a1 yt-1 + t Solution by iterations yt-1 = a0 + a1 yt-2 + t-1 yt-2 = a0 + a1 yt-3 + t-2  y1 = a0 + a1 y0 + 1

General form of AR(1) Taking E(.) for both sides of the eq.

Compare AR(1) models Math. AR(1) “true” AR(1) in time series

Infinite population {yt} If yt is an infinite DGP, E(yt) implies Why? If |a1| < 1

Stationarity in TS In strong form In weak form f(y|t) is a distribution function in time t f(.) is strongly stationary if f(y|t) = f(y|t-j) for all j In weak form constant mean constant variance constant autocorrelation

Weakly Stationarity in TS Also called “Covariance-stationarity” Three key features constant mean constant variance constant autocorrelation In statistical sense: if {yt} is weakly stationary, E(yt) = a constant, for all t var(yt) = 2 (a constant), for all t cov(yt yt-k ) = cov(yt-j yt-k-j ) =a constant, for all j, k, jk

AR(p) models For example: AR(2) EX. please write down the AR(5) model where t ~ w. n. For example: AR(2) yt = a0 + a1 yt-1 + a2 yt-2 + t EX. please write down the AR(5) model

The AR(5) model yt=a0 +a1 yt-1+a2 yt-2+a3 yt-3+a4 yt-4+a5 yt-5+ t

Stationarity Restrictions for ARMA(p,q) Enders, p.60-61. Sufficient condition Necessary condition

MA(q) models MA: moving average the general form where t ~ w. n.

MA(q) models MA(1) Ex. Write down the MA(2) model...

The MA(2) model Make sure you can write down MA(2) as... Ex. Write down the MA(5) model...

The MA(5) model yt=a0+a1yt-1+a2yt-2+a3yt-3+a4 yt-4 + a5 yt-5 + t

ARMA(p,q) models ARMA=AR+MA, i.e. ARMA=AR+MA, i.e. general form

Ex. ARMA(1,2) & ARMA(1,2) ARMA(1,2) Please write donw: ARMA(1,2) !