Linear Filters. denote a bivariate time series with zero mean. Let.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Stationary Time Series
Dates for term tests Friday, February 07 Friday, March 07
General Linear Model With correlated error terms  =  2 V ≠  2 I.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The Linear Prediction Model The Autocorrelation Method Levinson and Durbin.
Applied Econometrics Second edition
Model Building For ARIMA time series
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Time Series Building 1. Model Identification
ELEC 303 – Random Signals Lecture 20 – Random processes
The General Linear Model. The Simple Linear Model Linear Regression.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
Multivariate distributions. The Normal distribution.
Forecasting JY Le Boudec 1. Contents 1.What is forecasting ? 2.Linear Regression 3.Avoiding Overfitting 4.Differencing 5.ARMA models 6.Sparse ARMA models.
280 SYSTEM IDENTIFICATION The System Identification Problem is to estimate a model of a system based on input-output data. Basic Configuration continuous.
1 Identifying ARIMA Models What you need to know.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Financial Econometrics
Non-Seasonal Box-Jenkins Models
Correlation and spectral analysis Objective: –investigation of correlation structure of time series –identification of major harmonic components in time.
Review of Probability.
ARMA models Gloria González-Rivera University of California, Riverside
ECE 8443 – Pattern Recognition EE 3512 – Signals: Continuous and Discrete Objectives: Stability and the s-Plane Stability of an RC Circuit 1 st and 2 nd.
1 Consider a given function F(s), is it possible to find a function f(t) defined on [0,  ), such that If this is possible, we say f(t) is the inverse.
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
CISE315 SaS, L171/16 Lecture 8: Basis Functions & Fourier Series 3. Basis functions: Concept of basis function. Fourier series representation of time functions.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
Fourier Series. Introduction Decompose a periodic input signal into primitive periodic components. A periodic sequence T2T3T t f(t)f(t)
Models for Non-Stationary Time Series The ARIMA(p,d,q) time series.
Intro. ANN & Fuzzy Systems Lecture 26 Modeling (1): Time Series Prediction.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
FORECASTING. Minimum Mean Square Error Forecasting.
Introducing ITSM 2000 By: Amir Heshmaty Far. S EVERAL FUNCTION IN ITSM to analyze and display the properties of time series data to compute and display.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
1 Chapter 3:Box-Jenkins Seasonal Modelling 3.1Stationarity Transformation “Pre-differencing transformation” is often used to stablize the seasonal variation.
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
ECE 8443 – Pattern Recognition ECE 3163 – Signals and Systems Objectives: Stability Response to a Sinusoid Filtering White Noise Autocorrelation Power.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Normal Equations The Orthogonality Principle Solution of the Normal Equations.
Discrete-time Random Signals
Multivariate Time Series Analysis
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Geology 6600/7600 Signal Analysis 30 Sep 2015 © A.R. Lowry 2015 Last time: The transfer function relating linear SISO input & output signals is given by.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory.
Introduction to stochastic processes
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
Eeng360 1 Chapter 2 Linear Systems Topics:  Review of Linear Systems Linear Time-Invariant Systems Impulse Response Transfer Functions Distortionless.
Geology 6600/7600 Signal Analysis Last time: Linear Systems The Frequency Response or Transfer Function of a linear SISO system can be estimated as (Note.
LECTURE 30: SYSTEM ANALYSIS USING THE TRANSFER FUNCTION
Time Series Analysis.
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Computational Data Analysis
Model Building For ARIMA time series
UNIT II Analysis of Continuous Time signal
Chapter 6: Forecasting/Prediction
Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Multivariate Time Series Analysis
VAR Models Gloria González-Rivera University of California, Riverside
Linear Filters.
The Spectral Representation of Stationary Time Series
copyright Robert J. Marks II
BOX JENKINS (ARIMA) METHODOLOGY
Presentation transcript:

Linear Filters

denote a bivariate time series with zero mean. Let

The time series {y t : t  T} is said to be constructed from {x t : t  T} by means of a Linear Filter. Suppose that the time series {y t : t  T} is constructed as follows:

The autocovariance function of the filtered series

Thus the spectral density of the time series {y t : t  T} is:

Comment A: is called the Transfer function of the linear filter. is called the Gain of the filter while is called the Phase Shift of the filter.

Also

Thus cross spectrum of the bivariate time series is:

Definition: = Squared Coherency function Note:

Comment B: = Squared Coherency function. if {y t : t  T} is constructed from {x t : t  T} by means of a linear filter

Linear Filters with additive noise at the output

denote a bivariate time series with zero mean. Let t =..., -2, -1, 0, 1, 2,... Suppose that the time series {y t : t  T} is constructed as follows: The noise {v t : t  T} is independent of the series {x t : t  T} (may be white)

t

The autocovariance function of the filtered series with added noise

continuing Thus the spectral density of the time series {y t : t  T} is:

Also

Thus cross spectrum of the bivariate time series is:

Thus = Squared Coherency function. Noise to Signal Ratio

Box-Jenkins Parametric Modelling of a Linear Filter

Consider the Linear Filter of the time series {X t : t  T}: where and = the Transfer function of the filter.

{a t : t  T} is called the impulse response function of the filter since if X 0 =1and X t = 0 for t ≠ 0, then : for t  T Linear Filter XtXt atat

Also Note:

Hence {  Y t } and {  X t } are related by the same Linear Filter. Definition The Linear Filter is said to be stable if : converges for all |B| ≤1.

Discrete Dynamic Models:

Many physical systems whose output is represented by Y(t) are modeled by the following differential equation: Where X(t) is the forcing function.

If X and Y are measured at discrete times this equation can be replaced by: where  = I-B denotes the differencing operator.

This equation can in turn be represented with the operator B. orwhere

This equation can also be written in the form as a Linear filter as Stability: It can easily be shown that this filter is stable if the roots of  (x) = 0 lie outside the unit circle.

Determining the Impulse Response function from the Parameters of the Filter:

Now or Hence

Equating coefficients results in the following conclusions: a j = 0 for j < b. a j -  1 a j-1 -  2 a j  r a j-r =  j ora j =  1  j-1 +  2 a j  r a j-r +  j for b ≤ j ≤ b+s. anda j -  1 a j-1 -  2 a j  r a j-r = 0 ora j =  1 a j-1 +  2 a j  r a j-r for j > b+s.

Thus the coefficients of the transfer function, a 0, a 1, a 2,..., satisfy the following properties 1)b zeroesa 0, a 1, a 2,..., a b-1 2)No pattern for the next s-r+1 values a b, a b+1, a b+2,..., a b+s-r 3)The remaining values a b+s-r+1, a b+s-r+2, a b+s-r+3,... follow the pattern of an r th order difference equation a j =  1 a j-1 +  2 a j  r a j-r

Exampler =1, s=2, b=3,  1 =  a 0 = a 1 = a 2 = 0 a 3 =  a 2 +  0 =  0 a 4 =  a 3 +  1 =  0 +  1 a 5 =  a 4 +  2 =  [  0 +  1 ] +  2 =  2 w 0 +  1 +  2 a j =  a j-1 for j ≥ 6.

Transfer function {a t }

Identification of the Box-Jenkins Transfer Model with r=2

Recall the solution to the second order difference equation a j =  1 a j-1 +  2 a j-2 follows the following patterns: 1)Mixture of exponentials if the roots of 1 -  1 x -  2 x 2 = 0 are real. 2) Damped Cosine wave if the roots to 1 -  1 x -  2 x 2 = 0 are complex. These are the patterns of the Impulse Response function one looks for when identifying b,r and s.

Estimation of the Impulse Response function, a j (without pre-whitening).

Suppose that {Y t : t  T} and {X t : t  T}are weakly stationary time series satisfying the following equation: Also assume that {N t : t  T} is a weakly stationary "noise" time series, uncorrelated with {X t : t  T}. Then

Suppose that for s > M, a s = 0. Then a 0, a 1,...,a M can be found solving the following equations:

If the Cross autocovariance function,  XY (h), and the Autocovariance function,  XX (h), are unknown they can be replaced by their sample estimates C XY (h) and C XX (h), yeilding estimates of the impluse response function

In matrix notation this set of linear equations can be written:

If the Cross autocovariance function,  XY (h), and the Autocovariance function,  XX (h), are unknown they can be replaced by their sample estimates C XY (h) and C XX (h), yeilding estimates of the impluse response function

Estimation of the Impulse Response function, a j (with pre-whitening).

Suppose that {Y t : t  T} and {X t : t  T}are weakly stationary time series satisfying the following equation: Also assume that {N t : t  T} is a weakly stationary "noise" time series, uncorrelated with {X t : t  T}.

In addition assume that {X t : t  T}, the weakly stationary time series has been identified as an ARMA(p,q) series, estimated and found to satisfy the following equation:  (B)X t =  (B)u t where {u t : t  T} is a white noise time series. Then [  (B)] -1  (B)X t = u t transforms the Time series {X t : t  T} into the white noise time series{u t : t  T}.

This process is called Pre-whitening the Input series. Applying this transformation to the Output series {Y t : t  T} yeilds:

or where and

In this case the equations for the impulse response function - a 0, a 1,...,a M - become (assuming that for s > M, a s = 0):

Summary Identification and Estimation of Box-Jenkins transfer model

To identify the series we need to determine b, r and s. The first step is to compute 1.the ACF’s and the cross CF’s C xx (h) and C xy (h) 2.Estimate the impulse response function using

The Impulse response function {a t } bs- r + 1 Pattern of an r th order difference equation 3.Determine the value of b, r and s from the pattern of the impulse response function

3.Determine preliminary estimates of the Box- Jenkins transfer function parameters using: i.for j > b+s.. a j =  1 a j-1 +  2 a j  r a j-r ii.for b ≤ j ≤ b+s a j =  1  j-1 +  2 a j  r a j-r +  j 4.Determine preliminary estimates of the ARMA parameters of the input time series {x t }

5.Determine preliminary estimates of the ARIMA parameters of the noise time series { t }

Maximum Likelihood estimation of the parameters of the Box-Jenkins Transfer function model

The Box- Jenkins model is written The parameters of the model are: In addition 1.the ARMA parameters of the input series {x t } 2.The ARIMA parameters of the noise series { t }

The model for the noise { t }series can be written

Given starting values for {y t }, {x t }, and and the parameters of the transfer function model and the noise model We can calculate successively: The maximum likelihood estimates are the values that minimize:

Fitting a transfer function model Example: Monthly Sales (Y) and Monthly Advertising expenditures

The Data

Using SAS Available in the Arts computer lab

The Start up window for SAS

To import data - Choose File -> Import data

The following window appears

Browse for the file to be imported

Identify the file in SAS

The next screen (not important) click Finish

The finishing screen

You can now run analysis by typing code into the Edit window or selecting the analysis form the menu To fit a transfer function model we need to identify the model –Determine the order of differencing to achieve Stationarity –Determine the value of b, r and s.

To determine the degree of differencing we look at ACF’s and PACF’s for various order of differencing

To produce the ACF, PACF – type the following commands into the Editor window- Press Run button

To identify the transfer function model we need to estimate the impulse response function using: For this we need the ACF of the input series and the cross ACF of the input with the output

To produce the Cross correlation function – type the following commands into the Editor window

the impulse response function using can be determined using some other package (i.e. Excel) b = 4 r,s = 1

To Estimate the transfer function model – type the following commands into the Editor window

To estimate the following model Use input=( b $ (  -lags ) / (  -lags) x) In SAS

The Output