Introduction to Time Series Analysis

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Copyright(© MTS-2002GG): You are free to use and modify these slides for educational purposes, but please if you improve this material send us your new.
Dates for term tests Friday, February 07 Friday, March 07
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
Introduction to stochastic process
SYSTEMS Identification
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Review of Probability and Random Processes
Review of Probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
ARMA models Gloria González-Rivera University of California, Riverside
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Limits and the Law of Large Numbers Lecture XIII.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Elements of Stochastic Processes Lecture II
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Review of Random Process Theory CWR 6536 Stochastic Subsurface Hydrology.
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
Random processes. Matlab What is a random process?
Chapter 3 Foundation of Mathematical Analysis § 3.1 Statistics and Probability § 3.2 Random Variables and Magnitude Distribution § 3.3 Probability Density.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Chapter 1 Random Process
ENEE631 Digital Image Processing (Spring'04) Basics on 2-D Random Signal Spring ’04 Instructor: Min Wu ECE Department, Univ. of Maryland, College Park.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.
Lab for Remote Sensing Hydrology and Spatial Modeling Dept of Bioenvironmental Systems Engineering National Taiwan University 1/45 GEOSTATISTICS INTRODUCTION.
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
EC 827 Module 2 Forecasting a Single Variable from its own History.
Chapter 6 Random Processes
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Stochastic Process - Introduction
Multiple Random Variables and Joint Distributions
Time Series Analysis.
Introduction to Time Series Analysis
STATISTICS POINT ESTIMATION
水分子不時受撞,跳格子(c.p. 車行) 投骰子 (最穩定) 股票 (價格是不穏定,但報酬過程略穩定) 地震的次數 (不穩定)
Spurious Regression and Simple Cointegration
SIGNALS PROCESSING AND ANALYSIS
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Further Issues Using OLS with Time Series Data
Statistics 153 Review - Sept 30, 2008
Further Issues in Using OLS with Time Series Data
Stochastic models - time series.
Machine Learning Week 4.
Stochastic models - time series.
STOCHASTIC HYDROLOGY Random Processes
Chapter 14 Wiener Processes and Itô’s Lemma
Spurious Regression and Simple Cointegration
The Spectral Representation of Stationary Time Series
Analysis of Engineering and Scientific Data
These slides are based on:
Further Issues Using OLS with Time Series Data
Eni Sumarminingsih, SSi, MM
Introduction to Time Series Analysis
Chapter 6 Random Processes
Lecturer Dr. Veronika Alhanaqtah
Forecasting II (forecasting with ARMA models)
Experiments, Outcomes, Events and Random Variables: A Revisit
Forecasting II (forecasting with ARMA models)
CH2 Time series.
Cointegration and Common Factors
Forecasting II (forecasting with ARMA models)
Stationary Stochastic Process
Presentation transcript:

Introduction to Time Series Analysis Gloria González-Rivera University of California, Riverside and Jesús Gonzalo U. Carlos III de Madrid Copyright(© MTS-2002GG): You are free to use and modify these slides for educational purposes, but please if you improve this material send us your new version.

Brief Review of Probability Sample Space: , the set of possible outcomes of some random experiment Outcome: , a single element of the Sample Space Event: , a subset of the Sample Space Field: , the collection of Events we will be considering Random Variables: , a function from the Sample Space  to a State Space S State Space: S, a space containing the possible values of a random variables –common choices are the integers N, reals R, k-vectors Rk, complex numbers C, positive reals R+, etc Probability: , obeying the three rules that you must very well know Distribution: is the Borel sets (intervals, etc)

Brief Review (cont) Random Vectors: Z= (Z1, Z2 , ..., Zn) is a n-dimensional random vector if its components Z1 , ..., Zn are one-dimensional real-valued random variables If we interpret t=1, ..., n as equidistant instants of time, Zt can stand for the outcome of an experiment at time t . Such a time series may, for example, consists of Toyota share prices Zt at n succeeding days. The new aspect now, compared to a one-dimensional radnom variable, is that now we can talk about the dependence structure of the random vector. Distribution function FZ of Z : It is the collection of the probabilities

Stochastic Processes We suppose that the exchange rate €/$ at every fixed instant t between 5p.m and 6p.m. this afternoon is random. Therefore we can interpret it as a realization Zt(w) of the random variable Zt, and so we observe Zt(w), 5<t<6. In order to make a guess at 6 p.m. about the exchange rate Z19(w) at 7 p.m. it is reasonable to look at the whole evolution of Zt(w) between 5 and 6 p.m. A mathematical model describing this evolution is called a stochastic process.

Stochastic Processes (cont) A stochastic process is a collection of time indexed random variables defined on some space W. Suppose that (1) For a fixed t This is just a random variable. (2) For fixed This is a realization or sample function Changing the time index, we can generate several random variables: Add graph to generate intuition From which a realization is: This collection of random variables is called a STOCHASTIC PROCESS A realization of the stochastic process is called a TIME SERIES

Examples of stochastic processes E1: Let the index set be T={1, 2, 3} and let the space of outcomes (W) be the possible outcomes associated with tossing one dice: W={1, 2, 3, ,4 ,5, 6} Define Z(t, w)= t + [value on dice]2 t Therefore for a particular w, say w3={3}, the realization or path would be (10, 20, 30). Q1: Draw all the different realizations (six) of this stochastic process. Q2: Think on an economic relevant variable as an stochastic process and write down an example similar to E1 with it. Specify very clear the sample space and the “rule” that generates the stochastic process. E2: A Brownian Motion B=(Bt, t [0, infty]): It starts at zero: Bo=0 It has stationary, independent increments For evey t>0, Bt has a normal N(0, t) distribution It has continuous sample paths: “no jumps”.

Distribution of a Stochastic Process In analogy to random variables and random vectors we want to introduce non-random characteristics of a stochastic process such as its distribution, expectation, etc. and describe its dependence structure. This is a task much more complicated that the description of a random vector. Indeed, a non-trivial stochastic process Z=(Zt, t  T) with infinite index set T is an infinite-dimensional object; it can be inderstood as the infinite collection of the random variables Zt, t  T. Since the values of Z are functions on T, the distribution of Z should be defined on subsets of a certain “function space”, i.e. P(X  A), A  F, where F is a collection of suitable subsets of this space of functions. This approach is possible, but requires advanced mathematics, and so we will try something simpler. The finite-dimensional distributions (fidis) of the stochastic process Z are the distributions of the finite-dimensional vectors (Zt1,..., Ztn), t1, ..., tn T, for all possible choices of times t1, ..., tn  T and every n  1.

Stationarity Consider the joint probability distribution of the collection of random variables 1st order stationary process if 2nd order stationary process if n-order stationary process if Definition. A process is strongly (strictly) stationary if it is a n-order stationary process for any n.

Moments

Moments (cont) For strictly stationary process: because provided that The correlation between any two random variables depends on the time difference

A process is said to be n-order weakly stationary if all its joint Weak Stationarity A process is said to be n-order weakly stationary if all its joint moments up to order n exist and are time invariant. Covariance stationary process (2nd order weakly stationary): constant mean constant variance covariance function depends on time difference between R.V. Write graphs showing a process with time-variant mean, and time-variant variance

Autocovariance and Autocorrelation Functions For a covariance stationary process: Make a picture of the autocorrelogram

Properties of the autocorrelation function

Partial Autocorrelation Function (conditional correlation) This function gives the correlation between two random variables that are k periods apart when the in-between linear dependence (between t and t+k ) is removed. Motivation Think about a regression model (without loss of generality, assume that E(Z)=0)

Dividing by the variance of the process: Yule-Walker equations

Examples of stochastic processes Yt if t is even E4: Zt= Yt+1 if t is odd where Yt is a stationary time series. Is Zt weak stationary? E5: Define the process St = X1+ ... + Xn , where Xi is iid (0, s2). Show that for h>0 Cov (St+h, St) = t s2, and therefore St is not weak stationary.

Examples of stochastic processes (cont) E6: White Noise Process A sequence of uncorrelated random variables is called a white noise process. . . . . 1 2 3 4 k

Dependence: Ergodicity See Reading 1 from Leo Breiman (1969) “Probability and Stochastic Processes: With a View Toward Applications” We want to allow as much dependence as the Law of Large Numbers (LLN) let us do it Stationarity is not enough as the following example shows: E7: Let {Ut} be a sequence of iid r.v uniformly distributed on [0, 1] and let Z be N(0,1) independent of {Ut}. Define Yt=Z+Ut . Then Yt is stationary (why?), but The problem is that there is too much dependence in the sequence {Yt}. In fact the correlation between Y1 and Yt is always positive for any value of t.

Ergodicity for the mean Objective: estimate the mean of the process Need to distinguishing between: 1. Ensemble average 2. Time average Which estimator is the most appropriate? Ensemble average Problem: It is impossible to calculate Under which circumstances we can use the time average? Is the time average an unbiased and consistent estimator of the mean?

Ergodicity for the mean (cont) Reminder. Sufficient conditions for consistency of an estimator. 1. Time average is asymptotically unbiased 2. Time average is consistent for the mean

Ergodicity for the mean (cont) A covariance-stationary process is ergodic for the mean if A sufficient condition for ergodicity for the mean is

Ergodicity under Gaussanity Ergodicity for second moments A sufficient condition to ensure ergodicity for second moments is Ergodicity under Gaussanity If is a stationary Gaussian process, is sufficient to ensure ergodicity for all moments

Parametric and linear models Where are We? The Prediction Problem as a Motivating Problem: Predict Zt+1 given some information set It at time t. The conditional expectation can be modeled in a parametric way or in a non-parametric way. We will choose in this course the former. Parametric models can be linear or non-linear. We will choose in this course the former way too. Summarizing the models we are going to study and use in this course will be Parametric and linear models

Some Problems P1: Let {Zt} be a sequence of uncorrelated real-valued variables with zero means and unit variances, and define the “moving average” for constants a0, a1, ... , ar . Show that Y is weak stationary and find its autocovariance function P2: Show that a Gaussian process is strongly stationary if and only if it is weakly stationary P3: Let X be a stationary Gaussian process with zero mean, unit variance, and autocovariance function c. Find the autocovariance functions of the process

Appendix: Transformations Goal: To lead to a more manageable process Log transformation reduces certain type of heteroskedasticity. If we assume mt=E(Xt) and V(Xt) = k m2t, the delta method shows that the variance of the log is roughly constant: Differencing eliminates the trend (not very informative about the nature of the trend) Differencing + Log = Relative Change