STAT 497 APPLIED TIME SERIES ANALYSIS

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Random Processes Introduction
Autocorrelation Functions and ARIMA Modelling
ELEN 5346/4304 DSP and Filter Design Fall Lecture 15: Stochastic processes Instructor: Dr. Gleb V. Tcheslavski Contact:
Random Variables ECE460 Spring, 2012.
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
Stationary Stochastic Process
Non-Seasonal Box-Jenkins Models
Introduction to stochastic process
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
1 Power 2 Econ 240C. 2 Lab 1 Retrospective Exercise: –GDP_CAN = a +b*GDP_CAN(-1) + e –GDP_FRA = a +b*GDP_FRA(-1) + e.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Economics 20 - Prof. Anderson
K. Ensor, STAT Spring 2005 The Basics: Outline What is a time series? What is a financial time series? What is the purpose of our analysis? Classification.
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Review of Probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
ARMA models Gloria González-Rivera University of California, Riverside
TIME SERIES by H.V.S. DE SILVA DEPARTMENT OF MATHEMATICS
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
1 LES of Turbulent Flows: Lecture 1 Supplement (ME EN ) Prof. Rob Stoll Department of Mechanical Engineering University of Utah Fall 2014.
1 Statistical Distribution Fitting Dr. Jason Merrick.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Week 11 Introduction A time series is an ordered sequence of observations. The ordering of the observations is usually through time, but may also be taken.
Elements of Stochastic Processes Lecture II
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Review of Random Process Theory CWR 6536 Stochastic Subsurface Hydrology.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Random processes. Matlab What is a random process?
Chapter 1 Random Process
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
Geology 6600/7600 Signal Analysis 02 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Joint Moments and Joint Characteristic Functions.
Previously Definition of a stationary process (A) Constant mean (B) Constant variance (C) Constant covariance White Noise Process:Example of Stationary.
ECE-7000: Nonlinear Dynamical Systems 2. Linear tools and general considerations 2.1 Stationarity and sampling - In principle, the more a scientific measurement.
Stats Term Test 4 Solutions. c) d) An alternative solution is to use the probability mass function and.
Surveying II. Lecture 1.. Types of errors There are several types of error that can occur, with different characteristics. Mistakes Such as miscounting.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Basic Random Processes. Introduction Annual summer rainfall in Rhode Island is a physical process has been ongoing for all time and will continue. We’d.
Components of Time Series Su, Chapter 2, section II.
Geology 5600/6600 Signal Analysis 11 Sep 2015 © A.R. Lowry 2015 Last time: The Central Limit theorem : The sum of a sequence of random variables tends.
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
1 Ka-fu Wong University of Hong Kong A Brief Review of Probability, Statistics, and Regression for Forecasting.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
EC 827 Module 2 Forecasting a Single Variable from its own History.
Chapter 6 Random Processes
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Stochastic Process - Introduction
Multiple Random Variables and Joint Distributions
Covariance, stationarity & some useful operators
STATISTICS POINT ESTIMATION
水分子不時受撞,跳格子(c.p. 車行) 投骰子 (最穩定) 股票 (價格是不穏定,但報酬過程略穩定) 地震的次數 (不穩定)
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
STOCHASTIC HYDROLOGY Random Processes
These slides are based on:
Chapter 6 Random Processes
Lecturer Dr. Veronika Alhanaqtah
CH2 Time series.
Presentation transcript:

STAT 497 APPLIED TIME SERIES ANALYSIS INTRODUCTION

DEFINITION A stochastic process is a collection of random variables or a process that develops in time according to probabilistic laws. The theory of stochastic processes gives us a formal way to look at time series variables.

DEFINITION For a fixed t, Y(w, t) is a random variable For a given w, Y(w, t) is called a sample function or a realization as a function of t. Indexed set belongs to sample space

DEFINITION Time series is a realization or sample function from a certain stochastic process. A time series is a set of observations generated sequentially in time. Therefore, they are dependent to each other. This means that we do NOT have random sample. We assume that observations are equally spaced in time. We also assume that closer observations might have stronger dependency.

DEFINITION Discrete time series is one in which the set T0 at which observations are made is a discrete set. Continuous time series are obtained when observations are recorded continuously over some time interval.

EXAMPLES Data in business, economics, engineering, environment, medicine, earth sciences, and other areas of scientific investigations are often collected in the form of time series. Hourly temperature readings Daily stock prices Weekly traffic volume Annual growth rate Seasonal ice cream consumption Electrical signals

EXAMPLES

EXAMPLES

EXAMPLES

EXAMPLES

EXAMPLES

EXAMPLES

EXAMPLES

EXAMPLES

EXAMPLES

EXAMPLES

EXAMPLES (SUNSPOT NUMBERS)

OBJECTIVES OF TIME SERIES ANALYSIS Understanding the dynamic or time-dependent structure of the observations of a single series (univariate analysis) Forecasting of future observations Ascertaining the leading, lagging and feedback relationships among several series (multivariate analysis)

STEPS IN TIME SERIES ANALYSIS Model Identification Time Series plot of the series Check for the existence of a trend or seasonality Check for the sharp changes in behavior Check for possible outliers Remove the trend and the seasonal component to get stationary residuals. Estimation MME MLE Diagnostic Checking Normality of error terms Independency of error terms Constant error variance (Homoscedasticity) Forecasting Exponential smoothing methods Minimum MSE forecasting

CHARACTERISTICS OF A SERIES For a time series THE MEAN FUNCTION: THE VARIANCE FUNCTION: Exists iff The expected value of the process at time t.

CHARACTERISTICS OF A SERIES THE AUTOCOVARIANCE FUNCTION: THE AUTOCORRELATION FUNCTION: Covariance between the value at time t and the value at time s of a stochastic process Yt. The correlation of the series with itself

EXAMPLE Moving average process: Let ti.i.d.(0, 1), and Xt = t + 0.5 t−1

EXAMPLE RANDOM WALK: Let e1,e2,… be a sequence of i.i.d. rvs with 0 mean and variance . The observed time series is obtained as

A RULE ON THE COVARIANCE If c1, c2,…, cm and d1, d2,…, dn are constants and t1, t2,…, tm and s1, s2,…, sn are time points, then

JOINT PDF OF A TIME SERIES Remember that

JOINT PDF OF A TIME SERIES For the observed time series, say we have two points, t and s. The marginal pdfs: The joint pdf:

JOINT PDF OF A TIME SERIES Since we have only one observation for each r.v. Yt, inference is too complicated if distributions (or moments) change for all t (i.e. change over time). So, we need a simplification. r.v.

JOINT PDF OF A TIME SERIES To be able to identify the structure of the series, we need the joint pdf of Y1, Y2,…, Yn. However, we have only one sample. That is, one observation from each random variable. Therefore, it is very difficult to identify the joint distribution. Hence, we need an assumption to simplify our problem. This simplifying assumption is known as STATIONARITY.

STATIONARITY The most vital and common assumption in time series analysis. The basic idea of stationarity is that the probability laws governing the process do not change with time. The process is in statistical equilibrium.

TYPES OF STATIONARITY STRICT (STRONG OR COMPLETE) STATIONARY PROCESS: Consider a finite set of r.v.s. from a stochastic process The n-dimensional distribution function is defined by where yi, i=1, 2,…, n are any real numbers.

STRONG STATIONARITY A process is said to be first order stationary in distribution, if its one dimensional distribution function is time-invariant, i.e., Second order stationary in distribution if n-th order stationary in distribution if

n-th order stationarity in distribution = strong stationarity  Shifting the time origin by an amount “k” has no effect on the joint distribution, which must therefore depend only on time intervals between t1, t2,…, tn, not on absolute time, t.

STRONG STATIONARITY So, for a strong stationary process i) ii) iii) iv) Expected value of a series is constant over time, not a function of time The variance of a series is constant over time, homoscedastic. Not constant, not depend on time, depends on time interval, which we call “lag”, k

STRONG STATIONARITY ……………………………………….... Affected from time lag, k Yt Yn     2 2 2 2 t Affected from time lag, k

STRONG STATIONARITY v) Let t=t-k and s=t, It is usually impossible to verify a distribution particularly a joint distribution function from an observed time series. So, we use weaker sense of stationarity.

WEAK STATIONARITY WEAK (COVARIANCE) STATIONARITY OR STATIONARITY IN WIDE SENSE: A time series is said to be covariance stationary if its first and second order moments are unaffected by a change of time origin. That is, we have constant mean and variance with covariance and correlation beings functions of the time difference only.

WEAK STATIONARITY From, now on, when we say “stationary”, we imply weak stationarity.

EXAMPLE Consider a time series {Yt} where Yt=et and eti.i.d.(0,2). Is the process stationary?

EXAMPLE MOVING AVERAGE: Suppose that {Yt} is constructed as and eti.i.d.(0,2). Is the process {Yt} stationary?

EXAMPLE RANDOM WALK where eti.i.d.(0,2). Is the process {Yt} stationary?

EXAMPLE Suppose that time series has the form where a and b are constants and {et} is a weakly stationary process with mean 0 and autocovariance function k. Is {Yt} stationary?

EXAMPLE where eti.i.d.(0,2). Is the process {Yt} stationary?

STRONG VERSUS WEAK STATIONARITY Strict stationarity means that the joint distribution only depends on the ‘difference’ h, not the time (t1, . . . , tk). Finite variance is not assumed in the definition of strong stationarity, therefore, strict stationarity does not necessarily imply weak stationarity. For example, processes like i.i.d. Cauchy is strictly stationary but not weak stationary. A nonlinear function of a strict stationary variable is still strictly stationary, but this is not true for weak stationary. For example, the square of a covariance stationary process may not have finite variance. Weak stationarity usually does not imply strict stationarity as higher moments of the process may depend on time t.

STRONG VERSUS WEAK STATIONARITY If process {Xt} is a Gaussian time series, which means that the distribution functions of {Xt} are all multivariate Normal, weak stationary also implies strict stationary. This is because a multivariate Normal distribution is fully characterized by its first two moments.

STRONG VERSUS WEAK STATIONARITY For example, a white noise is stationary but may not be strict stationary, but a Gaussian white noise is strict stationary. Also, general white noise only implies uncorrelation while Gaussian white noise also implies independence. Because if a process is Gaussian, uncorrelation implies independence. Therefore, a Gaussian white noise is just i.i.d. N(0, 2).

STATIONARITY AND NONSTATIONARITY Stationary and nonstationary processes are very different in their properties, and they require different inference procedures. We will discuss this in detail through this course.