1 Power 2 Econ 240C. 2 Lab 1 Retrospective Exercise: –GDP_CAN = a +b*GDP_CAN(-1) + e –GDP_FRA = a +b*GDP_FRA(-1) + e.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Autocorrelation Functions and ARIMA Modelling
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Lecture #9 Autocorrelation Serial Correlation
Non-stationary data series
Unit Roots & Forecasting
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
Non-Seasonal Box-Jenkins Models
Stationary process NONSTATIONARY PROCESSES 1 In the last sequence, the process shown at the top was shown to be stationary. The expected value and variance.
1 ECON 240C Lecture Outline Box-Jenkins Passengers Box-Jenkins Passengers Displaying the Forecast Displaying the Forecast Recoloring Recoloring.
1 Power Nineteen Econ 240C. 2 Outline Forecast Sources Forecast Sources Ideas that are transcending Ideas that are transcending Symbolic Summary Symbolic.
1 Econ 240A Power 7. 2 This Week, So Far §Normal Distribution §Lab Three: Sampling Distributions §Interval Estimation and Hypothesis Testing.
1 Power Nine Econ 240C. 2 Outline Lab Three Exercises Lab Three Exercises –Fit a linear trend to retail and food sales –Add a quadratic term –Use both.
1 Econ 240 C Lecture 3. 2 Part I Modeling Economic Time Series.
1 Econ 240 C Lecture White noise inputoutput 1/(1 – z) White noise input output Random walkSynthesis 1/(1 – bz) White noise input output.
1 Econ 240 C Lecture Time Series Concepts Analysis and Synthesis.
Econ 240C Lecture Review 2002 Final Ideas that are transcending p. 15 Economic Models of Time Series Symbolic Summary.
1 Midterm Review. 2 Outline The Triangle of Stability - Power 7 The Triangle of Stability - Power 7 Augmented Dickey-Fuller Tests – Power 10 Augmented.
1 Econ 240 C Lecture 6. 2 Part I: Box-Jenkins Magic ARMA models of time series all built from one source, white noise ARMA models of time series all built.
1 Power Nine Econ 240C. 2 Outline Lab Three Exercises Lab Three Exercises –Fit a linear trend to retail and food sales –Add a quadratic term –Use both.
1 Econ 240C Lecture Five Outline w Box-Jenkins Models w Time Series Components Model w Autoregressive of order one.
1 Econ 240 C Lecture 3. 2 Time Series Concepts Analysis and Synthesis.
1 Power Nine Econ 240C. 2 Outline Lab Three Exercises Lab Three Exercises –Fit a linear trend to retail and food sales –Add a quadratic term –Use both.
Time Series Basics Fin250f: Lecture 3.1 Fall 2005 Reading: Taylor, chapter
1 Econ 240C Lecture Five. 2 Outline w Box-Jenkins Models: the grand design w What do you need to learn? w Preview of partial autocorrelation function.
1 ECON 240C Lecture 8. 2 Outline: 2 nd Order AR Roots of the quadratic Roots of the quadratic Example: capumfg Example: capumfg Polar form Polar form.
1 Ka-fu Wong University of Hong Kong Pulling Things Together.
1 ECON 240C Lecture 8. 2 Outline: 2 nd Order AR Roots of the quadratic Roots of the quadratic Example: change in housing starts Example: change in housing.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
1 Econ 240C Lecture Five Outline w Box-Jenkins Models w Time Series Components Model w Autoregressive of order one.
1 ECON 240C Lecture 8. 2 Part I. Economic Forecast Project Santa Barbara County Seminar Santa Barbara County Seminar  April 22, 2004 (April 17, 2003)
1 Econ 240A Power 7. 2 This Week, So Far §Normal Distribution §Lab Three: Sampling Distributions §Interval Estimation and Hypothesis Testing.
1 ECON 240C Lecture 8. 2 Outline: 2 nd Order AR Roots of the quadratic Roots of the quadratic Example: change in housing starts Example: change in housing.
Time series analysis - lecture 1 Time series analysis Analysis of data for which the temporal order of the observations is important Two major objectives:
Business Statistics - QBM117 Statistical inference for regression.
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by descriptive statistic.
Constant process Separate signal & noise Smooth the data: Backward smoother: At any give T, replace the observation yt by a combination of observations.
BOX JENKINS METHODOLOGY
ARMA models Gloria González-Rivera University of California, Riverside
STAT 497 LECTURE NOTES 2.
TIME SERIES ANALYSIS Time Domain Models: Red Noise; AR and ARMA models LECTURE 7 Supplementary Readings: Wilks, chapters 8.
Byron Gangnes Econ 427 lecture 11 slides Moving Average Models.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Previously Definition of a stationary process (A) Constant mean (B) Constant variance (C) Constant covariance White Noise Process:Example of Stationary.
Components of Time Series Su, Chapter 2, section II.
Ch16: Time Series 24 Nov 2011 BUSI275 Dr. Sean Ho HW8 due tonight Please download: 22-TheFed.xls 22-TheFed.xls.
1 Econ 240C Lecture Five. 2 Part I: Time Series Components Model w The conceptual framework for inertial (mechanical) time series models: w Time series.
Testing for equal variance Scale family: Y = sX G(x) = P(sX ≤ x) = F(x/s) To compute inverse, let y = G(x) = F(x/s) so x/s = F -1 (y) x = G -1 (y) = sF.
Introduction to stochastic processes
EC 827 Module 2 Forecasting a Single Variable from its own History.
Forecasting. Model with indicator variables The choice of a forecasting technique depends on the components identified in the time series. The techniques.
Stochastic Process - Introduction
Covariance, stationarity & some useful operators
Time Series Analysis and Its Applications
Econ 240 C Lecture 4.
Econ 240C Lecture 18.
ECON 240C Lecture 7.
Forecasting with non-stationary data series
Unit Root & Augmented Dickey-Fuller (ADF) Test
Lecturer Dr. Veronika Alhanaqtah
CH2 Time series.
Presentation transcript:

1 Power 2 Econ 240C

2 Lab 1 Retrospective Exercise: –GDP_CAN = a +b*GDP_CAN(-1) + e –GDP_FRA = a +b*GDP_FRA(-1) + e

3

4

5

6 Data in Excel YearGDP_CANGDP_CAN(-1C_CAN

7Stacking So for stacking, the data start with 1951

8 Data in Excel yearGDP_CANGDP_CAN(-1)C_CAN

9Stacking So the dependent variable starts with gdp_can(1951) and goes through gdp_can(1992). Then the next value in the stack is gdp_fra(1951) and the data continues ending with gdp_fra(1992). The independent variable for Canada starts with gdp_can(1950) and goes through gdp_can(1991). Then the rest of the stack is 42 zeros

10Stacking The independent variable for France starts with a stack of 42 zeros. Then the next observation is gdp_fra(1950), the following is gdp_fra(1951) etc. ending with gdp_fra(1991) The constant stack for Canada is 42 ones followed by 42 zeros The constant term for France is 42 zeros followed by 42 ones

11

12

13Outline Time Series Concepts –Inertial models –Conceptual time series components –Simulation and synthesis –Simulated white noise, wn(t) –Spreadsheet, trace, and histogram of wn(t) –Independence of wn(t)

14 Univariate Time Series Concepts Inertial models: Predicting the future from own past behavior –Example: trend models –Other example: autoregressive moving average (ARMA) models –Assumption: underlying structure and forces have not changed

15 Conceptual time series components model Time series = trend + seasonal + cycle + random Example: linear trend model –Y(t) = a + b*t + e(t) Example linear trend with seasonal –Y(t) = a + b*t + c 1 *Q 1 (t) + c 2 *Q 2 (t) + c 3 *Q 3 (t) + e(t)

16 How to model the cycle? We have learned how to model: –Trend: linear and exponential –Seasonality: dummy variables –Error: e.g. autoregressive How do you model the cyclical component?

17 Cyclical time series behavior Many economic time series vary with the business cycle Model the cycle using ARMA models That is what the first half of 240C is all about

18 Simulation and Synthesis Build ARMA models from noise, white noise, in a process called synthesis The idea is to start with a time series of simple structure, and build ARMA models by transforming white noise

19 Simulated white noise Generate a sequence of values drawn from a normal distribution with mean zero and variance one, i.e. N(0, 1) In EViews: Gen wn = nrnd

20 The first ten values of simulated white noise, wn(t) Valuedraw = time index

21 Trace (plot) of first 100 values of wn(t) No obvious time Dependence, i.e. Stationary, not Trended, not seasonal

22 Histogram and Statistics, 1000 Obs.

23Independence We know each drawn value is from the same distribution, i.e. N(0,1) We know every value drawn is independent from all other values So wn(t) should be iid, independent and identically distributed

24 Independence: conceptual Suppose the mean series, m(t), of white noise is zero, i.e. E wn(t) = m(t) = 0 This is a good suppose because every generated value has expectation zero since it is from N(0,1) Then E[wn(t)*wn(t-1)] = 0, i.e. a value is independent from the previous or lagged value

25 Independence: conceptual In general: cov [wn(t)*wn(t-u)], where wn(t-u) is lagged u periods from t is defined as cov[wn(t)*wn(t-u)] = E{[wn(t) – Ewn(t)]*[wn(t- u) – Ewn(t-u)]} = E [wn(t)*wn(t-u)], since E wn(t) = 0 This is called the autocovarince, i.e. the covariance of white noise with lagged values of itself

26 Independence: Conceptual For every value of lag except zero, the autocovarince function of white noise is zero by independence At lag zero, the autocovariance of white noise is just its variance, equal to one cov [wn(t)*wn(t)] = E[wn(t)*wn(t)] =1

27 Independence: Conceptual the autocovariance function can be standardized, i,e, made free of units or scale, by dividing by the variance to obtain the autocorrelation function, symbolized for wn(t) by  wn, wn (u) = cov [wn(t)*wn(t-u)/Var wn(t) In general, the autocorrelation function for a time series depends both on time, t, and lag, u. However, for stationary time series it depends only on lag.

28 Theoretical Autocorrelation Function: White Noise

29 What use is the autocorrelation function in practice? Estimated Autocorrelations in EViews

30

observations of White Noise

32Analysis Breaking down the structure of an observed time series, i.e. modeling it Example: weekly closing price of gold, Handy & Harmon, $ per ounce

33 PRICE OF GOLD DateWeekPrice 4/16/040$ /23/041$ /30/042$ /07/043$ /13/044$ /20/045$385.30

34

35

36 Price of gold does Not look like white noise

37 What now? How about week to week changes in the price of gold? In EViews: Gen dgold = gold –gold(-1)

38

39

40

41

42 Changes in the price of gold If changes in the price of gold are not significantly different from white noise, then we have a use for our white noise model: dgold(t) = c + wn(t) Ignore the constant for the moment What sort of time series is the price of gold?

43 The price of gold dgold(t) = gold(t) – gold(t-1) = wn(t) i.e. gold(t) = gold(t-1) + wn(t) Lag by one: dgold(t-1) = gold(t-1) – gold(t- 2) =wn(t-1) i.e., gold (t-1) = gold(t-2) + wn (t-1), so gold(t) = wn(t) + wn(t-1)+ gold(t-2)

44 The price of gold Keep lagging and substituting, and gold(t) = wn(t) + wn(t-1) + wn(t-2) + …. i.e. the price of gold is the current shock, wn(t), plus last week’s shock, wn(t-1), plus the shock from the week before that, wn(t-2) etc. These shocks are also called innovations

45 The price of gold This time series for gold, i.e. the sum of current and previous shocks is called a random walk, rw(t) So rw(t) = wn(t) + wn(t-1) + wn(t-2) + … Lagging by one: rw(t-1) = wn(t-1) + wn(t-2) + wn(t-3) + … So drw(t) = rw(t) –rw(t-1) = wn(t)

46 The first difference of a random walk The first difference of a random walk is white noise

47 Random walk plus trend If the price of gold is trend plus a random walk: gold(t) = a + b*t + rw(t), it is said to be a random walk with drift Lagging by one, gold(t-1) = a + b*(t-1) + rw(t-1) And subtracting, dgold(t) = b + drw(t), i.e. dgold(t) = constant + white noise

48 The time series is too short for the constant To be significant

49 Simulated Random walk EViews, sample 1 1, gen rw = wn Sample , gen rw = rw(-1) + wn

50 Simulated random walk timeWhite noiseRandom walk

51

52

53

54 Random walk Is a random walk evolutionary or stationary?

55 Random walk Mean function for a random walk, m(t) m(t) = E[rw(t)] = E[ wn(t) +wn(t-1) + …] m(t) = ….= 0

56 Variance of an infinite rw(t) Var[rw(t)] = E[rw(t)*rw(t)] Var[rw(t)] =E{[wn(t) + wn(t-1) + wn(t-2) …]*[wn(t) + wn(t-1) + wn(t-2) ….] Var rw(t) =        ∞ So the variance of an infinitely long random walk is not bounded, but infinite, and a random walk can go wandering off.

57 Random walk model The price of gold is bounded below by zero and is not likely to go wandering off to infinity either, so the random walk model is an approximation for the price of gold.

58Question What does the autocovariance function of an infinite random walk look like plotted against lag? lag 0  rw, rw

59 Recall the autocorrelation function For a finite sample of a simulated Random walk decays slowly

60Summary We are now familiar with two time series, white noise and random walks We have looked at the theoretical autocorrelation functions, or are in the process of doing so. We have simulated sample of both and looked at their empirically estimated autocorrelation functions, benchmarks for identification