Properties of the estimates of the parameters of ARMA models.

Slides:



Advertisements
Similar presentations
Dates for term tests Friday, February 07 Friday, March 07
Advertisements

Model Building For ARIMA time series
Comparison of the models. Concentration data Its ACF.
Model specification (identification) We already know about the sample autocorrelation function (SAC): Properties: Not unbiased (since a ratio between two.
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
OPTIMUM FILTERING.
STAT 497 LECTURE NOTES 8 ESTIMATION.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
How should these data be modelled?. Identification step: Look at the SAC and SPAC Looks like an AR(1)- process. (Spikes are clearly decreasing in SAC.
Modeling Cycles By ARMA
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem, random variables, pdfs 2Functions.
Modeling Host Load Peter A. Dinda Thesis Seminar 2/9/98.
ARMA-Eview Application. MA Process –Y[i,1]=1*u[i,1]+1.5*u[i-1,1]; –MA with order of 1 –The Graph of Autocorrelation function –When Acf will dampen?
Ranking individuals by group comparison New exponentiel model Two methods for calculations  Regularized least square  Maximum likelihood.
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 8 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
Financial Time Series CS3. Financial Time Series.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Non-Seasonal Box-Jenkins Models
AR- MA- och ARMA-.
ARMA models Gloria González-Rivera University of California, Riverside
Teknik Peramalan: Materi minggu kedelapan
1 FARIMA(p,d,q) Model and Application n FARIMA Models -- fractional autoregressive integrated moving average n Generating FARIMA Processes n Traffic Modeling.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
1 Linear Prediction. Outline Windowing LPC Introduction to Vocoders Excitation modeling  Pitch Detection.
1. 3 x = x 3 2. K + 0 = K (3 + r) = (12 + 3) + r 4. 7 (3 + n) = n Name the Property Commutative of Multiplication Identity of Addition.
Byron Gangnes Econ 427 lecture 11 slides Moving Average Models.
Functional Brain Signal Processing: EEG & fMRI Lesson 5 Kaushik Majumdar Indian Statistical Institute Bangalore Center M.Tech.
FORECASTING. Minimum Mean Square Error Forecasting.
K. Ensor, STAT Spring 2005 Model selection/diagnostics Akaike’s Information Criterion (AIC) –A measure of fit plus a penalty term for the number.
Time Series Basics (2) Fin250f: Lecture 3.2 Fall 2005 Reading: Taylor, chapter , 3.9(skip 3.6.1)
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
K. Ensor, STAT Spring 2005 Estimation of AR models Assume for now mean is 0. Estimate parameters of the model, including the noise variace. –Least.
1 Introduction to Statistics − Day 4 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Lecture 2 Brief catalogue of probability.
Autoregressive (AR) Spectral Estimation
Lecture 12: Parametric Signal Modeling XILIANG LUO 2014/11 1.
G. Cowan Computing and Statistical Data Analysis / Stat 9 1 Computing and Statistical Data Analysis Stat 9: Parameter Estimation, Limits London Postgraduate.
M M M M 5. Not Listed
Teaching Statistical Concepts with Simulated Data Andrej Blejec National Institute of Biology and University of Ljubljana Ljubljana, Slovenia
Learning Theory Reza Shadmehr Distribution of the ML estimates of model parameters Signal dependent noise models.

Introduction to stochastic processes
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
COLLEGE ADMISSION TEST SCORES Student IDReadingWritingMathTotal S S S S S
MODEL DIAGNOSTICS By Eni Sumarminingsih, Ssi, MM.
1 Lecture Plan : Statistical trading models for energy futures.: Stochastic Processes and Market Efficiency Trading Models Long/Short one.
Budapest University of Technology &Econ. 1 Summer school on ADC & DAC June-July 2006 Testing of A/D Converters II. Improvement of Sine Fit István Kollár.
Locating a Shift in the Mean of a Time Series Melvin J. Hinich Applied Research Laboratories University of Texas at Austin
Spatial Econometric Analysis
Ch3: Model Building through Regression
Applied Econometric Time Series Third Edition
FORECASTING PRACTICE I
Regression with Autocorrelated Errors
المبادلة بين العائد و المخاطرة دراسة قياسية السنة الدراســــــــية:
جلسه هشتم شناسايي سيستم مدلسازي سيستم هاي بيو لوژيکي.
POINT ESTIMATOR OF PARAMETERS
10701 / Machine Learning Today: - Cross validation,
مدلسازي تجربي – تخمين پارامتر
Spatial Econometric Analysis Using GAUSS
Chapter 8. Model Identification
Linear Prediction.
Homework A Let us use the log-likelihood function to derive an on-line adaptation rule analogous to LMS. Our goal is to update our estimate of weights.
CH2 Time series.
Maximum Likelihood Estimation (MLE)
Presentation transcript:

Properties of the estimates of the parameters of ARMA models

AR(1) models Comparison of Yule-Walker, Least Squares and Maximum Likelihood

a=0.5, N=20, 50 simulations YW: average 0.44, st.dev LS: average 0.467, st.dev ML: average 0.463, st.dev. 0.19

a=0.5, N=100, 50 simulations YW: average 0.494, st.dev LS: average 0.498, st.dev ML: average 0.498, st.dev. 0.85

a=0.5, N=500, 50 simulations YW: average 0.495, st.dev LS: average 0.496, st.dev ML: average 0.496, st.dev. 0.04

a=0.5, N=1000, 50 simulations YW: average 0.499, st.dev LS: average 0.499, st.dev ML: average 0.499, st.dev

a=0.95, N=20, 50 simulations YW: average 0.837, st.dev LS: average 0.877, st.dev ML: average 0.882, st.dev

a=0.95, N=100, 50 simulations YW: average 0.918, st.dev LS: average 0.929, st.dev ML: average 0.928, st.dev

a=0.95, N=500, 50 simulations YW: average 0.942, st.dev LS: average 0.944, st.dev ML: average 0.945, st.dev

a=0.95, N=1000, 50 simulations YW: average 0.948, st.dev LS: average 0.949, st.dev ML: average 0.949, st.dev

a=-0.95, N=20, 50 simulations YW: average , st.dev LS: average , st.dev ML: average , st.dev

a=-0.95, N=100, 50 simulations YW: average , st.dev LS: average , st.dev ML: average , st.dev

a=-0.95, N=500, 50 simulations YW: average , st.dev LS: average , st.dev ML: average , st.dev

a=-0.95, N=1000, 50 simulations YW: average , st.dev LS: average , st.dev ML: average , st.dev

AR(2) models Yule-Walker, Least squares and Maximum Likelihood for different N

N=20

a 1 = -1.8, a 2 = 0.9, N=20 Yule-Walker

a 1 = -1.8, a 2 = 0.9, N=20 Least Squares

a 1 = -1.8, a 2 = 0.9, N=20 Maximum Likelihood

a 1 = 0.05, a 2 = -0.9, N=20 Yule-Walker

a 1 = 0.05, a 2 = -0.9, N=20 Least Squares

a 1 = 0.05, a 2 = -0.9, N=20 Maximum Likelihood

N=100

a 1 = -1.8, a 2 = 0.9, N=100 Yule-Walker

a 1 = -1.8, a 2 = 0.9, N=100 Least Squares

a 1 = -1.8, a 2 = 0.9, N=100 Maximum Likelihood

N=1000

a 1 = 0.05, a 2 = -0.9, N=1000 Yule-Walker

a 1 = 0.05, a 2 = -0.9, N=1000 Least Squares

a 1 = 0.05, a 2 = -0.9, N=1000 Maximum Likelihood

AR(2) models Maximum Likelihood for different combinations of a 1, a 2

a 1 = -1, a 2 = 0.5, N=20

a 1 = -1, a 2 = 0.5, N=100

a 1 = -1, a 2 = 0.5, N=1000

a 1 = 1.3, a 2 = 0.8, N=20

a 1 = 1.3, a 2 = 0.8, N=100

a 1 = 1.3, a 2 = 0.8, N=1000

MA(1) models Conditional Likelihood for different b and N

b = 0.9

b = 0.6

b = -0.4

b = -0.9

b = -1 (not invertible, still stationary)

Here true model is MA(2) with ρ 1 about 0.7. Estimated b is, on average, about 0.75 (corresponding ρ 1 = 0.48)

ARMA(1,1) models Conditional Likelihood for different a, b and N

a = 0.8, b = 0.75

N=20

N=50

N=100

a = -0.7, b = -0.65

N=20

N=50

N=100

a = 0.8, b = (practically a white noise)

N=20

N=50

N=100