STAT 497 LECTURE NOTES 8 ESTIMATION.

Slides:



Advertisements
Similar presentations
Autocorrelation Functions and ARIMA Modelling
Advertisements

Point Estimation Notes of STAT 6205 by Dr. Fan.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Model Building For ARIMA time series
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
Chap 8: Estimation of parameters & Fitting of Probability Distributions Section 6.1: INTRODUCTION Unknown parameter(s) values must be estimated before.
The General Linear Model. The Simple Linear Model Linear Regression.
How should these data be modelled?. Identification step: Look at the SAC and SPAC Looks like an AR(1)- process. (Spikes are clearly decreasing in SAC.
Maximum likelihood (ML) and likelihood ratio (LR) test
Hypothesis testing Some general concepts: Null hypothesisH 0 A statement we “wish” to refute Alternative hypotesisH 1 The whole or part of the complement.
Modeling Cycles By ARMA
Maximum likelihood Conditional distribution and likelihood Maximum likelihood estimations Information in the data and likelihood Observed and Fisher’s.
Statistical Inference Chapter 12/13. COMP 5340/6340 Statistical Inference2 Statistical Inference Given a sample of observations from a population, the.
Maximum likelihood (ML) and likelihood ratio (LR) test
BCOR 1020 Business Statistics Lecture 28 – May 1, 2008.
4. Multiple Regression Analysis: Estimation -Most econometric regressions are motivated by a question -ie: Do Canadian Heritage commercials have a positive.
The Basics of Regression continued
Stat 321 – Lecture 26 Estimators (cont.) The judge asked the statistician if she promised to tell the truth, the whole truth, and nothing but the truth?
Prediction and model selection
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Financial Econometrics
1 STATISTICAL INFERENCE PART I EXPONENTIAL FAMILY & POINT ESTIMATION.
July 3, Department of Computer and Information Science (IDA) Linköpings universitet, Sweden Minimal sufficient statistic.
1 Inference About a Population Variance Sometimes we are interested in making inference about the variability of processes. Examples: –Investors use variance.
Linear and generalised linear models
Maximum likelihood (ML)
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
AR- MA- och ARMA-.
ARMA models Gloria González-Rivera University of California, Riverside
STAT 497 LECTURE NOTES 2.
STAT 497 LECTURE NOTES 6 SEASONAL TIME SERIES MODELS.
STATISTICAL INFERENCE PART I POINT ESTIMATION
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Intervention models Something’s happened around t = 200.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Maximum Likelihood Estimation Methods of Economic Investigation Lecture 17.
1 Lecture 16: Point Estimation Concepts and Methods Devore, Ch
FORECASTING. Minimum Mean Square Error Forecasting.
Lecture 4: Statistics Review II Date: 9/5/02  Hypothesis tests: power  Estimation: likelihood, moment estimation, least square  Statistical properties.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Generalised method of moments approach to testing the CAPM Nimesh Mistry Filipp Levin.
STAT 497 LECTURE NOTE 9 DIAGNOSTIC CHECKS 1. After identifying and estimating a time series model, the goodness-of-fit of the model and validity of the.
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
Forecasting (prediction) limits Example Linear deterministic trend estimated by least-squares Note! The average of the numbers 1, 2, …, t is.
K. Ensor, STAT Spring 2005 Estimation of AR models Assume for now mean is 0. Estimate parameters of the model, including the noise variace. –Least.
Autoregressive (AR) Spectral Estimation
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University.
ESTIMATION METHODS We know how to calculate confidence intervals for estimates of  and  2 Now, we need procedures to calculate  and  2, themselves.
Univariate Time series - 2 Methods of Economic Investigation Lecture 19.
MathematicalMarketing Slide 5.1 OLS Chapter 5: Ordinary Least Square Regression We will be discussing  The Linear Regression Model  Estimation of the.
Computacion Inteligente Least-Square Methods for System Identification.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
Estimation Econometría. ADE.. Estimation We assume we have a sample of size T of: – The dependent variable (y) – The explanatory variables (x 1,x 2, x.
Review of Unit Root Testing D. A. Dickey North Carolina State University (Previously presented at Purdue Econ Dept.)
Presentation : “ Maximum Likelihood Estimation” Presented By : Jesu Kiran Spurgen Date :
Statistical Estimation
STATISTICAL INFERENCE PART I POINT ESTIMATION
Model Building For ARIMA time series
STAT 497 LECTURE NOTE 9 DIAGNOSTIC CHECKS.
CHAPTER 16 ECONOMIC FORECASTING Damodar Gujarati
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Confidence intervals for the difference between two means: Independent samples Section 10.1.
Tutorial 10 SEG7550.
Presentation transcript:

STAT 497 LECTURE NOTES 8 ESTIMATION

ESTIMATION After specifying the order of a stationary ARMA process, we need to estimate the parameters. We will assume (for now) that: 1. The model order (p and q) is known, and 2. The data has zero mean. If (2) is not a reasonable assumption, we can subtract the sample mean , fit a zero-mean ARMA model: Then use as the model for Yt.

ESTIMATION Method of Moment Estimation (MME) Ordinary Least Squares (OLS) Estimation Maximum Likelihood Estimation (MLE) Least Squares Estimation Conditional Unconditional

THE METHOD OF MOMENT ESTIMATION It is also known as Yule-Walker estimation. Easy but not efficient estimation method. Works for only AR models for large n. BASIC IDEA: Equating sample moment(s) to population moment(s), and solve these equation(s) to obtain the estimator(s) of unknown parameter(s).

THE METHOD OF MOMENT ESTIMATION Let n is the variance/covariance matrix of X with the given parameter values. Yule-Walker for AR(p): Regress Xt onto Xt−1, . . ., Xt−p. Durbin-Levinson algorithm with  replaced by . Yule-Walker for ARMA(p,q): Method of moments. Not efficient.

THE YULE-WALKER ESTIMATION For a stationary (causal) AR(p)

THE YULE-WALKER ESTIMATION To find the Yule-Walker estimators, we are using, These are forecasting equations. We can use Durbin-Levinson algorithm.

THE YULE-WALKER ESTIMATION If If {Xt} is an AR(p) process, Hence, we can use the sample PACF to test for AR order, and we can calculate approximate confidence intervals for the parameters.

THE YULE-WALKER ESTIMATION If Xt is an AR(p) process, and n is large, 100(1)% approximate confidence interval for j is

THE YULE-WALKER ESTIMATION AR(1) Find the MME of . It is known that 1 = .

THE YULE-WALKER ESTIMATION So, the MME of  is Also, is unknown. Therefore, using the variance of the process, we can obtain MME of .

THE YULE-WALKER ESTIMATION

THE YULE-WALKER ESTIMATION AR(2) Find the MME of all unknown parameters. Using the Yule-Walker Equations

THE YULE-WALKER ESTIMATION So, equate population autocorrelations to sample autocorrelations, solve for 1 and 2.

THE YULE-WALKER ESTIMATION Using these we can obtain the MME of To obtain MME of , use the process variance formula.

THE YULE-WALKER ESTIMATION AR(1) AR(2)

THE YULE-WALKER ESTIMATION Again using the autocorrelation of the series at lag 1, Choose the root so that the root satisfying the invertibility condition

THE YULE-WALKER ESTIMATION For real roots, If , unique real roots but non-invertible. If , no real roots exists and MME fails. If , unique real roots and invertible.

THE YULE-WALKER ESTIMATION This example shows that the MMEs for MA and ARMA models are complicated. More generally, regardless of AR, MA or ARMA models, the MMEs are sensitive to rounding errors. They are usually used to provide initial estimates needed for a more efficient nonlinear estimation method. The moment estimators are not recommended for final estimation results and should not be used if the process is close to being nonstationary or noninvertible.

THE MAXIMUM LIKELIHOOD ESTIMATION Assume that By this assumption we can use the joint pdf instead of which cannot be written as multiplication of marginal pdfs because of the dependency between time series observations.

MLE METHOD For the general stationary ARMA(p,q) model or

MLE The joint pdf of (a1,a2,…, an) is given by Let Y=(Y1,…,Yn) and assume that initial conditions Y*=(Y1-p,…,Y0)’ and a*=(a1-q,…,a0)’ are known.

MLE The conditional log-likelihood function is given by Initial Conditions:

MLE Then, we can find the estimators of =(1,…,p), =(1,…, q) and  such that the conditional likelihood function is maximized. Usually, numerical nonlinear optimization techniques are required. After obtaining all the estimators, where d.f.=  of terms used in SS   of parameters = (np)  (p+q+1) = n  (2p+q+1).

MLE AR(1)

MLE The Jacobian will be

MLE Then, the likelihood function can be written as

MLE Hence, The log-likelihood function:

MLE Here, S*() is the conditional sum of squares and S() is the unconditional sum of squares. To find the value of  where the likelihood function is maximized, Then,

MLE If we neglect ln(12), then MLE=conditional LSE. If we neglect both ln(12) and , then

MLE Asymptotically unbiased, efficient, consistent, sufficient for large sample sizes but hard to deal with joint pdf.

CONDITIONAL LEST SQUARES ESTIMATION

CONDITIONAL LSE If the process mean is different than zero

CONDITIONAL LSE MA(1) Non-linear in terms of parameters LS problem S*() cannot be minimized analytically Numerical nonlinear optimization methods like Newton-Raphson or Gauss-Newton,... *There are similar problem is ARMA case.

UNCONDITIONAL LSE This nonlinear in . We need nonlinear optimization techniques.

BACKCASTING METHOD Obtain the backward form of ARMA(p,q) Instead of forecasting, backcast the past values of Yt and at, t  0. Obtain the unconditional log-likelihood function, then obtain the estimators.

EXAMPLE If there are only 2 observations in time series (not realistic) Find the MLE of  and .

EXAMPLE US Quarterly Beer Production from 1975 to 1997 > par(mfrow=c(1,3)) > plot(beer) > acf(as.vector(beer),lag.max=36) > pacf(as.vector(beer),lag.max=36)

EXAMPLE (contd.) > library(uroot) Warning message: package 'uroot' was built under R version 2.13.0 > HEGY.test(wts =beer, itsd = c(1, 1, c(1:3)), regvar = 0,selectlags = list(mode = "bic", Pmax = 12)) Null hypothesis: Unit root. Alternative hypothesis: Stationarity. ---- HEGY statistics: Stat. p-value tpi_1 -3.339 0.085 tpi_2 -5.944 0.010 Fpi_3:4 13.238 0.010 > CH.test(beer) ------ - ------ ---- Canova & Hansen test Null hypothesis: Stationarity. Alternative hypothesis: Unit root. L-statistic: 0.817 Critical values: 0.10 0.05 0.025 0.01 0.846 1.01 1.16 1.35

EXAMPLE (contd.) > plot(diff(beer),ylab='First Difference of Beer Production',xlab='Time') > acf(as.vector(diff(beer)),lag.max=36) > pacf(as.vector(diff(beer)),lag.max=36)

EXAMPLE (contd.) > HEGY.test(wts =diff(beer), itsd = c(1, 1, c(1:3)), regvar = 0,selectlags = list(mode = "bic", Pmax = 12)) ---- ---- HEGY test Null hypothesis: Unit root. Alternative hypothesis: Stationarity. ---- HEGY statistics: Stat. p-value tpi_1 -6.067 0.01 tpi_2 -1.503 0.10 Fpi_3:4 9.091 0.01 Fpi_2:4 7.136 NA Fpi_1:4 26.145 NA

EXAMPLE (contd.) > fit1=arima(beer,order=c(3,1,0),seasonal=list(order=c(2,0,0), period=4)) > fit1 Call: arima(x = beer, order = c(3, 1, 0), seasonal = list(order = c(2, 0, 0), period = 4)) Coefficients: ar1 ar2 ar3 sar1 sar2 -0.7380 -0.6939 -0.2299 0.2903 0.6694 s.e. 0.1056 0.1206 0.1206 0.0882 0.0841 sigma^2 estimated as 1.79: log likelihood = -161.55, aic = 335.1 > fit2=arima(beer,order=c(3,1,0),seasonal=list(order=c(3,0,0), period=4)) > fit2 arima(x = beer, order = c(3, 1, 0), seasonal = list(order = c(3, 0, 0), period = 4)) ar1 ar2 ar3 sar1 sar2 sar3 -0.8161 -0.8035 -0.3529 0.0444 0.5798 0.3387 s.e. 0.1065 0.1188 0.1219 0.1205 0.0872 0.1210 sigma^2 estimated as 1.646: log likelihood = -158.01, aic = 330.01