Computational Finance II: Time Series Guest Lecture II K.Ensor.

Slides:



Advertisements
Similar presentations
Simple linear models Straight line is simplest case, but key is that parameters appear linearly in the model Needs estimates of the model parameters (slope.
Advertisements

BA 275 Quantitative Business Methods
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Time Series Building 1. Model Identification
Economics 310 Lecture 25 Univariate Time-Series Methods of Economic Forecasting Single-equation regression models Simultaneous-equation regression models.
Objectives (BPS chapter 24)
How should these data be modelled?. Identification step: Look at the SAC and SPAC Looks like an AR(1)- process. (Spikes are clearly decreasing in SAC.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 21 Autocorrelation and Inferences about the Slope.
What makes one estimator better than another Estimator is jargon term for method of estimating.
Statistics for the Social Sciences
x y z The data as seen in R [1,] population city manager compensation [2,] [3,] [4,]
Modeling Cycles By ARMA
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Note 14 of 5E Statistics with Economics and Business Applications Chapter 12 Multiple Regression Analysis A brief exposition.
Lecture 19: Tues., Nov. 11th R-squared (8.6.1) Review
Examining Relationship of Variables  Response (dependent) variable - measures the outcome of a study.  Explanatory (Independent) variable - explains.
Nemours Biomedical Research Statistics April 2, 2009 Tim Bunnell, Ph.D. & Jobayer Hossain, Ph.D. Nemours Bioinformatics Core Facility.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Lecture 20 Simple linear regression (18.6, 18.9)
K. Ensor, STAT Spring 2005 The Basics: Outline What is a time series? What is a financial time series? What is the purpose of our analysis? Classification.
Slide Copyright © 2010 Pearson Education, Inc. Active Learning Lecture Slides For use with Classroom Response Systems Business Statistics First Edition.
Empirical Estimation Review EconS 451: Lecture # 8 Describe in general terms what we are attempting to solve with empirical estimation. Understand why.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
1 Simple Linear Regression 1. review of least squares procedure 2. inference for least squares lines.
Least Squares Regression Line (LSRL)
Regression Transformations for Normality and to Simplify Relationships U.S. Coal Mine Production – 2011 Source:
Checking Regression Model Assumptions NBA 2013/14 Player Heights and Weights.
1 BABS 502 Moving Averages, Decomposition and Exponential Smoothing Revised March 11, 2011.
AR- MA- och ARMA-.
9/14/ Lecture 61 STATS 330: Lecture 6. 9/14/ Lecture 62 Inference for the Regression model Aim of today’s lecture: To discuss how we assess.
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
Diploma in Statistics Introduction to Regression Lecture 2.21 Introduction to Regression Lecture Review of Lecture 2.1 –Homework –Multiple regression.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
23-1 Analysis of Covariance (Chapter 16) A procedure for comparing treatment means that incorporates information on a quantitative explanatory variable,
Lecture 3: Inference in Simple Linear Regression BMTRY 701 Biostatistical Methods II.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Regression. Population Covariance and Correlation.
© 2000 Prentice-Hall, Inc. Chap The Least Squares Linear Trend Model Year Coded X Sales
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
Byron Gangnes Econ 427 lecture 3 slides. Byron Gangnes A scatterplot.
Autocorrelation in Time Series KNNL – Chapter 12.
John G. Zhang, Ph.D. Harper College
Simple Linear Regression ANOVA for regression (10.2)
Regression Analysis Relationship with one independent variable.
1 Regression Analysis The contents in this chapter are from Chapters of the textbook. The cntry15.sav data will be used. The data collected 15 countries’
Lecture 7: Multiple Linear Regression Interpretation with different types of predictors BMTRY 701 Biostatistical Methods II.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Environmental Modeling Basic Testing Methods - Statistics III.
Chapter 22: Building Multiple Regression Models Generalization of univariate linear regression models. One unit of data with a value of dependent variable.
STAT 497 LECTURE NOTE 9 DIAGNOSTIC CHECKS 1. After identifying and estimating a time series model, the goodness-of-fit of the model and validity of the.
Linear Models Alan Lee Sample presentation for STATS 760.
1 Chapter 5 : Volatility Models Similar to linear regression analysis, many time series exhibit a non-constant variance (heteroscedasticity). In a regression.
1 BABS 502 Moving Averages, Decomposition and Exponential Smoothing Revised March 14, 2010.
Computational Finance II: Time Series K.Ensor. What is a time series? Anything observed sequentially (by time?) Returns, volatility, interest rates, exchange.
Forecasting is the art and science of predicting future events.
Statistics 350 Review. Today Today: Review Simple Linear Regression Simple linear regression model: Y i =  for i=1,2,…,n Distribution of errors.
Forecasting. Model with indicator variables The choice of a forecasting technique depends on the components identified in the time series. The techniques.
Analysis of Financial Data Spring 2012 Lecture 5: Time Series Models - 3 Priyantha Wijayatunga Department of Statistics, Umeå University
Ian Newcombe CO 2 LEVEL RISE OVER 26 YEARS. DATASET Quarterly Mauna Loa, HI CO 2 Record Quarterly US gasoline sales Quarterly US car and light truck sales.
Chapter 8: Multiple Regression for Time Series
Lecture Slides Elementary Statistics Twelfth Edition
Inference for Least Squares Lines
Linear Regression.
STAT 497 LECTURE NOTE 9 DIAGNOSTIC CHECKS.
Regression with Autocorrelated Errors
Linear Regression.
Linear Regression.
BOX JENKINS (ARIMA) METHODOLOGY
Presentation transcript:

Computational Finance II: Time Series Guest Lecture II K.Ensor

Review ACF PACF Autocorrelation when estimating the mean. First difference removes linear nonstationarities Sth differences removes seasonal nonstationarities

Identifying periodic patterns Spectral decomposition of a realization of length n. Any finite sequence of numbers can be represented as the sum of many (n/2) sin and cos functions of different periodicities and different amplitudes. Examination of the amplitudes (or amplitudes squared) provides an understanding of the dominant periodicities that exist in the data.

Smoothed Periodogram for Johnson and Johnson Series

Regression with Autocorrelated Errors Residual Standard Error = , Multiple R-Square = N = 84, F-statistic = on 2 and 81 df, p-value = 0 coef std.err t.stat p.value Int jjt cjj

Residuals from Linear plus periodic regression

How to proceed? The residuals from the regression fit exhibit dependence over the time lags. Identify the time series model. Refit Regression + time series model using MLE.

Regression with Autocorrelated Errors How can we study the relationship between 2 or more time series. Consider the regression model for studying the term structure of interest rates such as r(1tr(2t)+e(t) with time series Let’s look at the relationship between two U.S. weekly interest rate series measured in percentages r(1t) = The 1-year Treasury constant maturity rate r(2t) = The 3-year Treasury constant maturity rate From 1/5/1962 to 9/10/1999.

Scatterplots between series simultaneous in time, and the change in each series. The two series are highly correlated.

Try this model? r3(tr2(t) + e(t) Splus summary results of least squares fit Residual Standard Error = 0.538, Multiple R-Square = N = 1967, F-statistic = on 1 and 1965 df, p-value = 0 coef std.err t.stat p.value Inter X

Behavior of Residuals The residuals are nonstionary. Thus there does not appear to be a long-term equlibrium between the two interest rates.

From a regression perspective the assumptions of our regression model are violated. Let’s consider the change series of interest rates c1(t)=(1-B)r1(t) c3(t)=(1-B)r2(t) Now regress c1 on c3.

Regression results Residual Standard Error = , Multiple R-Square = N = 1966, F-statistic = on 1 and 1964 df, p- value = 0 coef std.err t.stat p.value Intercept X rse R2 n coef std.err t.stat p.value Intercept X

Looking at the Residuals There is a small bit of autocorrelation – violating our regression assumptions.

The correct model c3(t)=c1(t) + e(t) with e(t) = a(t) +a(t-1) Parameter estimates: 2) Standard errors: ( , , ) R-squared = 85.4%

Regression + MA diagnostics

Summary – Regression with autocorrelated errors Fit regression model Check residuals for the presence of autocorrelation. If autocorrelation is present, identify the nature of the autocorrelation and simultaneously fit (via MLE) the regression parameters and the time series parameters (in Splus use arima.mle)