Bootstrap in Finance Esther Ruiz and Maria Rosa Nieto (A. Rodríguez, J. Romo and L. Pascual) Department of Statistics UNIVERSIDAD CARLOS III DE MADRID.

Slides:



Advertisements
Similar presentations
The Simple Linear Regression Model Specification and Estimation Hill et al Chs 3 and 4.
Advertisements

Inference in the Simple Regression Model
Chapter 2. Unobserved Component models Esther Ruiz PhD Program in Business Administration and Quantitative Analysis Financial Econometrics.
Computational Statistics. Basic ideas  Predict values that are hard to measure irl, by using co-variables (other properties from the same measurement.
A.M. Alonso, C. García-Martos, J. Rodríguez, M. J. Sánchez Seasonal dynamic factor model and bootstrap inference: Application to electricity market forecasting.
Uncertainty and confidence intervals Statistical estimation methods, Finse Friday , 12.45–14.05 Andreas Lindén.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Time Series Building 1. Model Identification
Introduction Data and simula- tion methodology VaR models and estimation results Estimation perfor- mance analysis Conclusions Appendix Doctoral School.
STAT 497 APPLIED TIME SERIES ANALYSIS
Design of Engineering Experiments - Experiments with Random Factors
Resampling techniques Why resampling? Jacknife Cross-validation Bootstrap Examples of application of bootstrap.
Chapter 10 Simple Regression.
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Evaluating Hypotheses
1 Ka-fu Wong University of Hong Kong Pulling Things Together.
Prediction and model selection
2008 Chingchun 1 Bootstrap Chingchun Huang ( 黃敬群 ) Vision Lab, NCTU.
Statistical Background
Chapter 11 Multiple Regression.
© 2003 by Davi GeigerComputer Vision November 2003 L1.1 Tracking We are given a contour   with coordinates   ={x 1, x 2, …, x N } at the initial frame.
1 Inference About a Population Variance Sometimes we are interested in making inference about the variability of processes. Examples: –Investors use variance.
Estimation Error and Portfolio Optimization Global Asset Allocation and Stock Selection Campbell R. Harvey Duke University, Durham, NC USA National Bureau.
Measuring market risk:
Simulation Output Analysis
STAT 497 LECTURE NOTES 2.
What does it mean? The variance of the error term is not constant
Applications of bootstrap method to finance Chin-Ping King.
Microeconometric Modeling William Greene Stern School of Business New York University.
Probabilistic Robotics Bayes Filter Implementations.
Various topics Petter Mostad Overview Epidemiology Study types / data types Econometrics Time series data More about sampling –Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Elements of Financial Risk Management Second Edition © 2012 by Peter Christoffersen 1 Simulating the Term Structure of Risk Elements of Financial Risk.
DOX 6E Montgomery1 Design of Engineering Experiments Part 9 – Experiments with Random Factors Text reference, Chapter 13, Pg. 484 Previous chapters have.
Limits to Statistical Theory Bootstrap analysis ESM April 2006.
Mobile Robot Localization (ch. 7)
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Academic Research Academic Research Dr Kishor Bhanushali M
Chapter 2 Statistical Background. 2.3 Random Variables and Probability Distributions A variable X is said to be a random variable (rv) if for every real.
CS Statistical Machine learning Lecture 24
Robert Engle UCSD and NYU and Robert F. Engle, Econometric Services DYNAMIC CONDITIONAL CORRELATIONS.
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
An Introduction To The Kalman Filter By, Santhosh Kumar.
EC208 – Introductory Econometrics. Topic: Spurious/Nonsense Regressions (as part of chapter on Dynamic Models)
1 Chapter 5 : Volatility Models Similar to linear regression analysis, many time series exhibit a non-constant variance (heteroscedasticity). In a regression.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Stats Term Test 4 Solutions. c) d) An alternative solution is to use the probability mass function and.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
1/61: Topic 1.2 – Extensions of the Linear Regression Model Microeconometric Modeling William Greene Stern School of Business New York University New York.
Bootstrapping James G. Anderson, Ph.D. Purdue University.
Analysis of financial data Anders Lundquist Spring 2010.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
1 A NEW SPECIFICATION TOOL FOR STOCHASTIC VOLATILITY MODELS BASED ON THE TAYLOR EFFECT A. Pérez 1 and E. Ruiz 2 1 Dpto. Economía Aplicada Universidad de.
Analysis of Financial Data Spring 2012 Lecture 9: Volatility Priyantha Wijayatunga Department of Statistics, Umeå University
Bayesian Estimation and Confidence Intervals Lecture XXII.
Multiple Random Variables and Joint Distributions
Bayesian Estimation and Confidence Intervals
STATISTICS POINT ESTIMATION
Point and interval estimations of parameters of the normally up-diffused sign. Concept of statistical evaluation.
Ch8 Time Series Modeling
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Microeconometric Modeling
Autocorrelation.
Predictive distributions
Microeconometric Modeling
Lecturer Dr. Veronika Alhanaqtah
Autocorrelation.
Presentation transcript:

Bootstrap in Finance Esther Ruiz and Maria Rosa Nieto (A. Rodríguez, J. Romo and L. Pascual) Department of Statistics UNIVERSIDAD CARLOS III DE MADRID Workshop Modelling and Numerical Techniques in Quantitative Finance A Coruña 15 de octubre de 2009

 Motivation  Bootstrapping to obtain prediction densities of future returns and volatilities GARCH Stochastic volatility  Bootstrapping to measure risk: VaR and Expected Shortfall  Conclusions

1. Motivation  High frequency time series of returns are characterized by volatility clustering: Excess kurtosis Significant autocorrelations of absolute returns (not independent)

 Financial models based on inference on the dynamic behaviour of returns and/or predictions of moments associated with the density of future returns which assume independent and/or Gaussian observations are inadequate.  Bootstrap methos are attractive in this context because they do not assume any particular distribution.

 However, bootstrap procedures cannot be based on resampling directly from observed returns as they are not indepedent. Assume a parametric specification of the dependence and bootstrap from the corresponding residuals. Generalize the bootstrap procedures to cope with dependent observations  Li and maddala (1996) and Berkowitz and Kilian (2000) show that the parametric approach could be preferable in many applications.

 Consequently, we focus on two of the most popular and simpler models to represent the dynamic properties of financial returns: GARCH(1,1) ARSV(1)

 There are two main related areas of application of bootstrap methods Inference: Obtaining the sample distribution of a particular estimator or a statistic for testing, for example, the autoregressive dynamics in the conditional mean and variance, unit roots in the mean, fractional integration in volatility, inference for trading rules: Ruiz and Pascual (2002, JES) Prediction: Prediction of future densities of returns and volatilities: VaR and ES

 Bootstrap methods allow to obtain prediction densities (intervals) of future returns without distributional assumptions on the prediction error distribution and incorporating the parameter uncertainty Thombs and Schucany (1990, JASA): backward representation Cao et al. (1997, CSSC): conditional on estimated parameters Pascual et al. (2004, JTSA): incorporate parameter undertainty without backward representation

Example AR(1) The Minimum MSE predictor of y T+k based on the information available at time T is given by its conditional mean where In practice, the parameters are substituted by consistent estimates. Therefore, the predictions are given by Predictions are made conditional on the available data

Thombs and Schucany (1990) where are bootstrap replicates of the standardized residuals and are obtained from bootstrap replicates of the series based on the backward representation where Should we fix y T when bootstrapping the parameters???

Pascual et al. (2004, JTSA) propose a bootstrap procedure to obtain prediction intervals in ARIMA models that do not requiere the backward representation. Therefore, this procedure is simpler and more general, as it can cope with models for which the backward representation does not exist as, for example, GARCH.

2. Bootstrap forecast of future returns and volatilities We consider the prediction of future returns and volatilities generated by GARCH and ARSV(1) models 2.1 GARCH 2.2 ARSV Both models provide prediction intervals which are narrow in quite times and wide in volatile periods.

2.1 GARCH (1,1): Pascual et al. (2005, CSDA) Consider again the GARCH(1,1) model given by Therefore, Assuming conditional Normality of returns: · one-step ahead prediction errors are Normal. · prediction errors for two or more steps ahead are not Normal. · one-step-ahead volatilities only have associated parameter uncertainty. · volatilities more than one-step ahead also have uncertainty about future errors.

Bootstrap procedure  Estimate parameters and obtain standardized residuals  Obtain bootstrap replicates of the series and estimate the parameters.  Bootstrap forecasts of future returns and volatilities Using the bootstrap estimates of the parameters with the original observations (conditional)

2.2 ARSV(1) models The ARSV(1) model can be linearized by taking logs of squares Bootstrap methods for unobserved component models are much less developed. Previous procedures cannot be implemented due to the presence of several disturbances. In this context, the interest is not only to construct densities of future values of the observed variables but also of the unobserved components.

 The Kalman filter provides one-step-ahead (updated and smoothed) predictions of the series together with their MSE estimates of the latent components and their MSE  Bootstrap procedures can be implemented to obtain densities of Estimates of the parameters Prediction densities of future observations: Wall and Stoffer (2002, JTSA), Rodríguez and Ruiz (2009, JTSA) Prediction densities of underlying unobserved components: Pferfferman and Tiller (2005, JTSA), Rodríguez and Ruiz (2009, manuscript)

 Rodríguez and Ruiz (2009, JTSA) propose a bootstrap procedure to obtain prediction intervals of future observations in unobserved component models that incorporate the parameter uncertainty without using the backward representation.

The proposed procedure consists on the following steps: 1) Estimate the parameters by QML, and obtain the standardized innovations, 2) Obtain a sequence of bootstrap replicates of the standardized innovations, 3) Obtain a bootstrap replicate of the series using the IF with the estimated parameters Estimate the parameters, obtaining and

4) Obtain the conditional bootstrap predictions

 However, as we mentioned before, when modelling volatility, the objective is not only to predict the density of future returns but also to predict future volatilities. Therefore, we need to obtain prediction intervals for the unobserved components.  At the moment, Rodríguez and Ruiz (2009, manuscript) propose a procedure to obtain the MSE of the unobserved components.

Consider, for example, the random walk plus noise model: In this case, the prediction intervals are given by Normality assumption Estimated parameters

Random walk with q=0.5 Estimates of the level and 95% confidence intervals: In red with estimated parameters and in black with known parameters.

 Our procedure is based on the following decomposition of the MSE proposed by Hamilton (1986)  He proposes to generate replicates of the parameters from the asymptotic distribution and then to estimate the MSE by The filter is run with original observations

 We propose a non-parametric bootstrap in which the bootstrap replicates of the series are obtained from the innovation form after resampling from the innovations.

3. Var and ES In the context of financial risk management, one of the central issues of density forecasting is to track certain aspects of the densities as, for example, VaR and ES. Consider the GARCH(1,1) model In this context the VaR and ES are given by

In practice, assuming that the model is known, both the parameters and the distribution of the errors are unknown. Therefore, we obtain the estimates Bootstrap procedures have been proposed to obtain point estimates of the VaR by computing the corresponding quantile of the bootstrap distribution of returns (Ruiz and Pascual, 2004, JES)

Bootstrap procedures can also be implemented to obtain estimates of VaR and ES together with their MSE. Christoffersen and GonÇalves (2005, J Risk) propose to compute bootstrap replicates of the VaR by where

Instead of using the residuals obtained in each of the bootstrap replicates of the original series, Nieto and Ruiz (2009, manuscript) propose to estimate q 0.01 by a second bootstrap step. For each bootstrap replicate of the series of returns, we obtain n random draws from the empirical distribution of the original standardized residuals, Then, the constant q 0.01 can be estimated by any of the three alternative estimators described before. In this way, we avoid the estimation error involved in the residuals

Monte Carlo experiments: Coverages 95% VaR dTNormalFHSCFB-FHSB-CF N25085%88.1%85.8%93.6%90.3% %88.5%92.7%87.3% S %89.1%82.8%93.3%87.6% %87.7%88.8%95.2%90.1% 95% ES dTNormalFHSCFB-FHSB-CF N %75.8%92%79.3%90.8% %81.1%86%82.1%88.6% S %67.8%97.2%72.8%99.4% %72.9%99.2%81.7%97.9%

Conclusions and further research  Few analytical results on the statistical properties of bootstrap procedures when applied to heterocedastic time series  Further improvements in: bootstrap estimation of quantiles and expectations to compute the VaR and ES construction of prediction intervals for unobserved components (stochastic volatility)  Multivariate extensions: Engsted and Tanggaard (2001, JEF)