Download presentation
Presentation is loading. Please wait.
1
Bootstrap in Finance Esther Ruiz and Maria Rosa Nieto (A. Rodríguez, J. Romo and L. Pascual) Department of Statistics UNIVERSIDAD CARLOS III DE MADRID Workshop Modelling and Numerical Techniques in Quantitative Finance A Coruña 15 de octubre de 2009
2
Motivation Bootstrapping to obtain prediction densities of future returns and volatilities GARCH Stochastic volatility Bootstrapping to measure risk: VaR and Expected Shortfall Conclusions
3
1. Motivation High frequency time series of returns are characterized by volatility clustering: Excess kurtosis Significant autocorrelations of absolute returns (not independent)
5
Financial models based on inference on the dynamic behaviour of returns and/or predictions of moments associated with the density of future returns which assume independent and/or Gaussian observations are inadequate. Bootstrap methos are attractive in this context because they do not assume any particular distribution.
6
However, bootstrap procedures cannot be based on resampling directly from observed returns as they are not indepedent. Assume a parametric specification of the dependence and bootstrap from the corresponding residuals. Generalize the bootstrap procedures to cope with dependent observations Li and maddala (1996) and Berkowitz and Kilian (2000) show that the parametric approach could be preferable in many applications.
7
Consequently, we focus on two of the most popular and simpler models to represent the dynamic properties of financial returns: GARCH(1,1) ARSV(1)
9
There are two main related areas of application of bootstrap methods Inference: Obtaining the sample distribution of a particular estimator or a statistic for testing, for example, the autoregressive dynamics in the conditional mean and variance, unit roots in the mean, fractional integration in volatility, inference for trading rules: Ruiz and Pascual (2002, JES) Prediction: Prediction of future densities of returns and volatilities: VaR and ES
10
Bootstrap methods allow to obtain prediction densities (intervals) of future returns without distributional assumptions on the prediction error distribution and incorporating the parameter uncertainty Thombs and Schucany (1990, JASA): backward representation Cao et al. (1997, CSSC): conditional on estimated parameters Pascual et al. (2004, JTSA): incorporate parameter undertainty without backward representation
11
Example AR(1) The Minimum MSE predictor of y T+k based on the information available at time T is given by its conditional mean where In practice, the parameters are substituted by consistent estimates. Therefore, the predictions are given by Predictions are made conditional on the available data
12
Thombs and Schucany (1990) where are bootstrap replicates of the standardized residuals and are obtained from bootstrap replicates of the series based on the backward representation where Should we fix y T when bootstrapping the parameters???
13
Pascual et al. (2004, JTSA) propose a bootstrap procedure to obtain prediction intervals in ARIMA models that do not requiere the backward representation. Therefore, this procedure is simpler and more general, as it can cope with models for which the backward representation does not exist as, for example, GARCH.
15
2. Bootstrap forecast of future returns and volatilities We consider the prediction of future returns and volatilities generated by GARCH and ARSV(1) models 2.1 GARCH 2.2 ARSV Both models provide prediction intervals which are narrow in quite times and wide in volatile periods.
16
2.1 GARCH (1,1): Pascual et al. (2005, CSDA) Consider again the GARCH(1,1) model given by Therefore, Assuming conditional Normality of returns: · one-step ahead prediction errors are Normal. · prediction errors for two or more steps ahead are not Normal. · one-step-ahead volatilities only have associated parameter uncertainty. · volatilities more than one-step ahead also have uncertainty about future errors.
17
Bootstrap procedure Estimate parameters and obtain standardized residuals Obtain bootstrap replicates of the series and estimate the parameters. Bootstrap forecasts of future returns and volatilities Using the bootstrap estimates of the parameters with the original observations (conditional)
20
2.2 ARSV(1) models The ARSV(1) model can be linearized by taking logs of squares Bootstrap methods for unobserved component models are much less developed. Previous procedures cannot be implemented due to the presence of several disturbances. In this context, the interest is not only to construct densities of future values of the observed variables but also of the unobserved components.
21
The Kalman filter provides one-step-ahead (updated and smoothed) predictions of the series together with their MSE estimates of the latent components and their MSE Bootstrap procedures can be implemented to obtain densities of Estimates of the parameters Prediction densities of future observations: Wall and Stoffer (2002, JTSA), Rodríguez and Ruiz (2009, JTSA) Prediction densities of underlying unobserved components: Pferfferman and Tiller (2005, JTSA), Rodríguez and Ruiz (2009, manuscript)
22
Rodríguez and Ruiz (2009, JTSA) propose a bootstrap procedure to obtain prediction intervals of future observations in unobserved component models that incorporate the parameter uncertainty without using the backward representation.
23
The proposed procedure consists on the following steps: 1) Estimate the parameters by QML, and obtain the standardized innovations, 2) Obtain a sequence of bootstrap replicates of the standardized innovations, 3) Obtain a bootstrap replicate of the series using the IF with the estimated parameters Estimate the parameters, obtaining and
24
4) Obtain the conditional bootstrap predictions
26
However, as we mentioned before, when modelling volatility, the objective is not only to predict the density of future returns but also to predict future volatilities. Therefore, we need to obtain prediction intervals for the unobserved components. At the moment, Rodríguez and Ruiz (2009, manuscript) propose a procedure to obtain the MSE of the unobserved components.
27
Consider, for example, the random walk plus noise model: In this case, the prediction intervals are given by Normality assumption Estimated parameters
28
Random walk with q=0.5 Estimates of the level and 95% confidence intervals: In red with estimated parameters and in black with known parameters.
29
Our procedure is based on the following decomposition of the MSE proposed by Hamilton (1986) He proposes to generate replicates of the parameters from the asymptotic distribution and then to estimate the MSE by The filter is run with original observations
30
We propose a non-parametric bootstrap in which the bootstrap replicates of the series are obtained from the innovation form after resampling from the innovations.
33
3. Var and ES In the context of financial risk management, one of the central issues of density forecasting is to track certain aspects of the densities as, for example, VaR and ES. Consider the GARCH(1,1) model In this context the VaR and ES are given by
34
In practice, assuming that the model is known, both the parameters and the distribution of the errors are unknown. Therefore, we obtain the estimates Bootstrap procedures have been proposed to obtain point estimates of the VaR by computing the corresponding quantile of the bootstrap distribution of returns (Ruiz and Pascual, 2004, JES)
36
Bootstrap procedures can also be implemented to obtain estimates of VaR and ES together with their MSE. Christoffersen and GonÇalves (2005, J Risk) propose to compute bootstrap replicates of the VaR by where
37
Instead of using the residuals obtained in each of the bootstrap replicates of the original series, Nieto and Ruiz (2009, manuscript) propose to estimate q 0.01 by a second bootstrap step. For each bootstrap replicate of the series of returns, we obtain n random draws from the empirical distribution of the original standardized residuals, Then, the constant q 0.01 can be estimated by any of the three alternative estimators described before. In this way, we avoid the estimation error involved in the residuals
38
Monte Carlo experiments: Coverages 95% VaR dTNormalFHSCFB-FHSB-CF N25085%88.1%85.8%93.6%90.3% 50079.690.3%88.5%92.7%87.3% S-825084.9%89.1%82.8%93.3%87.6% 50076.8%87.7%88.8%95.2%90.1% 95% ES dTNormalFHSCFB-FHSB-CF N25088.5%75.8%92%79.3%90.8% 50079.6%81.1%86%82.1%88.6% S-825084.9%67.8%97.2%72.8%99.4% 50076.8%72.9%99.2%81.7%97.9%
40
Conclusions and further research Few analytical results on the statistical properties of bootstrap procedures when applied to heterocedastic time series Further improvements in: bootstrap estimation of quantiles and expectations to compute the VaR and ES construction of prediction intervals for unobserved components (stochastic volatility) Multivariate extensions: Engsted and Tanggaard (2001, JEF)
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.