Download presentation
Presentation is loading. Please wait.
Published byRandolf Davidson Modified over 9 years ago
1
A comparison of established and newly developed benchmarking methods Jennifer Davies, Duncan Elliott (ONS), Homesh Sayal, John Aston (University of Cambridge)
2
Overview What is benchmarking? Methods Comparison of benchmarking methods Conclusions Further work
3
What is benchmarking and why is it needed? Two series from different sources measuring the same phenomenon at different frequencies Eg Total of monthly turnover not equal to annual Constrain high to low frequency series Low frequency – higher quality in terms of levels High frequency – higher quality in short term movements Used in extensively in official statistics
4
What is benchmarking?
5
Notation Path series – high frequency series Benchmark series – low frequency series Benchmarked series – path series constrained to benchmark series Constraints Sum Average Point in time
6
Methods - Denton Minimises difference between growth rates in observed path and benchmarked series Used for constraining seasonally adjusted annual totals to non-seasonally adjusted total in ONS
7
Methods – Cholette-Dagum Regression-based approach Assumes observed path series is ‘truth’ plus bias and variance Assumes observed benchmark is ‘truth’ plus variance (variance often assumed to be 0 – binding benchmarking) Denton is a sub-case Used in production of National Accounts
8
Methods - Wavelets New method for benchmarking using wavelets Developed by Homesh Sayal, John Aston (University of Cambridge), Duncan Elliott (ONS) and Hernando Ombao (University of California at Irvine)
9
Methods - Wavelets
10
Wavelet benchmarking Decompose high and low frequency series using same wavelet basis (with additional wavelets in the high frequency series) Swap low frequency wavelets in path series with those in benchmark series to get benchmarked series Further step - thresholding – removes wavelets that are considered noise Smoothing based on variance estimated from a dynamic linear model (differs to the model used by Sayal et al, 2014)
11
Comparison Index of Production (IoP) Two simulation studies 1.Using IoP as true path and simulating noise 2.Looking at the effect of outliers and level shifts on benchmarking
12
Comparison 1 – simulating noise 6241 simulated series True path known, noise simulated (irregular components from IoP) to get observed path Diagnostics: Did it work? Did it meet the benchmark constraints? MSE between benchmarked series and 1. observed path and 2. true path Percentage of growth rates with same sign as observed series
13
Comparison 1 – simulating noise All series ran using all methods All series met constraints using all methods MeasureDentonCholette- Dagum Elementary wavelets Threshold wavelets MSE true path9.929.889.677.15 MSE obs. path0.470.450.404.14 Growth true path 76.5% 52.1% Growth obs.path99.5% 99.7%59.7%
14
Comparison 1 – simulating noise
16
Comparison 2 – adding outliers and level shifts to path series Use IoP series as true path Outlier – reduce January 2006 observation by 5% - use as observed path Level shift – reduce series prior to May 2005 by 4% - use as observed path
17
Comparison – series with outlier
18
Comparison – series with level shift
19
Comparison - adding outliers and level shifts to path series Further work adding noise into the path series MeasureDentonCholette- Dagum Elementary wavelet Threshold wavelet MSE true path – outlier 0.13 1.79 MSE true path – level shift 0.130.160.242.00
20
Conclusions Elementary wavelets performs slightly better than Denton and Cholette-Dagum in simulations Threshold wavelets smoothes so gets close to ‘truth’ but not observed path – doesn’t get close to growth rates because of smoothing Outliers and level shifts Elementary wavelets affected more by level shifts before the break and less affected by outliers All methods would be improved by adjusting for additive outliers and level shifts
21
Further work Further work into outliers and level shifts Revisions analysis – incorporate wavelet forecasting Practical problems for implementing benchmarking eg additional constraints
22
Thanks!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.