Download presentation
Presentation is loading. Please wait.
Published byBeryl Cox Modified over 9 years ago
1
World Meteorological Organization Working together in weather, climate and water Enhanced User and Forecaster Oriented TAF Quality Assessment CAeM-XIV TECO 6 February 2010 Queenie CC Lam CAeM-XIV TECO 6 Feb 2010 WMO
2
TAF Quality Assessment Needs from the aspects of quality management user focus use of meteorological information for decision making in air navigation Issues Mismatches between the criteria for inclusion of change groups and amendment (ICAO Annex 3, Appendix 5) and those for accurate forecast (ICAO Annex 3, Attachment B) Shortfalls in the existing TAF performance measure in ICAO Annex 3 Attachment B
3
3 TAF Accuracy...Who cares? Why? Stakeholders –Users Pilots and Airlines – pre-flight planning and in-flight re-planning Air Traffic Control - better management of air traffic flow and control –Forecasters –better understanding of strengths and weaknesses –continuous professional development (learn from mistakes, remove bias) –Developers –Forecast techniques development –Validate changes to underpinning model or nowcasting, to improve overall forecast performance –Management –Identification of areas of improvement and resource allocation –Provide evidence on the forecast quality to users to increase their confidence in using the forecast for better decision making
4
4 TAF Verification Quality management –forecast performance as a key Quality Objective –basis for continuous improvement of service ICAO Annex 3 –operationally desirable accuracy (Attachment B)
5
5 Mismatches Remark: The number of change and probability groups should be kept to a minimum and should not normally exceed 5 groups
6
6 TAF Accuracy (% of cases within range or % of accurate forecasts - existing performance measure in ICAO Annex 3 Attachment B) Merits –simple and easy-to-understand –convenient for setting a globally applicable level of desirable accuracy Shortfalls –may not reflect skills properly Target percentage of accurate forecasts could have been met even by never forecasting significant weather at most airports –fail to delineate performance in forecasting high operational impact events –need special treatment to cater for change groups
7
Way Forward to Address the Issues Align the change group criteria with the desirable forecast accuracy Enhance the existing TAF verification system with additional performance metrics
8
8 Proposal for an Enhanced TAF Verification System Objective –reflect forecast skill (with respect to no-skill method such as climatological or persistence method) –generate useful verification information for use by different stakeholders Approach –supplement with additional performance metrics, e.g. by using contingency table method, to reflect forecast skill, especially in forecasting high operational impact events take climatology, frequency of events into account –adapt based on local forecasting practice consult users, e.g. local ATC
9
9 Use of Contingency Table Delineate performance in forecasting high operational impact events low visibility low cloud ceiling moderate/heavy precipitation, thunderstorms –categorization based on change group criteria
10
10 An example: Austro Control Use of High/Low-contingency Table Method ( Reference: Mahringer, G., 2008: Terminal aerodrome forecast verification in Austro Control using time windows and ranges of forecast conditions, Meteorol. Appl. 15, p.113-123) Decompose TAF to hourly forecasts –a range of forecast conditions is covered in the change group Two contingency tables for each weather element –H-table HIGHEST forecast vs HIGHEST observed condition in each hour –L-table LOWEST forecast vs LOWEST observed condition in each hour Source of observational data –METAR/SPECI
11
11 H/L-Contingency Table Method for Verification of Visibility Forecast
12
12 Performance Metrics Generate from 2-dimensional contingency tables
13
Examples of Performance Metrics Reference: Jolliffe IT, Stephenson DB (Ed.), 2003 : Forecast Verification: A Practitioner’s Guide in Atmospheric Science. John Wiley & Sons, Chichester, UK.
14
14 Result Presentation for Users (1)
15
15 Result Presentation for Users (2) Forecast Value costs of airline operations –weather-related delays –safety aspects –planning ahead reduce costs economic value of the forecasts –if cost of precautionary measures and loss due to weather event are known
16
16 Result Presentation for Forecasters FTOS31 LOWM 281100 TAF LOWL 281130Z 2812/0118 27006KT 9999 SCT012 SCT025 BECMG 2819/2822 VRB02KT 0500 BCFG FEW012 BECMG 2822/2824 0200 FG VV001 BECMG 0107/0110 09005KT 3000 BR OVC005 TEMPO 1018 6000 SCT010=
17
17
18
Principles of TAF Verification (1) From User perspective – e.g. requiring accuracy within a certain range; forecasts of multiple states of the atmosphere within a single time period (2) Normalization of verification scores – % of accurate forecast heavily influenced by climatology of the location and frequency of changes in atmospheric conditions – Enable comparison between airports (3) Alignment between the criteria for inclusion of change groups and amendment and those for accurate forecast
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.