Download presentation
Presentation is loading. Please wait.
Published byClemence Knight Modified over 9 years ago
1
Slide 1 Nargis 0600UTC May 2nd Progress in global tropical cyclone forecasting at ECMWF Martin Miller
2
Slide 2 Outline Introduction General performance improvements Objective scores Systematic errors TC forecast skill ECMWF: Tracks, intensity and genesis Other Centres Forecast system improvements Physical parametrizations Observation quality control Resolution Extended-range forecasting Monthly Seasonal Conclusions
3
Slide 3 Historic Evolution of Skill
4
Slide 4 Global extra-tropics
5
Slide 5
6
Slide 6
7
Slide 7 Large-scale eg MJO
8
Slide 8 ECMWF Systematic Error: D+3 DJF Z500 Stream Function 200hPa Velocity Potential 200hPa 1986-19891996-19992006-2009
9
Slide 9 ECMWF Systematic Error: D+3 JJA Z500 Stream Function 200hPa Velocity Potential 200hPa 1986-19891996-19992006-2009
10
Slide 10 Systematic Error: D+10 DJF Z500 Stream Function 200hPa Velocity Potential 200hPa 1986-19891996-19992006-2009
11
Slide 11 Systematic Error: D+10 JJA Z500 Stream Function 200hPa Velocity Potential 200hPa 1986-19891996-19992006-2009
12
Slide 12 FC 27 th Aug 0Z t+132 VT: 1 st Sept ANALYSIS VT: 1 st Sept IKE Gustav
13
Slide 13
14
Slide 14
15
Slide 15
16
Slide 16 Forecast system development The current levels of success in TC forecasting at ECMWF are due to many important changes in the forecast system in the past, In particular some changes of note include: implementation of 4D-Var; increases in horizontal and vertical model resolution; better model physics; availability and assimilation of massive amounts of data from satellites, and also the drop-sondes released in and around TCs; EPS stochastic physics and also targeted perturbations in the vicinity of the TCs. A wide-range of improvements in seasonal forecasting systems involving both atmosphere and ocean
17
Slide 17 Tropical cyclones Verification of TC predictions from the operational deterministic forecast for 12- month periods ending on 14 July. The latest period, 15 July 2008 to 14 July 2009, is shown in red. Mean error in core pressure (left) and position (right).
18
Slide 18 Overall performance of TC forecasts Deterministic medium-range forecasts (day 0-5) for reported TCs Direct position error (DPE) Core pressure error (CPE) 2003 2004 2005 2006 2007 2008 2009 70% confidence level for the mean computed as a function of the lead time. General trend to lower values since 2006. Significant reduction of CPE is found in 2008 and confirmed in 2009. Physics changes in Nov07 may explain the positive impact in reducing the intensity and position errors.
19
Slide 19 Tropical cyclones Verification of tropical cyclone predictions from the operational deterministic forecast for 12-month periods ending on 14 July. Along track error: mean error in the direction of travel of the cyclone (negative values indicate slow bias) Cross track error: mean error at right-angles to the direction of travel
20
Slide 20
21
Slide 21 Nargis 0440UTC May 1st
22
Slide 22 D+5 from 23 april 00z T799 D+9 from 23 april 00z EPS probabilities from 23 april 00z at D+5 Black dots are the first official reported positions of cyclone (~april 28 th )
23
Slide 23 August-September 2008 Atlantic Basin Tropical Storm Genesis Increasingly ECMWF forecasts pick up tropical cyclones before they are officially reported Recent Atlantic hurricanes were predicted 5-7 days before they were observed TS/Hurricane1 st Observed dateForecast detection Fay16-AugD+6 (run: 10-Aug) Gustav25-AugD+5 (run: 20-Aug) Hanna28-AugD+5 (run: 23-Aug) Ike02-SepD+7 (run: 26-Aug) Josephine03-SepD+5 (run: 29-Aug)
24
Slide 24 Seamless Approach to Understanding Model Error Precipitation Climatology JJA (GPCP) Model Climate Error JJA (32R3) Analysis Increments U925 (JJA 2008) Process Tendencies U925 (JJA 2008) DYN CON V.DIF For the India monsoon, momentum tendency is a residual of the sum of large terms. 3915-3-9-153915-3-9-15 ms -1 day -1 Unit = 0.1 ms -1 12hr -1 mmday -1
25
Slide 25 WGNE TC Verification TC tracks on 2008 season Northern-Hemisphere [2008/01/01 to 2008/12/31] Southern-Hemisphere [2007/09/01 to 2008/08/31] Number of TCs, [best track data provider] 22 western North-Pacific [RSMC Tokyo] 17 eastern North-Pacific (including Central-Pacific) [RSMC Miami, Honolulu] 16 North Atlantic [RSMC Miami] 4 north Indian-Ocean [RSMC New-Delhi] 12 south Indian-Ocean [RSMC La-Reunion] 4 around Australia [RSMC Nadi and 4 TCWCs ] 22 17 16 12 4 4 Daisuke Hotta and Takuya Komori (NPD/JMA)
26
Slide 26 (a) Verification of western North-Pacific (WNP) domain Position Error 22 TCs in 2008
27
Slide 27 (a) WNP domain Detection Rate better DetectionRate – PositionError map (FT +72)
28
Slide 28 JMAECMWFUKMOCMCDWDNCEPNRL Meteo France Scattering diagram of TC positions at 72 hour forecast. Red : Before recurvature Green : During recurvature Blue : After recurvature Y-axis represents position errors in Along Track (AT) direction and X-axis in the Cross Track (CT) direction. Unit: kms. ECMWF has a small bias in all stages. CMC, DWD, NCEP,, NRL, Meteo France have slow bias in “before recurvature” stage. (a) WNP domain AT-CT bias map (FT +72)
29
Slide 29 JMAECMWFUKMOCMCDWDNCEPNRL Meteo France Scattering diagram of central pressure at 72 hour forecast. Y-axis represents central pressure of forecast and X-axis does that of analysis. Unit: hPa Predicted central pressure of most models is higher than analysis JMA and ECMWF has some relation ship between analysis and forecast. Other center has very small. (a) WNP domain Central Pressure scatter diagram (FT +72)
30
Slide 30 Time series of 2-day and 4-day forecast of JMA, ECM, UKM and 3 centres ensemble in WNP domain.
31
Slide 31 16 TCs in 2008 (b) Verification of North-Atlantic (NAT) domain Position Error
32
Slide 32 (b) NAT domain Detection Rate DetectionRate – PositionError map (FT +72) better
33
Slide 33 A characteristic according to the domain of Northern Hemisphere. Position error of each model will be compared in the next slide according to the domain.
34
Slide 34
35
Slide 35 (d) Verification of south Indian-Ocean (SIO) domain Position Error 12 TCs in 2008
36
Slide 36 Position Error 2007 season Detection Rate 2008 season
37
Slide 37 (d) SIO domain Detection Rate better DetectionRate – PositionError map (FT +72)
38
Slide 38 *N.B. colouring is different from other figures. (=Lead time) Verification of TC genesis forecast in the WNP domain based on TIGGE data CenterModel Resolution Verified Data Resolution JMA (Deterministic) TL959L60 0.25° JMATL319L60 0.5625° ECMWF (Deterministic) TL799L91 0.5625° ECMWFTL399L62 0.5625° UKMO 0.833°×1.25°L380.5625° NCEPT126L28 0.5625° CMC 0.9° L280.5625° CMAT213L31 0.5625° KMAT213L40 0.5625° CPTECT126L280.5625° Comparison between ECMWF EPS and DET, JMA EPS and DET suggests that horizontal resolution may not be so important for genesis.
39
Slide 39 Tropical Forecast Biases Precipitation against GPCP for different cycles: from 15 year 5 months integrations for 1990-2005. c 32r2d 32r3 b 31r1 a
40
Slide 40 if |x| <= k, if |x| > k, Introduction of the Huber norm Gaussian Huber Gaussian + flat
41
Slide 41 The Huber-norm and robust estimation 18 months of conventional data Feb 2006 – Sep 2007 Normalised fit of PDF to data Best Gaussian fit Best Huber norm fit
42
Slide 42 Comparing optimal observation weights Huber-norm (red) vs. Gaussian+flat (blue) More weight in the middle of the distribution More weight on the edges of the distribution More influence of data with large departures Weights: 0 – 25% 25%
43
Slide 43 Changes in the IFS Huber norm parameters for SYNOP, METAR, DRIBU: surface pressure, 10m wind TEMP, AIREP: temperature, wind PILOT: wind Relaxation of the fg-check Relaxed first guess checks when Huber VarQC is done Relaxation out to ~ 20 Sigma Retuning of the observation error Smaller observation errors for Huber VarQC
44
Slide 44 Implementation of the next resolution upgrade Configuration: Deterministic model: T1279L91 (~16km) Outer loop of 4D-Var T1279L91 and inner loops (T159/T255/T255) EPS resolution T639 (to 10 days) and T319 thereafter Wave model (25km and 36 directions) Implementation planned for January 2010
45
Slide 45 OPS (35r2) 980hPa (35r3) Use of improved QC (Huber norm) 961hPa (36r1) High-res system (T1279 etc) 945hPa Hurricane Bill, observed pressure~944hPa
46
Slide 46 Typhoon Melor
47
Slide 47 ECMWF model simulations (T1279, ~15 km) compared with Satellite Observations (Met 9)
48
Slide 48 The MJO and tropical cyclones in the monthly forecast system What is the state of play regarding the models MJO? What is the models TC climatology like? How does the MJO influence the model TC 15-member ensemble forecasts starting on the 15 th of each month from 1989 to 2008. 46-day integrations Cycle 32R3 T399 uncoupled till day 10 and T255 coupled after day 10 (Frederic Vitart – submitted to GRL)
49
Slide 49 MJO Propagation Time spent in each phase of the MJO ERA-I Model
50
Slide 50 Amplitude of the MJO
51
Slide 51 NDJFMA ObservationsModel JASON Tropical Cyclone Genesis climatology 1989-2008
52
Slide 52 NDJFMA Observations JASON Model Tropical Cyclone Density climatology 1989-2008
53
Slide 53 MJO Composite- ASO Tropical storm density anomaly Phases 2+3 Phases 4+5 Phases 6+7 Phases 8+1 Model Observations
54
Slide 54 Phases 6+7 – Phase 2+3 Observations Model MJO Composite- ASO Tropical storm density anomaly
55
Slide 55 Example of Tropical Storm Seasonal Forecast
56
Slide 56 Example of Tropical Storm Seasonal Forecast
57
Slide 57 Interannual variability of Atlantic tropical storm ACE Forecasts issued in June for the period July-December 1990- 2009 Correlation: 0.72 RMS error: 0.40
58
Slide 58 Interannual variability of the frequency of Typhoons Forecasts issued in April for the period May-October 1990- 2009 Correlation: 0.66 RMS error: 3.29
59
Slide 59 Closing remarks Steady improvements in skill in traditional measures such as 500hPa geopotential (~1 day per decade for deterministic forecasts and ~1 day per 7 years for ensemble systems). Forecasts of weather parameters such as precipitation, cloud etc. are ~ 1 day per 7 years. Skill of tropical cyclone forecasting is improving at a similar rate with emerging skill in intensity and genesis. ECMWF generally the best available especially in the most recent years Advanced assimilation methods enable better use of storm-related data Progressive increases in resolution and improved physics are improving realism of structures and intensity and hence improvements in tracking Links to MJO activity provide some sub-seasonal skill in forecasting TC statistics Seasonal forecasting shows (surprisingly?) good skill in multi-basin statistics
60
Slide 60 Extreme forecast index (EFI) Use as an “alarm bell” to alert forecasters to possible extreme events Forecaster can then look in more detail (probabilities, EPSgrams) EFI indicates places where EPS distribution is towards the extreme of the climate distribution EFI measures the distance between the EPS cumulative distribution and the model climate distribution range from –1 (all members break climate minimum records) to +1 (all beyond model climate records) EFI parameters: temperature, precipitation, wind speed, wind gusts
61
Slide 61 EFI verification Verification of Extreme Forecast Index (EFI) for precipitation (left) and 10m wind (right) over Europe. Trend in annual performance (ROC area) from 2004 to 2009 for day 2 (blue) and day 5 (red). Extreme event is taken as an observation exceeding 95th percentile of station climate. Hit rates and false alarm rates are calculated for EFI exceeding different thresholds.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.