Forecast Verification Research

Slides:



Advertisements
Similar presentations
RSMC La Réunion activities regarding SWFDP Southern Africa Matthieu Plu (Météo-France, La Réunion), Philippe Arbogast (Météo-France, Toulouse), Nicole.
Advertisements

TOPIC 3 CHAIR REPORT TROPICAL CYCLONE MOTION Russell L. Elsberry OUTLINE Need to improve track prediction –Importance of track forecast Opportunities to.
© GEO Secretariat THORPEX-TIGGE Overall Concept What? –TIGGE: THORPEX will develop, demonstrate and evaluate a multi- model, multi-analysis and multi national.
World Meteorological Organization Working together in weather, climate and water EXPECTED RESULT 1 ENHANCED CAPABILITIES OF MEMBERS TO PRODUCE BETTER WEATHER.
Ensemble Forecasting of High-Impact Weather Richard Swinbank with thanks to various, mainly Met Office, colleagues High-Impact Weather THORPEX follow-on.
Sub-seasonal to Seasonal Prediction (1-90 days) -
Forecast Verification Research Laurie Wilson, Environment Canada Beth Ebert, Bureau of Meteorology WWRP-JSC, Geneva, July, 2013.
Forecast Verification Research
WCRP Perspective Joint ICSC1/JSC6 17 July 2013, Geneva Michel Rixen, WCRP JPS.
WWRP/WGNE Joint Working Group on Forecast Verification Research Beth Ebert, JWGFVR co-chair WGNR meeting, Geneva, 8-10 Feb 2011.
© Crown copyright Met Office 10 years ago (IPCC TAR, 2001)
WWRP 1 5th COPS Workshop, Stuttgart-Hohenheim, March 2007 The WMO context of COPS: WWRP-THORPEX Dr. Jeanette Onvlee, Chair WWRP/WG on mesoscale forecasting.
Prediction of Extreme Rainfall of Typhoon Morakot (2009) with the WRF model: Part II: Ensemble Forecast with a New Probability Matching Scheme Xingqin.
Numerical Weather Prediction Models
Regional Modelling Prepared by C. Tubbs, P. Davies, Met Office UK Revised, delivered by P. Chen, WMO Secretariat SWFDP-Eastern Africa Training Workshop.
DROUGHT MONITORING SYSTEM IN DHMZ National Seminar on Drought Management 16 th April 2012, Zagreb Ksenija Cindrić, D. Mihajlović, J. Juras L. Kalin, B.
Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
The THORPEX Interactive Grand Global Ensemble (TIGGE) Richard Swinbank, Zoltan Toth and Philippe Bougeault, with thanks to the GIFS-TIGGE working group.
Proposed Predictability, Dynamics & Ensemble Forecasting Expert Team Richard Swinbank, with Heini Wernli, Masayuki Kyouda and Istvan Szunyogh, and thanks.
Verification and evaluation of a national probabilistic prediction system Barbara Brown NCAR 23 September 2009.
Monitoring the Quality of Operational and Semi-Operational Satellite Precipitation Estimates – The IPWG Validation / Intercomparison Study Beth Ebert Bureau.
On-going WMO Demonstration Projects related to EXPO2010 Multi Hazard Early Warning System Multi Hazard Early Warning System Leading by SMB/CMALeading by.
GEO Work Plan Symposium 2014 WE-01 Jim Caughey THORPEX IPO.
Challenges in Urban Meteorology: A Forum for Users and Providers OFCM Panel Summaries Bob Dumont Senior Staff Meteorologist OFCM.
Climate Forecasting Unit Prediction of climate extreme events at seasonal and decadal time scale Aida Pintó Biescas.
1 Joint Working Group Forecast Verification Research  Beth Ebert (BOM, Australia) co-chair  Laurie Wilson (CMC, Canada) co-chair Barb Brown (NCAR, USA)
International CLIVAR Working Group for Seasonal-to- Interannual Prediction (WGSIP) Ben Kirtman (Co-Chair WGSIP) George Mason University Center for Ocean-Land-Atmosphere.
CIMA CHFP Data Server.
Verification Approaches for Ensemble Forecasts of Tropical Cyclones Eric Gilleland, Barbara Brown, and Paul Kucera Joint Numerical Testbed, NCAR, USA
TIGGE LAM PLAN Eighth meeting of the GIFS-TIGGE Working Group Geneva, 22 to 24 February 2010 Tiziana Paccagnella.
World Weather Research Programme (WWRP) Report Gilbert Brunet WWRP/JSC Chair.
© Crown copyright Met Office WGNE activities and future directions Andy Brown and Jean-Noël Thépaut (WGNE co-chairs)
Working Group on Nowcasting Research A Brief Introduction Paul Joe.
© GEO Secretariat Overall Concept What? B08FDP Forecast Demonstration Project (Nowcasting) B08RDP Research & Development Project (Ensemble Prediction Systems)
Forecast Verification Research Beth Ebert and Laurie Wilson, JWGFVR co-chairs WWRP-JSC meeting, Geneva, Feb 2011.
© Crown copyright Met Office Met Office activities related to needs of humanitarian agencies Anca Brookshaw.
WWRP OUTCOME OF CASXV (November 2009) David Burridge and Gilbert Brunet WWRP & THORPEX IPO CASXV report – ftp://ftp.wmo.int/Documents/PublicWeb/mainweb/meetings/
Short-Range Ensemble Prediction System at INM José A. García-Moya SMNT – INM 27th EWGLAM & 12th SRNWP Meetings Ljubljana, October 2005.
CARPE DIEM 6 th meeting – Helsinki Critical Assessment of available Radar Precipitation Estimation techniques and Development of Innovative approaches.
GIFS-FDP Introduction to Framework Plan and links with SWFDP Richard Swinbank & Zoltan Toth.
Weather Forecast Verification Training - Recent Advancements -
Page 1© Crown copyright 2005 Met Office Verification -status Clive Wilson, Presented by Mike Bush at EWGLAM Meeting October 8- 11, 2007.
Science plan S2S sub-project on verification. Objectives Recommend verification metrics and datasets for assessing forecast quality of S2S forecasts Provide.
Verification of ensemble precipitation forecasts using the TIGGE dataset Laurence J. Wilson Environment Canada Anna Ghelli ECMWF GIFS-TIGGE Meeting, Feb.
Page 1 Andrew Lorenc WOAP 2006 © Crown copyright 2006 Andrew Lorenc Head of Data Assimilation & Ensembles Numerical Weather Prediction Met Office, UK Data.
Proposed THORPEX/HEPEX Hydrologic Ensemble Project (THEPS) Presentation for 3 rd THORPEX Science Symposium September 14-18, 2009 Prepared by John Schaake,
WWRP 1 THORPEX-WCRP Collaborations and other climate relevant activities of the WWRP WCRP/JSC31 WMO/WWRP/THORPEX
WGNE Systematic Errors Workshop: Grand and Other Challenges Christian Jakob, ARC Centre of Excellence for Climate System Science, Monash University, Melbourne,
11 9th ARC Meeting 3 November 2012, Kunming, China Report from ICSC 10 Jim Caughey, THORPEX IPO.
Joint Working Group on Forecast Verification Research
WG on Nowcasting Research World Meteorological Organization
Ayrton Zadra and Keith Williams (WGNE co-chairs) WMAC5 25/04/16
Spatial Verification Intercomparison Meeting, 20 February 2007, NCAR
S2S sub-project on verification (and products)
Research to operations (R2O)
Peter May and Beth Ebert CAWCR Bureau of Meteorology Australia
Jennifer Boehnert Emily Riddle Tom Hopson
REPORT BY THE CHAIR OF THE OPEN PROGRAMME AREA GROUP ON THE DATA-PROCESSING AND FORECASTING SYSTEM Ken Mylne
Meteorological applications and numerical models becoming increasingly accurate Actual observing systems provide high resolution data in space and time.
Verification of nowcasting products: Issues and methods
Links with GEO.
Causes of improvements to NWP (1979 – 2009)
GIFS-TIGGE project Richard Swinbank, and Young-Youn Park,
Caio Coelho (Joint CBS/CCl IPET-OPSLS Co-chair) CPTEC/INPE, Brazil
Science Objectives contained in three categories
Linking operational activities and research
Proposed Predictability, Dynamics & Ensemble Forecasting Expert Team
Verification of Tropical Cyclone Forecasts
Activities WG-MWFR Help set up/involvement in RDP’s/FDP’s (COPS, Sochi, HYMEX, …) Push mesoscale research cooperation / proposals on: Mesoscale modelling.
Peter May and Beth Ebert CAWCR Bureau of Meteorology Australia
Presentation transcript:

Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

Verification working group members Beth Ebert (BOM, Australia) Laurie Wilson (CMC, Canada) Barb Brown (NCAR, USA) Barbara Casati (Ouranos, Canada) Caio Coelho (CPTEC, Brazil) Anna Ghelli (ECMWF, UK) Martin Göber (DWD, Germany) Simon Mason (IRI, USA) Marion Mittermaier (Met Office, UK) Pertti Nurmi (FMI, Finland) Joel Stein (Météo-France) Yuejian Zhu (NCEP, USA)

Aims Verification component of WWRP, in collaboration with WGNE, WCRP, CBS (“Joint” between WWRP and WGNE) Develop and promote new verification methods Training on verification methodologies Ensure forecast verification is relevant to users Encourage sharing of observational data Promote importance of verification as a vital part of experiments Promote collaboration among verification scientists, model developers and forecast providers

Relationships / collaboration WGCM WGNE TIGGE SDS-WAS HyMeX Polar Prediction SWFDP YOTC Subseasonal to Seasonal Prediction CG-FV WGSIP SRNWP COST-731

FDPs and RDPs Sydney 2000 FDP Beijing 2008 FDP/RDP SNOW-V10 RDP FROST-14 FDP/RDP MAP D-PHASE Other FDPs: Lake Victoria Intend to establish collaboration with SERA on verification of tropical cyclone forecasts and other high impact weather warnings Typhoon Landfall FDP Severe Weather FDP

SNOW-V10 Nowcast and regional model verification at obs sites User-oriented verification Tuned to decision thresholds of VANOC, whole Olympic period Model-oriented verification Model forecasts verified in parallel, January to August 2010 User Relatively high concentration of data available for the Olympic period. Status Significant effort to process and quality-control observations Multiple observations at some sites  observation error

Wind speed verification (model-oriented) Visibility verification (user-oriented)

FROST-14 User-focused verification Model-focused verification Threshold-based as in SNOW-V10 Timing of events – onset, duration, cessation Real-time verification Road weather forecasts? Model-focused verification Neighborhood verification of high-resolution NWP Spatial verification of ensembles Account for observation uncertainty Anatoly Muravyev and Evgeny Atlaskin came to the Verification Methods Workshop in December, and will be working on the FROST-14 verification.

Promotion of best practice Recommended methods for evaluating cloud and related parameters Cloud document is just out! Originally requested by WGNE, has been in the works for some time. Has recommendations for standard verification of cloud amount and related variables such as cloud base height, vertical profile of cloud amount, using both point-based and spatial observations (satellite, cloud radar, etc.)

Promotion of best practice Verification of tropical cyclone forecasts Introduction Observations and analyses Forecasts Current practice in TC verification – deterministic forecasts Current verification practice – Probabilistic forecasts and ensembles Verification of monthly and seasonal tropical cyclone forecasts Experimental verification methods Comparing forecasts Presentation of verification results JWGFVR is also preparing a document describing methods for verifying tropical cyclone forecasts, in support of GIFS-TIGGE and the WMO Typhoon Landfall FDP. It will include standard methods for assessing track and intensity forecasts, probabilistic and ensemble forecast verification, and a review of recent developments in this field. In addition to track and intensity, we also recommend methodologies for TC-related hazards – wind, heavy precipitation, storm surge.

Verification of deterministic TC forecasts

Beyond track and intensity… Track error distribution TC genesis Wind speed Most tropical cyclone verification (at least operationally) focuses on only 2 variables: track location and intensity. Since a great deal of the damage associated with tropical storms is related to other factors, this seems overly limiting Some additional important variables: Storm structure and size Precipitation Storm surge Landfall time, position, and intensity Consistency Uncertainty Info to help forecasters (e.g., steering flow) Other? Tailoring verification to help forecasters with their high-pressure job and multiple sources of guidance information Precipitation (MODE spatial method)

Promotion of best practice Verification of forecasts from mesoscale models (early DRAFT) Purposes of verification Choices to be made Surface and/or upper-air verification? Point-wise and/or spatial verification? Proposal for 2nd Spatial Verification Intercomparison Project in collaboration with Short-Range NWP (SRNWP)

Spatial Verification Method Intercomparison Project International comparison of many new spatial verification methods Phase 1 (precipitation) completed Methods applied by researchers to same datasets (precipitation; perturbed cases; idealized cases) Subjective forecast evaluations Weather and Forecasting special collection 2009-2010 Phase 2 in planning stage Complex terrain MAP D-PHASE / COPS dataset Wind and precipitation, timing errors 14

Outreach and training Verification workshops and tutorials On-site, travelling SWFDP (e.g., east Africa) EUMETCAL training modules Verification web page Sharing of tools http://www.cawcr.gov.au/projects/verification/

5th International Verification Methods Workshop Melbourne 2011 Tutorial 32 students from 23 countries Lectures and exercises (took tools home) Group projects - presented at workshop Workshop ~120 participants Topics: Ensembles and probabilistic forecasts Seasonal and climate Aviation verification User-oriented verification Diagnostic methods and tools Tropical cyclones and high impact weather Weather warning verification Uncertainty Special issue of Meteorol. Applications in early 2013 THANKS FOR WWRP’S SUPPORT!! Had some trouble with participants getting their visas on time – some countries missed out (Ethiopia, China came late). Could use advice/help from WMO on this.

Seamless verification Seamless forecasts - consistent across space/time scales single modelling system or blended likely to be probabilistic / ensemble climate change local point regional global Spatial scale forecast aggregation time minutes hours days weeks months years decades NWP nowcasts decadal prediction seasonal sub- very short range Which scales / phenomena are predictable? Different user requirements at different scales (timing, location, …)

"Seamless verification" – consistent across space/time scales Modelling perspective – is my model doing the right thing? Process approaches LES-style verification of NWP runs (first few hours) T-AMIP style verification of coupled / climate runs (first few days) Single column model Statistical approaches Spatial and temporal spectra Spread-skill Marginal distributions (histograms, etc.) Seamless verification It was not clear to the group how to define seamless verification, and the WG had a lively discussion on this topic. One possible interpretation is consistent verification across a range of scales by for example applying the same verification scores to all forecasts being verified to allow comparison. This would entail greater time and space aggregation as longer forecast ranges are verified. Averaging could be applied to the EPS medium range and monthly time range, as these two forecast ranges have an overlapping period. Similarly the concept of seamless verification could be applied to the EPS medium range forecast and seasonal forecast. For example, verification scores could be calculated using tercile exceedance and the ERA Interim could be used as the reference system. Verification across scales could involve conversion of forecast types, for example, from precipitation amounts (weather scales) to terciles (climate scales). A probabilistic framework would likely be the best approach to connect weather and climate scales. Perkins et al., J.Clim. 2007

"Seamless verification" – consistent across space/time scales User perspective – can I use this forecast to help me make a better decision? Neighborhood approaches - spatial and temporal scales with useful skill Generalized discrimination score (Mason & Weigel, MWR 2009) consistent treatment of binary, multi-category, continuous, probabilistic forecasts Calibration - accounting for space-time dependence of bias and accuracy? Conditional verification based on larger scale regime Extreme Forecast Index (EFI) approach for extremes JWGFVR activity Proposal for research in verifying forecasts in weather-climate interface Assessment component of UK INTEGRATE project Models may be seamless – but user needs are not! Nowcasting users can have very different needs for products than short-range forecasting users (more localized in space and time; wider range of products which are not standard in SR NWP and may be difficult to produce with an NWP model; some products routinely measured, others not; …) Temporal/spatial resolution go together. On small spatial /temporal scales modelling/verification should be inherently probabilistic. The predictability of phenomena generally decreases (greatly) from short to very short time/spatial scales. How to assess/show such limits to predictability in verification? Need to distinguish “normal” and “extreme” weather? Nowcasting more than SR forecasting is interested not just in intensities of phenomena, but also in exact timing/duration and location. Insight in errors of timing/location is needed. Different demands on observations, possibly not to be met with the same data sources? From Marion: We have two work packages kicking off this FY (i.e. now or soon). I am co-chair of the assessment group for INTEGRATE which is our 3-year programme for improving our global modelling capability. The INTEGRATE project follows on from the CAPTIVATE project. INTEGRATE project pages are hosted on the collaboration server. A password is needed (as UM partners you have access to these pages). The broad aim of INTEGRATE is to pull through model developments from components of the physical earth system (Atmosphere, Oceans, Land, Sea-Ice and Land-Ice, and Aerosols) and integrate them into a fully coupled global prediction system, for use across weather and climate timescales. The project attempts to begin the process of integrating coupled atmosphere-ocean (COA) forecast data into a conventional weather forecast verification framework, and consider the forecast skill of surface weather parameters in the existing operational seasonal COA system, GloSea4 and 5, over the first 2 weeks of the forecast. Within that I am focusing more on applying weather-type verification tools on global, longer time scales, monthly to seasonal. A part of this is a comparison of atmosphere-only (AO) and coupled ocean-atmosphere (COA) forecasts for the first 15 days (initially). Both are approaching the idea of seamless forecasting, i.e. can we used COA models to do NWP-type forecasts for the first 15 days, and seamless verification, i.e. finding some common ground in the way we can compare longer simulations and short-range NWP.

Questions What should be the role of JWGFVR in S2S? Defining protocols? Metrics? Guidance on methods? Participation in activities? Linking forecasting and applications? What should be the interaction with other WMO verification activities? E.g., Standardized Verification System for Long-range Forecasts (SVS-LRF); WGNE/WGCM Climate Metrics Panel How do metrics need to change for S2S? How do we cope with small sample sizes Is a common set of metrics required for S2S?

Database comments Database should be designed to allow easy access for Applications Verification Will need observations for evaluations and applications Will these (or links to these) be included in the database? Lack of obs can be a big challenge / detriment to use of the database Access to data For applications and verification often will not want a whole field or set of fields Also may want to be able to examine time series of forecasts at points Data formats and access can limit uses

Opportunities! New challenges Methods for evaluating extremes Sorting out some of the thorny problems (small sample sizes, limited observations, etc.) Defining meaningful metrics associated with research questions Making a useful connection between forecast performance and forecast usefulness/value Application areas (e.g., precipitation onset in Africa) A new research area Using spatial methods for evaluation of S2S forecast patterns

Thank you