EQC for Seasonal Forecasts

Slides:



Advertisements
Similar presentations
Measuring the performance of climate predictions Chris Ferro, Tom Fricker, David Stephenson Mathematics Research Institute University of Exeter, UK IMA.
Advertisements

ECMWF long range forecast systems
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Short Course on Introduction to Meteorological Instrumentation and Observations Techniques QA and QC Procedures Short Course on Introduction to Meteorological.
A Regression Model for Ensemble Forecasts David Unger Climate Prediction Center.
WMO UNEP INTERGOVERNMENTAL PANEL ON CLIMATE CHANGE NATIONAL GREENHOUSE GAS INVENTORIES PROGRAMME WMO UNEP IPCC Good Practice Guidance Simon Eggleston Technical.
Climate Forecasting Unit Second Ph’d training talk Prediction of climate extreme events at seasonal and decadal time scale Aida Pintó Biescas.
Climate Forecasting Unit Prediction of climate extreme events at seasonal and decadal time scale Aida Pintó Biescas.
Barcelona, 2015 Ocean prediction activites at BSC-IC3 Virginie Guemas and the Climate Forecasting Unit 9 February 2015.
Quality assurance / Quality control system for the Greek GHG emissions inventory Yannis Sarafidis, Elena Georgopoulou UNFCCC Workshop on National Systems.
Interpretation of climate change using maps of climate analogues in the DRIAS climate services project Julien Lémond 1-2, S. Planton 1, M. Ha-Duong 3,
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
ENSEMBLES RT4/RT5 Joint Meeting Paris, February 2005 Overview of the WP5.3 Activities Partners: ECMWF, METO/HC, MeteoSchweiz, KNMI, IfM, CNRM, UREAD/CGAM,
12/10/15.  It is a Cross Life Cycle Activity (CLCA) that may be performed at any stage ◦ In fact, some part of it (e.g. risk analysis and management)
Thoughts on Stewardship, Archive, and Access to the National Multi- Model Ensemble (NMME) Prediction System Data Sets John Bates, Chief Remote Sensing.
Business Analysis. Business Analysis Concepts Enterprise Analysis ► Identify business opportunities ► Understand the business strategy ► Identify Business.
Work Package 6 L2C Kick-off meeting Fontainebleau, March 7th 2006.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Software Quality Assurance SOFTWARE DEFECT. Defect Repair Defect Repair is a process of repairing the defective part or replacing it, as needed. For example,
The Role of International Standards for National Statistical Offices Andrew Hancock Statistics New Zealand Prepared for 2013 Meeting of the UN Expert Group.
Competency based learning & performance Ola Badersten.
1 Malaquias Peña and Huug van den Dool Consolidation of Multi Method Forecasts Application to monthly predictions of Pacific SST NCEP Climate Meeting,
Science plan S2S sub-project on verification. Objectives Recommend verification metrics and datasets for assessing forecast quality of S2S forecasts Provide.
VERIFICATION OF A DOWNSCALING SEQUENCE APPLIED TO MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR GLOBAL FLOOD PREDICTION Nathalie Voisin, Andy W. Wood and.
Pertemuan 14 Matakuliah: A0214/Audit Sistem Informasi Tahun: 2007.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
Statistical process model Workshop in Ukraine October 2015 Karin Blix Quality coordinator
Incoming Themes for 2017 Frank Kelly, USGS
Social return on investments (SROI)
Quality Assurance/Quality Control and Verification
EU Smart Cities and Communities information system
Implementing the New JCOMM Marine Climate Data System (MCDS)
Quality Assurance / Quality Control and Verification
Equality and diversity – session 2
Programme Board 6th Meeting May 2017 Craig Larlee
S2S sub-project on verification (and products)
BSC-ES/IC3 climate prediction activity
Prediction for Climate Services
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Extension of ARD concept to Atmosphere and Oceans?
Nathalie Voisin, Andy W. Wood and Dennis P. Lettenmaier
Assessment of the forecast quality of different seasonal climate prediction systems for the wind energy sector Doo Young Lee1, Albert Soret1, Veronica.
OECD Chief Statistician and Director, Statistics Directorate
Measuring Data Quality and Compilation of Metadata
New input for CEOS Persistent Identifier Best Practices
DEMETER Development of a European Multi-model Ensemble System for
GIFS-TIGGE project Richard Swinbank, and Young-Youn Park,
Climate forecasting services: coming down from the ivory tower
Predictability assessment of climate predictions within the context
A semi-operational prototype to forecast wind power
GAIA-CLIM Peter Thorne / Joerg Schulz 8/3/18
(WCRP Seasonal Prediction Workshop) Applied Meteorology Group
Toward a WMO Lead Center for Near Term Climate Prediction
Capability Maturity Model
Barcelona, 23 September 2015 Impact of resolution and initialisation in climate seasonal predictions F.J. Doblas Reyes.
Quality Assurance for Multi-model Seasonal Forecast Products (QA4Seas)
Prediction for Climate Services: Highlights
Data: way forward GO-ESSP meeting next week in Paris
The European Statistics Code of Practice - a Basis for Eurostat’s Quality Assurance Framework Marie Bohatá Deputy Director General, Eurostat ... Strategic.
School of Information Studies, Syracuse University, Syracuse, NY, USA
Proposed WCRP Grand Challenge on Near Term Climate Prediction
SPECS: Climate Prediction for Climate Services
Leveraging partnership for the DRR knowledge hub
Measuring the performance of climate predictions
Capability Maturity Model
The Weather Roulette: a game to communicate the usefulness of probabilistic climate predictions Marta Terrado, Llorenç Lledó, Dragana Bojovic, Asun Lera.
KEY INITIATIVE Financial Data and Analytics
Main recommendations & conclusions (1)
Rapid Adjustment of Forecast Trajectories: Improving short-term forecast skill through statistical post-processing Nina Schuhen, Thordis L. Thorarinsdottir.
The Added Value of User-Driven Climate Predictions
Presentation transcript:

EQC for Seasonal Forecasts C3S_51 Lot 3: EQC for Seasonal Forecasts Francisco Doblas-Reyes1,2 1 Barcelona Supercomputing Center (BSC) 2 Institució Catalana de Recerca i Estudis Avançats (ICREA)

Challenges of the EQC for seasonal forecasts Adaptation: provide information for all kind of services. Consistency: build trust, ensuring a high degree of coherence across products. Innovation: transfer recent developments from research to operations. Efficiency: information should respond to users’ queries with a delay as short as possible. Targets: define data, products, verification, guidance, etc.

Some important elements: C3S_51 Lot 3: What C3S_51 Lot 3 aims at developing a strategy for the evaluation and quality control (EQC) of the multi-model seasonal forecasts provided by the Copernicus Climate Change Service (C3S) to respond to the needs identified among a wide range of stakeholders. Some important elements: Users are consulted to identify needs in terms of products and EQC information Forecast data are quality controlled and forecast products (not forecast systems) are verified A quality assurance template is required to summarise the information A framework is designed that serves as a reference to the community

C3S_51 Lot 3: The strategy C3S_51 Lot 3 aims at developing a strategy for the evaluation and quality control (EQC) of the multi-model seasonal forecasts provided by the Copernicus Climate Change Service (C3S) to respond to the needs identified among a wide range of stakeholders.   Quality assurance Assessment of user needs Technical assessment and KPIs Scientific assessment Data sources, standards and conventions Forecast quality prototype

C3S_51 Lot 3: Who Partner Nature Role BSC-CNS (ES) Main contractor Coordination, data inventory and EQC framework and prototype Univ. Leeds (UK) Subcontractor Assess user requirements Meteoswiss (CH) Scientific quality assessment and gap analysis Predictia (ES) CDS requirements and development of the prototype Univ. Exeter (UK) Expert statistical advice IFCA-CSIC (ES) Downscaling

Assessing forecast quality Preferences for forecast quality format C3S_51 Lot 3: The users Assessing forecast quality Beware of user fatigue Preferences for forecast quality format M. Soares (Univ. of Leeds)

C3S_51 Lot 3: Standards and metadata All necessary metadata have been defined for the seasonal forecasts using all standards available (CHFP, SPECS). Similar standards could not be found for the reference observations. But beware Additional checks still necessary (data checker). xkcd.com

C3S_51 Lot 3: Scientific assessment Forecast quality assessment Verification procedure of existing forecast systems based on scoring rules (e.g. RPS for multi-category probabilities, CRPS for ensembles) Products, and not data, are verified; forecast products are scrutinised Impact of lagged ensembles and adequate product generation Calibration All bias correction and recalibration methods effectively remove bias Added value of sophisticated methods (e.g. EMOS) small to inexistent due to limited hindcast length (and low skill) Multi-model combination No forecast system consistently outperforms others Multi-model combination is beneficial Avoid the temptation of identifying inadequate data sources to e.g. discard “bad” forecast systems. These points address questions raised by users that lead to new products In addition: Calibration is more important than combination for ensemble forecasts Construction of ‘best performing’ lagged ensemble: two most recent initializations appear to be enough for short lead times

C3S_51 Lot 3: Forecast quality assessment and calibration Skill of JJA temperature from ECMWF SEAS5 + recalibration* Better than constant climatological forecast Worse CRPS of JJA near-surface temperature from ECMWF SEAS 5 initialized in May calibrated with the climate-conserving recalibration (CCR) and verified against ERA Interim for 1993-2014 * CRPS of JJA near-surface temperature from ECMWF SEAS 5 initialized in May calibrated with the climate-conserving recalibration (CCR) and verified against ERA Interim for 1993-2014 S. Hemri (Meteoswiss)

C3S_51 Lot 3: Forecast quality assessment and multi-model Skill of JJA temperature from C3S multi-model* Better than constant climatological forecast Worse CRPS of JJA near-surface temperature from ECMWF SEAS 5 initialized in May calibrated with the climate-conserving recalibration (CCR) and verified against ERA Interim for 1993-2014 * CRPS of JJA near-surface temperature from ECMWF SEAS 5, Meteo France System 5, MetOffice GloSea5 initialized in May CCR + weighted (RMSE) averaging of forecast PDF and verified against ERA Interim for 1993-2014 S. Hemri (Meteoswiss)

C3S_51 Lot 3: Timeliness and provenance Computing performance and product provenance are key. Forecast verification (ROC area) with performance analysis and workflow definition using SpecsVerification and easyVerification. J. Bedia, D. San Martín (Predictia)

C3S_51 Lot 3: Timeliness and provenance Computing performance and product provenance are key. Users need to generate products from the forecasts. Users require bias adjustment. Computational cost estimates are fundamental for user satisfaction. R. Manzanas (IFCA-CSIC)

C3S_51 Lot 3: Timeliness and provenance Prepare products for reproducibility, traceability, reuse and context METACLIP ensures the reproducibility of objects (NetCDF file, image) with human and machine-readable solution http://metaclip.org. It uses an RDF-based semantic metadata model that builds the vocabularies on existing initiatives (e.g. VALUE, PROV) and stores information in JSON files. “De facto” the IPCC WGI standard for AR6. Visualisation of provenance information J. Bedia, D. San Martín (PREDICTIA)

Quality assurance template and reports C3S_51 Lot 3: Where Quality assurance template and reports The quality information of the climate forecast datasets should be input into a quality assurance template to produce reports that are made available along with the datasets in the CDS. Reports include dataset documentation, quality flags and an independent assessment.

C3S_51 Lot 3 The contract has documented a metadata standard to feed the common data model and created a provenance solution for products provided comprehensive forecast quality assessment of products illustrated the benefits of bias adjustment and multi-model solutions demonstrated that scalability and computational efficiency of the solutions should be taken into account to ensure timeliness identified relevant user requirements Relevant gaps: 1) guidance and context for forecast products, 2) metadata, vocabularies and provenance governance, 3) observational uncertainty, 4) criteria to identify relevant forecast products …