CMEMS R&D KICK-OFF MEETING

Slides:



Advertisements
Similar presentations
1 GEOSS Monitoring & Evaluation. 2 1st GEOSS M&E WG meeting outcomes Agreed on 1.the final draft of Terms of Reference for the M&E WG 2.The plan for delivery.
Advertisements

“A LPB demonstration project” Celeste Saulo CIMA and Dept. of Atmos. and Ocean Sciences University of Buenos Aires Argentina Christopher Cunningham Center.
WMO Competency Standards: Development and Implementation Status
Exploring the Use of Object- Oriented Verification at the Hydrometeorological Prediction Center Faye E. Barthold 1,2, Keith F. Brill 1, and David R. Novak.
On-going WMO Demonstration Projects related to EXPO2010 Multi Hazard Early Warning System Multi Hazard Early Warning System Leading by SMB/CMALeading by.
© Crown copyright Met Office From the global to the km-scale: Recent progress with the integration of new verification methods into operations Marion Mittermaier.
MERSEA Remote – Sensing Ocean Portal Brest Nov 2005 WP2.6 - DIP2 Project Status < Observing Systems < Remote Sensing.
Work Programme for the specific programme for research, technological development and demonstration "Integrating and strengthening the European Research.
Verification Summit AMB verification: rapid feedback to guide model development decisions Patrick Hofmann, Bill Moninger, Steve Weygandt, Curtis Alexander,
1 SPSRB Decision Brief on Declaring a Product Operational Instructions / Guidance This template will be used by NESDIS personnel to recommend to the SPSRB.
Inter-comparison and Validation Task Team Breakout discussion.
1 Motivation Motivation SST analysis products at NCDC SST analysis products at NCDC  Extended Reconstruction SST (ERSST) v.3b  Daily Optimum Interpolation.
BalticGrid-II Project The Second BalticGrid-II All-Hands Meeting, Riga, May, Joint Research Activity Enhanced Application Services on Sustainable.
Refinement and Evaluation of Automated High-Resolution Ensemble-Based Hazard Detection Guidance Tools for Transition to NWS Operations Kick off JNTP project.
High Impact Weather Emerging challenge identified at CASXVI Mariane DIOP KANE Mariane DIOP KANE CASMG9, Geneva, April 2014.
E VALUATING YOUR E - LEARNING COURSE LTU Workshop 11 March 2008.
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
The Data Sharing Working Group 24 th meeting of the GEO Executive Committee Geneva, Switzerland March 2012 Report of the Data Sharing Working Group.
Science plan S2S sub-project on verification. Objectives Recommend verification metrics and datasets for assessing forecast quality of S2S forecasts Provide.
Diagnostic verification and extremes: 1 st Breakout Discussed the need for toolkit to build beyond current capabilities (e.g., NCEP) Identified (and began.
RFP FOR DEVELOPMENT OF TRAINING MATERIAL, PROVISION OF BASE LINE TRAINING AND PROMOTION OF SASSA IN-HOUSE CORE TRAINING COMPETENCE.
Pre-Workshop on Intervention of Science and Technology in Traditional Crafts Date : 14 th Dec Venue : Seminar Hall, First Floor of S & H Building.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
HIC Meeting, 02/25/2010 NWS Hydrologic Forecast Verification Team: Status and Discussion Julie Demargne OHD/HSMB Hydrologic Ensemble Prediction (HEP) group.
Principal Investigator ESTCP Selection Meeting
ROLE of a Continuous Improvement LEADER
The impact of change Source: FOLG Summit Tools for the Future Closing the gap between business as usual and where we need to be.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
WIGOS Data Quality Monitoring System (WDQMS)
YouthPower Learning Tools for Positive Youth Development and Peacebuilding Cassandra Jessee, Director YouthPower Learning.
LEPS VERIFICATION ON MAP CASES
GEO WP 1. INFRASTRUCTURE (Architecture and Data Management)
Tuning II ( ) Thematic Networks and Tuning: How to adapt and how to adopt the Tuning methodology? Management Committee.
Principal Investigator ESTCP Selection Meeting
Poster 1. Leadership Development Programme : Leading Cultures of Research and Innovation in Clinical Teams Background The NHS Constitution is explicit.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Project Management Lifecycle Phases
Evaluating ESD in RCEs: The Start-up Tools
Spatial Verification Intercomparison Meeting, 20 February 2007, NCAR
Capacity Building Enhance the coordination of efforts to strengthen individual, institutional and infrastructure capacities, particularly in developing.
UERRA WP3 Assessing uncertainties by evaluation against independent observational datasets DWD, KNMI, MI, EDI, UEA, NMA-RO, MO Task 3.1 Coordinated uncertainty.
S2S sub-project on verification (and products)
Research to operations (R2O)
Project Roles and Responsibilities
Introduction to the Capability Framework
WMO Global Multi-Hazard Alert System
Overview of WMO Strategic Planning Initiative
WaveFlow KO Øyvind Breivik (MET Norway), Joanna Staneva (HZG), Jean Bidlot (ECMWF) and George Nurser (NOC)
The impact of change Source: FOLG Summit Tools for the Future Closing the gap between business as usual and where we need to be.
Caio Coelho (Joint CBS/CCl IPET-OPSLS Co-chair) CPTEC/INPE, Brazil
Review of Chair Priorities
LSI-VC Work Plan Updates
Online Catalogue of Gaps
RECARE set-up Rudi Hessel on behalf of coordination team
IPET-OPSLS/CCl-17 relevant issues before EC-70
66-SE-CMEMS-CALL2: Lot-3 Benefits of dynamically modelled river discharge input for ocean and coupled atmosphere-land-ocean systems Hao Zuo, Fredrik Wetterhall,
Development of an advanced ensemble-based ocean data assimilation approach for ocean and coupled reanalyses Eric de Boisséson, Hao Zuo, Magdalena Balmaseda.
Public Weather Services and Nowcasting
Global Review of Regional Climate Outlook Forums 2017
A Review of Effective Teaching Skills
The Hub Innovation Program Evaluation Plan
Principal Investigator ESTCP Selection Meeting
Annual Professional Development Conference
School-Wide Positive Behavioral Interventions and Supports (SWPBIS)
High Impact Weather Emerging challenge identified at CASXVI
CMEMS R&D MID-TERM MEETING
The WIGOS Pre-Operational Phase
Principal Investigator ESTCP Selection Meeting
WORK STREAM TEAM DELIVERABLES
Annual Professional Development Conference
Presentation transcript:

CMEMS R&D KICK-OFF MEETING High-resolution model Verification Evaluation (HiVE) Jan Maksymczuk & Marion Mittermaier The kick-off meetings will be 40-min long maximum, with the following agenda: - Context and general information by Mercator-Ocean: 10' - Presentation of the project by the PI: 20' - Discussion and questions: 10' The presentation of your project should include the context, objectives, team, organization, impact and relevance of the project for CMEMS as well as how you plan to interact with TACs and MFCs so that your R&D activities can lead to an evolution of CMEMS operational products. We also ask you to send me the slides of your presentation the day before the kick-off meeting to ease the organization of the meeting. Moreover, I'd like to let you know that after the kick off meetings I'll ask you a version of the presentations that can be shared on an intranet with a restricted access (to PIs of the projects and leaders of CMEMS production centers) so that leaders of the CMEMS production centers can access them to have an idea on what will be done as part as the CMEMS funded R&D projects. 06/04/18 CMEMS R&D Kick-Off Teleconference

Objectives To understand the accuracy of CMEMS products at specific observing locations using neighbourhood methods and ensemble techniques To understand the skill of CMEMS products in forecasting events or features of interest in space and time 06/04/18 CMEMS R&D Kick-Off Teleconference

Context: Work Package A Understand the accuracy of CMEMS products at specific observing locations using neighbourhood methods and ensemble techniques Several CMEMS products covering the same locations, from both global and regional TACs, are available to users but are not necessarily assessed in a similar way. Work Package A proposes to trial a verification system (HiRA - High Resolution Assessment framework) to equitably compare these models with regards to their accuracy and predictive skill. Work Package A will help to inform CMEMS model developers on the basic accuracy and skill of ocean forecasts at a given location when comparing lower resolution models with their next generation evolutions (for example, AMM7 and AMM15). Outcomes from Work Package A would be of particular relevance to regional MFCs and the user community (e.g. NOOS). 06/04/18 CMEMS R&D Kick-Off Teleconference

Context: Work Package B Understand the skill of CMEMS products in forecasting events or features of interest in space and time. Work Package B will trial the use of an object-based spatial method called MODE (Method for Object-based Diagnostic Evaluation) to evaluate the evolution of events in both forecast and observation fields. MODE aims to evaluate forecast quality in a manner similar to that of a user making a subjective assessment. Object-based verification methods were developed to provide an objective link to the way forecasts are used subjectively, i.e. focusing on features or events of interest. Outcomes from Work Package B will have applications when monitoring feature evolution (for example, eddies, algal blooms), and could be extendable and applicable to global model assessments. Strong linkages to other CMEMS MFCs (particularly IBI) will enhance the pull through to operational improvements across CMEMS systems, as will interactions with the NOOS community. 06/04/18 CMEMS R&D Kick-Off Teleconference

Model Domains to be Assessed Assessments will be made using products from the CMEMS catalogue NWS - AMM7 (1/10°) IBI (1/36°) NWS - AMM15 (1.5km) 06/04/18 CMEMS R&D Kick-Off Teleconference

x Spatial sampling Create distribution NOT upscaling/ smoothing! Make use of spatial verification methods which compare single observations to a forecast neighbourhood around the observation location  probabilistic framework Represents a fundamental departure from traditional verification where the emphasis is on extracting the nearest GP or bilinear interpolation to get matched forecast-ob pair. x interpolate/nearest Traditional Observation Ob Create distribution NOT upscaling/ smoothing! Forecast neighbourhood 06/04/18 CMEMS R&D Kick-Off Teleconference

Model-specific v user-relevant Probabilistic Forecast / Binary Observation Modeller interested overall performance or pdf pi = forecast probability oi = observed occurrence (0 or 1) Perfect score = 0 Users interested in specific thresholds: e.g. defining a decision point or hazard k 1 CDF obs x 1 CDF obs fcst k 1 CDFobs-CDFfcst Perfect score = 0 Perfect score = 0 K = discrete threshold categories

Focus is on spatial properties, especially the spatial biases MODE – Method for Object-based Diagnostic Evaluation Davis et al., MWR, 2006 Two parameters: Convolution radius Threshold Highly configurable Attributes: Centroid difference, Angle difference, Area ratio etc Focus is on spatial properties, especially the spatial biases

MODE and MODE-TD examples: cloud breaks and jet cores Model A Analysis Model B Mittermaier and Bullock 2013 Mittermaier et al. 2016 06/04/18 CMEMS R&D Kick-Off Teleconference

Project Team Jan Maksymczuk will co-lead and deliver on development and evaluation tasks relating to the inter-comparison of models as part of Work Package A. He will act as overall project manager, provide the authorship of project reports and presentations, and act as the main point of contact to Mercator Ocean. Marion Mittermaier will co-lead and provide verification expertise to the project, as well as assistance on the usage of both HiRA and MODE. She will guide the research team in the analysis and assessment of results, and ensure that any conclusions are scientifically robust. Ric Crocker, Rachel North & Christine Pequignet will contribute to the delivery of the scientific aspects of the proposed research relating to testing and evaluating both HiRA and MODE. Andrew Ryan will deliver on technical aspects of the proposed research, and facilitate longer-term implementation of the tools within operational systems. 06/04/18 CMEMS R&D Kick-Off Teleconference

Project Organisation I Work Package A   Effort (weeks) Task A.1 Identify and prepare suitable CMEMS observation and forecast datasets for use in HiRA assessments 3.4w Task A.2 Technical development work on HiRA code base to adapt framework for use with ocean datasets, in particular, to enable netcdf functionality. Conduct testing and code review of developments to HiRA. 6w Task A.3 Conduct evaluations of HiRA with current AMM7 model over a 12 month period with an emphasis on SST and currents, and then intercompare with overlapping IBI model. Assess results and identify possible NWS sub-regions for more detailed assessment. 5.6w Task A.4 Repeat evaluations of Task A.3 with the inclusion of AMM15 model. Assess outputs and provide recommendations on the suitability of HiRA in evaluating ocean model forecast accuracy. 5w 06/04/18 CMEMS R&D Kick-Off Teleconference

Project Organisation II Work Package B   Effort (weeks) Task B.1 Identify appropriate CMEMS gridded remote datasets for case study evaluations 2w Task B.2 Technical development work to install and test latest versions of MODE Task B.3 Data and forecast preparation to be enable ocean datasets to be compatible with MODE. 4w Task B.4 Experimental case studies to allow for the tuning of the thresholds in MODE to gain an optimal analysis of ocean datasets, followed by final case study evaluations of models. Assess outputs and provide recommendations on the suitability of MODE in evaluating the skill of CMEMS products in forecasting events or features of interest in space and time. 10.4w 06/04/18 CMEMS R&D Kick-Off Teleconference

Project Milestones Milestone B.1 Technical development to install and test MODE completed Jun 2018 Milestone A.1 Technical development and testing of HiRA completed and ready for use Oct 2018 Milestone B.2 Data preparation for MODE completed Dec 2018 Milestone A.2 Initial studies of HiRA with AMM7 and IBI models completed Mar 2019 Milestone B.3 Case studies using MODE with AMM7, IBI, AMM15 completed Dec 2019 Milestone A.3 Studies of HiRA with AMM7, IBI and AMM15 models completed Jan 2020 06/04/18 CMEMS R&D Kick-Off Teleconference

Project Deliverables Due date Deliverable Q.1 Quarterly Report 1   Due date Deliverable Q.1 Quarterly Report 1 30/06/2018 Deliverable Q.2 Quarterly Report 2 30/09/2018 Deliverable Q.3 Quarterly Report 3 31/12/2018 Deliverable Q.4 Mid-term Project Report 31/03/2019 Deliverable Q.5 Quarterly Report 4 30/06/2019 Deliverable Q.6 Quarterly Report 5 30/09/2019 Deliverable Q.7 Quarterly Report 6 31/12/2019 Deliverable A.1 An assessment report on the applicability of HiRA to ocean forecasts, making a recommendation as to whether it should be adopted as part of routine CMEMS Product Quality activities 31/01/2020 Deliverable B.1 A report evaluating the applicability of MODE for feature-based assessment of ocean forecasts, making a recommendation as to whether it should be adopted as part of routine CMEMS Product Quality activities 29/02/2020 Deliverable Q.8 Final Project Report 31/03/2020 06/04/18 CMEMS R&D Kick-Off Teleconference

R&D leading to evolution of CMEMS operational products Specific benefits to the CMEMS community and stakeholders resulting from the proposed work include: a framework that will assess km-scale models appropriately metrics that can overcome the double-penalty effect metrics which can assess both deterministic and ensemble forecasts equitably a framework that can measure the improvements in forecast skill over time tools that can measure the skill of CMEMS products in forecasting events or features of interest in space and time The focus for facilitating transfer of the project R&D results to CMEMS operational centres will be to demonstrate the relevance of the proposed project outcomes in improving the quality assessment of forecast products. For example, the evaluation activity and the associated recommendations in this project will focus on understanding which datasets, neighbourhoods and metrics are suited to the variables considered, as well as an evaluation of some of the available assessment tools that could be used. CMEMS MFCs will then be in a position to implement these recommendations. 06/04/18 CMEMS R&D Kick-Off Teleconference

Interaction with TACs/MFCs The focus will be to demonstrate the improved assessments of skill within CMEMS products e.g. Evaluation activity and associated recommendations will focus on understanding forecast skill and how it could be used to inform product users in their decisions making. Project capability and scientific results will be regularly and openly shared with others within CMEMS, supported by Mercator Ocean activities. Sharing of results and facilitating uptake by CMEMS will be promoted through the linkages of the project team and colleagues with the international ocean community. e.g. through NOWMAPS and the NOOS consortium e.g. through participation in the CMEMS Product Quality Working Group 06/04/18 CMEMS R&D Kick-Off Teleconference

Evaluation activities How do we demonstrate benefits? The outcomes from Objective A will help inform ongoing CMEMS regional model development by providing additional information on forecast accuracy when comparing lower resolution models with their next generation evolutions (for example, AMM7 and AMM15). Strong linkages to other CMEMS MFCs (particularly IBI) will enhance the pull through to operational improvements across CMEMS systems. The outcomes from Objective B will have applications relevant to CMEMS downstream users when monitoring feature evolution (for example, eddies, or the onset of algal blooms), and could be extendable and applicable to global model assessments. 06/04/18 CMEMS R&D Kick-Off Teleconference

Questions and discussion 06/04/18 CMEMS R&D Kick-Off Teleconference