Presentation is loading. Please wait.

Presentation is loading. Please wait.

CMEMS R&D KICK-OFF MEETING

Similar presentations


Presentation on theme: "CMEMS R&D KICK-OFF MEETING"— Presentation transcript:

1 CMEMS R&D KICK-OFF MEETING
High-resolution model Verification Evaluation (HiVE) Jan Maksymczuk & Marion Mittermaier The kick-off meetings will be 40-min long maximum, with the following agenda: - Context and general information by Mercator-Ocean: 10' - Presentation of the project by the PI: 20' - Discussion and questions: 10' The presentation of your project should include the context, objectives, team, organization, impact and relevance of the project for CMEMS as well as how you plan to interact with TACs and MFCs so that your R&D activities can lead to an evolution of CMEMS operational products. We also ask you to send me the slides of your presentation the day before the kick-off meeting to ease the organization of the meeting. Moreover, I'd like to let you know that after the kick off meetings I'll ask you a version of the presentations that can be shared on an intranet with a restricted access (to PIs of the projects and leaders of CMEMS production centers) so that leaders of the CMEMS production centers can access them to have an idea on what will be done as part as the CMEMS funded R&D projects. 06/04/18 CMEMS R&D Kick-Off Teleconference

2 Objectives To understand the accuracy of CMEMS products at specific observing locations using neighbourhood methods and ensemble techniques To understand the skill of CMEMS products in forecasting events or features of interest in space and time 06/04/18 CMEMS R&D Kick-Off Teleconference

3 Context: Work Package A
Understand the accuracy of CMEMS products at specific observing locations using neighbourhood methods and ensemble techniques Several CMEMS products covering the same locations, from both global and regional TACs, are available to users but are not necessarily assessed in a similar way. Work Package A proposes to trial a verification system (HiRA - High Resolution Assessment framework) to equitably compare these models with regards to their accuracy and predictive skill. Work Package A will help to inform CMEMS model developers on the basic accuracy and skill of ocean forecasts at a given location when comparing lower resolution models with their next generation evolutions (for example, AMM7 and AMM15). Outcomes from Work Package A would be of particular relevance to regional MFCs and the user community (e.g. NOOS). 06/04/18 CMEMS R&D Kick-Off Teleconference

4 Context: Work Package B
Understand the skill of CMEMS products in forecasting events or features of interest in space and time. Work Package B will trial the use of an object-based spatial method called MODE (Method for Object-based Diagnostic Evaluation) to evaluate the evolution of events in both forecast and observation fields. MODE aims to evaluate forecast quality in a manner similar to that of a user making a subjective assessment. Object-based verification methods were developed to provide an objective link to the way forecasts are used subjectively, i.e. focusing on features or events of interest. Outcomes from Work Package B will have applications when monitoring feature evolution (for example, eddies, algal blooms), and could be extendable and applicable to global model assessments. Strong linkages to other CMEMS MFCs (particularly IBI) will enhance the pull through to operational improvements across CMEMS systems, as will interactions with the NOOS community. 06/04/18 CMEMS R&D Kick-Off Teleconference

5 Model Domains to be Assessed
Assessments will be made using products from the CMEMS catalogue NWS - AMM7 (1/10°) IBI (1/36°) NWS - AMM15 (1.5km) 06/04/18 CMEMS R&D Kick-Off Teleconference

6 x Spatial sampling Create distribution NOT upscaling/ smoothing!
Make use of spatial verification methods which compare single observations to a forecast neighbourhood around the observation location  probabilistic framework Represents a fundamental departure from traditional verification where the emphasis is on extracting the nearest GP or bilinear interpolation to get matched forecast-ob pair. x interpolate/nearest Traditional Observation Ob Create distribution NOT upscaling/ smoothing! Forecast neighbourhood 06/04/18 CMEMS R&D Kick-Off Teleconference

7 Model-specific v user-relevant
Probabilistic Forecast / Binary Observation Modeller interested overall performance or pdf pi = forecast probability oi = observed occurrence (0 or 1) Perfect score = 0 Users interested in specific thresholds: e.g. defining a decision point or hazard k 1 CDF obs x 1 CDF obs fcst k 1 CDFobs-CDFfcst Perfect score = 0 Perfect score = 0 K = discrete threshold categories

8 Focus is on spatial properties, especially the spatial biases
MODE – Method for Object-based Diagnostic Evaluation Davis et al., MWR, 2006 Two parameters: Convolution radius Threshold Highly configurable Attributes: Centroid difference, Angle difference, Area ratio etc Focus is on spatial properties, especially the spatial biases

9 MODE and MODE-TD examples: cloud breaks and jet cores
Model A Analysis Model B Mittermaier and Bullock 2013 Mittermaier et al. 2016 06/04/18 CMEMS R&D Kick-Off Teleconference

10 Project Team Jan Maksymczuk will co-lead and deliver on development and evaluation tasks relating to the inter-comparison of models as part of Work Package A. He will act as overall project manager, provide the authorship of project reports and presentations, and act as the main point of contact to Mercator Ocean. Marion Mittermaier will co-lead and provide verification expertise to the project, as well as assistance on the usage of both HiRA and MODE. She will guide the research team in the analysis and assessment of results, and ensure that any conclusions are scientifically robust. Ric Crocker, Rachel North & Christine Pequignet will contribute to the delivery of the scientific aspects of the proposed research relating to testing and evaluating both HiRA and MODE. Andrew Ryan will deliver on technical aspects of the proposed research, and facilitate longer-term implementation of the tools within operational systems. 06/04/18 CMEMS R&D Kick-Off Teleconference

11 Project Organisation I
Work Package A Effort (weeks) Task A.1 Identify and prepare suitable CMEMS observation and forecast datasets for use in HiRA assessments 3.4w Task A.2 Technical development work on HiRA code base to adapt framework for use with ocean datasets, in particular, to enable netcdf functionality. Conduct testing and code review of developments to HiRA. 6w Task A.3 Conduct evaluations of HiRA with current AMM7 model over a 12 month period with an emphasis on SST and currents, and then intercompare with overlapping IBI model. Assess results and identify possible NWS sub-regions for more detailed assessment. 5.6w Task A.4 Repeat evaluations of Task A.3 with the inclusion of AMM15 model. Assess outputs and provide recommendations on the suitability of HiRA in evaluating ocean model forecast accuracy. 5w 06/04/18 CMEMS R&D Kick-Off Teleconference

12 Project Organisation II
Work Package B Effort (weeks) Task B.1 Identify appropriate CMEMS gridded remote datasets for case study evaluations 2w Task B.2 Technical development work to install and test latest versions of MODE Task B.3 Data and forecast preparation to be enable ocean datasets to be compatible with MODE. 4w Task B.4 Experimental case studies to allow for the tuning of the thresholds in MODE to gain an optimal analysis of ocean datasets, followed by final case study evaluations of models. Assess outputs and provide recommendations on the suitability of MODE in evaluating the skill of CMEMS products in forecasting events or features of interest in space and time. 10.4w 06/04/18 CMEMS R&D Kick-Off Teleconference

13 Project Milestones Milestone B.1
Technical development to install and test MODE completed Jun 2018 Milestone A.1 Technical development and testing of HiRA completed and ready for use Oct 2018 Milestone B.2 Data preparation for MODE completed Dec 2018 Milestone A.2 Initial studies of HiRA with AMM7 and IBI models completed Mar 2019 Milestone B.3 Case studies using MODE with AMM7, IBI, AMM15 completed Dec 2019 Milestone A.3 Studies of HiRA with AMM7, IBI and AMM15 models completed Jan 2020 06/04/18 CMEMS R&D Kick-Off Teleconference

14 Project Deliverables Due date Deliverable Q.1 Quarterly Report 1
Due date Deliverable Q.1 Quarterly Report 1 30/06/2018 Deliverable Q.2 Quarterly Report 2 30/09/2018 Deliverable Q.3 Quarterly Report 3 31/12/2018 Deliverable Q.4 Mid-term Project Report 31/03/2019 Deliverable Q.5 Quarterly Report 4 30/06/2019 Deliverable Q.6 Quarterly Report 5 30/09/2019 Deliverable Q.7 Quarterly Report 6 31/12/2019 Deliverable A.1 An assessment report on the applicability of HiRA to ocean forecasts, making a recommendation as to whether it should be adopted as part of routine CMEMS Product Quality activities 31/01/2020 Deliverable B.1 A report evaluating the applicability of MODE for feature-based assessment of ocean forecasts, making a recommendation as to whether it should be adopted as part of routine CMEMS Product Quality activities 29/02/2020 Deliverable Q.8 Final Project Report 31/03/2020 06/04/18 CMEMS R&D Kick-Off Teleconference

15 R&D leading to evolution of CMEMS operational products
Specific benefits to the CMEMS community and stakeholders resulting from the proposed work include: a framework that will assess km-scale models appropriately metrics that can overcome the double-penalty effect metrics which can assess both deterministic and ensemble forecasts equitably a framework that can measure the improvements in forecast skill over time tools that can measure the skill of CMEMS products in forecasting events or features of interest in space and time The focus for facilitating transfer of the project R&D results to CMEMS operational centres will be to demonstrate the relevance of the proposed project outcomes in improving the quality assessment of forecast products. For example, the evaluation activity and the associated recommendations in this project will focus on understanding which datasets, neighbourhoods and metrics are suited to the variables considered, as well as an evaluation of some of the available assessment tools that could be used. CMEMS MFCs will then be in a position to implement these recommendations. 06/04/18 CMEMS R&D Kick-Off Teleconference

16 Interaction with TACs/MFCs
The focus will be to demonstrate the improved assessments of skill within CMEMS products e.g. Evaluation activity and associated recommendations will focus on understanding forecast skill and how it could be used to inform product users in their decisions making. Project capability and scientific results will be regularly and openly shared with others within CMEMS, supported by Mercator Ocean activities. Sharing of results and facilitating uptake by CMEMS will be promoted through the linkages of the project team and colleagues with the international ocean community. e.g. through NOWMAPS and the NOOS consortium e.g. through participation in the CMEMS Product Quality Working Group 06/04/18 CMEMS R&D Kick-Off Teleconference

17 Evaluation activities
How do we demonstrate benefits? The outcomes from Objective A will help inform ongoing CMEMS regional model development by providing additional information on forecast accuracy when comparing lower resolution models with their next generation evolutions (for example, AMM7 and AMM15). Strong linkages to other CMEMS MFCs (particularly IBI) will enhance the pull through to operational improvements across CMEMS systems. The outcomes from Objective B will have applications relevant to CMEMS downstream users when monitoring feature evolution (for example, eddies, or the onset of algal blooms), and could be extendable and applicable to global model assessments. 06/04/18 CMEMS R&D Kick-Off Teleconference

18 Questions and discussion
06/04/18 CMEMS R&D Kick-Off Teleconference


Download ppt "CMEMS R&D KICK-OFF MEETING"

Similar presentations


Ads by Google