Presentation is loading. Please wait.

Presentation is loading. Please wait.

Agenda Item 7.1: Aggregation Rules

Similar presentations


Presentation on theme: "Agenda Item 7.1: Aggregation Rules"— Presentation transcript:

1 Agenda Item 7.1: Aggregation Rules
2. Workshop of TT-WDQMS – ECMWF

2 Outcome of Meetings Meeting at WMO on Nov. 6th 2017 and discussion during the Project Steering Meeting of OSCAR/Surface about the strategic role of WDQMS – OSCAR/Surface – Country Profile DB It was considered to be important to clarify the purpose and scope of each application (for WDQMS not only for the Pilot but for a potential final solution) To define the users to be addressed with each of the applications To clarify the flow of information between the systems in order to avoid redundancy and guarantee consistency. Additional discussion on a more general level about the need to agree on the most useful granularity (or aggregation) of information in the three components, considering that they should not duplicate information unnecessarily and also address the different needs of different users (WMO SG vs an NMHSs network manager).

3 WDQMS, OSCAR and CPDB in the context of monitoring (strategic level)
Primary purpose Describe how well WIGOS is providing required observations, and report findings, incidents (Quality monitoring & incident mgt. of global/national assets) Understanding observational data, GAP analysis, support WDQMS, support CPDB, use as national DB (Technical doc/analysis of global/national assets) To provide national perspective on capacities of Members, contact information, country KPIs, institutional context. Hi-level doc/analysis of global/national assets Users Members, Regional WIGOS centers, users of data Network planners/managers, global/regional/national levels, data users (scientists, eg. climatologists), international organizations SG, donors, WMO staff, WMO regional offices , international organizations Variable scope Individual observed variable (P, T, U, more if available) (P, T, U, radiation, chemical composition, hydrology, …) Overall KPI of observing systems Temporal aggregation 6-hourly or else highest resolution available Monthly (actual as well as historical) Monthly or yearly? Spatial aggregation Individual location of observation (station level) Country level KPI = key performance indicator

4 WDQMS, OSCAR and CPDB in the context of monitoring (strategic level)
Imports Displays Declared status, schedules (present) «real» status (weekly) Contacts, organizations Declared status (monthly) «real» status (monthly) KPIs on metadata completeness KPIs on Focal Point Activity Exports «real» status (daily) Declared status, schedules (present, past) KPIs on metadata completeness Focal Point Activity KPIs for GOS, GAW, GCW, … Contacts, organizations (?) Access Restricted to Focal Points Public KPI = key performance indicator Agreement: monthly aggregated values for each calendar month and each variable in OSCAR/Surface (not only actual but also historical)

5 Purpose & Users of WDQMS

6 Guidance on Quality Monitoring, Evaluation and Incident Management Proceduresfor Regional WIGOS Centres (RWC) WIGOS Evaluation Function: takes the Quality Monitoring outputs from all the contributing WIGOS Monitoring Centres, .. generates routine daily performance reports based on at least two performance indicators, which are at the moment; a comparison with the status of WIGOS as described in WIS/OSCAR; trends in network performance over a suitable period (for GOS elements monthly rolling averages are proposed).  New element “Schedule of International Exchange” in WIGOS Metadata Standard  shall also be added to OSCAR

7 Aggregation within WDQMS: Daily aggregation across Monitoring centers
So far .. ECMWF JMA NCEP EUCOS ….. Monitoring Results Monitoring Results Monitoring Results Monitoring Results Monitoring Results e.g out of 8 5 out of 8 7 out of 8 6 out of 8 …. To be done .. Aggregation 1 value / variable / station / day

8 Daily aggregations WDQMS
Challenge: results of the different WIGOS Monitoring Centres might differ. how to deal with this, if RWC shall only initiate an Incident Management Process if the majority of WIGOS Monitoring Centres show similar results Does not only concern Data availability, but also Timeliness and Accuracy Aggregated value needed for OSCAR/Surface and probably also CPDB  user requirements not really clear. Rules: Data Availability: evtl. already implemented by WDQMS Tool and if so, how? Proposal: value considered to be reported if at least one of the monitoring centres is using it. Timeliness and Accuracy: ?? Average over all centers? Information comparable?

9 Outputs of WDQMS Data availability: total number of meteorological bulletins (TAC/BUFR) received during a defined period (e.g. 24 hours) compared to the required number of bulletins as determined by the observing schedule New element “Schedule of International Exchange” in WIGOS Metadata Standard  shall also be added to OSCAR Timeliness: delay between the nominal observation time of a particular observation message type issued at site of a Member and reception time at users´ database of this message received via GTS. Accuracy: Combining trueness and precision as outlined in ISO5725 (mainly derived from “Observation minus Background” (O-B) NWP results from Global NWP Centres for particular parameters such as air pressure, air temperature, wind and relative humidity observations. The Web-tool should allow the statistics to be filtered, for example by country or to display only those stations which exceed targets on data availability, timeliness or measurement uncertainty (bias, standard deviation, mean vector difference, root mean square vector difference).  requirements not very clear yet (in particular for CPDB)

10 Purpose & Users of OSCAR/Surface

11 Aggregation for OSCAR/Surface
Request by ICG-WIGOS: «3.1.4 ICG-WIGOS requested that “quantitative monitoring information from the WDQMS to become part of the station report in OSCAR/Surface”, which will provide valuable additional information.” What do OSCAR/Surface Users really need (Task 9)?  different attempts to find out but still not really clear Output of Meetings (6/11/17 and 20/11/17): OSCAR/Surface wants: an aggregated value per observation for each calendar month Not only actual but also historical

12 Aggregation per variable: Display?
In discussions how and where to implement this information it showed, that It isn’t clear, if this aggregated value has to be calculated for each observation in relation to the program to which it contributes .. The user might apply different criteria to check the needed availability Display of reporting status: Do we use categories and if so  related to Thresholds of Requirements (would have to be set specifically depending on priority, region etc)

13 Generic vs specific information
Another question: where to put the information?  Observations:  not a specific setting but the observation in general or  Deployments  specific setting of how it is carried out .. or here.. or? Here …

14 What information is needed in OSCAR/Surface?
Is only a performance status required per variable or is there any need for a «station»-wise point of view?  aggregation across variables necessary?  if no guidance: do people not do it anyway? Regularly reporting Occasionally reporting Not reporting

15 Possible Solutions Challenges:
We still don’t know the concrete requirements about how this information is going to be used within OSCAR/Surface We can decide it’s not needed to come up with a guideline how to aggregate across variables but people might nonetheless start to do it  guideline how to do would be helpful Challenge of categorisation (e.g. «reporting» = % of reported values)  applied categorisation can differ from country to country depending on used criteria. Relation to thresholds of requirements? Proposals: Aggregation across «Station»:  define order of variables which should be used to determine performance status instead of implementing a complex aggregation algorithm. Not definitely specified user needs:  create table (see next slide) to deliver basis in order that they can calculate in OSCAR/Surface their required actions itself.

16 Data delivered to OSCAR/Surface
Proposal to create table for each variable and day Month of the year Facility Variable * # exptected values as actually used in the monitoring systems (dynamic meta data) Results created by Monitoring system Delivery: once a month Analytics in OSCAR/Surface:  Guidelines should be provided by WDQMS total availability Monthly variations Date or day of month # exptected values* # reported values 8 6 7 ….

17 Purpose & Users of Country Profile DB

18 Aggregations for CPDB Opinions differ as to whether it makes sense for CPDB to show the list of stations of each country in the CPDB or rather display summary statistics and a KPI of the observing capability of a country  link to OSCAR to show the list of stations and KPIs for each individual one. It needs to be clarified what are the exact needs who knows them? WDQMS shouldn’t have to worry about it but provide the necessary basic information (analogous to OSCAR/Surface) and give guidance how to calculate e.g. yearly values or KPIs Proposal: Aggregations calculated based on same information given to OSCAR/Surface This discussion needs to be continued and agreement be found, such that all 3 components can flourish side by side and have their specific raison d'être but that there is consistency of the information provided.


Download ppt "Agenda Item 7.1: Aggregation Rules"

Similar presentations


Ads by Google