Download presentation
Presentation is loading. Please wait.
Published byAlexia Watkins Modified over 6 years ago
1
Indicator structure and common elements for information flow
WGDIKE JUNE , Brussels Indicator structure and common elements for information flow Supporting Document: DIKE_ Prepared by: Neil Holdsworth (ICES) for ETC/ICM support to EEA
2
Conditions for WISE-Marine
DO NOT need to manage all the data and information centrally DO need to leverage as much as possible from the RSC’s DO need to have structured access to or information relating to, the data and information used in the MSFD assessments
3
Data supply What is structured access? Assessment Indicator result
DATA INFORMATION SERVICES Data supply Assessment Indicator result
4
What we observe What is structured access?
ICES strategy on data handling/databases What we observe
5
Reality – more like this
What is structured access? Professionalising knowledge and data products Reality – more like this
6
Structured access Professionalising knowledge and data products
Roles/Responsibilities Operators Methodologies Timetables agreed deliveries services/products Access rights Etc.
7
How does this look for MSFD?
Based on, but not identical to, the Marine Information Schema (RSC support project, 2015 – in development)
8
Indicator assessment structure
9
Indicator structure at regional level
Organization Development status Presentation format Structure EEA Currently used for coreset indicators (CSI) Online Follows a strict meta-data template adapted from the European SDMX Metadata Structure (ESMS) HELCOM Currently used for some existing assessments (i.e. Eutrophication) and available for use for roof report Follows a document-based structure based on an agreed HELCOM set of information OSPAR In draft and awaiting contracting party agreement. Available for use in roof report Planned to be online Follows a document and meta-data-based structure, developed within OSPAR UNEP/MAP Not known BSC
10
Categorisation of indicator elements
Category Description Access and use Explicit information on access rights, usage rights related to the data products, indicator publication etc. i.e. data policy, copyright restrictions etc. Assessment findings Key messages, assessment results, trend analysis and conclusions presented primarily as text Assessment methods Methodologies, aggregation methods, indicator specifications, references to other relevant methods Assessment purpose Purpose of assessment, rationale of approach, targets and policy context Contact and ownership Contact details for assessment indicator, including authorship and organisational contact points Data inputs and outputs Data sources, assessment datasets, assessment results, snapshots etc. Geographical scope Assessment units, other geographical information, countries Labelling and classification Identification and classification systems, such as INSPIRE, DPSIR, MSFD criteria Quality aspects Explicit information content on quality of assessment, including data and methods. i.e. uncertainty, gaps in coverage etc. Temporal scope Time range of assessment, usually expressed as a year range Version control Publishing dates, references to previous indicator versions, URI's etc.
11
Indicator structure Categories EEA fields HELCOM fields OSPAR fields
Access and use 1 Assessment findings 4 3 8 Assessment methods 5 7 Assessment purpose 6 Contact and ownership 2 Data inputs and outputs Geographical scope Labelling and classification 10 Quality aspects Temporal scope Version control Grand Total 44 31 42
12
Notable features from indicator structures
(EEA) Explicit reference to the indicator units e.g. concentration expressed as microgramme per litre (HELCOM, OSPAR) Indicator makes explicit references to both an assessment dataset and assessment data sources (HELCOM) Indicator structure has extensive and explicit assessment methods and quality aspects components (HELCOM, OSPAR) Explicit references to MSFD criteria (OSPAR) Explicitly states data and indicator access and use rights
13
Possible MSFD improvements?
EEA no specific reference to geographical assessment units. However the map products do depict the results by MSFD sub-regions, and the metadata structure has a field for ‘contributing countries’. There is no explicit reference to an ‘assessment dataset’ No specific MSFD utilisation labels, except for DPSIR label type OSPAR A clearer information provision to the data product relating to an assessment result would be useful The field ‘linkage’ would benefit from a more structured approach as a collection of More explicit references to quality aspects of the assessment, monitoring and data aggregation methods. HELCOM No explicit placeholder for temporal scope i.e. the year range of the assessment No direct reference to which countries the indicator assessment applies to (i.e. a list of countries), although this might be inferred from the map of assessment units/data results Although an authorship citation is provided, there is no explicit reference to the access and use rights to the assessment dataset, or map products
14
Proposal for common structured elements
Category and relevant fields Description Content type EEA HELCOM OSPAR Recomm-endation Access and Use Explicit information on access rights, usage rights related to the data products, indicator publication etc. i.e. data policy, copyright restrictions etc. n o Conditions applying to access and use i.e. Copyright, data policy text or URL Required Assessment Findings Key messages, assessment results (text and graphic form), trend analysis and conclusions presented primarily as text x Key assessment Longer description of assessment results by assessment units text Key messages Short descriptions of trends Results and Status Textual description of assessment results, could include graphics text and figures Trend Textual description of assessment trend, could include graphics Optional Assessment methods Methodologies, aggregation methods, indicator specifications, references to other relevant methods Indicator Definition Short description of indicator aimed at general audience Methodology for indicator calculation Text and tabular information on the process of aggregation and selection etc Methodology for monitoring Short textual description of monitoring requirements/method Indicator units Units used for indicator GES - concept and target setting method Text describing concept used and target method Assessment Purpose Purpose of assessment, rationale of approach, targets and policy context Indicator purpose Justification for indicator selection and relevance Policy relevance Text relating indicator to policy Relevant publications (policy, scientific etc) Citable URLs to policy documents related to indicator Policy Targets Description of policy target Contact and Ownership Contact details for assessment indicator, including authorship and organisational contact points Authors List of authors Citation Full citation Point of contact Organisational contact Continued
15
Proposal for common structured elements
Category and relevant fields Description Content type EEA HELCOM OSPAR Recomm-endation Data inputs and outputs Data sources, assessment datasets, assessment results (tabular and dataset), snapshots etc. x Data sources Underlying datasets text and/or URL Required Assessment dataset snapshot dataset that was derived from underlying data URL Assessment result summary results dataset/table/figure File or web service Assessment result- map GIS version of assessment result i.e. Shape file or WFS Optional Geographical scope Assessment units, other geographical information, countries o Assessment unit Nested assessment unit (if available) text Countries Countries that the indicator covers Other geographical unit alternate source of geographical reference for indicator i.e. ICES areas text or URL Labelling and Classification identification and classification systems, such as INSPIRE, DPSIR, MSFD criteria DPSIR assessment framework linkage MSFD criteria criteria coding as listed in Annex III tables 1 and 2 Indicator title Full title of indicator as published INSPIRE topics Keyword topics Quality Aspects Explicit information content on quality of assessment, including data and methods. i.e. Uncertainty, gaps in coverage etc. Data confidence Adequateness of spatial and temporal coverage, quality of data collection etc. Indicator methodology confidence Knowledge gaps and uncertainty in methods/knowledge GES - confidence Confidence of target, descriptive text Temporal scope Time range of assessment, usually expressed as a year range n Temporal Coverage assessment period expressed as year start -year end date range Version control Publishing dates, references to previous indicator versions, URI's etc. Last modified date date of last modification date time Published date publish date of indicator Unique reference Citable reference unique to resource i.e. URI, DOI version linkage Link to other versions of assessment Sum Required fields 21 Sum Optional fields 15
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.