Presentation is loading. Please wait.

Presentation is loading. Please wait.

Christian Heidorn DIMESA 2008, 17 June Copenhagen

Similar presentations


Presentation on theme: "Christian Heidorn DIMESA 2008, 17 June Copenhagen"— Presentation transcript:

1 Christian Heidorn DIMESA 2008, 17 June 2008 - Copenhagen
6.1 Group of Four co-ordination activities: Streamlining of environmental indicators Christian Heidorn DIMESA 2008, 17 June Copenhagen Unit E Environment Statistics and Accounts

2 Environmental Indicators
Go4 –Technical Arrangement: Background EUROSTAT: + Streamlining of Environmental Indicators … is establishing environmental Data Centres (DCs) on: Natural Resources Products (IPP) Waste … and is responsible for: + Streamlining of Environmental Indicators + Coordination of data quality issues

3 Background (2) Technical Arrangement signed by Directors General:
"Eurostat will take the lead on a joint EEA/ESTAT/ENV inventory of the various indicator sets and the streamlining exercise. DG ENV and JRC will contribute to this work, which needs to take full account of the specific needs of different users".

4 Background (3) Various sets of environmental indicators maintained by EU bodies: Eurostat – EEA – JRC – DG Environment … and international organisations: OECD – UN (UNSD, UNEP …) The problem: ► Indicators have the same name but show different variables ►Indicators have different names, but show the same ►There are many, many indicators out there: DG Environment: “Indicator Jungle”

5 Objectives – based on previous work
To get an overview of the major environmental indicators Improve indicators production by: Streamlining of indicators and … … the underlying data flows, reporting obligations … meta data, definitions … application of quality standards Clarification of “responsibilities”, better coordination among the indicator “owners”

6 Objectives (2) Streamlining is in practice:
Eliminate obsolete and redundant indicators Provide insight into the purpose and inter- relationships of indicators, naming conventions Provide relevant background information for the indicator audiences Better organisation of the indicator production process

7 Study project: Streamlining of environmental indicators
Tasks: Study project: Streamlining of environmental indicators Project start: January End: 19 March 2008 Task A: Preparation of a joint inventory Task B: Identification of streamlining potential Task C: Start the process of streamlining Workshop of 15 April 2008: Presentation of results and planning of further work

8 The inventory  A list of environmental indicators from 11 sets :
SI (Structural indicators, Eurostat), SDI (Sustainable Development Indicators, Eurostat), CSI (Core Set of Indicators, EEA), SEBI 2010 (Streamlining European 2010 Biodiversity Indicators, EEA), EERM (Indicators of environmental integration of the energy sector, EEA), AEI (Agri-Environmental Indicators, ex-IRENA, Indicators Reporting on the Integration of Environmental Concerns into Agriculture Policy, EEA/Eurostat/DG AGRI/DG ENV/JRC), TERM (Transport and Environment reporting System, EEA), EPI (Environmental Pressure Indicators, Eurostat/DG ENV), “ISD” (Indicators of Sustainable Development, UNCSD), /11. KEI / CEI (Key- and Core- environmental indicators, OECD)

9 The inventory (2) 435 Indicators
Eliminate non-purely environmental Indicators – 66 Eliminate “non-streamlineable” Indicators – 60

10 The potential 309 “streamlineable” indicators
Put in order, set a framework – 48 “clusters” 9 further regroupings made – 29 “main - clusters” discussed with the Steering Committee

11 The potential (2) Results from indicator assessment: 47 “masters”
171 with “high” potential 42 with “limited” potential 8 “unknown” (in-sufficient information) 89 “no” potential

12 ► ► ► ► ► ► ► ► Starting the process
Set of general guidelines for practical implementation: streamlining options and recommendations for further work = 6 slides Distribution of responsibilities within the Go4 and their data centres = 2 more slides

13 Streamlining options Where more or less the same indicators are being created, but from a different data source, the project recommends choosing the same data source for these indicators. … but: not always possible (some sources are deliberately chosen). indicators using more recent or better data sources may be preferred.

14 Streamlining options (2)
Where the same data sources are being used, a division of tasks for the calculation of the indicators could be made between data centres. this may then create a situation where organisation Y and Z are depending on the indicators being produced by organisation X. This will introduce criteria for the production of the indicators, such as: Quality criteria Timeliness criteria Commitment - competence

15 Streamlining options (3)
More difficulties can be expected in the streamlining process for those indicators that are composed from various data sources. Difficult to solve will be those indicators which are composed of data sources which are not under competence of any of the data centres.

16 Streamlining options (4)
More recommendations Develop a naming convention and/or a unique codification system; Apply identical metadata to indicators with an identical code; Develop specific metadata for the presentation of the indicator; For purpose of ease, quality and transparency, make metadata and flow-charts for all indicators accessible through ONE source; A flow-chart approach, a generalised and generic chain of events (e.g. raw data collection - data management (data improvement) and processing (modelling, weighting, aggregating) - indicator creation - assessment - presentation) could be elaborated to provide all possible elements that need to be present.

17 Streamlining options (5)
More recommendations, cont. Set up permanent indicator quality team within (and between) the indicator producers, with sufficient allocated time, and a permanent budget. Procedures for making changes in existing indicators need to be elaborated and ratified. Quality management requires that decisions envisaged need to be discussed by the indicator quality team(s), and decisions made and implemented need to be documented properly in the ‘revision history log’; Procedures need to be set up to deal with quality assurance and transparency issues beyond the scope of the data centres (third-party data quality, documentation, decision making, etc.); Given the very high number of indicators potentially streamlineable, the follow-up work should be prioritised.

18 Responsibilities Discussion on ownership

19 Responsibilities (2) Discussion on ownership: example AEI.21 Soil erosion

20 Outcome of workshop, agreed steps
Project has delivered insight on main EU environmental indicator sets (covering > 300 indicators) Methodology was accepted, but needs to be fine-tuned Priority setting required, set up a matrix for planning of work Design a coding/naming system for better communication Each “cluster” will be allocated to a DC for organisation of further work Flow-chart approach to be further investigated

21 Outcome of workshop, agreed steps (2)
Composition of Steering Committee will remain, results will be made available to EIONET (NFP meetings) and the ESS (DIMESA) Commitment by all Go4 partners is crucial – and was confirmed, Eurostat remains committed to lead the process Sufficient allocation of human resources and budget is crucial as well The objective to establish an “Indicator Clearing House” for existing and “to-be-developed” indicators will be followed There are some “low-hanging-fruits” (e.g. municipal waste), but a lot is still to be done …

22 Further reading: Final report and workshop minutes are on CIRCA: For further questions: Some illustrative examples?

23 Municipal waste: Eurostat >>> SDI

24 Municipal waste: Eurostat

25 Municipal waste: Eurostat >>> SI

26

27 Municipal waste: EEA >>> CSI

28

29 Municipal waste: OECD >>> KEI

30

31 Municipal waste: UN >>> EI


Download ppt "Christian Heidorn DIMESA 2008, 17 June Copenhagen"

Similar presentations


Ads by Google