Download presentation
Presentation is loading. Please wait.
Published byBranden Hopkins Modified over 9 years ago
1
1 European Environment Agency (EEA) Anita Künitzer http://www.eea.eu.int ‘Lessons learned in designing effective assessments’
2
2 The EEA's Mission... is to deliver timely, targeted, relevant and reliable information to policy-makers and the public for the development and implementation of sound environmental policies in the European Union and other EEA member countries. Domingo Jiménez-Beltrán Executive Director, EEA
3
3 EU 15 member states + Iceland Liechtenstein Norway EEA candidate countries Stability Pact countries TACIS EEA member countries
4
4 EEA major reports State of the environment & outlook report: “Environment in the European Union at the turn of the century, 1999” (and 2004, 2009...) = the EEA tool for strategic environmental planning The regular indicator report “EEA Environmental signals, 2000” (2001, 2002 …) = the EEA tool for performance review
5
5 Production process of assessment Identification of users (politicians, scientists, school children,...) Policy issues to be addressed: what should the assessment achieve? Process of the assessment Launch: at which policy event
6
6 Designing effective assessments: The role of participation, science and governance, and focus Workshop co-organised by the European Environment Agency and the Global Environmental Assessment Project 2001
7
7 Integrated Environmental Assessment An interdisciplinary process of structuring knowledge elements from various scientific disciplines in such a manner that all relevant aspects of a complex societal problem are considered in their mutual coherence for the benefit of (sustainable) decision-making.
8
8 policy preparation policy formulation policy execution policy evaluation The Policy Cycle
9
9 Framework for scientific assessment process to inform policy makers 1.Example: IPCC (Intergovernmental Panel on Climate Change) Involved only expert scientists in defined disciplines, no political stakeholders Production of lengthy reports 2.Example: CLRTAP (Long-Range Transboundary Air Pollution) Less clear science-policy distinction Few formal reports
10
10 Effective assessments What is effective? –Cost-effectivness –Improvements in the natural environment –Fulfilling political objectives Attributes for effective assessments: –Credibility –Salience –Legitimacy
11
11 Credibility Lack of credibility: –Assessment based on shoddy methods –Assessment ignores important empirical evidence –Assessment draws inappropriate conclusions from data Gain credibility: –Through the process by which the information is created (example: data obtained by good laboratory practise) –By the credentials or other characteristics of producers of the assessment (example: assessment done by well-known, highly regarded scientist)
12
12 Salience or relevance Lack of salience: –Produced report is never referred to and never heard from again –Assessment addresses questions to which the user is not interested in the answers Gain salience: –Assessment is able to address the particular concerns of a user –User is aware of the assessment –User considers the assessment relevant to current policy
13
13 Legitimacy measure of the political acceptability or perceived fairness of an assessment to a user Lack of legitimacy: –In ‘global’ assessments inputs from less powerful countries are not included or their interests are ignored Gain legitimacy: –Users and participants interests, concerns, views, perspectives have been taken into account –The assessment process has been a fair one
14
14 Assessment design (1) 1.Historical context of the assessment Characteristics of the issue area Position of the issue on the political agenda 2.Characteristics of the intended user Interest in the issue and/or assessment User capacity to understand the results User openness to different sources of advice
15
15 Assessment design (2) 3.Assessment characteristics Participation: who is involved in the assessment process? Science and governance: how are assessments conducted with respect to the interactions between scientific experts and policy makers? Focus: how broadly (multidisciplinary) or narrowly (technically) focussed should the assessment be? How consensus-based should it be?
16
16 Conceptual framework for considering effective assessments Ultimate determinantsProximate pathwaysAssessment effectiveness Historical context Issue characteristics Linkage Attention cycle User characteristics Concern Capacity openness Assessment characteristics Science/governance Participation focus Salience Credibility Legitimacy Effectiveness
17
17 Participation: critical issues The capacity of partners, clients and/or users to participate in the assessment (travel costs, administrative capacity, time for the assessment itself). Are scientists participating in their individual capacity (good for scientific credebility) or are they accountable to governments? Encourage participation of stakeholders to whom the assessment is designed to: NGOs, policy making community, country representatives to make them interested in the final report Process of participation might be more important than content: inclusion as author or attending a meeting increases legitimacy of assessment. Broad review of an assessment done by few scientist by several international organisations can increase the legitimacy
18
18 Science and governance: critical issues Assessments on issue areas, which are scientific controversial should be undertaken by institutions accountable to the scientific community to minimise credibility concerns. While scientist prefer credibility of assessments, politicians prefer salience. Therefore such assessments on more mature scientific areas might better be undertaken by other organisations more focussed on policy needs. Including policy recommendations in a scientific assessment can be dangerous. Here assessment of ’boundary organisations’ that are accountable to science and policy may be the solution.
19
19 Focus: critical issues Succesful assessment avoid adressing controversial issues. Broadly focussed assessments include more relevant factors and increase the audience and might be more relevant to decision makers. Most assessment to date have been too simple by excluding too many factors and causal chains. Assessment should be kept comprehensive despite all interactions. Periodic separate thematic assessments could be produced instead of one big comprehensive assessment.
20
20 M : Monitoring D : Data I : Information A : Assessment R : Reporting A : Assessment I : Information D : Data M : Monitoring Use MDIAR to analyse the information provision process MDIAR stands for:
21
21 Steps in indicator based reporting 1.Agree on the „story“ (define environment-sector model; DPSIR) 2.List (most important) policy questions (and identify policy levers) 3.Identify indicators that come close to answering these („ideal“ & „actual“; define new ones!) 4.Data compilation 5.Assessment 6.Make conclusions, modify, adapt, update – iterate!
22
22 The DPSIR framework Responses State Drivers Pressures Impact e.g. Clean Production, Public Transport, Regulations, taxes Information, etc. e.g. Ill health, Biodiversity loss, Economic damage e.g. Transport and Industry e.g. Polluting Emissions e.g. Air, Water, Soil quality
23
23 A performance indicator Emissions of ozone precursors, EU15 target
24
24 The link between indicators and the policy process – distinguishing the differences and improving relevance 123 Indicators linked to quantitative targets Indicators linked to stated objectives Indicators linked to policy intentions or public expectations
25
25 What are scenarios? Scenarios are archetypal descriptions of alternative images of the future, created from mental maps or models that reflect different perspectives on past, present and future developments.
26
26 Measuring is Not Knowing: The Marine Environment and the Precautionary Principle ‘The enormous number of papers in the marine environment means that huge amounts of data are available, but …we have reached a sort of plateau in …the understanding of what the information is telling us …. We… seem not to be able to do very much about it or with it. This is what led to the precautionary principle, after all – we do not know whether, in our studied ecosystem, a loss of diversity would matter, and it might’. Marine Pollution Bulletin, Vol 34, No. 9, pp. 680-681, 1997
27
27 Precautionary principle in assessments Levels of proof: Assessments for public policy making need lower levels of proof than normal good science Multidisciplinary approaches: improve the quality of an assessment by considering aspects of the problem from different perspectives Early warnings: successful prevention of environmental impacts and associated cost needs early warnings
28
Levels of proof - some illustrations Beyond all reasonable doubt reasonable certainty balance of probabilities/evidence strong possibility scientific suspicion of risk negligible/insignificant
29
29 Organisation through Interest Groups On each of the 33 servers across Europe
30
30 CIRCLE Library Service
31
31 Data Flows in EIONET National Layer EEA Warehouse European Layer Information Retrieval System Data access and visualisation E2RC Reports and Reference Centre Concept
32
32
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.