Copernicus Institute Knowledge Quality Assessment rethinking uncertainty challenges for science and society Jeroen van der Sluijs Copernicus Institute.

Slides:



Advertisements
Similar presentations
Discourses and Framings of Climate Change: What Literatures Do We Need to Review? To realize synergies there is a need to indentify common objectives for.
Advertisements

Division Of Early Warning And Assessment MODULE 11: ASSESSING THE FUTURE.
Future Earth: Research for global sustainability in Asia
World Water Scenarios 2012 – 2035 William J Cosgrove & Gilberto Gallopin Chicago 17 July 2009.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Uncertainty, climate scenarios and adaptation Suraje Dessai Tyndall Centre for Climate Change Research, UK and School of Environmental Sciences University.
Design of Experiments Lecture I
See ( OECD-JRC handbook on CI The ‘pros’: Can summarise complex or multi-dimensional issues in view of supporting decision-makers.
Exploring uncertainty in cost effectiveness analysis NICE International and HITAP copyright © 2013 Francis Ruiz NICE International (acknowledgements to:
11 Steps Leading to Amsterdam and Beyond A much greater emphasis on frameworks for planned adaptation, and also to robust approaches to decision-making.
Ray C. Rist The World Bank Washington, D.C.
Copernicus Institute Uncertainty Assessment - Flood Risk Management, Nottingham, 6 Oct 2004 Uncertainty Assessment and Communication Jeroen van der Sluijs.
Quantitative vs. Qualitative Research Method Issues Marian Ford Erin Gonzales November 2, 2010.
Symposium Climate change as a challenge for human rights, Utrecht, November 20, 2009 Climate Change, Uncertainty, and Ethics Centre d'Economie et d'Ethique.
16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty A matter of (Bayesian) belief?
The Price of Precaution and the Ethics of Risk Christian Munthe Department of Philosophy, Göteborg University Based on: Munthe C, The Morality of Precaution:
A DAPTIVE MANAGEMENT: S TRATEGIES FOR COPING WITH CHANGE AND UNCERTAINTY J. BRIAN NYBERG FRST 532 COMPLEX ADAPTIVE SYSTEM, GLOBAL CHANGE SCIENCE AND ECOLOGICAL.
Proposed NSF Center on Climate Decision Making Carnegie Mellon University 1 Climate and Related Decision Making in the Face of Irreducible Uncertainties.
Uncertainty and quality in scientific policy assessment -introductory remarks- Martin Krayer von Krauss, WHO/EEA Integrated Assessment of Health Risks.
Introduction to teaching and assessing so students will learn more using learner-centered teaching Phyllis Blumberg Warm-up activity How can instructor’s.
Research and Methodology Lecture 2 1. Organization of this lecture Research and Methodology: Research defined and described Some classifications of research.
Section 2: Science as a Process
Copyright © 2005 by South-Western, a division of Thomson Learning All rights reserved 1 Chapter 8 Fundamentals of Decision Making.
Methodological introspection: how not to talk past one another When common sense knowledge is neither common nor sensible enough Or How five experts can.
Scenarios. Scenarios can be defined as plausible descriptions of how the future may unfold based on 'if-then' propositions (EEA, 2005) Scenarios: definition.
Development of Indicators for Integrated System Validation Leena Norros & Maaria Nuutinen & Paula Savioja VTT Industrial Systems: Work, Organisation and.
Defining and Directing Public Administration toward Performance Excellence Dr. Donald Klingner International Conference on Administrative Development Riyadh,
1 Mid-Term Review of the Hyogo Framework for Action Roadmap to Disaster Risk Reduction in the Americas & HFA Mid-Term Review.
Universiteit Utrecht Copernicus Institute Coping with uncertainty in climate change adaptation merging top down and bottom up approaches Dr. Jeroen P.
LEVEL 3 I can identify differences and similarities or changes in different scientific ideas. I can suggest solutions to problems and build models to.
Copernicus Institute Universiteit Utrecht Taking uncertainty on board in decision making The example of adaptation to climate change.
SINTEF Telecom and Informatics EuroSPI’99 Workshop on Data Analysis Popular Pitfalls of Data Analysis Tore Dybå, M.Sc. Research Scientist, SINTEF.
Gile Sampling1 Sampling. Fundamental principles. Daniel Gile
STRATEGIC ENVIRONMENTAL ASSESSMENT METHODOLOGY AND TECHNIQUES.
Uncertainty assessment of the IMAGE/TIMER B1 CO 2 emissions scenario using the NUSAP method Jeroen P. van der Sluijs 1 Jose Potting 1 James Risbey 1 Detlef.
Hazard Identification
Perception, Cognition, and Emotion in Negotiation
Introduction to Earth Science Section 2 Section 2: Science as a Process Preview Key Ideas Behavior of Natural Systems Scientific Methods Scientific Measurements.
“Social” Multicriteria Evaluation: Methodological Foundations and Operational Consequences Giuseppe Munda Universitat Autonoma de Barcelona Dept. of Economics.
Climate Projections: From Useful to Usability Richard B. Rood, Maria Carmen Lemos, Donald E. Anderson Richard B. Rood
Interdisciplinary Studies of Ethical and Societal Implications of Nanotechnology.
The science-policy interface at MNP INTARESE training on uncertainty & quality, 16/17 October 2007 Arthur Petersen.
Systematic Review: Interpreting Results and Identifying Gaps October 17, 2012.
URBDP 591 I Lecture 4: Research Question Objectives How do we define a research question? What is a testable hypothesis? How do we test an hypothesis?
Copernicus Institute Interfaces between Science & Society, Milano, November 2003 Quicksandy Knowledge Bases The need for guidance for dealing with.
Guidance for Uncertainty Scanning and Assessment at RIVM Jeroen van der Sluijs, James Risbey, Penny Kloprogge (Copernicus Institute, Utrecht) Jerry Ravetz.
S ystems Analysis Laboratory Helsinki University of Technology 1 Decision Analysis Raimo P. Hämäläinen Systems Analysis Laboratory Helsinki University.
Copernicus Institute Interfaces between Science & Society, Milano, more info: Break-out session Uncertainty, assumptions and value.
Machiel Lamers
Chapter 3: Exploring the Future Scott Kaminski ME / 2 / 2005.
“ Building Strong “ Delivering Integrated, Sustainable, Water Resources Solutions Uncertainty & Variability Charles Yoe, Ph.D.
Copernicus Institute FP6 MUST workshop, 11 November 2002, Brussels Managing Uncertainty in science for suSTainability future research challenges for Europe.
Area Studies Controversy ID01302 Kih, Hee-Seong. Questions Who are Social Scientists? And who are Area Specialists?
©2013 Global Insights Consulting, LLC Neethling Brain Instrument™ NBI™ Thinking Preferences and Decision Making Neethling Brain Instrument™ NBI™ Thinking.
European Research 2002, Workshop on Uncertainty, Brussels, November 2002 Extended Peer Reviews in Science for Sustainability 1 Extended Peer Reviews Joachim.
Establishing by the laboratory of the functional requirements for uncertainty of measurements of each examination procedure Ioannis Sitaras.
Uncertain Judgements: Eliciting experts’ probabilities Anthony O’Hagan et al 2006 Review by Samu Mäntyniemi.
Uncertainty and controversy in environmental research
Towards a Value-Based Theory Of Sustainability Problem Framing
Are we sure? UNECE-workshop on uncertainty treatment in Integrated Assessment Modelling January 2002 Rob Maas.
CSU/Riverside Global Water & Climate Initiative
CASE STUDY BY: JESSICA PATRON.
RIVM/MNP Guidance for Uncertainty Assessment and communication
Giuseppe Munda Universitat Autonoma de Barcelona
The Nature of Qualitative Research
Quantitative vs. Qualitative Research Method Issues
Professor John Ratcliffe, Dr Ela Krawczyk, Dr Ruth Kelly
ELEC4011 Ethics & Electrical Engineering Practice Hugh Outhred
Research and Methodology
Dr. Arslan Ornek MATHEMATICAL MODELS
Presentation transcript:

Copernicus Institute Knowledge Quality Assessment rethinking uncertainty challenges for science and society Jeroen van der Sluijs Copernicus Institute for Sustainable Development and Innovation Utrecht University

Copernicus Institute Knowledge Quality Assessment Complex environmental risks Typical characteristics (Funtowicz & Ravetz): Decisions will need to be made before conclusive scientific evidence is available; Decision stakes are high: potential error costs of wrong decisions can be huge Values are in dispute Knowledge base is mixture of knowledge and ignorance:  large (partly irreducible) uncertainties, knowledge gaps, and imperfect understanding; Assessment dominated by models, scenarios, and assumptions Many (hidden) value loadings in problem frames, indicators, assumptions Coping with uncertainty is essential

Copernicus Institute Knowledge Quality Assessment Problematic definitions of uncertainty Example 1: Walker et al “We adopt a general definition of uncertainty as being any departure from the unachievable ideal of complete determinism” Might make more sense to talk about unreality or uncomplexity as being any departure from or reduction of the inherent complexity of systems outside the controlled environment of the laboratory.

Copernicus Institute Knowledge Quality Assessment Definition from HarmoniCa uncertainty guidance document A person is uncertain if s/he lacks confidence about the specific outcomes of an event or action. Reasons for this lack of confidence might include a judgement of the information as incomplete, blurred, inaccurate or potentially false or might reflect intrinsic limits to the deterministic predictability of complex systems or of stochastic processes. Similarly, a person is certain if s/he is confident about the outcome of an event. It is possible that a person feels certain but has misjudged the situation (i.e. s/he is wrong).

Copernicus Institute Knowledge Quality Assessment The definition above defines uncertainty as a property (state of confidence) of the decision. Alternatively uncertainty can be defined as a property (state of perfection) of the total body of knowledge or information that is available at the moment of judgement. Uncertainty is then seen as an expression of the various forms of imperfection of the available information and depends on the state-of-the- art of scientific knowledge on the problem at the moment that the decision needs to be made (assuming that the decision maker has access to the state-of-the-art knowledge).

Copernicus Institute Knowledge Quality Assessment Challenges Increase society’s capacity to manage and surmount uncertainties surrounding knowledge production and use in designing and implementing precautionary (or should I say: responsible) policies and sustainable development New epistemology that does not see uncertainty as deviation from deterministic ideal, nor as imperfect knowledge, nor as low quality Need for a new (multidimensional) definition of uncertainty (maybe even a new word)

Copernicus Institute Knowledge Quality Assessment Insights on uncertainty Omitting uncertainty management can lead to scandals, crisis and loss of trust in science and institutions More research tends not to reduce uncertainty –Usually reveals unforeseen complexities –Meets irreducible uncertainty (intrinsic or practically) High quality  low uncertainty Quality relates to fitness for function (robustness, PP) In many complex problems unquantifiable uncertainties dominate the quantifiable uncertainty Shift in focus needed from reducing uncertainty towards systematic ways to explicitly cope with uncertainty and quality -> knowledge quality assessment

Copernicus Institute Knowledge Quality Assessment Uncertainty as a “monster” A monster is a phenomenon that at the same moment fits into two categories that were considered to be mutually excluding (Smits, 2002; Douglas 1966)

Copernicus Institute Knowledge Quality Assessment Cultural categories that we thought to be mutually exclusive and that now tend to get increasingly mixed up: knowledge – ignorance objective – subjective facts – values prediction – speculation science – policy

Copernicus Institute Knowledge Quality Assessment Responses to monsters Different degrees of tolerance towards the abnormal: monster-exorcism (expulsion) monster-adaptation (transformation) monster-embracement (acceptance) monster-assimilation (rethinking)

Copernicus Institute Knowledge Quality Assessment Footnote: compare to Lakatos (1976) preserving mathematical models against apparent refutations Surrender (throw the model away and start all again), Monster barring, Monster adjustment Lemma incorporation.

Copernicus Institute Knowledge Quality Assessment monster-exorcism Uncertainty causes discomfort Reduce uncertainties! Strong believe in “objective science”: “the puzzle can be solved” Example: “We are confident that the uncertainties can be reduced by further research” (IPCC 1990)

Copernicus Institute Knowledge Quality Assessment But…. For each head science chops off of the uncertainty monster, several new monster heads tend to pop up (unforeseen complexities) 1994 IGBP dropped objective to reduce uncertainty: ”full predictability of the earth system is almost certainly unattainable”

Copernicus Institute Knowledge Quality Assessment "We cannot be certain that this can be achieved easily and we do know it will take time. Since a fundamentally chaotic climate system is predictable only to a certain degree, our research achievements will always remain uncertain. Exploring the significance and characteristics of this uncertainty is a fundamental challenge to the scientific community." (Bolin, 1994) Former chairman IPCC on objective to reduce uncertainties:

Copernicus Institute Knowledge Quality Assessment Monster adaptation Fit the uncertainty monster back in the categories: purification Quantify uncertainty, subjective probability & Bayesian Tendency to build system models based on “objective science” and externalise the subjective parts and uncertainties into scenario’s and storylines Boundary work

Copernicus Institute Knowledge Quality Assessment IPCC 10 years after “we are confident that the uncertainties can be reduced…”

Copernicus Institute Knowledge Quality Assessment Monster adaptation meets its limits Different models fed with the same scenarios produce very different results “Integrated Assessment Modeling of Global Climate Change: Transparent Rational Tool for Policy Making or Opaque Screen Hiding Value-laden Assumptions?” (Steve Schneider)

Copernicus Institute Knowledge Quality Assessment Monster Embracement Uncertainty is welcomed: an appreciated property of life fascination about the unfathomable complexity of our living planet Gaia room for spirituality and wonder as counterweight to the engineering worldview of “managing the biosphere” Plea for a humble science Holism; Inclusive Science Or: Uncertainty is welcomed because it fits well in other political agenda’s (strategic) Denial of realness of environmental risks by emphasizing all those uncertainties Manufacturing uncertainty

Copernicus Institute Knowledge Quality Assessment Monster Assimilation Rethink the categories by which the knowledge base is judged Create a place for monsters in the science policy interface Post Normal Science; Reflexive science; Complex systems research

Copernicus Institute Knowledge Quality Assessment Uncertainty has multiple dimensions Technical (inexactness) Methodological (unreliability) Societal (limited social robustness) Epistemological (ignorance)

Copernicus Institute Knowledge Quality Assessment Inexactness Intrinsic uncertainty: –Variability / heterogeneity Technical limitations: –Resolution error –Aggregation error –Unclear definitions

Copernicus Institute Knowledge Quality Assessment Unreliability Methodological limitations Limited internal strength in: –Use of proxies –Empirical basis –Methodological rigour –Validation Bias in knowledge production –Motivational bias (interests, incentives) –Disciplinary bias –Cultural bias –Choice of (modelling) approach –Subjective judgement Future scope

Copernicus Institute Knowledge Quality Assessment Limited social robustness Limited external strength in: –Bias / Value ladenness –Insufficient exploration of rival problem framings –Management of dissent –Extended peer acceptance / stakeholder involvement –Transparency –Access & availability –Intelligibility Strategic/selective knowledge use

Copernicus Institute Knowledge Quality Assessment Ignorance Epistemological limitations Limited theoretical understanding System indeterminacy –Open-endedness –Chaotic behavior –Intrinsic unknowability Active ignorance –Model fixes for reasons understood –Limited domains of applicability of functional relations –Numerical error –Surprise A Passive ignorance –Bugs (software error, hardware error, typos) –Model fixes for reasons not understood –Surprise B

Copernicus Institute Knowledge Quality Assessment

Copernicus Institute Knowledge Quality Assessment RIVM/MNP Uncertainty Guidance Systematic reflection on uncertainty and quality in: Problem framing Involvement of stakeholders Selection of indicators Appraisal of knowledge base Mapping and assessment of relevant uncertainties Reporting of uncertainty information

Copernicus Institute Knowledge Quality Assessment Detailed Guidance RIVM-MNP Uncertainty Guidance Quickscan Hints & Actions List Quickscan Questionnaire Mini-Checklist Downloads: Reminder list Invokes Reflection Portal to QS Further Guidance Advice Hints & Implications Advice on Quantitative + Qualitative tools for UA Tool Catalogue for Uncertainty Assessment

Copernicus Institute Knowledge Quality Assessment

Copernicus Institute Knowledge Quality Assessment “Wisdom is to know, that you do not know” (Socrates)