Open policy practice: open science- based assessments for decision-makers Jouni Tuomisto National Institute for Health and Welfare, Kuopio, Finland.

Slides:



Advertisements
Similar presentations
EuropeAid PARTICIPATORY SESSION 2: Managing contract/Managing project… Question 1 : What do you think are the expectations and concerns of the EC task.
Advertisements

Does risk exist, and if it does, where does it live and how do we find it? Doug Crawford-Brown Professor of Environmental Sciences and Policy Director,
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
CRITICAL THINKING The Discipline The Skill The Art.
1 PRESENTER Title Organization Presented at CONFERENCE NAME, CITY DATE Healthy Built Environment (HBE) Linkages: A Toolkit for Design  Planning  Health.
Appraisal of Literature. Task 4 The task requires that you:  Obtain a piece of literature from a journal, book or internet source. The literature should.
Dr Jim Briggs Masterliness Not got an MSc myself; BA DPhil; been teaching masters students for 18 years.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
“Building Effective Public Participation in Environmental Impact Assessment in a Transboundary Context” in Bulgaria Institute for Ecological Modernisation.
Getting Started: Research and Literature Reviews An Introduction.
Research Methods for Business Students
CH4LLENGE has just started its series of online learning courses! We offer a SUMP Basics online course and four in-depth courses.
Learning Outcomes from Report-Writing Unit
CHALLENGES AND OPPORTUNITIES FOR CRITICAL ANALYSIS IN ASSESSMENT.
DR. AHMAD SHAHRUL NIZAM ISHA
How to Write a Literature Review
Application of toxicological risk assessment in the society Jouni Tuomisto, THL, Kuopio.
Overview of operational research in MSF Myriam Henkens, MD, MPH International Medical Coordinator MSF London 1st of June, 2006.
APAPDC National Safe Schools Framework Project. Aim of the project To assist schools with no or limited systemic support to align their policies, programs.
DrugEpi 1-4 Counting HS Marijuana Use Module 1 Overview Context Content Area: Descriptive Epidemiology & Surveillance Essential Question (Generic): How.
Too expensive Too complicated Too time consuming.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
06/10/2015 Presentation name / Author1 Evaluating assessment performance Mikko Pohjola, THL.
National Public Health Institute, Finland Pyrkilo – a modified risk assessment method Jouni Tuomisto National Public Health Institute (KTL)
National Public Health Institute, Finland Beneris (Benefit-risk assessment for food: an iterative value-of-information approach) ‏ Jouni Tuomisto.
Management of assessments and decision making: execution, facilitation, evaluation Mikko V. Pohjola, Nordem Oy (THL)
Guide Jouni Tuomisto, Mikko Pohjola - National Institute for Health and Welfare - Department of Environmental Health – Finland Introduction: The world.
The Conclusion and The Defense CSCI 6620 Spring 2014 Thesis Projects: Chapters 11 and 12 CSCI 6620 Spring 2014 Thesis Projects: Chapters 11 and 12.
Shared understanding Jouni Tuomisto, THL. Outline What is shared understanding? Main properties Examples of use How does it make things different? Rules.
Decision analysis and risk management: Introduction to course Jouni Tuomisto, THL.
Introduction to assessment performance Mikko Pohjola, THL.
LEVEL 3 I can identify differences and similarities or changes in different scientific ideas. I can suggest solutions to problems and build models to.
Structured Discussion Michael Mari Anni
Decision analysis and risk management: Introduction to course Jouni Tuomisto, THL.
Workshop on VHL and HEN, Sao Paulo, April 2006 Workshop on VHL and HEN Sao Paulo, April 2006 Anca Dumitrescu, M.D. WHO Regional Office for.
Evidence-Based Forestry: Networks of Practice, Research and Information SLA-DERM Forestry Section, SLA Monday, June 13, 2011 Gillian Petrokofsky University.
Risk management: From needs to knowledge, knowledge to action Mikko Pohjola, THL.
From description to analysis
This work is part of the Joint Action on Improving Quality in HIV Prevention (Quality Action), which has received funding from the European Union within.
 An article review is written for an audience who is knowledgeable in the subject matter instead of a general audience  When writing an article review,
Shared understanding Jouni Tuomisto, THL. Outline What is shared understanding? Main properties Examples of use How does it make things different? Rules.
Systematic Review: Interpreting Results and Identifying Gaps October 17, 2012.
IR 202 Research Methods This course aims to introduce students what is social research, what are the different types of research and the research process.
DARM 2013: Assessment and decision making Mikko V. Pohjola, Nordem Oy, (THL)
BECOMING CRITICAL THINKERS: Four strategies to use in the classroom.
National Public Health Institute, Finland Open risk assessment Lecture 5: Argumentation Mikko Pohjola KTL, Finland.
Guide Background Climate mitigation policies in municipalities have global impacts on climate change, but the actual impacts on the municipalities themselves.
Principals of Research Writing. What is Research Writing? Process of communicating your research  Before the fact  Research proposal  After the fact.
Application of toxicological risk assessment in the society Jouni Tuomisto, THL, Kuopio
GENET - European NGO Network on Genetic Engineering: 44 members in 24 countries Mission: to provide information on gene technologies and related topics.
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
National Public Health Institute, Finland Open Risk Assessment Lecture 2: General assessment framework Mikko Pohjola KTL, Finland.
Introduction to assessment performance Mikko Pohjola, THL.
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 1 Research: An Overview.
Stage 1 Integrated learning Coffee Shop. LEARNING REQUIREMENTS The learning requirements summarise the knowledge, skills, and understanding that students.
Decision support with online collaborative models Jouni Tuomisto THL, Kuopio.
Risk management: Facilitation of (open) risk management Mikko Pohjola, THL.
Decision analysis and risk management: Introduction to course Jouni Tuomisto, THL.
The Governance Lab Jouni Tuomisto THL, Kuopio. Assessments are to support decisions It’s all about information flows between people to right places.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. EVIDENCE-BASED TEACHING IN NURSING – Chapter 15 –
Jouni T. Tuomisto1, John S. Evans2, Arja Asikainen1, Pauli Ordén1
Application of toxicological risk assessment in the society
DARM 2013: Assessment and decision making
CRITICAL ANALYSIS Purpose of a critical review The critical review is a writing task that asks you to summarise and evaluate a text. The critical review.
Introduction to risk management
Impact assessment and decision making
Application of toxicological risk assessment in the society
Standard for Teachers’ Professional Development July 2016
Application of toxicological risk assessment in the society
Presentation transcript:

Open policy practice: open science- based assessments for decision-makers Jouni Tuomisto National Institute for Health and Welfare, Kuopio, Finland

Outline Information flow in policy support Examples of open assessments Shared understanding Six principles of open policy practice Structure of information objects Lessons learned from open policy practice

Outline 1.We could start with a brief intro on Why Opasnet... what made you want to build the platform and what were your initial expectations when you first started the project. 2.If you think it is important we could also take some time to talk about the process of its design and what worked and what didn’t. 3.Then we could continue with more on the features, insights and data of the website or on the open risk assessment method of research and sharing scientific knowledge. 4.Finally, we could talk on how has Opasnet been used for social good and your vision on the future of the platform. “How have you proven that your tool works in practice to solve a clear, identified problem?” 3

Information flow in current decision support Original data Scientific publications CBA, impact assessment etc. Scientific review Report Practical knowledge and lobbying Civil servant Stakeholders Researchers Expert Decision maker

Open policy practice Open original data Scientific analysis Other scientific literature Open assessment Report Practical knowledge and lobbying Civil servant Stakeholders Researchers Expert Decision maker

What are open assessment and Opasnet? Open assessment –How can scientific information and value judgements be organised for informing societal decision making in a situation where open participation is allowed? –[Previous names: open risk assessment, pyrkilo] Opasnet –What is a web workspace that contains all functionalities needed when performing open assessments, based on open source software only?

Why Opasnet? Need for systematic flow of and place for relevant information –Scientific data and interpretations –Valuations and discussions –Decision options and objectives –Models and scenarios 7

Tendering process for pneumococcal vaccine Need to buy a new vaccine for the Finnish vaccination program (ca doses per year). What should be the decision criteria? This question was answered by using an open hearing in Opasnet.an open hearing –Epidemiological model about health impacts of vaccines. –Cost effectiveness model including price and health costs. –Online discussion forum about valuations and assumptions. Best outcome: no outrage! –Reasons: Specific question, moderation? Drug companies were active, anti-vaccine groups were not. Little outside researcher involvement. 8

Other projects and assessments in Opasnet (1) Climate change policies and health in Kuopio, FinlandClimate change policies and health Future overview reports of Finnish ministries (Transport and Logistics; Health; Environment)Transport and LogisticsHealthEnvironment Evaluation and summary of several climate policy reports, strategies, and programs of the city of Helsinki.Evaluation and summary Health and ecological risks of mining (guidance and models)Health and ecological risks Water guide for assessing health risks of raw water contamination.Water guide 9

Other projects and assessments in Opasnet (2) Environmental protection lawhttp://fi.opasnet.org/fi/Ymp%C3%A4rist%C3%B6nsuojelulaki veyteen Urban planning in Raumahttp://fi.opasnet.org/fi/Rauman_sataman_laajennuksen_vaikutus_ter veyteen – umalla Assessment of fine particles from the Port of Raumahttp://fi.opasnet.org/fi/Pienhiukkasp%C3%A4%C3%A4st%C3%B6t_Ra umalla Compensation for work-based drivinghttp://fi.opasnet.org/fi/Kilometrikorvaus_AM_ telman_p%C3%A4%C3%A4t%C3%B6ksenteko Use of Puijo forest areahttp://fi.opasnet.org/fi/Puijon_metsien_k%C3%A4ytt%C3%B6suunni telman_p%C3%A4%C3%A4t%C3%B6ksenteko

Main findings from Pohjola et al 2012: In environmental health assessments there are tendencies towards: a) increased engagement between assessors, decision makers, and stakeholders b) more pragmatic problem-oriented framing of assessments c) integration of multiple benefits and risks from multiple domains d) inclusion of values, alongside scientific facts, in explicit consideration in assessment

Shared understanding: graph Pohjola MV et al: Food and Chemical Toxicology

Shared understanding: definition There is shared understanding about a topic within a group, if everyone is able to explain what thoughts and reasonings there are about the topic. –There is no need to know all thoughts on individual level. –There is no need to agree on things (just to agree on what the disagreements are about).

Principles of open policy practice Intentionality Causality Shared information objects Criticism Openness Reuse

Six principles of open policy practice Intentionality: All that is done aims to offer better understanding to the decision maker about outcomes of the decision. Shared information objects: all information is shared using a systematic structure and a common workspace where all participants can work. Causality: The focus is on understanding the causal relations between the decision options and the intended outcomes. Critique: All information presented can be criticised based on relevance and accordance to observations. Reuse: All information is produced in a format that can easily be used for other purposes by other people. Openness: All work and all information is openly available to anyone interested. Participation is free. If there are exceptions, these must be publicly justified.

An example of an open assessment Health impact of radon in Europe

An example of a variable in a model

An example of a statement and resolution of a discussion Is Pandemrix a safe vaccine?

Application of soRvi in Opasnet

Results from soRvi

Problems perceived about open participation 1.It is unclear who decides about the content. 2.Expertise is not given proper weight. 3.Strong lobbying groups will hijack the process. 4.Random people are too uneducated to contribute meaningfully. 5.The discussion disperses and does not focus. 6.Those who are now in a favourable position in the assessment or decision-making business don’t want to change things. 7.The existing practices, tools, and software are perceived good enough. 8.There is not enough staff to keep this running. 9.People don’t participate: not seen useful, no time, no skills. 10.People want to hide what they know (and publish it in a scientific journal).

Problems observed about open participation 1.People want to hide what they know (and publish it in a scientific journal). 2.People don’t participate: not seen useful, no time, no skills. 3.The existing practices, tools, and software are perceived good enough. 4.There is not enough staff to keep this running. 5.Those who are now in a favourable position in the assessment or decision-making business don’t want to change things. 6.The discussion disperses and does not focus. 7.It is unclear who decides about the content. 8.Expertise is not given proper weight. 9.Strong lobbying groups will hijack the process. 10.Random people are too uneducated to contribute meaningfully.

Main rules in open assessment (1) Each main topic should have its own page. –Sub-topics are moved to own pages as necessary. Each topic has the same structure: –Question (a research question passing the clairvoyant test) –Answer (a collection of hypotheses as answers to the question) –Rationale (evidence and arguments to support, attack, and falsify hypotheses and arguments) ALL topics are open to discussion at all times by anyone. –Including things like ”what is open assessment”

Main rules in open assessment (2) Discussions are organised around a statement. A statement is either about facts (what is?) or moral values (what should be?) All statements are valid unless they are invalidated, i.e. attacked with a valid argument [sword]. The main types of attacks are to show that the statement is –irrelevant in its context, –illogical, or –inconsistent with observations or expressed values. Statements can have defending arguments [shield].

Main rules in open assessment (3) Uncertainties are expressed as subjective probabilities. A priori, opinions of each person are given equal weight. A priori, all conflicting statements are considered equally likely.

Future promises and challenges Technically, Opasnet works surprisingly well. Personally, I am able to do almost all my work in Opasnet. Many people see open participation and expert + decision-maker collaboration as a promising approach. However, many reasons for resistance: –Open practices are a threat to expert authority. –People don’t want to show intermediate work. –Old tools are considered better for each specific task.  Now it is time for new ambitious collaboration and community for open online modelers/assessors.  Course DARM starts at UEF Jan13, 2015! 26

Conclusions We could do most of our scientific work online using shared information systems and web workspaces (such as Opasnet). These tools exist and are functional. The work would be quicker and better. There are major obstacles of new practices: –Lack of awareness. –Lack of practical knowledge to use tools. –Current practices and incentives are against sharing. –Reluctance to change things. Join Decision Analysis and Risk Management in 8 Jan – 14 Feb 2013!

History briefly: borrowing and combining ideas –1996: EU Parliament visit: ”Information does not flow!” –1997: Idea of Internet-based assessments –2000: Decision analysis (Harvard University) –2005: Wikipedia, wiki approach –2006: Opasnet wiki launched –2006: Wikinomics, mass collaboration, wisdom of crowds –2006: Argumentation rules (Amsterdam University) –2007: Open assessment –2009: Wiki government –2011: online wiki modeling using R –2012: MongoDB database –2013: Open policy practice (guidance for making decisions) –2014: Several policy projects without research funding 29

Framework for TEKAISU method

Open risk management: overview QRAQRA Mikko V Pohjola and Jouni T Tuomisto.. Environmental Health 2011, 10: 58 doiEnvironmental Health 2011, 10: 58doi Public health data

How Opasnet helps in assessments 1drjo8qMJ- vWR3BQgsfRbH2DO0E43Xb01eRddWc g/edit?hl=en_GB&authkey=CN_oqbYK& pli=1https://docs.google.com/drawings/d/1f1s 1drjo8qMJ- vWR3BQgsfRbH2DO0E43Xb01eRddWc g/edit?hl=en_GB&authkey=CN_oqbYK& pli=1