Introduction to assessment performance Mikko Pohjola, THL.

Slides:



Advertisements
Similar presentations
Climate prediction: a limit to adaptation? Living with climate change: are there limits to adaptation? 7 & 8 February 2008, Royal Geographical Society,
Advertisements

Chapter 4 Design Approaches and Methods
Determining CLIMASP Competencies Jerash University Development of Interdisciplinary Program on Climate Change and Sustainability Policy- CLIMASP Development.
Knowledge Translation Curriculum Module 1: An Introduction to KT Lesson 1 - Knowledge Translation: The Basics.
Introduction to assessment performance Mikko Pohjola, THL.
Lecture 7 Evaluation. Purpose Assessment of the result Against requirements Qualitative Quantitative User trials Etc Assessment of and Reflection on process.
Title slide PIPELINE QRA SEMINAR. PIPELINE RISK ASSESSMENT INTRODUCTION TO GENERAL RISK MANAGEMENT 2.
Uncertainty and quality in scientific policy assessment -introductory remarks- Martin Krayer von Krauss, WHO/EEA Integrated Assessment of Health Risks.
FAO/WHO CODEX TRAINING PACKAGE
Program Participants: Department Managers, Project Leaders, Senior officers, Black Belt candidates and anyone who desires an understanding of Lean Six.
Risk management: State-of-the-art? Mikko Pohjola, THL.
Performance Measurement and Analysis for Health Organizations
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
EHealth Partners Finland Finnish Agency for Technology and Innovation Tekes grants no /06 and 70030/06 Evaluation studies support.
Standards-Based Science Instruction. Ohio’s Science Cognitive Demands Science is more than a body of knowledge. It must not be misperceived as lists of.
Management & Development of Complex Projects Course Code - 706
Introduction to assessment performance Mikko Pohjola, THL.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
06/10/2015 Presentation name / Author1 Evaluating assessment performance Mikko Pohjola, THL.
Management of assessments and decision making: execution, facilitation, evaluation Mikko V. Pohjola, Nordem Oy (THL)
Guide Jouni Tuomisto, Mikko Pohjola - National Institute for Health and Welfare - Department of Environmental Health – Finland Introduction: The world.
HCI in Software Process Material from Authors of Human Computer Interaction Alan Dix, et al.
# 1 US Army Engineer Research and Development Center Multi-Criteria Decision Analysis and Environmental Risk Assessment for Nanomaterials Jeff Steevens.
Module 4: Systems Development Chapter 12: (IS) Project Management.
Shared understanding Jouni Tuomisto, THL. Outline What is shared understanding? Main properties Examples of use How does it make things different? Rules.
European Operational Concept Validation Methodology E-OCVM Version 3: What’s New Episode 3 - CAATS II Final Dissemination Event Matthias Poppe DFS Episode.
National Public Health Institute, Finland Open risk assessment Lecture 7: Evaluating assessment performance Mikko Pohjola KTL, Finland.
Introduction to the Research Framework Work-in-progress Conceptualizing the Criteria to assess ‘appropriateness’ of actions in given ‘national’ circumstances.
University of Palestine software engineering department Testing of Software Systems Testing throughout the software life cycle instructor: Tasneem.
Integrated Risk Management Charles Yoe, PhD Institute for Water Resources 2009.
Decision analysis and risk management: Introduction to course Jouni Tuomisto, THL.
Max Booleman, Statistics Netherlands Antonio Baigorri, Eurostat The 10 commandments of process and product quality in official statistics.
Risk management: A social learning perspective? Mikko Pohjola, THL.
Chapter 14: Using the Scalable Decision Process on Large Projects The process outlined is meant to be scaleable. Individual steps can be removed, changed,
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Chapter 3 Managing Design Processes. 3.1 Introduction Design should be based on: –User observation Analysis of task frequency and sequences –Prototypes,
Rome, july 5, 2006 Observing project implementation and conducting project analysis (UVER) Presentation by Luigi Guerci.
The new EC impact assessment: what for? EUROPEAN TRADE UNION CONFEDERATION Sophie Dupressoir.
Shared understanding Jouni Tuomisto, THL. Outline What is shared understanding? Main properties Examples of use How does it make things different? Rules.
National Public Health Institute, Finland Open Risk Assessment Lecture 2: General assessment framework Mikko Pohjola KTL, Finland.
SOLUTION What kind of plan do we need? How will we know if the work is on track to be done? How quickly can we get this done? How long will this work take.
DARM 2013: Assessment and decision making Mikko V. Pohjola, Nordem Oy, (THL)
RLV Reliability Analysis Guidelines Terry Hardy AST-300/Systems Engineering and Training Division October 26, 2004.
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Designing New Programs Design & Chronological Perspectives (Presentation of Berk & Rossi’s Thinking About Program Evaluation, Sage Press, 1990)
1 URBDP 591 A Analysis, Interpretation, and Synthesis -Assumptions of Progressive Synthesis -Principles of Progressive Synthesis -Components and Methods.
National Public Health Institute, Finland Open Risk Assessment Lecture 2: General assessment framework Mikko Pohjola KTL, Finland.
Linking SEA and City Development Strategy (CDS) in Vietnam Maria Rosário Partidário, Michael Paddon, Markus Eggenberger, Minh Chau, and Nguyen Van Duyen.
Model validity, testing and analysis. Conceptual and Philosophical Foundations Model Validity and Types of Models –Statistical Forecasting models (black.
Copernicus Institute Interfaces between Science & Society, Milano, more info: Break-out session Uncertainty, assumptions and value.
Introduction to assessment performance Mikko Pohjola, THL.
Management and evaluation of open policy processes Mikko V. Pohjola Nordem Oy, THL, Santasport Institute.
NMFS Use Case 1 review/ evaluation and next steps April 19, 2012 Woods Hole, MA Peter Fox (RPI* and WHOI**) and Andrew Maffei (WHOI) *Tetherless World.
Decision analysis and risk management: Introduction to course Jouni Tuomisto, THL.
Verification vs. Validation Verification: "Are we building the product right?" The software should conform to its specification.The software should conform.
Assessing College Students’ Desire to Enhance Global Learning Competencies Rosalind R. King, Ph.D. ABSTRACT Literature indicates the urgency to enhance.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
PRAGMATIC Study Designs: Elderly Cancer Trials
Quantitative Methods for Business Studies
Cognitive Informatics for Biomedicine – Chapter 5
Chris Lintern Co-operative Financial Services
Background Non-Formal Education is recognized as an important sub-sector of the education system, providing learning opportunities to those who are not.
Software Verification and Validation
Application of toxicological risk assessment in the society
Nanomaterial Risk Screening Tool (NRST)
DARM 2013: Assessment and decision making
Introduction to risk management
Impact assessment and decision making
Frequently asked questions about software engineering
Strategic Environmental Assessment (SEA)
Presentation transcript:

Introduction to assessment performance Mikko Pohjola, THL

Contents Concepts General framework Common perspectives (& examples) Quality assurance/quality control Uncertainty Model performance Properties of good assessment Summary

Discussion example: swine flu vaccination Because of urgence, swine flu vaccination was bought in Finland without a thorough testing. When narcolepsy cases were identified, the decision made without testing was seen as a major mistake. Was it a mistake? – How should we evaluate the situation to find an answer? – How did the decision-maker assess the situation? – How should she have assessed the situation?

Swine flu example: issues in performance? What are the critical issues in the assessment performance? Possibilities include e.g. – The assessment truthfully estimates the total health impact of swine flu. – The assessment truthfully estimates the health impact of a vaccination campaign. – The only tested vaccines are assessed. – The assessment does not underestimate potential side effects of the vaccine, whether tested or not. – Something else, what?

Swine flu example: follow-up as a part of assessment performance? What are the methods to identify if something starts to go on after the decision? Should these be assessed already in the assessment before the decision? How can this be done? Does this improve the assessment performance?

Concepts & rationale Some basic concepts: Performance = goodness! Assessment, Management Model Process (making/using), Product Output, Outcome Assessor, Decision/Policy maker, Stakeholder Participant, User

Concepts & rationale Why evaluation of assessment performance is important? Efficient use of resources? Value of work done? Importance/meaning of information? Implications of information? Actual impacts of information? … …because funder, customer, user, boss, peer, stakeholder etc. wants/needs to know!

Roles and interests ExpertsData quality, analysis procedure, coherence, comprehensiveness, … FundersRelevance, efficiency, timeliness, importance, … Users (DM)Understandability, reliability (of source), acceptance, practicality, … Interested (SH)(same as DM, but different perspective)

General RA/RM framework Process, product, use

Common perspectives & examples Quality assurance/quality control Focus on assessment process An “engineering” perspective Uncertainty Focus on assessment output A scientists perspective??? Model performance Focus on modelling and model Combines QA/QC and uncertainty perspectives A modellers perspective

Quality assurance/quality control Principle: Good process guarantees good outputs/outcomes! Question: How should an assessment process be conducted? Examples: Ten steps by Jakeman et al.(2006) IDEA framework (Briggs, 2008) (Over)appreciation of randomized controlled trials (RCT’s)

Ten iterative steps in development and evaluation of environmental models Jakeman et al.: Ten iterative steps in development and evaluation of environmental models. Environmental Modelling & Software Issue 5, May 2006, Pages

IDEA framework (INTARESE) Briggs: A framework for integrated environmental health impact assessment of systemic risks. Environmental Health 2008, 7:61.

Uncertainty Principle: Performance is an intrinsic property of an information product! Question: How good is the answer provided by the assessment?

Uncertainty Examples: Statistical uncertainty analysis Mean, variance, confidence limits, distributions, … Cf. D. Lindley: Philosophy of Statistics, 2000 Sources of uncertainty E.g. model, parameter & scenario uncertainty (as applied e.g. by the U.S.EPA) Extensive approaches E.g. inclusion of qualitative aspects, sources of uncertainty as in NUSAP (

NUSAP N: numeral U: unit S: spread A: assessment (qualitative judgment) P: pedigree (historical path leading to result)

NUSAP - pedigree Jeroen van der Sluijs: NUSAP- some examples. Presentation. Available:

Model performance Principle: The model is the essence of the assessment! Question: How good is the model? Examples: Verification, validation, (reliability, usability, …) Outcome-oriented approach by Matthews et al. 2011

Outcome-oriented modelling approach Matthews et al.: Raising the bar? – The challenges of evaluating the outcomes of environmental modelling and software. Environmental Modelling & Software, March 2011, Pages

Summary of common perspectives Assessment process and product addressed in many ways Use of results mostly not considered The link between outputs and outcomes (cf. Matthews et al. 2011) Evaluation often a separate process Expert processes of making assessments and using their results Expert processes of evaluating performance Alternative perspectives?

Properties of good assessment

Ex post (after assessment) evaluation Ex ante (before/during assessment) evaluation Guidance of design and execution Links process and output with use Thereby also linking them to outcomes

Example: what makes a good hammer?

How is the hammer made? By whom? What properties does the hammer have? What do you want to do with the hammer? How does the hammer help you do it?

Summary Consideration of (intended) use is essential Consideration of process and product in light of use Consider the instrumental value of information Cf. absolute value (a common science view) Cf. Ad hoc solutions (a common practice view) Contextuality, situatedness, practicality, … In policy-support information is a tool (a means to an end) A model is a tool for producing information How does this relate to the previous lectures about DA and the DA study plan exercise?