National Public Health Institute, Finland www.ktl.fi Open risk assessment Lecture 7: Evaluating assessment performance Mikko Pohjola KTL, Finland.

Slides:



Advertisements
Similar presentations
Strategic Planning An Overview. General Definition The process of strategic planning involves deciding where you want to go, how you want to be positioned,
Advertisements

Financial Statements Audit
Project management Project manager must;
Introduction to assessment performance Mikko Pohjola, THL.
Monitoring and Control Earned Value Management (EVM)
© Tefko Saracevic, Rutgers University 1 EVALUATION in searching IR systems Digital libraries Reference sources Web sources.
Title slide PIPELINE QRA SEMINAR. PIPELINE RISK ASSESSMENT INTRODUCTION TO GENERAL RISK MANAGEMENT 2.
PPA 503 – The Public Policy Making Process
Project Workshops Results and Evaluation. General The Results section presents the results to demonstrate the performance of the proposed solution. It.
Lecture Nine Database Planning, Design, and Administration
Unit 15 Assessment in Language Teaching. Teaching objectives By the end of the lesson, students should be able to:  know what assessment is and how it.
Types of Evaluation.
Tony Gould Quality Risk Management. 2 | PQ Workshop, Abu Dhabi | October 2010 Introduction Risk management is not new – we do it informally all the time.
BA 427 – Assurance and Attestation Services
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
FAO/WHO CODEX TRAINING PACKAGE
What is Business Analysis Planning & Monitoring?
AICT5 – eProject Project Planning for ICT. Process Centre receives Scenario Group Work Scenario on website in October Assessment Window Individual Work.
Health promotion and health education programs. Assumptions of Health Promotion Relationship between Health education& Promotion Definition of Program.
ECTS definition : Student centred system, Student centred system, Based on student workload required to : Based on student workload required to : Achieve.
Chapter 9 Database Planning, Design, and Administration Sungchul Hong.
Database System Development Lifecycle © Pearson Education Limited 1995, 2005.
Software Project Management Lecture # 8. Outline Chapter 25 – Risk Management  What is Risk Management  Risk Management Strategies  Software Risks.
RESEARCH A systematic quest for undiscovered truth A way of thinking
Software Project Management Lecture # 8. Outline Earned Value Analysis (Chapter 24) Topics from Chapter 25.
Use of survey (LFS) to evaluate the quality of census final data Expert Group Meeting on Censuses Using Registers Geneva, May 2012 Jari Nieminen.
“Opening the Doors of Policy-Making: Central Asia and South Caucasus” (UDF- GLO ) Skills Development Training for CSOs Istanbul, June 2-3, 2011 In-depth.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Introduction to assessment performance Mikko Pohjola, THL.
06/10/2015 Presentation name / Author1 Evaluating assessment performance Mikko Pohjola, THL.
National Public Health Institute, Finland Pyrkilo – a modified risk assessment method Jouni Tuomisto National Public Health Institute (KTL)
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Management of assessments and decision making: execution, facilitation, evaluation Mikko V. Pohjola, Nordem Oy (THL)
The basics of health impact assessment (HIA): Part 1 Erica Ison Specialist Practitioner in HIA and HiAP Expert Adviser in HIA, WHO Network of European.
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
Shared understanding Jouni Tuomisto, THL. Outline What is shared understanding? Main properties Examples of use How does it make things different? Rules.
The TIDE impact assessment methodology TIDE Final Conference Barcelona, September 2015 Wuppertal Institute for Climate, Environment and Energy Oliver.
Introduction to assessment performance Mikko Pohjola, THL.
Donor perspectives on planning and evaluation Janice Astbury The J.W. McConnell Family Foundation.
18/11/2015 Presentation name / Author1 Assessments – science-based decision support Mikko Pohjola, THL.
Building Simulation Model In this lecture, we are interested in whether a simulation model is accurate representation of the real system. We are interested.
National Public Health Institute, Finland Open risk assessment Lecture 4: Defining variables Jouni Tuomisto KTL, Finland.
The new EC impact assessment: what for? EUROPEAN TRADE UNION CONFEDERATION Sophie Dupressoir.
ECOS Information Session Draft EPA Quality Documents February 13, 2013 Presented by EPA Quality Staff, Office of Environmental Information For meeting.
National Public Health Institute, Finland Open Risk Assessment Lecture 2: General assessment framework Mikko Pohjola KTL, Finland.
Understanding Policy Contexts CEP:ELM, 2011 Mombasa.
DARM 2013: Assessment and decision making Mikko V. Pohjola, Nordem Oy, (THL)
Kathy Corbiere Service Delivery and Performance Commission
National Public Health Institute, Finland Open risk assessment Lecture 5: Argumentation Mikko Pohjola KTL, Finland.
LIFE CYCLE ASSESSMENT (LCA). As corporations seek to improve their environmental performance they require new methods and tools. LCA is one such tool.
Quality Assessment of MFA’s evaluations Rita Tesselaar Policy and operations Evaluation Department Netherlands Ministry of Foreign Affairs.
Lecture №1 Role of science in modern society. Role of science in modern society.
Software Project Management Lecture # 9. Outline Chapter 25 – Risk Management  What is Risk Management  Risk Management Strategies  Software Risks.
National Public Health Institute, Finland Open risk assessment Lecture 5: Argumentation Mikko Pohjola KTL, Finland.
National Public Health Institute, Finland Open Risk Assessment Lecture 2: General assessment framework Mikko Pohjola KTL, Finland.
Introduction to assessment performance Mikko Pohjola, THL.
Module 4 – Evaluation: General Characteristics. 25/02/20162 Overview of the Module Definitions Purposes Milestones Stakeholders A win-win exercise.
Management and evaluation of open policy processes Mikko V. Pohjola Nordem Oy, THL, Santasport Institute.
Interpreting Communication Research Textual Analysis.
Chapter 9 Database Planning, Design, and Administration Transparencies © Pearson Education Limited 1995, 2005.
Reconciling the Value Estimates Basic Real Estate Appraisal: Principles & Procedures – 9 th Edition © 2015 OnCourse Learning Chapter 15.
…an open, online course Statistics in Education for Mere Mortals Statistics in Evaluation & Research: Some Important Context Lloyd P. Rieber Professor.
The scope and focus of the Research
Environmental Health Management (EN481)
DARM 2013: Assessment and decision making
Impact assessment and decision making
Open risk assessment Lecture 5: Argumentation
Quality Risk Management
Conduction of a simulation considering cascading effects
Planning process in river basin management
Presentation transcript:

National Public Health Institute, Finland Open risk assessment Lecture 7: Evaluating assessment performance Mikko Pohjola KTL, Finland

National Public Health Institute, Finland Lecture contents Properties of good assessments Relations between properties and parts of assessments Assessment performance Evaluation of assessment performance Group work: evaluation of performance

National Public Health Institute, Finland Properties of good assessments

National Public Health Institute, Finland Properties of good assessments Quality of content –How good a description of the given part of reality this is? Applicability –How well does the information within the description transfer to its use process (decision making)? Efficiency –f(quality of content, applicability | purpose) / effort expenditure –How good output, given the effort, is/was produced?

National Public Health Institute, Finland Quality of content Informativeness –How exact is the description? Quantitatively: how tight is the spread, how narrow is the distribution? Calibration –How close to reality is the description? Quantitatively: distance of central estimate to ”true” value? Relevance –How well does the description answer to the specific question?

National Public Health Institute, Finland Applicability Usability –How well is the information understood by its intended users? Language, terms, representation and illustration, clarity, … Availability –How well do the intended users get access to the information according to their needs? Who, when, where, how receives the information? –In relation to identified information need

National Public Health Institute, Finland Applicability Acceptability –How well is the information, and conclusions based on it accepted (taken into use, internalized) By intended users and by stakeholders –Acceptability of premises: value judgments and assumptions behind the assessment E.g. severity weight or cost of specific disease, use of DALYs and/or money as aggregate measures, … –Acceptability of process: methods, means and materials used in making the assessment E.g. data and data sources, openness (transparency), scientific reliability of methods (and assessors), …

National Public Health Institute, Finland ”Effectiveness” Quality of content and applicability are together called effectiveness How much effect does (or can) the assessment have regarding its intended use? E.g. How great impact it has regarding the decision making where it is used? As the use process (decision making) is considered as being outside the sphere of assessment, effectiveness is thought of as ”potential for effect”

National Public Health Institute, Finland Efficiency Intra-assessment efficiency –How much effectiveness (quality of content and applicability) per spent effort is/was achieved in a particular assessment? Inter-assessment efficiency –How much effectiveness (quality of content and applicability) per spent effort is/was achieved in a series of assessments?

National Public Health Institute, Finland Relations Quality of content -> product Applicability –> product, process and use Efficiency – assessment process

National Public Health Institute, Finland Relations Quality of content –Informativeness, calibration -> assessment/variable result –Relevance -> assessment/variable scope Applicability –Usability -> presentation of content –Availability -> assessment process – use process interaction –Acceptability of premises -> scientific context, policy context –Acceptability of process -> assessment process, assessment/variable definition Effectiveness –Assessment process

National Public Health Institute, Finland Assessment performance Performance = f(quality of content, applicability, efficiency | purpose) Purpose must always be defined and explicated! –General purpose = to describe reality –Instrumental purpose = to answer to a specific need No deliberate distortion of results Specific assessment question must be defined

National Public Health Institute, Finland Evaluation of assessment performance Evaluation of product and process in terms of the properties against the purpose Quantitative evaluation where possible Informativeness, calibration, efficiency (effort expenditure) Qualitative or semi-quantitative evaluation for the rest No good semi-quantitative evaluation methods yet known to be available

National Public Health Institute, Finland Evaluation of assessment performance A priori: applying the performance principles in making an assessment –Continuous evaluation of assessment performance Definition of purpose and scope Identification and explication of assumptions Information processing means and methods Process management, e.g. openness, communication, tools Formulation and presentation of outputs …

National Public Health Institute, Finland Evaluation of assessment performance A posteriori: evaluating performance of an existing already made assessment –Same principles as a priori, but an afterward check of performance Evaluating only a posteriori is always too late –Evaluation part of the group work case is an a posteriori evaluation exercise

National Public Health Institute, Finland Group work: evaluation of performance Evaluate identification and explication of assessment purpose –Intended use process –Intended users –Important stakeholders –Policy context If identification is incomplete, consider how the purpose should/could have been identified and explicated

National Public Health Institute, Finland Group work: evaluation of performance Evaluate quality of content –Both on assessment and variable level: Scope Definition Result Evaluation can be done against both: –The purpose identified and explicated in the assessment –The purpose suggested by the evaluator

National Public Health Institute, Finland Group work: evaluation of performance Evaluate applicability against purpose –”Packaging” of the information –Assessment – use interaction During and after assessment –Identification and explication of assumptions –Explication and choice of used methods, means and materials Again evaluation against both ”real” and suggested purpose as seen necessary Also compare original format and new format

National Public Health Institute, Finland Group work: evaluation of performance Evaluate efficiency –In practice: effort expenditure Person time Money Consider both efforts in original assessment and conversion work

National Public Health Institute, Finland Group work: evaluation of performance Compile an overall evaluation of performance –Performance = f(quality of content, applicability, efficiency | purpose) Efficiency = f(quality of content, applicability | purpose) / effort expenditure –Performance = f(quality of content, applicability | purpose) / effort expenditure Consider possible improvements Does ORA appear useful in improving assessment performance?