IPDET 2015 WHAT IS EVALUATION? Robert Picciotto King’s College, London "There are no facts, only interpretations“ Friedrich Nietzsche 1.

Slides:



Advertisements
Similar presentations
Some Core Values, Principles, and Assumptions to Guide the Work.
Advertisements

Introduction to Monitoring and Evaluation
Donald T. Simeon Caribbean Health Research Council
Lucila Beato UNMIL/HRPS
How do we achieve cost effective cancer treatments in the UK? Professor Peter Littlejohns Department of Public Health and Primary Care.
MODULE 8: PROJECT TRACKING AND EVALUATION
HUMAN DIMENSIONS OF WILDLIFE MANAGEMENT (HDWM) M. Nils Peterson and Shari L. Rodriguez Fisheries, Wildlife & Conservation Biology Program Department of.
Demistyfing equity-focused evaluations [EFE]
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
The French Youth Experimentation Fund (Fonds d’Expérimentation pour la Jeunesse – FEJ) Mathieu Valdenaire (DJEPVA - FEJ) International Workshop “Evidence-based.
Results-Based Management: Logical Framework Approach
Measuring and Monitoring Program Outcomes
IPDET Lunch Presentation Series Equity-focused evaluation: Opportunities and challenges Michael Bamberger June 27,
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 502 – Program Evaluation
Research problem, Purpose, question
Evaluation. Practical Evaluation Michael Quinn Patton.
What Would You Do? A Case Study in Ethics
PROFESSION OF NURSING OBJECTIVES: 1. Discuss the historical development of professional nursing. 2. Discuss the modern definitions, philosophies, and theories.
Evaluation and Policy in Transforming Nursing
Formulating objectives, general and specific
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Integrated Assessment and Planning
Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition, or past practice. The importance.
31 st August 2007 Domains and Dimensions of Health Systems Research Health Systems Research: Purpose and Scope Stiofan de Burca Health Systems Research.
Study Design. Study Designs Descriptive Studies Record events, observations or activities,documentaries No comparison group or intervention Describe.
Clinical Social Work Research Patience Matute-Ewelisane Eugene Shabash Jayne Griffin.
PART II – Management Audit: Basic Standards, Values and Norms Shared by Pratap Kumar Pathak.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Criteria for Assessing The Feasibility of RCTs. RCTs in Social Science: York September 2006 Today’s Headlines: “Drugs education is not working” “ having.
ESPON Seminar 15 November 2006 in Espoo, Finland Review of the ESPON 2006 and lessons learned for the ESPON 2013 Programme Thiemo W. Eser, ESPON Managing.
Nursing Research Prof. Nawal A. Fouad (5) March 2007.
How can we evaluate the impact of supported employment and help make a better business case To demonstrate impact we need to measure the social value created.
BUILDING AND MAINTAINING AN EFFECTIVE CASE MANAGEMENT SYSTEM Sundra Franklin.
Chapter 5 Managing Responsibly and Ethically Copyright © 2016 Pearson Canada Inc. 5-1.
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Naresh Malhotra, David Birks and Peter Wills, Marketing Research, 4th Edition, © Pearson Education Limited 2012 Slide 6.1 Chapter 6 Exploratory Research.
Aaron Zazueta Chief Evaluation Officer 2013 EVALUATION IN THE GEF.
Impact Evaluations and Development Draft NONIE Guidance on Impact Evaluation Cairo Conference: Perspectives on Impact Evaluation Tuesday, March 31, 2009.
Potential and Pitfalls of Experimental Impact Evaluation: Reflections on the design and implementation of an experimental Payments for Environmental Services.
Approach to GEF IW SCS Impact Evaluation Aaron Zazueta Reference Group Meeting Bangkok, Thailand September 27, 2010.
Indicators to Measure Progress and Performance IWRM Training Course for the Mekong July 20-31, 2009.
What Can We Say About the Economic, Institutional, and Legal Framework for Sustainable Forest Management in the United States? Roundtable on Sustainable.
 Welcome, introductions  Conceptualizing the evaluation problem  stakeholder interests  Divergent and convergent processes  Developing evaluation.
Measuring Effectiveness and Impact VCSE Employment and Skills Conference 21st May 2014 Matthew Hill, South West Forum.
INTEGRATING GENDER ISSUES INTO EVALUATION. First steps The first step is to search for gender issues within the context of your ICT project. This is essential.
1 Results-based Monitoring, Training Workshop, Windhoek, Results-based Monitoring Purpose and tasks Steps 1 to 5 of establishing a RbM.
Evaluation design and implementation Puja Myles
Evidence-Based Practice Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition,
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Cost effectiveness in Human Services: simple complicated and complex considerations Towards a method that is useful and accurate Andrew Hawkins, ARTD Consultants.
BNAO ROLE FOR EFFECTIVE MANAGEMENT OF PUBLIC FINANCES Tzvetan Tzvetkov, CIA, CGAP, CRMA - President of the Bulgarian National Audit Office.
How to measure the impact of R&D on SD ? Laurence Esterle, MD, PhD Cermes and Ifris France Cyprus, 16 – 17 October L. ESTERLE Linking science and.
1 Performance Auditing ICAS & IRAS Officers NAAA 21 Jan 2016.
Preparing Novice Teachers in Classroom Management At The Elementary and Secondary Level By: Yelena Patish Charles Peck Elizabeth West Laura Rothenberg.
Chapter 3: Exploring the Future Scott Kaminski ME / 2 / 2005.
AES 2010 Kate McKegg, Nan Wehipeihana, Jane Davidson.
ICT4D: Evaluation Dr David Hollow DSV, University of Stockholm.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Evaluation: For Whom and for What?
Public Policy Process Ghanashyam Bhattarai
Module 1: Introducing Development Evaluation
Assessment of Student Learning
What can implementation research offer?
CATHCA National Conference 2018
Strategic Environmental Assessment (SEA)
Public Policy Process Ghanashyam Bhattarai
Presentation transcript:

IPDET 2015 WHAT IS EVALUATION? Robert Picciotto King’s College, London "There are no facts, only interpretations“ Friedrich Nietzsche 1

Setting the stage Evaluation: a fledgling profession Defining evaluation in terms of evaluator capabilities: – Dispositions – Knowledge – Practice Evaluation is on the move IPDET has shaped its evolution 2

Conflicting definitions of impact evaluation DAC: the positive and negative, primary and secondary long-term effects produced by an intervention, directly or indirectly, intended or unintended” (sustainability) 3iE: measuring through experiments the difference that an intervention makes with respect to an indicator of interest (attribution) 3

The 3iE definition is popular… It claims scientific validity and promises definitive and quantitative judgments It champions micro-economics at a time when macro economics has lost its lustre It gives austerity oriented policy makers a convenient tool for closing down programmes that ‘do not work’ The academic establishment privileges it! 4

..but on its own it is not evaluation… Advantages: – Attributing results to social interventions is a legitimate question (Does it ‘work’?) – Randomization takes care of selection bias Disadvantages: – RCTs do not tell us whether the intervention was worth pursuing; how well it performed ; whether it is replicable (why/what/how?) – it blurs accountability(who?) 5

… since it is neither necessary, nor sufficient …nor easy to do! It may be redundant… or not feasible Validity may be threatened by unobserved factors The experiment may affect behaviour RCTs are often unethical or illegal Valid statistical treatment is demanding (e.g. sample size)… and finally … simpler/cheaper methods do exist! 6

What then is evaluation? There are many ways to describe what we do (or should do) but the following concise definition, crafted by Michael Scriven, has gained broad based acceptance in the evaluation community: “Evaluation is the process of determining the merit, worth and value of something or the product of that process” It is well worth pondering! 7

The merit dimension Merit is intrinsic to the intervention. It is about doing things right and it focuses on efficacy (theory of action) It verifies compliance with standards or norms or attest to the achievement of intended goals It makes evaluation akin to process auditing It is one facet of evaluation… but again, on its own, it is not evaluation 8

The worth dimension Worth is about doing the right things. It is extrinsic along two dimensions: – relevance from the perspective of stakeholders and the society – linkages between outcomes and impacts on society (theory of change) Aggregating preferences in the public interest impact is tricky: only ethics can solve the impossibility theorem 9

Value completes the trilogy The value criterion aims at an overall judgment of effectiveness (relevance, efficacy, efficiency, sustainability, impact) ‘Doing good’ as well as ‘doing right’ Value assessment ensures that intrinsic merit criteria are selected to measure the extrinsic worth of the intervention from a public interest perspective (the domain of moral philosophy) 10

How do current evaluation models measure up to the definition? The utilization- focused model stresses merit: it is consultancy not evaluation The empowerment model stresses worth: facilitation more than evaluation Knowledge oriented models are value free and akin to social research Social justice oriented models are new, untested and neglected 11

The future of evaluation? Given alarming inequality and unsustainable environmental trends will progressive evaluation be the next evaluation wave? Progressive evaluation would require transparency, broad based participation and fulsome democratic debate and resist the forces that threaten our independence It would be grounded in ethics and come to terms with the impossibility of reducing the social world to simplistic propositions 12

You are the future of evaluation: the world needs you! Living up to the merit/worth/value challenge is extraordinarily ambitious: – Professionalize – Speak truth to power – Accept the solitude of independence The future is risky and uncertain but the struggle itself is enough… “The best way to predict the future is to create it.” Peter Drucker 13

THANK YOU FOR YOUR ATTENTION! 14