The added value of evaluation

Slides:



Advertisements
Similar presentations
Mywish K. Maredia Michigan State University
Advertisements

Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Assessing Program Impact Chapter 8. Impact assessments answer… Does a program really work? Does a program produce desired effects over and above what.
Introduction to Research
PPA 502 – Program Evaluation
TOOLS OF POSITIVE ANALYSIS
Research problem, Purpose, question
INTRODUCTION TO OPERATIONS RESEARCH Niranjan Saggurti, PhD Senior Researcher Scientific Development Workshop Operations Research on GBV/HIV AIDS 2014,
RESEARCH A systematic quest for undiscovered truth A way of thinking
CHARACTERISTICS OF GOOD DESIGN Dr. Aidah Abu Elsoud Alkaissi Linköping University- Sweden An-Najah National University- Palestine 1.
Evaluating the Options Analyst’s job is to: gather the best evidence possible in the time allowed to compare the potential impacts of policies.
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
The Process of Conducting Research
Experimental Research Methods in Language Learning Chapter 1 Introduction and Overview.
Making Sense of the Social World 4 th Edition Chapter 11, Evaluation Research.
FOR 500 PRINCIPLES OF RESEARCH: PROPOSAL WRITING PROCESS
URBDP 591 I Lecture 3: Research Process Objectives What are the major steps in the research process? What is an operational definition of variables? What.
LEVEL 3 I can identify differences and similarities or changes in different scientific ideas. I can suggest solutions to problems and build models to.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 8 Clarifying Quantitative Research Designs.
Introduction to Research. Purpose of Research Evidence-based practice Validate clinical practice through scientific inquiry Scientific rational must exist.
Introduction to Research
What Is Science?. 1. Science is limited to studying only the natural world. 2. The natural world are those phenomena that can be investigated, discovered,
Critiquing Quantitative Research.  A critical appraisal is careful evaluation of all aspects of a research study in order to assess the merits, limitations,
QUALITATIVE RESEARCH QUANTITATIVE RESEARCH TYPE OF INFORMATION SOUGHT RESEARCH.
Impact evaluations of the UNICEF-IKEA Foundation programme on Improving Adolescents Lives in Afghanistan, India and Pakistan: Integrating an equity and.
STEPS IN RESEARCH PROCESS 1. Identification of Research Problems This involves Identifying existing problems in an area of study (e.g. Home Economics),
Presentation at UKES and CDI, 27 and 9 April 2016 Jeremy Holland and Florian Schatz Evaluating complex change across projects and contexts: Methodological.
Chapter 6 Selecting a Design. Research Design The overall approach to the study that details all the major components describing how the research will.
Qualitative Research Quantitative Research. These are the two forms of research paradigms (Leedy, 1997) which are qualitative and quantitative These paradigms.
Stages of Research and Development
Responding to Complexity in Impact Evaluation
Research Methods in I/O Psychology
Making Causal Claims in Non-Experimental Settings
Writing a sound proposal
Technical Assistance on Evaluating SDGs: Leave No One Behind
Food and Agriculture Organization of the United Nations
Leacock, Warrican and Rose (2009)
Research Methods for Business Students
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
Programme Board 6th Meeting May 2017 Craig Larlee
Section 2: Science as a Process
Qualitative Research Quantitative Research.
CASE STUDY RESEARCH An Introduction.
DUET.
IB Psych 10/16/17 Nothing Intros only… Today’s Agenda: Working on SAQ
Teaching and Educational Psychology
IS Psychology A Science?
Conceptual Frameworks, Models, and Theories
Introduction to Nursing Theory and Science
IB Environmental Systems and Societies
© 2012 The McGraw-Hill Companies, Inc.
IS Psychology A Science?
Research proposal MGT-602.
Logic Models and Theory of Change Models: Defining and Telling Apart
SCIENCE AND ENGINEERING PRACTICES
Evaluating agricultural value chain programs: How we mix our methods
Lesson Using Studies Wisely.
A LEVEL Paper Three– Section A
Statistical Data Analysis
Standard for Teachers’ Professional Development July 2016
RESEARCH BASICS What is research?.
Positive analysis in public finance
BBA V SEMESTER (BBA 502) DR. TABASSUM ALI
Monitoring and Evaluating FGM/C abandonment programs
Some Further Considerations in Combining Single Case and Group Designs
Meta-analysis, systematic reviews and research syntheses
Presentation transcript:

The added value of evaluation Julian Barr UK Evaluation Society & Itad Ltd British Academy and Government Social Research Network Evaluation: impact, challenges and complexity 12th July 2017

Frame Evaluation as a field of practice Evaluation - quality and rigour The Value of Evaluation Challenges in evaluation

Evaluation as a field of practice “Evaluation determines the merit, worth, or value of things. …” (Scriven, 1991). Worth is complex, personal, political Evaluation is inherently political “Evaluation is an objective process of understanding how a policy or other intervention was implemented, what effects it had, for whom, how and why.” (Magenta Book, 2011) Considers a range of outcomes and impacts (not just achievement of positive planned ones) Explores causality – through a range of methods Concerned equally with how impacts are achieved and how much impact is achieved Pays attention to power dynamics and socio-economic winners and losers

Evaluation as a field of practice Characteristics of a distinct field Development of evaluation theories and approaches Emergence of evaluation associations & societies, nationally and internationally (102 listed) Global coherence: IOCE, UN Year of Evaluation Journals and conferences devoted entirely to evaluation Academic positions in evaluation A profession Evaluators come from many different primary disciplines and professions Generation of standards, ethics of practice and capabilities specific to the field Specific training in evaluation methods and approaches Multi-discipline / Trans-discipline Evaluation draws on many disciplines’ concepts and methods, but adapted to the social and political practice of evaluation.

Evaluation as a field of practice Evaluation and Research - spectrum of views Purposively different Evaluation - informs decision making in policy and practice Research - contributes to knowledge in specific subjects Much shared methodological territory Both empirically based Some consider evaluation an applied social science… … one that draws evaluative conclusions about the delivery and effectiveness of interventions that have social and economic objectives Others are clear it is not the simple application of social science methods to solve social problems. 

Evaluation quality and rigour What type of evaluation is best? One size does not fit all Evaluation is pluralistic range of appropriateness of designs, methods and approaches Validity of an evaluation needs to be related to purpose

Evaluation quality and rigour Many ways to establish quality and rigour Trustworthy evaluation needs designs & methods that are: unbiased, precise, conceptually sound, and repeatable. Strong advocacy for quantitative (quasi)experiment designs, and focus on counterfactual causality use of the Maryland Scale (counterfactual strength - 1: before/after or with/without  5: RCT) But quality and rigour are not methodologically dependent …or absolute: ‘right rigour’ Non-experimental methods can be rigorous Mixed methods may confer added rigour

Evaluation quality and rigour Causal inference - the counterfactual to counterfactuals Generative frameworks: Theory-based approaches; test & confirm causal processes; Theories of Change, Contribution Analysis, Process Tracing, Realist Evaluation Comparative frameworks: Case-based approaches; Comparison across and within cases of causal factors; QCA, meta-ethnography Participatory frameworks: Validation by participants of effect caused by intervention Complexity – theory-based approaches, developmental evaluation, systems thinking, modelling, Bayesian analysis Realist evaluation draws on a generative notion of causation which involves an iterative process of theory building, testing and refinement which allows causal statements about attribution to be made. Evaluation findings should demonstrate what worked, for whom, how, and in what circumstances. The need for small n approaches arises when data are available for only one or a few units of assignment, with the result that experiments or quasi-experiments in which tests of statistical differences in outcomes between treatment and comparison groups are not possible. For large n analyses, experiments provide a powerful tool for attributing cause and effect. The basis for experimental causal inference stems from the manipulation of one (or more) putative causal variables and the subsequent comparison of observed outcomes for a group receiving the intervention (the treatment group) with those for a control group which is similar in all respects to the group receiving the intervention, except in that it has not received the intervention (Duflo et al., 2008, White, 2011). The small n approaches outlined above draw on a different basis for causal inference. They set out to explain social phenomena by examining the underlying processes or mechanisms which lie between cause and effect. Whereas experimental approaches infer causality by identifying the outcomes resulting from manipulated causes, a mechanism-based approach searches for the causes of observed outcomes. Theory-based and case-based approaches are especially suited to unpicking ‘causal packages’ - how causal factors combine - and what might be the contribution of an intervention. However such approaches are not good at estimating the quantity or extent of a contribution. Their overarching aim is to build a credible case which will demonstrate that there is a causal relationship between an intervention and observed outcomes. Mohr (1999) suggests the analogy of a medical diagnosis or a detective investigating a case as being a good one to describe the process of elimination and accumulation of evidence by which causal conclusions can be reached. Multiple causal hypotheses are investigated and critically assessed. Evidence is built up to demonstrate the different connections in the causal chain, with the ultimate goal of providing sufficient proof to demonstrate a plausible association, as in the case of Contribution Analysis, or to substantiate a causal claim “beyond reasonable doubt”.

The Value of Evaluation Not just based on rigour Cost : Benefit of evaluation maximise the marginal value of new knowledge Benefit greatest where: uncertainty is high the evidence levers high numbers (people benefitted, £s budgeted) uptake pathways are clear (comms, audience, opportunity) Proportionate cost

Challenges in Evaluation Rigour across methods and approaches Open debate on quality in evaluation More evaluation-oriented version of the Maryland scale. Method wars? Pax methologica? More complex evaluands Funding for evaluation patchy pros & cons Maximising the Value of Evaluation Communication & uptake

Resources Alternative approaches to causality: the DFID funded ‘Stern report’ on methods for impact assessment [https://www.oecd.org/derec/50399683.pdf] White & Phillips 3ie ‘attribution in small n impact evaluations’ [http://www.3ieimpact.org/media/filer_public/2012/06/29/working_ paper_15.pdf]