The need-to-knows when commissioning evaluations Rick Davies, Stockholm, 27 June, 2016.

Slides:



Advertisements
Similar presentations
Active Reading: “Scientific Processes”
Advertisements

Making Contribution Claims IPDET 2011 Ottawa John Mayne, Ph D Advisor on Public Sector Performance Adjunct Professor, University of Victoria
1 Procedural Analysis or structured approach. 2 Sometimes known as Analytic Induction Used more commonly in evaluation and policy studies. Uses a set.
Hypotheses. Hypotheses are about applying theory to your research question A good research paper starts with a context that conceptualizes the problem,
Really Using (Useful) Theories of Change IPDET 2013 John Mayne, Advisor on Public Sector Performance
Asking the right questions Chapter 3. Nonimpact evaluations and when they are most useful Module 3.1.
© 2009 All Rights Reserved Jody Underwood Chief Scientist
Social capital & Spirit’s ToC Applying Understanding Society data to test a third sector theory of change on social capital and sports participation Pat.
Workshop A. Development of complex interventions Rob Anderson, PCMD Nicky Britten, PCMD.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Which list of numbers is ordered from least to greatest? 10 –3, , 1, 10, , 1, 10, 10 2, 10 – , 10 –3, 1, 10, , 10 –3,
Helpful hints for planning your Wednesday investigation.
Certificate IV in Project Management Assessment Outline Course Number Qualification Code BSB41507.
Certificate IV in Project Management Assessment Outline Course Number Qualification Code BSB41507.
Theory of Change Articulating your project’s design theory.
Ann M. Doucette, PhD Bringing Rigor to Qualitative Data Qualitative Comparative Analysis Ann M. Doucette, PhD 25 October 2013 Midge Smith Center for Evaluation.
Representing Simple, Complicated and Complex Aspects in Logic Models for Evaluation Quality Presentation to the American Evaluation Association conference,
Bringing Diversity into Impact Evaluation: Towards a Broadened View of Design and Methods for Impact Evaluation Sanjeev Sridharan.
Working with “loose” Theories of Change Alternative approaches to exploring and testing complex causal models of development interventions 21/10/2015Rick.
Monitoring and Evaluation Presentation by Kanu Negi Office of Development Effectiveness (ODE), DFAT and David Goodwins Coffey International Development.
Case Studies and Case Selection Research Methods Workshops Oisín Tansey.
Statistical Consultation. Conceptualization  Consultation is Collaboration  Content Expertise and Methodological Expertise not held by the same person.
Nursing Research Week Two Research Problem Definition: a perplexing or troubling condition. Sources – Clinical experience – Nursing literature – Social.
In response… How QCA can be combined with other methods in evaluation? How can various ideas that are part of the whole QCA package be useful in themselves?
Presentation at UKES and CDI, 27 and 9 April 2016 Jeremy Holland and Florian Schatz Evaluating complex change across projects and contexts: Methodological.
AssessPlanDo Review Next Steps… Now you are ready to undertake the evaluation. This will normally include: 1.Collecting data 2.Analysing data The type.
Stages of Research and Development
DoSE Meeting October 28, 2016 Lori Dehart, Behavior Consultant
Chapter 1, Section 2 Answers to review for worksheet pages
Decision Analysis Lecture 7
Responding to Complexity in Impact Evaluation
Evaluability Assessments
Accelerating Student Achievement Pilot (A.S.A.P.) – Update –
Technical Assistance on Evaluating SDGs: Leave No One Behind
Making Causal Claims in Non-Experimental Settings
Resource 1. Involving and engaging the right stakeholders.
Technical Assistance on Evaluating SDGs: Leave No One Behind
Food and Agriculture Organization of the United Nations
Input Evaluation the origins of the intervention
How to be a Team Player.
AF1: Thinking Scientifically
Introduction to Evaluation
Introduction to Comprehensive Evaluation
Identifying Equivalent Expressions
What can we learn from the Community Budget pilots?
Decision Making.
Conducting Efficacy Trials
Partial Credit Scoring for Technology Enhanced Items
Research proposal MGT-602.
Objective Tree I want to design a project
Section 11.2: Carrying Out Significance Tests
Guide for Terms of Reference A checklist European Commission
What qualifies as evidence of student learning?
Methods Choices Overall Approach/Design
How will discussion days/note taking work in science class this year?
Group Work Lesson 9.
4.1 Selecting Project Purposes and Outcomes
What is a Theory.
4.2 Identify intervention outputs
Project collaborators’ names
Regulated Health Professions Network Evaluation Framework
Challenge Mat! KEY LEARNING POINT: Section 1: STANDARD QUESTIONS!
Criteria for prioritizing health-related problems for research
Alexa’s Back story Building a persona.
Change Control Diagram
The Scientific Method.
Question.
[List team members here] June 14, 2019
Adaptation and Translation
Evaluating the Reliability of a Source
Presentation transcript:

The need-to-knows when commissioning evaluations Rick Davies, Stockholm, 27 June, 2016

What type of questions should I pose if a consultant proposes the use of QCA? What about the evaluation questions that are not about causal attribution? What will you say if anyone asks about average net effects of the project? What will you say if people ask how generalizable the findings are outside this population? Have you got enough cases? Do you have at least a loose ToC that will guide choice of conditions? Have you got sufficient diversity of cases, given the range of causal conditions that you want to explore If diversity is limited, how will you deal with this? Have you got cases with and without the outcome of interest? Are you likely to have many cases of conditions where there will be missing data? What other methods will you use in association with QCA? Have you/your team got both sectoral and (QCA) technical skills? Rick Davies, Stockholm, 27 June, 2016

What questions can be answered with QCA? Questions about complex forms of causality What works in what contexts Strong evidence about configurations but less so about causal mechanisms behind them (where more supporting within-case analysis is needed) Causes of an effect as well as effects of a cause NOT Scale of effect Average effect Rick Davies, Stockholm, 27 June, 2016

When would it be advisable to ask for a QCA in a ToR and what should I then ask for? When there is a “loose” theory of change When there is diversity of context and interventions Where programs are trying to be adaptive / participatory/ decentralized ?Where there are some hypotheses that could be tested, but other alternative explanations also need exploring (via algorithmic search)? Rick Davies, Stockholm, 27 June, 2016

How do I quality assure a QCA-proposal and a QCA-evaluation? Raw data must be available Coding protocol must be available Sensitivity analysis, if dichotomization of data is involved, especially of outcomes Key measures of solutions should be available: consistency, coverage (raw and unique) Cases and conditions rejected and why so If intermediate solutions, what assumptions made Venn diagram? Rick Davies, Stockholm, 27 June, 2016