Making Contribution Claims IPDET 2011 Ottawa John Mayne, Ph D Advisor on Public Sector Performance Adjunct Professor, University of Victoria

Slides:



Advertisements
Similar presentations
Chapter 12 Leadership: New Concepts and Applications
Advertisements

High Quality Performance Measures What is Performance Measurement? What makes a high quality performance measure? Copyright © 2012 by JBS International,
Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management.
1 Mateja Bizilj PEMPAL BCOP KEY CONCEPTS AND APPROACHES TO PROGRAMS AND PERFORMANCE Tbilisi, June 28, 2007.
Paris, 11th of July 2008 Quality Assurance in Higher Education Recognition procedures of agencies Bruno CURVALE Head of international affairs at AÉRES.
Faculty of Health & Social Care Improving Safeguarding Practice: Study of Serious Case Reviews Wendy Rose and Julie Barnes.
The Role of Pilots Alex Bryson Policy Studies Institute ESRC Methods Festival, St Catherines College, Oxford University, 1st July 2004.
Outputs, outcomes and impacts Using Theory of Change to plan, develop and evaluate your initiative.
Experimental and Quasiexperimental Designs Chapter 10 Copyright © 2009 Elsevier Canada, a division of Reed Elsevier Canada, Ltd.
WHAT IS HEALTH PROMOTION?
Afterschool Programs That Follow Evidence- based Practices to Promote Social and Emotional Development Are Effective Roger P. Weissberg, University of.
The National Council on Economic Education/John Templeton Foundation Teaching the Ethical Foundations of Economics Lesson 7: Should We Allow a Market For.
Experiences of Integration in Scotland ADASS Spring Conference Peter McLeod Vice President Association of Directors of Social Work.
Social inclusion initiatives: the effect of joined up approaches Justine McNamara and Alicia Payne Paper presented at the 11 th Australian Institute of.
Presented by: Tom Chapel Focus On…Thinking About Design.
1 The Antecedents of Internal Auditors Adoption of Continuous Auditing Technology: Exploring UTAUT in an Organizational Context Ray Henrickson CAIT, CACISA.
Nudge enablers at the Canada Revenue Agency Nudge workshop July 22, 2014.
EVENTS LEADING UP TO AND OUTPUTS OF WCC – 3 Filipe Lúcio.
Document Repositories and the copyright issue Marc Goovaerts Hasselt University Library ODIN-PI TRAINING OSTENDE, May 2008.
2 © Pro Bono Economics  PBE acts as a broker, matching professional economists with charities;  providing pro bono help to measure performance and understand.
Copyright © 2002 by The McGraw-Hill Companies, Inc. All rights reserved Chapter The Future of Training and Development.
1 Personal Development and Performance Review Professional Development.
1. Choosing outcomes and measures - for doing and using research James Lind Alliance Outcomes in clinical research – whose responsibility? 20 November.
Mywish K. Maredia Michigan State University
Exploring Possibility Space: an experiential approach to intelligence analysis Brett Peppler, FAIPIO Adjunct Professor, Macquarie University 8 December.
GODFREY HODGSON HOLMES TARCA
Contribution analysis Anita Morrison and Jackie Horne, Office of the Chief Researcher, Scottish Government.
Session Four: M&E System for AfT bankable projects UNITED NATIONS Economic and Social Commission for Western Asia (ESCWA) Expert Group Meeting on Monitoring.
T EN QUESTIONS TO CONSIDER IN DESIGNING AN IMPACT EVALUATION.
Really Using (Useful) Theories of Change IPDET 2013 John Mayne, Advisor on Public Sector Performance
DISCUSSION PAPER PREPARED FOR THE WHO’S DEPARTMENT OF HUMAN RESOURCES FOR HEALTH ON BEHALF OF THE UN TASK FORCE ON IMPACT ASSESSMENT OF FELLOWSHIPS BY.
Enterprise & Entrepreneurship Education the new curriculum guidelines in Ireland and the UK ISBE 2012, Dublin 6 November Professor David Rae
LIMITLESS POTENTIAL | LIMITLESS OPPORTUNITIES | LIMITLESS IMPACT Copyright University of Reading IMPACT AND THE SCIENCES Anthony Atkin (Research Impact.
Lessons from RAPID’s work on research-policy links John Young.
1 A proposed skills framework for all 11- to 19-year-olds.
Thank you Connections Outcomes – Key Government Priority Individual’s Outcomes Personalisation Quality Strategy Carers Strategy SDS Bill IntegrationCCOF.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Contribution Analysis: An introduction Anita Morrison Scottish Government.
Workshop on Using Contribution Analysis to Address Cause-Effect Questions Danish Evaluation Society Conference Kolding, September 2008 John Mayne, Advisor.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
+ Conceptualizing Influence and Impact in Development Research Katie Wright.
Evaluation – Principles and methods Matt Barnard Head of Evaluation NSPCC.
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
LEVEL 3 I can identify differences and similarities or changes in different scientific ideas. I can suggest solutions to problems and build models to.
Lesson Overview Science in Context THINK ABOUT IT Scientific methodology is the heart of science. But that vital “heart” is only part of the full “body”
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Evaluation: what is it? Anita Morrison. What is evaluation? Evaluation… –…is the process of determining the merit, worth, or value of something, or the.
1 Transport Canada Transports Canada Presentation for CES - Conference 2000 Presented by Jennifer Birch-Jones, Evaluation Manager Gail Young, Evaluation.
Logic Model for Youth Substance Abuse & Use Prevention Programs in OAS Member States September 14, 2005 Ottawa, Ontario Wanda Jamieson & Tullio Caputo.
 Welcome, introductions  Conceptualizing the evaluation problem  stakeholder interests  Divergent and convergent processes  Developing evaluation.
Mapping the logic behind your programming Primary Prevention Institute
Demonstrating the Value and Benefits of Career Development Services to Social and Economic Growth CDSWG Report: 2009.
Impact of evaluations matters IDEAS Conference 2011, Amman “Evidence to Policy: Lessons Learnt from Influential Impact Evaluations” Presenter: Daniel Svoboda,
Evaluation design and implementation Puja Myles
Independent Enquirers Learners process and evaluate information in their investigations, planning what to do and how to go about it. They take informed.
Outcomes Thinking* Christine Jost Linking Knowledge with Action Research Theme KMC4CRP2 workshop, Addis Ababa, 4 December 2013 * Drawing from the presentation.
Office of the Auditor General of Canada Modernizing Accountability A need for evaluation Presentation to the CES 2003 Annual Conference Vancouver John.
Outcomes Working Group: Webinar 2: Theory of Change Facilitators: Frances Sinha, Director EDA Rural Systems (India) and board member of SPTF. Anton Simanowitz,
Learning By Doing (Or Looping the Loop, Scooping the Poop & Shooting the Hoop)
Theory of Change Articulating your project’s design theory.
Making Contribution Analysis Work The Benefits of Using Contribution Analysis The 25 th Annual Conference of the American Evaluation Association.
7 common challenges in using theory of change - and how to address them Professor Patricia Rogers BetterEvaluation Royal Melbourne Institute of Technology,
Lesson Overview Lesson Overview Science in Context Lesson Overview 1.2 Science in Context Scientific methodology is the heart of science. But that vital.
The need-to-knows when commissioning evaluations Rick Davies, Stockholm, 27 June, 2016.
How to show your social value – reporting outcomes & impact
Making Causal Claims in Non-Experimental Settings
Danish Evaluation Society Conference Kolding, September 2008
4.2 Identify intervention outputs
A Focus on Outcomes and Impact
Regulated Health Professions Network Evaluation Framework
Presentation transcript:

Making Contribution Claims IPDET 2011 Ottawa John Mayne, Ph D Advisor on Public Sector Performance Adjunct Professor, University of Victoria

John Mayne Advisor on Public Sector Performance 2 The context An intervention is expected to contribute to certain desired results The desired results have been observed to occur No single factor likely ‘caused’ the results; there are several players involved Alternative approaches (such as RCTs, quasi- experiments) not available or possible But there is a need to say something useful about the contribution the intervention is making: Are you making a difference?

John Mayne Advisor on Public Sector Performance Theory-based Evaluations Growing acceptance of the need for theory-based approaches To better design interventions To understand what works where and when Numerous approaches Realist evaluation Theory of change approaches 3

John Mayne Advisor on Public Sector Performance 4 outputs (goods and services produced by the program) activities (how the program carries out its work) intermediate outcomes (the benefits and changes resulting from the outputs) end outcomes (impacts) (the final or long-term consequences) Examples negotiating, consulting, inspecting, drafting legislation Examples checks delivered, advice given, people processed, information provided, reports produced Examples satisfied users, jobs found, equitable treatment, illegal entries stopped, better decisions made Examples environment improved, stronger economy, safer streets, energy saved Immediate outcomes (the first level effects of the outputs) Examples actions taken by the recipients, or behaviour changes Results A results chain External Factors

John Mayne Advisor on Public Sector Performance 5 outputs (goods and services produced by the program) activities (how the program carries out its work) intermediate outcomes (the benefits and changes resulting from the outputs) end outcomes (the final or long-term consequences) Examples negotiating, consulting, inspecting, drafting legislation Examples checks delivered, advice given, people processed, information provided, reports produced Examples satisfied users, jobs found, equitable treatment, illegal entries stopped, better decisions made Examples environment improved, stronger economy, safer streets, energy saved Immediate outcomes (the first level effects of the outputs) Examples actions taken by the recipients, or behaviour changes Results Why will these immediate outcomes come about? Results chain links External Factors

John Mayne Advisor on Public Sector Performance 6 Theories of change A results chain with embedded assumptions, risks and other explanatory factors identified An explanation of what has to happen for the results chain to work Reduction in smoking Anti-smoking campaign Assumptions: target is reached, message is heard, message is convincing, no other major influences at work Risks: target not reached, poor message, peer pressure to smoke very strong Other Explanatory Factors: reduction due to trend pressure or price increases

John Mayne Advisor on Public Sector Performance 7 A Generic Theory of Change Activities and Outputs Reach & Reaction Unintended effects Changes in knowledge, attitudes skills, opportunities and incentives Unintended effects Behaviour changes Unintended effects End Results Theory of Change Unintended effects Assumptions: How are behavioural changes in the target population expected to influence the desired end result? What has to happen? What factors influence these processes? Risks: Risks to the link not occurring. Other Explanatory Factors: Socio- economic factors Assumptions: How are changes in knowledge, attitudes, skills, opportunities and/or incentives expected to change behaviour? What has to happen? What factors influence these processes? Risks: Risks to the link not occurring. Other Explanatory Factors: Peer or trend pressure; other interventions Assumptions: How does the intervention expect to enhance knowledge, attitudes skills, opportunities and/or incentives? What has to happen? What factors influence these processes? Risks: Risks to the link not occurring. Other Explanatory Factors: Other interventions; self-learning Assumptions: How and to what extent does the intervention output expect to reach people? What has to happen? What contextual factors influence these processes? Risks: Risks to the link not occurring. Assumptions: How do external factors influence the realization of the intervention’s ToC? Risks: Risks to the links in the ToC not occurring as expected. Other Explanatory Factors: Socio-economic factors; other interventions External Influences

John Mayne Advisor on Public Sector Performance Addressing causality “The only way to deal with causality is to use a counterfactual” NOT TRUE Philosophy of science discusses several alternative perspectives on causality Successionist (Hume & Mill’s Methods of Agreement and Differences) Generative (mechanistic, or process causality) 8

John Mayne Advisor on Public Sector Performance Addressing Causality The gold standard debate (RCTs et al) Intense debate underway, especially in development impact evaluation In concept, RCTs may be great. In practice, RCTs have problems and often limited applicability Then what do we do? 9

John Mayne Advisor on Public Sector Performance Causal Questions 1.Has the intervention caused the result? o What would have happened without the intervention? 2.Has the intervention made a difference? o What contribution has the intervention made? 3.Why has the result occurred? o What role did the intervention play? 10

John Mayne Advisor on Public Sector Performance Mechanistic Causation Process or generative causality Tracing the links in the theory between events The alternative to successionist (counterfactual) approaches—variation causality Everyday causality: auto mechanic, air crashes, forensic work, doctors

John Mayne Advisor on Public Sector Performance 12 Contribution analysis: the theory There is a theory behind the intervention with expected results The activities of the intervention were implemented as planned The intervention theory is supported by evidence; the sequence of results is being realized, assumptions are holding Other influencing factors have been assessed and accounted for

John Mayne Advisor on Public Sector Performance 13 The Contribution Claim Therefore, It is reasonable to conclude that the intervention is making a difference—it is contributing to (influencing) the desired results This is taking a mechanistic approach to causality: understanding & confirming the causal mechanisms at work in an intervention

John Mayne Advisor on Public Sector Performance 14 Contribution analysis: the practice 1.Set out the attribution problem 2.Critically develop the expected theory of change 3.Gather the existing evidence 4.Assess the contribution story 5.Seek out additional evidence 6.Revise & strengthen the contribution story

John Mayne Advisor on Public Sector Performance Developments in CA EES Prague Conference CA Forum website group.org/-Forum-.htmlhttp:// group.org/-Forum-.html Upcoming Evaluation journal Special issue on CA DfiD work on alternative methods 15

John Mayne Advisor on Public Sector Performance Main Messages Results chains, et al should not be seen as theories of change Counterfactuals are not necessary (nor sufficient) for ‘proving’ causality Key impact evaluation questions should be: Why has the result occurred? What has been the intervention’s contribution? Contribution analysis (and related approaches) produce contribution claims 16

John Mayne Advisor on Public Sector Performance Some References Mayne, J. (2011). Addressing Cause and Effect in Simple and Complex Settings through Contribution Analysis. In Evaluating the Complex, R. Schwartz, K. Forss, and M. Marra (Eds.), Transaction Publishers. Mayne, J. (2008). Contribution Analysis: An Approach to Exploring Cause and Effect, ILAC Brief. Available at ilac.org/files/publications/briefs/ILAC_Brief16_Contributio n_Analysis.pdf Mayne, J. (2001). Addressing Attribution through Contribution Analysis: Using Performance Measures Sensibly. Canadian Journal of Program Evaluation, 16(1), See also bvg.gc.ca/domino/other.nsf/html/99dp1_e.html bvg.gc.ca/domino/other.nsf/html/99dp1_e.html Funnell, S. and P. Roggers (2011). Purposeful Program Theory. Jossey-Bass. 17