Evaluation – Principles and methods Matt Barnard Head of Evaluation NSPCC.

Slides:



Advertisements
Similar presentations
EN Regional Policy EUROPEAN COMMISSION Impact evaluation: some introductory words Daniel Mouqué Evaluation unit, DG REGIO Brussels, November 2008.
Advertisements

© 2009 Berman Group. Evidence-based evaluation RNDr. Jan Vozáb, PhD partner, principal consultant Berman Group.
Early stages of research programmes: pilot and feasibility studies From Scoping to Activity Rehabilitation Conference A Byrne.
Mywish K. Maredia Michigan State University
UNSW Strategic Educational Development Grants
Social Impacts Measurement in Government and Academia Daniel Fujiwara Cabinet Office & London School of Economics.
Philip Davies The Challenges Ahead for Impact Evaluation Studies in Poland Using Mixed Methods of Evaluation for Policy Making Purposes.
Systematic Review of the Effectiveness of health behavior interventions based on TTM.
Designing Influential Evaluations Session 5 Quality of Evidence Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014.
QUASI-EXPERIMENTAL STUDY DESIGNS IN EVALUATING MEDICINES USE INTERVENTIONS 1 Lloyd Matowe 2 Craig Ramsay 1 Faculty of Pharmacy, Kuwait University 2 HSRU,
From research question to objectives via a literature review Tim Dolin.
Results-Based Management: Logical Framework Approach
VIII Evaluation Conference ‘Methodological Developments and Challenges in UK Policy Evaluation’ Daniel Fujiwara Senior Economist Cabinet Office & London.
SOWK 6003 Social Work Research Week 7 Designs for Evaluating Programmes and Practice – Experimental designs By Dr. Paul Wong.
EEN [Canada] Forum Shelley Borys Director, Evaluation September 30, 2010 Developing Evaluation Capacity.
Evidence: What It Is And Where To Find It Building Evidence of Effectiveness Copyright © 2014 by JBS International, Inc. Developed by JBS International.
Hazard Analysis and Critical Control Points
CASE STUDIES IN PROJECT MANAGEMENT
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar May 2014 Note: These slides are intended as guidance.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 How Researchers Generate.
Impact assessment framework
Developing a ‘Bench to Bedside’ Commercial Collaboration Jo Chambers.
Rapid Evidence Assessments: some practical considerations DfES Rapid Reviewing Seminar 17th November 2004.
Medical Audit.
How qualitative research contributes to evaluation Professor Alicia O’Cathain ScHARR University of Sheffield 22 June 2015.
Becoming a trainer – design of training. Aims and objectives Aim: to explore the design stage of the training cycle By the end of this workshop, you should.
Overview of the State Systemic Improvement Plan (SSIP)
Introduction to Evaluation Odette Parry & Sally-Ann Baker
QUICK OVERVIEW - REDUCING REOFFENDING EVALUATION PACK INTRODUCTION TO A 4 STEP EVALUATION APPROACH.
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
LOGICAL FRAMEWORK by Lorelyn T. Dumaug.
Natalie Egleton Program Manager Maximising your grant-making impact 2015 Community Foundations Forum.
DART – Evaluation of a large scale project Matt Barnard Head of Evaluation NSPCC.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Preliminary Results – Not for Citation Strengthening Institutions Program Webinar on Competitive Priority on Evidence April 11, 2012 Note: These slides.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar April 25, 2012 Note: These slides are intended as.
Pay for performance and impact evaluation design Practical lessons from OECD review Y-Ling Chi, OECD.
A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson.
The P Process Strategic Design
INTEGRATED ASSESSMENT AND PLANNING FOR SUSTAINABLE DEVELOPMENT 1 Click to edit Master title style 1 Evaluation and Review of Experience from UNEP Projects.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Chapter 1: Fundamental of Testing Systems Testing & Evaluation (MNN1063)
Evaluation design and implementation Puja Myles
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Kathy Corbiere Service Delivery and Performance Commission
REGIONAL TRAINING UNIT Leading and Managing Achievements and Standards in the Special School and the Learning Community.
Induction The Diploma for the Children and Young People’s Workforce.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Analysis and Critical Thinking in Assessment 1. What is the problem? Gathering information Using information to inform decisions/ judgment Synthesising.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
AGRO PARKS “The Policy Cycle” Alex Page Baku November 2014.
Monitoring and evaluation 16 July 2009 Michael Samson UNICEF/ IDS Course on Social Protection.
Deciphering “Evidence” in the New Era of Education Research Standards Ben Clarke, Ph.D. Research Associate - Center for Teaching and Learning, University.
National Cancer Institute U.S. DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Dissemination & Implementation Research: Study Designs.
Developing an Evaluation Strategy George Selvanera Senior Consultant Cordis Bright Ltd
IEc INDUSTRIAL ECONOMICS, INCORPORATED Attributing Benefits to Voluntary Programs: Practical and Defensible Approaches Cynthia Manson, Principal June 23,
Comprehensive Evaluation Concept & Design Analysis Process Evaluation Outcome Assessment.
Chapter 6 Selecting a Design. Research Design The overall approach to the study that details all the major components describing how the research will.
Stages of Research and Development
Title Investigators and sites. Clinical Trial Proposal Presentation Template for open forum at the 2017 ASM.
Writing a sound proposal
Continuous Improvement Plan (CIP)
Our new quality framework and methodology:
Logic Models and Theory of Change Models: Defining and Telling Apart
Building a Strong Outcome Portfolio
Monitoring and evaluation
Presentation transcript:

Evaluation – Principles and methods Matt Barnard Head of Evaluation NSPCC

Purpose of evaluation Definition “Examine how a policy or intervention was designed and carried out and with what results.” (Magenta Book) Asks objective questions –What were the impacts? –How was it delivered? –What were barriers and facilitators? –Did it deliver value for money? Aims to provide –‘Scientific’ basis for policy making

Evaluation Design Process Evaluation findings Intervention Design Logic model Intervention implementation planning Evaluation implementation Intervention implementation Evaluation Design

Logic models Characteristics –Mechanisms not processes –Key steps not every step –Explanatory not descriptive –Reflects theoretical assumptions Benefits –Sense check –Identifies realistic outcomes –Facilitates evaluation design

Strength of design Weak design (Poor/ no counterfactual) Strong design (Realistic counterfactual) Low power (Small numbers/ effect size) Unlikely to detect difference/ Low confidence in attribution Unlikely to detect difference/ High confidence in attribution High power (Large numbers/ effect size) Likely to detect difference/ Low confidence in attribution Likely to detect difference/ High confidence in attribution Strength of design matrix

Strength – Evaluation design Randomized controlled trial Quasi-experimental design Before and after measures

Types of design RCT –Individual randomization –Cluster randomization/ roll out –BAU/waiting list/alternative services Quasi-experimental designs –Matched area/ groups –Matched individual –Interrupted time series –Regression discontinuity

Factors influencing methodology Intervention stage of development –Early exploration –Defined and established but not proven –Transferability Potential Costs and benefits –Resources –Timescales

Key principles Clarity about key question –Avoid ‘default’ questions Methods matched to question –Ensure methods match desired questions Claims match evidence –Avoid over-claiming Have a coherent story to tell