Narrowing the evaluation gap

Slides:



Advertisements
Similar presentations
Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Advertisements

Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
The contribution of research to teachers’ professional learning and development Philippa Cordingley Centre for the Use of Research and Evidence in Education.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Really Using (Useful) Theories of Change IPDET 2013 John Mayne, Advisor on Public Sector Performance
1 SUMMER CONFERENCE What is “Mixed Methods” Research Research studies that include both QUALitative and QUANtitative data. QUAL and QUAN data purposely.
Plan © Plan Assessing programme effectiveness at the global level in a large and complex organisation Presentation delivered to the conference on Perspectives.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Today.. Overview of my realist synthesis Reflections on the process
Improving skills and care standards in the support workforce: a realist synthesis of workforce development interventions Jo Rycroft-Malone, Christopher.
IPSP Outcomes Reporting Framework What you need to know and what you need to do.
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
Guidance for Analyses of Pilot Interventions European Workplace and Alcohol Berlin, 9 November 2012 Jon Dawson.
Researching Innovation.  By definition, an innovation means that you are testing out something new to see how effective it is.  This also means that.
Qualitative Research Quantitative Research. These are the two forms of research paradigms (Leedy, 1997) which are qualitative and quantitative These paradigms.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Creating Public Relations Campaigns
Aims To identify and explore some key questions regarding evaluation, impact and outcomes To introduce and exemplify a 'level model' of evaluation of professional.
Making Causal Claims in Non-Experimental Settings
Framework for Getting Results at Scale
DATA COLLECTION METHODS IN NURSING RESEARCH
Monitoring and Evaluation Frameworks
Front Line Innovation and Trials
Food and Agriculture Organization of the United Nations
Managing for Results Capacity in Higher Education Institutions
Evaluating Better Care Together
Qualitative research: an overview
Cardiff Flying Start & Results Based Accountability
IPSP Outcomes Reporting Framework
Carrie Jackson and claire thurgate
Introduction to Evaluation
Evaluation of 15 projects – ‘Supporting School Leavers’
Assess Plan Do Review Resource 1: Types of Evaluation – which type of approach should I use? There are a number of different types of evaluation, the most.
HEALTH IN POLICIES TRAINING
Qualitative Research Quantitative Research.
Conducting Efficacy Trials
Action Research in Education
Business Environment Dr. Aravind Banakar –
Business Environment Dr. Aravind Banakar –
Business Environment
Business Environment
Business Environment
Business Environment
REACH Mission & Objectives
Teaching and Educational Psychology
YSGOL GWYDDORAU GOFAL IECHYD / SCHOOL OF HEALTHCARE SCIENCES
Intro slide to all of these:
TEACHING PERFORMANCE STANDARDS FRAMEWORK
Learning to learn to teach in initial teacher education
STTEP project Supporting Transition Towards Education Progression
Quantitative and Qualitative Approaches Dr. William M. Bauer
Implementation Guide for Linking Adults to Opportunity
Community Integration and Development USP Conference May 2013
A Focus on Outcomes and Impact
Problems, Purpose and Questions
Regulated Health Professions Network Evaluation Framework
Dept. of Science & Technology Education
Standard for Teachers’ Professional Development July 2016
Carlo Gianelle, Fabrizio Guzzo
SUSTAINABLE MICRO-FINANCE for WOMEN’S EMPOWERMENT
Chapter 15 Community As Client: Applying the Nursing Process
Evaluating Community Link Working in Scotland: Learning from the ‘early adopters’ Jane Ford, NHS Health Scotland Themina Mohammed & Gordon Hunt, NSS Local.
Teacher Evaluator Student Growth Retraining Academy
Planning for Evaluation: An Introduction to Logic Models
Root Cause Analysis Identifying critical campaign challenges and diagnosing bottlenecks.
Presentation transcript:

Narrowing the evaluation gap Session 2.A. Not just window dressing – how IPE can be used to understand effects from impact evaluations 2018 EEF Evaluators’ Conference Narrowing the evaluation gap #EEFeval18 @EducEndowFound

By re-thinking the design and use of logic models Using the IPE to better understand the effects from impact evaluations By re-thinking the design and use of logic models Dr Bronwen Maxwell

Some potential contributions of an IPE based on theory-based evaluation principles Distinguish between programme, implementation and methodological deficiency Strengthen learning by exploring variation among different levels of implementation and different contexts Understand programme success or failure Unpack the 'black box' Understand causal logic and capture causal mechanisms Address limitation of experimental designs (why and how?) Understand causation A description of how an intervention leads to change that lies between the minor but necessary working hypotheses … and the all-inclusive systematic efforts to develop a unified theory Develop middle range theory [EEF] Evaluators should develop an intervention logic model or theory of change in partnership with the delivery team to inform the evaluation [...] It is important to know not just if an intervention ‘works’ in terms of producing desired outcomes, but also if it works in the manner theorised." (Humphrey et al, 2015 p.9).

Some potential contributions of an IPE based on theory-based evaluation principles Distinguish between programme, implementation and methodological deficiency Strengthen learning by exploring variation among different levels of implementation and different contexts Understand programme success or failure Unpack the 'black box' Understand causal logic and capture causal mechanisms Address limitation of experimental designs (why and how?) Understand causation A description of how an intervention leads to change that lies between the minor but necessary working hypotheses … and the all-inclusive systematic efforts to develop a unified theory Develop middle range theory

Some potential contributions of an IPE based on theory-based evaluation principles Distinguish between programme, implementation and methodological deficiency Strengthen learning by exploring variation among different levels of implementation and different contexts Understand programme success or failure Unpack the 'black box' Understand causal logic and capture causal mechanisms Address limitation of experimental designs (why and how?) Understand causation A description of how an intervention leads to change that lies between the minor but necessary working hypotheses … and the all-inclusive systematic efforts to develop a unified theory Develop middle range theory

Potential contribution of logic models to an IPE Articulate thinking behind programme working so this can be tested Gather data on implementation and contextual variation Understand programme success of failure Articulate causal logic Provide frame work to explore causal logic and mechanisms, including independent but inter-related mechanisms Understanding causation Explore plausibility of underpinning middle range theory Develop middle range theory

Some issues with the use of logic models in EEF evaluations Issues re: explanation of causality and complexity Programme and implementation theory not distinguished Implementation theory overemphasised Causal mechanisms subsumed into arrows Causal mechanisms poorly evidenced There maybe alternative and/or recursive causal paths There may be independent but inter-related causal paths Ignores context and concurrent programmes

Some issues with the use of logic models in EEF evaluations Issues re: explanation of causality and complexity Programme and implementation theory not distinguished Implementation theory overemphasised Causal mechanisms subsumed into arrows Causal mechanisms poorly evidenced There maybe alternative and/or recursive causal paths There may be independent but inter-related causal paths Ignores context and concurrent programmes

Some issues with the use of logic models in EEF evaluations Issues re: explanation of causality and complexity Programme and implementation theory not distinguished Implementation theory overemphasised Causal mechanisms subsumed into arrows Causal mechanisms poorly evidenced There maybe alternative and/or recursive causal paths There may be independent but inter-related causal paths Ignores context and concurrent programmes

Some issues with the use of logic models in EEF evaluations Issues re: explanation of causality and complexity Programme and implementation theory not distinguished Implementation theory overemphasised Causal mechanisms subsumed into arrows Causal mechanisms poorly evidenced There maybe alternative and/or recursive causal paths There may be independent but inter-related causal paths Ignores context and concurrent programmes

Some issues with the use of logic models in EEF evaluations Issues re: context Oversimplifies context as a set of features Context is dynamic and maybe agentic Contextual factors are relational Change is immanent rather than external Contextual factors are socio-historically and culturally located

Some issues with the use of logic models in EEF evaluations Issues re: context Oversimplifies context as a set of features Context is dynamic and maybe agentic Contextual factors are relational Change is immanent rather than external Contextual factors are socio-historically and culturally located

Some issues with the use of logic models in EEF evaluations Issues re: context Oversimplifies context as a set of features Context is dynamic and maybe agentic Contextual factors are relational Change is immanent rather than external Contextual factors are socio-historically and culturally located

Some issues with the use of logic models in EEF evaluations Issues re: context Oversimplifies context as a set of features Context is dynamic and maybe agentic Contextual factors are relational Change is immanent rather than external Contextual factors are socio-historically and culturally located

Example of issues of causation, complexity and context: EEF Booktrust Summer Active Evaluation

Moving towards better understanding of effects Logic models developed using research evidence on : Evidence-informed logic model The limitations of logic models In relation to causation, complexity and context The initiative being evaluated Likely causal and implementation processes, in context and taking account of complexity Evidence relating to connections between elements in implementation paths and to the causal processes that theorise how these connections occur and lead to outcomes.

Moving towards better understanding of effects Logic models developed using research evidence on : Evidence-informed logic model The limitations of logic models In relation to causation, complexity and context The initiative being evaluated Likely causal and implementation processes, in context and taking account of complexity Evidence relating to connections between elements in implementation paths and to the causal processes that theorise how these connections occur and lead to outcomes.

Moving towards an evidence based logic model Building the model Knowledge mobilisation literature crucial to identify three core causal mechanisms, detail on implementation processes, recognise complexity/ non-linearity of knowledge 'transfer'. Contextual factors -drawn from literature on research use Benefits Directing data gathering and analysis in ways that may not have been apparent otherwise Led to conclusion that effective advocacy-based scale-up has three key and inter-related system components - each of which different sets of enabling characteristics Iterative combination of research evidence and conversations to create underpinning hypothesis

Improving the impact of teaching assistants: EEF scale-up campaign

Moving towards an evidence based logic model Building the model: Multi-strand complexity/ causal and implementation processes Effective professional learning leading to teacher change Effective pedagogical strategies leading to pupil outcomes Professional learning and communities improving teacher retention Benefits Supported the design of tools and analysis for a pilot evaluation Provided evidence of the path from inputs to outputs- supporting the plausibility of the model Identified crucial programme components and ways of strengthening hypothesised links between inputs and outcomes.

Evaluation of the RETAIN programme : CPD for early career KS1 teachers

Key components of an evidence-informed logic model

Evaluating implementation and mechanisms Chris Bonell Professor of Public Health Sociology London School of Hygiene and Tropical Medicine

Process evaluation evaluates 2 kinds of processes Processes of implementation – actions providers and clients enact as part of the delivery e.g. train teachers to do restorative practice; teach social & emotional skills curriculum; participate in school-wide action group Mechanisms of impact – the mechanisms that interventions trigger which generate outcomes e.g. developing new skills; increasing commitment to school; developing new relationships

Processes of implementation Quantitative metrics: fidelity/quality; reach; dose delivered/received; acceptability - measured via structured observations/videos, logbooks, surveys etc. Qualitative accounts - measured via semi-structured observations, interviews, focus groups etc. Frameworks e.g. Carl May’s normalisation process theory – coherence, cognitive participation, collective action, reflexive monitoring Develop and test hypotheses about how context affects implementation e.g. baseline capacity and culture Rationale - assess whether failure of implementation or theory explains null result; refinement of implementation; assessment of potential transportability of intervention

Mechanisms of action Realist evaluation: what works for whom, how and under what conditions? Focused on ‘context-mechanism-outcome configurations’ e.g. engaging students in school to reduce risk behaviours will work better in schools with low baseline engagement and supportive SLT Hypotheses from formal or informal theory Refine with qualitative research Test with statistical analyses of mediation (what proximal outcomes explain distal outcomes?) and moderation (what baseline variables account for variation in impacts?) Rationale – further assessment of potential transportability of intervention; better theory

Mechanisms of action Realist evaluation: what works for whom, how and under what conditions? Develop, refine and test context-mechanism-outcome configurations e.g. engage students in school to reduce risk behaviours will work better in schools with low baseline engagement and supportive SLT Hypotheses from formal or informal theory Refine with qualitative research Test with statistical analyses of mediation (what proximal outcomes explain distal outcomes?) and moderation (what baseline variables account for variation in impacts?) Rationale – further assessment of potential transportability of intervention; better theory

QUESTIONS FOR SESSION 2.A ON IMPLEMENTATION AND PROCESS EVALUATION MEHODS How well do current IPE approaches support our understanding of impact results, particularly when there are null effects? How can IPE be designed to enhance impact evaluations, particularly within QED and RCT designs? What specific models or frameworks are most useful? Are there any approaches, models or frameworks that you think are underexploited? Thinking particularly from a design perspective, what elements or qualities make for good IPE? Conversely, what makes for poor quality IPE? #EEFeval18 @EducEndowFound