Conducting Efficacy Trials

Slides:



Advertisements
Similar presentations
Using randomised control trials to evaluate public policy – Presentation to DIISRTE/DEEWR workshop, January 31 Jeff Borland Department of Economics University.
Advertisements

Conference for EEF evaluators: Building evidence in education Hannah Ainsworth, York Trials Unit, University of York Professor David Torgerson, York Trials.
Session 3: Trial management Sarah Miller (Queens, Belfast) Laura Dunne (Queens, Belfast)
Addressing educational disadvantage, sharing evidence, finding out what works Camilla Nevill Evaluation Manager.
Evaluating Grandparent (And Other Relatives) Raising Grandchildren Programs Brookdale Foundation Web Chat, May 2007 Presenter: Kevin Brabazon
How qualitative research contributes to evaluation Professor Alicia O’Cathain ScHARR University of Sheffield 22 June 2015.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Laying the Foundation for Scaling Up During Development.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Pilot and Feasibility Studies NIHR Research Design Service Sam Norton, Liz Steed, Lauren Bell.
The rise of RCTs in education: statistical and practical challenges Kevan Collins.
Parallel Session A - Prevention Proposed List of Minimum Quality Standards.
The Cause…or the “What” of What Works? David S. Cordray Vanderbilt University IES Research Conference Washington, DC June 16, 2006.
Importance of Monitoring and Evaluation. Lecture Overview  Monitoring and Evaluation  How to Build an M&E System.
Researching Innovation.  By definition, an innovation means that you are testing out something new to see how effective it is.  This also means that.
Title Investigators and sites. Clinical Trial Proposal Presentation Template for open forum at the 2017 ASM.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Cari-Ana, Alexis, Sean, Matt
Resource 1. Involving and engaging the right stakeholders.
Workshop to develop theories of change
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Front Line Innovation and Trials
School Improvement School to Circuit to District.
Evaluating Better Care Together
MUSIC THERAPY INTERVENTION
Intervention Development in Elderly Adults (IDEA)
Participatory Action Research (PAR)
Lecture 02.
Assess Plan Do Review Resource 1: Types of Evaluation – which type of approach should I use? There are a number of different types of evaluation, the most.
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
European Network on teacher Education Policies
Business Environment Dr. Aravind Banakar –
Business Environment Dr. Aravind Banakar –
Business Environment
Business Environment
Business Environment
Business Environment
Program Evaluation Essentials-- Part 2
© 2012 The McGraw-Hill Companies, Inc.
Governance and leadership roles for equality and diversity in Colleges
Claire NAUWELAERS, independent policy expert
Open Questions: The respondent is free to give whatever response they wish, in their own words. Closed Questions: The respondent must choose from a limited.
Lessons learnt in evaluating literacy projects
Research Methods Research Methods Lecturer/ Facilitator :
General Notes Presentation length - 10 – 15 MINUTES
Narrowing the evaluation gap
Prof Robin Matthews robindcmatthews.com
Learning Module 11 Case Study Research.
Course Evaluation Ad-Hoc Committee Recommendations
CATHCA National Conference 2018
THE RESEARCH PROCESS.
Narrowing the evaluation gap
Post-2020 discussions 1. State of play of discussions 2. On-going work 3. Questions for debate.
Building a Strong Outcome Portfolio
Standard for Teachers’ Professional Development July 2016
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Applying to the EEF for funding: What are we looking for
PEER LEADERSHIP DEVELOPMENT AND INTEGRATION PROGRAMME
Guidance document on ex ante evaluation
MONITORING AND EVALUATION IN TB/HIV PROGRAMS
OGB Partner Advocacy Workshop 18th & 19th March 2010
OGB Partner Advocacy Workshop 18th & 19th March 2010
What does assessment literacy mean to language teachers?
Data Collection: Designing an Observational System
What makes a good grant application
Reading Paper discussion – Week 4
Recurrent Care Change Project Key Points about Evaluation
Misc Internal Validity Scenarios External Validity Construct Validity
Data for PRS Monitoring: Institutional and Technical Challenges
Case study from a market-systems development programme in Ethiopia
Presentation transcript:

Conducting Efficacy Trials Dr Sarah Miller | EEF Evaluators’ Conference | June 2016 www.qub.ac.uk/cesi

Characteristics of an efficacy trial A trial in which the intervention is tested under ‘ideal’ conditions and often implemented with input from the programme developer. It answers the question ‘can it work’? Usually an earlier pilot trial has already indicated that that the intervention is feasible to implement in a classroom setting (there may also be some evidence of effects) In fact, a good pilot trial is key to a good efficacy trial www.qub.ac.uk/cesi

Theory of change In an efficacy trial: There is a greater focus on understanding the theory of change and underlying causal mechanism(s) The intervention should be underpinned by a conceptually feasible and evidence informed theory of change Consequently, relevant outcomes are selected on the basis of this logic model www.qub.ac.uk/cesi

Internal validity An efficacy trial will have high internal validity (at this stage external validity is less crucial), that is: Sufficient power to detect the estimated effect Robust (and tested) allocation procedures are employed Valid and reliable instruments are used to measure outcomes (as you would plan to in the effectiveness trial) Data collection methods are independent, tried and tested www.qub.ac.uk/cesi

A sufficiently in-depth process evaluation The process evaluation should be sufficiently in-depth to allow you to: Qualitatively test the proposed theory of change Ensure that all key agents are included in the process evaluation and use a range of methods where possible e.g. teacher survey, parent interviews, pupil focus groups Identify issues related to scaling up the intervention e.g. any challenges associated with implementation and delivery Understand and monitor inevitable adaptation in practice www.qub.ac.uk/cesi

A sufficiently in-depth process evaluation Reflect and learn lessons where possible e.g. Attrition (at school and pupil level) – avoid where possible but aim to fully understand it if and when it happens so that it can be avoided/minimised in an effectiveness trial Efficacy trials also provide an opportunity to better understand recruitment and retention - why schools are not keen to participate or drop out post randomisation Importance of regular and efficient communication between developers and evaluators www.qub.ac.uk/cesi