Moving towards evidence-based policy

Slides:



Advertisements
Similar presentations
Theory-Based Evaluation:
Advertisements

1 EU Strategy for the Baltic Sea Region Evaluation: Setting Outcome Indicators and Targets Seminar: 15 March 2011, La Hulpe Veronica Gaffey Acting Director.
Agenda - January 28, 2009 Professional Learning Community – Jefferson HS Learning by Doing What does the data tell us? ITED results SIP Goals Data Questions.
Building capacity for assessment leadership via professional development and mentoring of course coordinators Merrilyn Goos.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Evidence Battles in Evaluation: How can we do better? Mel Mark Penn State University.
Roger D. Goddard, Ph.D. March 21, Purposes Overview of Major Research Grants Programs Administered by IES; Particular Focus on the Education Research.
RBM in the context of Operations and Programme and Project Management Material of the Technical Assistance Unit (TAU)
WHY LARGE-SCALE RANDOMIZED CONTROL TRIALS? David Myers Senior Vice President IES 2006 Research Conference David Myers Senior Vice President IES 2006 Research.
Health Systems and the Cycle of Health System Reform
Network to Transform Teaching (NT3) NBPTS/KEA/Gates Foundation Investment in Board Certification By System Leaders.
Addressing educational disadvantage, sharing evidence, finding out what works Camilla Nevill Evaluation Manager.
Evidence based research in education Cathy Gunn University of Auckland.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
CBR 101 An Introduction to Community Based Research.
WaterAid’s Experience of Using VfM Background: New Global Strategy focus on ‘influencing’ – Role of both of Service delivery and policy advocacy Importance.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
Assessing Teacher Effectiveness Charlotte Danielson
Connect2Complete Theory of Change Development for Colleges and State Offices November 10, 2011 OMG Center for Collaborative Learning.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Enhancing Evaluation Stakeholder Responsiveness Through Collaborative Development of Data Collection Instruments Karen Kortecamp, PhD The George Washington.
People lives communities Supported employment for disabled people Commissioning and Contracting Training Conference 12 September 2014 Rich Watts, NDTi.
Schools as organisations
Stages of Research and Development
Introduction Social ecological approach to behavior change
Overview of Intervention Mapping
Welcome : Governor Refresher How we can demonstrate Impact.
Finding, Developing and Capitalizing on the Capacity Dividend
School Climate Transformation Grants SEA Session October
Montessori Research Initiative
MATERI #6 Proses Perancangan Intervensi
Building Evidence of Effectiveness
Program Evaluation ED 740 Study Team Project Program Evaluation
D1- Leveraging PBIS Implementation Leader Presenters: Rob Horner, University of Oregon Caryn Ward, University of North Carolina (SISEP) Key Words:
Welcome - Pupil Premium
Enacting Multiple Strategies and Limiting Potential Successes: Reflections on Advocacy Evaluation, Competing Objectives, and Pathways to Policy Change.
Are Evidence-Based Practice Websites Trustworthy and Accessible?
a New Focus for External Validity
Gerald Farthing PhD Chair, United Nations Economic Commission for Europe Steering Committee on Education for Sustainable Development.
District Leadership Team Sustainability Susan Barrett Director, Mid-Atlantic PBIS Network Sheppard Pratt Health.
Quality Seems to be the Hardest Word: How one UK Funder Uses Evaluation to Achieve Policy Change Dr Andrew Cooper Opener:
AFTER SCHOOL SYMPOSIUM 2017
Building a Framework to Support the Culture Required for Student Centered Learning Jeff McCoy | Executive Director of Academic Innovation & Technology.
Building the foundations for innovation
WASH Enabling Environment Framework, Theory of Change and Tools
Introduction to Comprehensive Evaluation
Using Logic Models in Program Planning and Grant Proposals
Data-Driven Instructional Leadership
Measuring success in personalizing schools
Session 8 Exam techniques
Research methods and evaluation
Exploring and Implementing the MTSS Framework
Module 7 Key concepts Part 2: EVALUATING COMPLEX DEVELOPMENT PROGRAMS
The Moroccan Observatory on Drugs
Logic Models and Theory of Change Models: Defining and Telling Apart
Public Policy Process An Introduction.
An initiative that makes a difference
Consider the Evidence Evidence-driven decision making
The Technology Integration Planning Model
Regulated Health Professions Network Evaluation Framework
Curriculum Coordinator: D. Miller Date of Presentation: 1/18/2017
Applying to the EEF for funding: What are we looking for
State & Democracy States and…
Wrap up and consequences for TOR’s
What to include in your personal evaluation
Concepts to Review for Midterm
THE ENVIRONMENT THAT INFLUENCES NURSING CARE
Framing Grants for policy Research
Presentation transcript:

Moving towards evidence-based policy Nic Spaull Stellenbosch University & Allan Gray Orbis Foundation Endowment nicspaull@gmail.com www.nicspaull.com IEFG 2018 Cape Town 28 Feb 2018

Overview A few starting assumptions Some terminology Scale, evidence and context A cheat sheet for grant-making bodies interested in interventions and evaluations A framework for thinking about evidence How things actually work in the real world

Some starting assumptions ”The best use of private money is influencing how public money is spent.” R351 billion R4,5bn (1,2%) 2018 SA Government exp on education Trialogue estimate of ALL education CSI in SA

Another starting assumption When we talk about evidence & evaluation we are typically talking about either (1) research, or (2) an intervention. NOT: advocacy coalition-building politics

MEASUREMENT & EVALUATION PROBLEM What is the problem we are trying to alleviate? Why has this problem remained a problem (i.e. recognizing that the status quo is an equilibrium of forces and not just a random shitty outcome) AIM What are we hoping to achieve or change with funding XYZ? THEORY OF CHANGE What are ALL our assumptions about (1) the problem, (2) the way the intervention addresses the problem. MEASUREMENT & EVALUATION What will we know and be able to say after the evaluation? How will we know if it worked? (‘impact’) How will we know where & when it works? (‘external validity’) How long/much does it take for it to work? (‘dosage’) How sure will we be of the results? (evaluation methodology) How defensible are the results of the evaluation (‘independence’) What instrument are we using to measure? Is it legit/validated/used elsewhere? How comparable are these figures to others looking at the “same” thing? COST What is our cost per learner/school/teacher per year? What in-kind costs (particularly human expertise/networks) have we not included?

Cheat sheet: interventions Proof of concept One site, multi-year, intensive, often costly/child Aim: putting the idea on the map Evidence: smorgasboard of measures) Main challenge: does it actually work? FYI: heavily dependent on individual champions (one funder, one leader, one academic) Multi-site intervention 5-50 sites, Aim: to show that “the ‘model’ works & tweak model Evidence: few core measures Main challenge: Maintaining fidelity at 5-50 & causality of evidence FYI: the one funder/founder may still be driving Causal evidence 250+ sites Aim: The ‘model’ works in multiple different contexts Main challenge: Maintaining fidelity & causality & heterogeneous effects FYI: HR & technical expertise becoming a binding constraint Scale-up & government implementation 500+ sites Aim: The ‘model’ should become national policy Main challenge: Maintaining fidelity when integrating into government FYI: politics, fidelity, external validity, unintended consequences

Context and scale 4 3 Context 1 2 5 25 250 1000 Scale

The Goldilocks Zone A framework for thinking about evidence Goldilocks zone for evidence IDEAL Policy The Goldilocks Zone A framework for thinking about evidence Goldilocks zone for policy y Measurement rigor Consensus Conclusive 5 Causal A Small but rigorous 4 Rigor of the Evidence Confirming 3  Suggestive 2 B Large but weak evidence Proof of concept 1 x 5 10 25 75 250 1500 5000+ Independence Scale

How do things actually work? Is evidence the binding constraint to improved policy on paper and improved implementation in practice? Probably not: “There is massive evidence that governments do not implement many many many projects/proposals/programs that are cost effective and do spend budget on items known to be not cost effective. The model of a benign social welfare planner hampered by lack of rigorous evidence on effectiveness whose behavior an RCT will change is complete wack nonsense. “ (Pritchett, 2018) Primacy of politics Budget cycles  Feb/March splurge in SA Election cycles (new party/president/minister  new policy) Role of culture, values and tradition The exact same evidence in US and Denmark will not have the same impact on policy (think about studies on (1) for-profit education, (2) competition/choice.) Role of social networks Often policies are implemented or not implemented based on the Minister’s Whatsapp contacts or who’s in her calendar Role of unpredictable factors Minister makes a trip overseas and loves the school(s) she sees. Changes policy.

The Goldilocks Zone A framework for thinking about evidence Where are we on interventions about… Early grade reading? ECD? Teacher development? Teaching at the right level? The Goldilocks Zone A framework for thinking about evidence y Sloppy! Measurement rigor Consensus Conclusive C B Causal Rigor of the Evidence Confirming Suggestive A D  Proof of concept x 5 10 25 75 250 1500 5000+ Independence Scale

y C A B Politics x Evidence