Better Decision-Making with RCTs: A Guide for Financial Service Providers.

Slides:



Advertisements
Similar presentations
MHealth Evaluation & Text4Baby Doug Evans September 20, 2010.
Advertisements

EN Regional Policy EUROPEAN COMMISSION Impact evaluation: some introductory words Daniel Mouqué Evaluation unit, DG REGIO Brussels, November 2008.
CONVERTING BORROWERS INTO SAVERS: SOME PRODUCT DEVELOPMENT IDEAS FROM THE U.S. HOUSEHOLD FINANCE INITIATIVE Dean Karlan, Yale University and IPA Jonathan.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Mywish K. Maredia Michigan State University
How Do We Know if a Charter School is Really Succeeding? – Various Approaches to Investigating School Effectiveness October 2012 Missouri Charter Public.
Rose Wilcher November 19, 2008 Strategic Considerations for Strengthening the Integration of FP and HIV Service Delivery Programs.
Field Experiments and Beyond for R&D in Retail Finance Jonathan Zinman Dartmouth College, IPA, etc. December 3, 2010 (Happy Anniversary PCC!)
The French Youth Experimentation Fund (Fonds d’Expérimentation pour la Jeunesse – FEJ) Mathieu Valdenaire (DJEPVA - FEJ) International Workshop “Evidence-based.
Expanding the body of evidence Marc Shotland Training Director and Senior Research Manager.
GENERATING DEMAND FOR DATA Module 1. Session Objectives  Understand the importance of improving data-informed decision making  Understand the role of.
Public Policy & Evidence: How to discriminate, interpret and communicate scientific research to better inform society. Rachel Glennerster Executive Director.
Program Evaluation Spero Manson PhD
TOOLS OF POSITIVE ANALYSIS
Evaluation. Practical Evaluation Michael Quinn Patton.
PAI786: Urban Policy Class 2: Evaluating Social Programs.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Financial inclusion: A means to an end Presentation by Dean Karlan.
BC Jung A Brief Introduction to Epidemiology - XI (Epidemiologic Research Designs: Experimental/Interventional Studies) Betty C. Jung, RN, MPH, CHES.
Copyright © 2014 by The University of Kansas Choosing Questions and Planning the Evaluation.
Cheri Pies, MSW, DrPH Padmini Parthasarathy, MPH Family, Maternal and Child Health Programs Contra Costa Health Services A Life Course Approach to MCH.
Financial Coaching: An Overview J. Michael Collins Prepared for presentation at the Centers for Working Families, July 22-23, 2010.
WHAT IS “CLASS”? A BRIEF ORIENTATION TO THE CLASS METHODOLOGY.
Save the Children’s Organizational Strategy for mHealth Jeanne Koepsell Save the Children mHealth Working Group 22 May, 2012.
Philip Davies How is a Policy Supposed to Work? – Theory of Change Analysis Philip Davies International Initiative for Impact Evaluation.
Research Methods Key Points What is empirical research? What is the scientific method? How do psychologists conduct research? What are some important.
Randomized Control Trials for Agriculture Pace Phillips, Innovations for Poverty Action
Measuring Impact: Experiments
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Mattea Stein Quasi Experimental Methods.
Using willingness to pay data to inform the design of health insurance for the poor: evidence from micro-lending clients in Lagos, Nigeria November 1,
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
Measuring the Impact of Microfinance Meritxell Martinez- CGAP European Microfinance Week 29 Nov – 1 Dec.
Evaluating a Research Report
Effect of a values-based prevention curriculum on HIV- positive couples from four regions in Ethiopia Presented at XIX IAC 2012 By Misgina Suba, MPH 25.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Data on Access to and Use of Financial Services: Regulator and Bank Surveys Asli Demirguc-Kunt DECRG WB- Brookings Conference on Access to Finance May.
Pharmaceutical system strengthening – Is there a need for a new paradigm? Andreas Seiter The World Bank ICIUM 2011, Antalya 1.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
Ch. 2 Tools of Positive Economics. Theoretical Tools of Public Finance theoretical tools The set of tools designed to understand the mechanics behind.
Moscow, Russia November 19th, 2009 Symbiotics SA Jerome Savelli Regional Manager Europe and Asia 2009 Russian Micro finance Center Conference “Microfinance.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
Can Financial Innovation Promote Energy Efficiency? An Impact Analysis for China November 13, 2009 Hiroyuki Hatashima Independent Evaluation Group-IFC.
Randomized controlled trials and the evaluation of development programs Chris Elbers VU University and AIID 11 November 2015.
What do we know about gender and private sector development in Africa? Markus Goldstein Alaka Holla The World Bank.
Lessons from the United States: Evaluating Employment Services is Neither Easy Nor Cheap November 7, 2009 Sheena McConnell Peter Schochet Alberto Martini.
Evaluation Plan Steven Clauser, PhD Chief, Outcomes Research Branch Applied Research Program Division of Cancer Control and Population Sciences NCCCP Launch.
Randomized Assignment Difference-in-Differences
SOCIAL PERFORMANCE TASK FORCE WEBINAR “USING DATA TO BETTER SUPPORT BUSINESS AND SOCIAL GOALS” JACOBO MENAJOVSKY DATA SCIENTIST FOR FINANCIAL INCLUSION.
Social Experimentation & Randomized Evaluations Hélène Giacobino Director J-PAL Europe DG EMPLOI, Brussells,Nov 2011 World Bank Bratislawa December 2011.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Choosing Questions and Planning the Evaluation. What do we mean by choosing questions? Evaluation questions are the questions your evaluation is meant.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
How United Way Works to Advance the Common Good. How United Way Works 2 To improve lives by mobilizing the caring power of communities Mission of the.
Measuring causal impact 2.1. What is impact? The impact of a program is the difference in outcomes caused by the program It is the difference between.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
IMPLEMENTATION AND PROCESS EVALUATION PBAF 526. Today: Recap last week Next week: Bring in picture with program theory and evaluation questions Partners?
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
About me Miguel Almunia 1st year Econ PhD student Project Associate ( ) at Innovations for Poverty Action (IPA) in Peru Worked mostly for Dean.
Measuring Results and Impact Evaluation: From Promises into Evidence
Competence Centre on Microeconomic Evaluation (CC-ME)
Responsibilities & Tasks Week 2
Class 2: Evaluating Social Programs
Sampling for Impact Evaluation -theory and application-
Class 2: Evaluating Social Programs
Positive analysis in public finance
Early Learning Social Impact Bond Contract Standardization Working Group Governance Issues and Expanded Agreements Listing.
Randomized Controlled Trial of Family Centered Treatment in North Carolina Treatment and Study Overview for NC County Departments of Social Services September.
Presentation transcript:

Better Decision-Making with RCTs: A Guide for Financial Service Providers

Overview Who we are Experimentation with Randomized Controlled Trials –What is an RCT –Choosing metrics –Logistics –When are RCTs useful/when are they not RCT Toolkit November 5, 20152

Introductions November 5, 2015 Julia Brown Program Manager Lucia Goin Research Associate Nora Gregory Research Associate Kim Smith Sr Research Associate Jonathan Zinman, Dartmouth Scientific Director Dean Karlan, Yale

Where we work 51 Countries Supported by 18 Country Programs November 5, 20154

US Finance Initiative 26 Projects (8 RCTs, 18 Pilots) 22 Partner Organizations November 5, 20155

THE PROBLEM November 5, Limited Evidence on what works best Limited Use of available evidence Ineffective Policies and Programs HiPPO Thinking But we’ve always done it that way!

THE SOLUTION November 5, Better Policies & Programs MORE EVIDENCE, LESS POVERTY Design & Evaluate potential solutions to poverty problems Mobilize & Support decision-makers to use evidence

What is Impact? November 5, IMPACT of current practices IMPACT of new Products/services Features Marketing On Your Bottom Line On Your Clients On Your Mission

What is Impact? November 5, Savings Balances Time Program Start

What is Impact? November 5, Correlation: Per capita consumption of cheese (US) Number of people who died by becoming entangled in their bedsheets

November 5, Average Income Time Program Start Received Intervention Did Not Receive Intervention

The Counterfactual The counterfactual represents the state of the world that program participants would have experienced in the absence of the program (i.e. had they not participated in the program) Problem: Counterfactual cannot be observed November 5,

Solution: We need to “mimic” or construct the counterfactual November 5, The Counterfactual

Randomized Controlled Trials November 5, COMPARIS ON

November 5, Impact is defined as a comparison between: 1. the average outcome for people who were in the program --and-- 2.the average outcome for the comparison group Average savings balances for those who received text messages: $550 ― Average savings balances for those in the comparison group: $325 Impact of text messages on savings: $225

November 5,

Selecting Outcomes & Metrics November 5, Outcome - What do you care about? Improved savings behavior Metric - How is it measured? Number of savings deposits per month Average size of savings deposit Average daily balance of savings account

Selecting Metrics November 5, OUTPUTSOUTCOMES Account openings Message receipt Web visits Savings Repayment Income

Selecting Data November 5, Administrative data  Demographic data  Transaction data Credit report data Survey data  Overall financial health

Selecting Metrics November 5,  Collect data that your organization will use  Set thresholds for success and failure before you start

Logistics – How many people do you need? November 5,

Logistics – How many people do you need? November 5,

Logistics – How many people do you need? November 5,

Logistics – How many people do you need? November 5,

Logistics - Randomization November 5,

Example: Super Savers CD November 5, )Identify target population 2)Randomize a.Control b.Counseling c.Savings 3)Collect outcome data

When NOT to Randomize Sample size Staff capacity Intervention not ready to be tested No feasible way of randomizing No way of measuring outcomes Organization not prepared to respond to results November 5,

When is the Right Time to Randomize? When you have a good sense of needs, program theory, and process When there is an important, specific and testable question you want/need to know the answer to Time, expertise, and money to do it right Appropriate sample size Results can be used to inform programming and policy November 5,

The Role for Researchers Sort through trickier RCT designs Outsource some of the work Objectivity and credibility Creativity and experience November 5,

A Toolkit for RCTs November 5,

Thank You!