Who is the subject in a CRT? Rachel Glennerster Executive Director, J-PAL Department of Economics, MIT PRIM&R November 8, 2013.

Slides:



Advertisements
Similar presentations
What should RECs be reviewing? Criteria Define minimal risk Who decides Student research Health care vs social research.
Advertisements

Empowering tobacco-free coalitions to collect local data on worksite and restaurant smoking policies Mary Michaud, MPP University of Wisconsin-Cooperative.
3ie Grantees Policy Influencing Clinic, Negombo, Sri Lanka, July 2012 Mother Literacy, Participation Programs, and Child Learning Assessment Survey.
Choosing the level of randomization
Teaching Community Assistant Initiative
Drawing policy lessons from RCTs: When to generalize and when not to Rachel Glennerster J-PAL and IGC London School of Economics, June 3, 2014.
Benefits and limits of randomization 2.4. Tailoring the evaluation to the question Advantage: answer the specific question well – We design our evaluation.
Expanding the body of evidence Marc Shotland Training Director and Senior Research Manager.
ETHICS OF RANDOMIZED EVALUATIONS Module 4.3. Overview Ethical principles The division between research and practice Respect for persons and informed consent.
Information Campaigns to Strengthen Participation & Improve Public Schools: Experimental Evidence from Two Studies in Uttar Pradesh and Madhya Pradesh.
Departments of Education and Public Welfare Office of Child Development and Early Learning Executive Budget
 Daylene Meuschke, Ed.D Barry Gribbons, Ph.D RP Conference: April 2, 2013.
Research, Program Evaluation, and Quality Improvement or Assurance: What’s in a Name? Ivor Pritchard, Ph.D. Senior Advisor to the Director of OHRP
Public Policy & Evidence: How to discriminate, interpret and communicate scientific research to better inform society. Rachel Glennerster Executive Director.
Copyright © 2009 Pearson Education Canada2-1 Chapter 2: Child Development 2.1 Doing Child-Development Research 2.2 Child-Development Research and Family.
Professional Development: Components for Technology Integration  Vision  Planning  Technology Skills Training  Evaluation & Assessment.
CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent.
Ethics of randomized evaluations. Overview Ethical principles The division between research and practice Respect for persons and informed consent Justice.
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
1 Randomization in Practice. Unit of randomization Randomizing at the individual level Randomizing at the group level –School –Community / village –Health.
Promoting School- Based Mental Health Through a Countywide Summer Institute Keri Weed, Ph.D. Department of Psychology University of South Carolina Aiken.
1 Improving Administration in Your Financial Aid Office SFA’s Tools for Success.
TRANSLATING RESEARCH INTO ACTION What is Randomized Evaluation? Why Randomize? J-PAL South Asia, April 29, 2011.
Technology Integration Planning Guidelines for Development A Visual Guide/ Couva West Secondary School.
Community Participation in Public Schools: Impact of Information Campaigns in three Indian States Priyanka Pandey, Sangeeta Goyal & Venkatesh Sundararaman.
Gender and Impact Evaluation
Approach and Key Components. The Goal of Cities for Life: To help community groups and primary care providers create an environment that facilitates and.
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
Pitfalls of Participatory Programs: Evidence from a randomized evaluation in education in India Abhijit Banerjee (MIT) Rukmini Banerji (Pratham) Esther.
Social Capital and Early Childhood Development Evidence from Rural India Wendy Janssens Washington, 20 May 2004.
Asha Wide Initiatives on Education & Related issues November 2010.
Pinellas Safe Start Evaluation Prepared by Sandra Ortega For the 2004 Cross-Site Meeting Baltimore, MD November 2004.
No Place Like Home Cross-Site Evaluation Training.
Public Expenditure Tracking Surveys(PETS) Zerubabel Ojoo Management systems and Economic Consultants Jan
Randomized Controlled Trial to Prevent Child Violence Pediatric Research in Office Settings (PROS)
SUNY Oswego Human Subjects Committee Last Revised 10/28/2011.
SUNY Oswego Human Subjects Committee Last Revised 10/28/2011.
Evaluation: what is it? Anita Morrison. What is evaluation? Evaluation… –…is the process of determining the merit, worth, or value of something, or the.
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
The Relationship of Quality Practices to Child and Family Outcomes A Focus on Functional Child Outcomes Kathi Gillaspy, NECTAC Maryland State Department.
M & E TOOLKIT Jennifer Bogle 11 November 2014 Household Water Treatment and Water Safety Plans International and Regional Landscape.
Reducing Summer Learning Loss: Promising Approaches for Summer Learning Programs Philanthropy New York February 4, 2010.
@theEIFoundation eif.org.uk CAREY OPPENHEIM, CEO Wednesday 5 th November, 2014 Child development and policy interventions.
CHOOSING THE LEVEL OF RANDOMIZATION. Unit of Randomization: Individual?
Potential for Stakeholder Collaboration: An Evaluation of Community-based Conservancies in Maasai Group Ranches, Kenya Lily Maynard October 7, 2014.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Kenya: Solidarity Center/Union Project International Program on the Elimination of Child Labor US Department of Labor International Labor Organization.
PEDAGOGICAL TOOLS METHODOLOGICAL CRITERIA. The methodological tools should… - be culturally sensitive, - consider age and use the knowledge of old generation.
Quality Improvement in Primary/Ambulatory Care: The new Frontier Focus on Patients Piera Poletti CEREF, Padua (Italy)-
INDIA: The MVF Program International Program on the Elimination of Child Labor US Department of Labor International Labor Organization IPEC.
WWB Training Kit #9 What Are Children Trying to Tell Us: Assessing the Function of Their Behavior.
Questions to ask When Engaged in Sustainable Community Development Anu Ramaswami Professor and Director IGERT: Sustainable Urban Infrastructure University.
Protecting Human Subjects Overview of the Issues Applications to Educational Research The IRB Process.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Teacher Assessment in Lithuania Dalia Brazienė, The Senior specialist of Culture, Education and Sports Department of Kaunas Region, Lithuania 12 / 04 /
SRHR Policy Salima 30 th June 2011 SRHR Policy Salima 30 th June 2011 Foundation for Children Rights.
THE ROLE OF RESEARCH IN TELECENTER DEVELOPMENT Why does research matter? Raul Roman Center for Internet Studies University of Washington APEC Telecenter.
Investigator Initiated Research Best Practices for IRB: SBER Corey Zolondek, Ph.D. IRB Operations Manager Wayne State University.
Bruna Irene Grimberg, Ph.D. AEA 2013 Annual Conference Washington, D.C. PEER-TEACHER INSTRUCTIONAL COACHING IN K-6 RURAL SCHOOLS.
The Effects of High Quality Early Childhood Education.
Evaluating the Quality and Impact of Community Benefit Programs
Ethics, Human Subjects Approval and Consent in Randomized Evaluations
Measuring Results and Impact Evaluation: From Promises into Evidence
Participatory Toolbox
HEALTH IN POLICIES TRAINING
Health Breakout.
Impact Evaluation Methods
Sampling for Impact Evaluation -theory and application-
Strategies and practices for sustainability of CBR
Presentation transcript:

Who is the subject in a CRT? Rachel Glennerster Executive Director, J-PAL Department of Economics, MIT PRIM&R November 8, 2013

Overview Example: education program in India Whose consent do we need? What is research and what is practice? Some suggested criteria

Improving learning in India Pratham is a large Indian education NGO –“every child in school….and learning well” Successful urban program, new program for rural areas Developed tool to test learning, community members generated village score cards Facilitated village meeting where information on ways to improve education were shared (e.g., VEC) Trained volunteers to run after-school reading camps

Research design Randomized at village level –65 villages received information on how to improve education –65 received information and scorecards –65 received information, scorecards, and reading camps –85 in comparison group Data collected on: –Children’s learning levels –Teacher absenteeism –Parents preferences and actions in promoting education –Village Education Committee members knowledge and actions

Who is the subject of the research? All households? Children? Teachers? Village leaders? Those interviewed? All impacted by the intervention, are they all impacted by research? –Indirect effects of any action can be far reaching Informed consent for what? –To be interviewed? –To have data collected on them? (e.g., teacher absenteeism) –To allow intervention to go ahead? (i.e., veto power)

What is research and what is practice? Practice: Pratham regulated as an NGO in India –Right and ability to implement their program without informed consent of everyone in the village –e.g., can provide information about villager rights without teacher or village leader consent –Pratham worked closely with researchers to design the program (drawing on their knowledge of what works); does that make it research? Research: Systematic study leading to general lessons –Bad studies aren’t regulated, good ones are? –Changes made to implementation in order to evaluate (none) –Data collection storage and analysis (informed consent needed)

Possible criteria for regulating CRTs Would the intervention have happened anyway? What change is due to the evaluation? –Subjects are those impacted by changes due to evaluation Is participation in the intervention voluntary? –Most programs are voluntary, form of informed consent –More careful assessment of involuntary programs Some deference to local standards and regulations –e.g., right of community to collect information on absent teachers Level of risk and benefit –Risk of children being beaten vs. benefit of improving education –But benefit of studying a program that may be risky if program is going ahead anyway