PAI786: Urban Policy Class 2: Evaluating Social Programs.

Slides:



Advertisements
Similar presentations
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Advertisements

REGRESSION, IV, MATCHING Treatment effect Boualem RABTA Center for World Food Studies (SOW-VU) Vrije Universiteit - Amsterdam.
Mywish K. Maredia Michigan State University
Random Assignment Experiments
What are the causes of age discrimination in employment?
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The counterfactual logic for public policy evaluation Alberto Martini hard at first, natural later 1.
Validity In our last class, we began to discuss some of the ways in which we can assess the quality of our measurements. We discussed the concept of reliability.
Designing Influential Evaluations Session 5 Quality of Evidence Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014.
VIII Evaluation Conference ‘Methodological Developments and Challenges in UK Policy Evaluation’ Daniel Fujiwara Senior Economist Cabinet Office & London.
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 Evidence-Based Models for.
Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences.
Pooled Cross Sections and Panel Data II
Lecture 2: ECN 111 The Basics
Chapter 51 Experiments, Good and Bad. Chapter 52 Experimentation u An experiment is the process of subjecting experimental units to treatments and observing.
Evaluation Research COMT 502. topics Evaluation research –Process of determining whether intervention has its intended result.
Making Impact Evaluations Happen World Bank Operational Experience 6 th European Conference on Evaluation of Cohesion Policy 30 November 2009 Warsaw Joost.
TOOLS OF POSITIVE ANALYSIS
Types of Evaluation.
Applied Business Forecasting and Planning
Job Training Programs. What has been tried? How well does it work?
Experimental Design and Other Evaluation Methods Lana Muraskin
Servant Leadership Paper The student will concentrate on their individual workplace or business as the focus of a 5-7 page research paper discussing Servant.
What Works? Evaluating the Impact of Active Labor Market Policies May 2010, Budapest, Hungary Joost de Laat (PhD), Economist, Human Development.
State and Local Public Finance Spring 2015, Professor Yinger Lecture 5 Public Sector Costs: Policy.
McGraw-Hill © 2006 The McGraw-Hill Companies, Inc. All rights reserved. The Nature of Research Chapter One.
Matching Methods. Matching: Overview  The ideal comparison group is selected such that matches the treatment group using either a comprehensive baseline.
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 13 Experiments and Observational Studies.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Non-Experimental Methods Florence Kondylis.
Measuring Impact: Experiments
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
Evaluating Job Training Programs: What have we learned? Haeil Jung and Maureen Pirog School of Public and Environmental Affairs Indiana University Bloomington.
SESSION 8 GENDER ISSUES IN THE PROJECT LIFE CYCLE.
CAUSAL INFERENCE Shwetlena Sabarwal Africa Program for Education Impact Evaluation Accra, Ghana, May 2010.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Chapter 1 The Nature of Economics. Slide 1-2 Introduction The increasing obesity rate among U.S. citizens is a health concern which can be addressed if.
Beyond surveys: the research frontier moves to the use of administrative data to evaluate R&D grants Oliver Herrmann Ministry of Business, Innovation.
CAUSAL INFERENCE Presented by: Dan Dowhower Alysia Cohen H 615 Friday, October 4, 2013.
ECON 3039 Labor Economics By Elliott Fan Economics, NTU Elliott Fan: Labor 2015 Fall Lecture 21.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
CHAPTER 4 – RESEARCH METHODS Psychology 110. How Do We Know What We Know? You can know something because a friend told you You can know something because.
Public Finance Seminar Spring 2015, Professor Yinger Public Production Functions.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
CHOOSING THE LEVEL OF RANDOMIZATION. Unit of Randomization: Individual?
McGraw-Hill/Irwin Copyright © 2008 by The McGraw-Hill Companies, Inc. All rights reserved. CHAPTER 2 Tools of Positive Analysis.
Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative.
1 General Elements in Evaluation Research. 2 Types of Evaluations.
LECTURE 5 PUBLIC SECTOR COSTS: POLICY State and Local Public Finance Professor Yinger Spring 2016.
Methodology: How Social Psychologists Do Research
Public Finance and Public Policy Jonathan Gruber Third Edition Copyright © 2010 Worth Publishers 1 of 24 Copyright © 2010 Worth Publishers.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
Children’s Emotional and Behavioral Problems and Their Parents’ Labor Supply Patrick Richard, Ph.D., M.A. Nicholas C. Petris Center on Health Markets and.
Do European Social Fund labour market interventions work? Counterfactual evidence from the Czech Republic. Vladimir Kváča, Czech Ministry of Labour and.
Impact Evaluation Methods Randomization and Causal Inference Slides by Paul J. Gertler & Sebastian Martinez.
Patricia Gonzalez, OSEP June 14, The purpose of annual performance reporting is to demonstrate that IDEA funds are being used to improve or benefit.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
Introduction to Impact Evaluation The Motivation Emmanuel Skoufias The World Bank PRMPR PREM Learning Week: April 21-22, 2008.
Measuring Results and Impact Evaluation: From Promises into Evidence
Quasi Experimental Methods I
Development Impact Evaluation in Finance and Private Sector
Empirical Tools of Public Finance
Learning Seminar - Targeting employment policies
Class 2: Evaluating Social Programs
Class 2: Evaluating Social Programs
Positive analysis in public finance
Monitoring and Evaluating FGM/C abandonment programs
Which Evaluation Designs Are Right for Your State?
Presentation transcript:

PAI786: Urban Policy Class 2: Evaluating Social Programs

Urban Policy: Evaluating Social Programs Class Outline ▫Positive vs. normative analysis ▫The role of program evaluation ▫Basic principles of program evaluation ▫Program evaluation and decision making

Urban Policy: Evaluating Social Programs Positive vs. Normative Analysis ▫Positive analysis: How do people behave?  How are prices and quantities determined in a particular market?  What is the impact of a particular government program and people’s behavior?  If the necessary data are available, positive statements can be tested.

Urban Policy: Evaluating Social Programs Positive vs. Normative Analysis ▫Normative analysis: What is a good outcome?  What are the appropriate objectives for government intervention in a given market?  Which objectives are the most important?  Normative statements cannot be tested, but they certainly can be debated!

Urban Policy: Evaluating Social Programs The Role of Program Evaluation ▫Government programs change behavior. ▫You cannot determine whether the outcome of a government program meets your own objectives without determining how it changes behavior. ▫Program evaluation is necessary to identify the programs that best meet your own objectives!

Urban Policy: Evaluating Social Programs What is the basic problem facing someone wanting to evaluate any public program? ▫What you want is to know how one place differs with and without the program. ▫What you observe is either (a) what the world is like after and before the program or (b) what one place is like with the program and another is without it. ▫Thus, you cannot be sure that the effects you observe are not due to non-program differences over time or across places.

Urban Policy: Evaluating Social Programs What is the basic problem facing someone wanting to evaluate any public program? ▫Another way to put this is that the great challenge of any program evaluation is to identify the “counterfactual”, that is, to identify what would have happened if the program had not been implemented. ▫The counterfactual cannot be observed directly, so all evaluation methods are attempts to estimate the counterfactual in an unbiased way.

Urban Policy: Evaluating Social Programs Example (from Hollister) ▫Consider a program that provides training and counseling to improve participants’ employability. ▫Suppose a high share of previously unemployed participants become employed after leaving the program. ▫Does this evidence indicate that the program is effective?

Urban Policy: Evaluating Social Programs Example (continued) ▫Answer: No! ▫People tend to move from unemployment to employment over time, and programs tend to select people who are unemployed.  So the increase in employment may reflect the natural process of moving to employment, not program impact.  This is called regression to the mean. ▫Local labor market conditions might have improved at the time the program was implemented.  So the increase in employment might reflect factors other than the impact of the program.  This is called omitted variable bias.

Urban Policy: Evaluating Social Programs The two ways to estimate program impacts are ▫random assignment ▫statistical control. ▫Random assignment ensures that differences across time and place are not correlated with program participation. ▫Statistical controls can account for observable differences across place or time and for certain kinds of unobservable factors.

Urban Policy: Evaluating Social Programs Random assignment is the preferred method in most cases. ▫It provides results that are intuitively compelling and scientifically sound. ▫If you want to know a program’s impacts, become an advocate for evaluation using random assignment!

Urban Policy: Evaluating Social Programs Random assignment can be applied at many different scales. ▫Some evaluations randomly assign treatment to individuals. ▫Others randomly assign treatment to organizations (firms, schools, etc.) ▫Still others randomly assign treatment to communities.

Urban Policy: Evaluating Social Programs Random assignment has been used to study: ▫Welfare-to-work programs ▫Unemployment insurance ▫Job training ▫Income maintenance ▫Housing assistance ▫Electricity pricing ▫Education ▫Early childhood development ▫Criminal justice policy ▫Child health and nutrition

Urban Policy: Evaluating Social Programs Random assignment is not always feasible. A huge literature indicates that the best statistical studies: ▫must have extensive data to ensure that differences aren’t due to unobservable factors. ▫must have comparable experimental and control groups based on observable factors.

Urban Policy: Evaluating Social Programs Comparable Control Groups Recent advances in evaluation methodology have discovered that it is very important to have comparable control groups, because the impact of a program may depend on the traits of the recipients. This leads to matching techniques, which focus treatment-control comparisons on groups that are comparable on all observable traits. Matching cannot solve the often-encountered problem that treatment and control groups may differ on unobserved traits—a problem that arises in many statistical studies.

Urban Policy: Evaluating Social Programs For the case of community economic development programs, Hollister discusses several evaluations that do not use random assignment. ▫This discussion gives you a sense of what to look for in statistical studies. ▫You may want to return to it when we discuss community economic development!

Urban Policy: Evaluating Social Programs Formal evaluation of programs or management reforms are often not available. Thus, it is appropriate for you (when you become public officials!) to use your own judgment: ▫to select programs and reforms that appear to have worked in other places ▫to design new programs and reforms

Urban Policy: Evaluating Social Programs Evaluations of intermediate results can also be helpful. Here is the figure in Hollister: Your Planned Work Your Intended Results ImpactResources/ Inputs ActivitiesOutputsOutcomes 12435

Urban Policy: Evaluating Social Programs But evaluation should always be in the back of your mind. ▫Search for evaluations of the programs or reforms you are interested in. ▫Make an honest judgment about the quality of existing evaluations. ▫Informally apply basic evaluation principles to programs and reforms you are considering. ▫Implement formal evaluations whenever possible!

Urban Policy: Evaluating Social Programs Informal Evaluations ▫Informal evaluations can be very helpful. ▫Learn about the market in which the program will operate—that is about the economic and social factors that influence the behavior of market participants. ▫Think about how various government programs change the incentives of people in this market. ▫Use your understanding from other cases to make an educated guess about the impact of these changes in incentives on behavior—and hence on your objectives.

Urban Policy: Evaluating Social Programs The Punchline ▫You may undermine your own objectives if you don’t take program evaluation seriously ▫Look for (and advocate!) high-quality program evaluation studies. ▫When these studies are not available, make your best judgment about the relevant positive analysis using the best evidence and analysis you can find.