III. Practical Considerations in preparing a CIE

Slides:



Advertisements
Similar presentations
Mywish K. Maredia Michigan State University
Advertisements

OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
1 W ORKSHOP ON S TRATEGIC P ROGRAMMING, M ONITORING AND EVALUATION F OCUSING ON P ERFORMANCE AND RE SULTS Madrid, 22 February 2013 Ines Hartwig DG Employment,
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
PPA 502 – Program Evaluation
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Challenge Questions How good is our operational management?
Proposal Writing for Competitive Grant Systems
Reporting & Ethical Standards EPSY 5245 Michael C. Rodriguez.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Unit 10. Monitoring and evaluation
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
APPLICATION FORM OF ROBINWOOD SUBPROJECT SECOND STEP 1. The short listed Local Beneficiaries work together to create international partnerships and prepare.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Technology Strategy Board Driving Innovation Participation in Framework Programme 7 Octavio Pernas, UK NCP for Health (Industry) 11 th April 2012.
Introduction to Evaluation Odette Parry & Sally-Ann Baker
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Successful Concepts Study Rationale Literature Review Study Design Rationale for Intervention Eligibility Criteria Endpoint Measurement Tools.
Session 9 & 10. Definition of risk assessment and pre condition for risk assessment Establishment of clear, consistent agency objectives. Risk assessment.
For ABA Importance of Individual Subjects Enables applied behavior analysts to discover and refine effective interventions for socially significant behaviors.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Development Impact Evaluation in Finance and Private Sector 1.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Integrating Evaluation into the Design of the Minnesota Demonstration Project Paint Product Stewardship Initiative St. Paul, MN May 1, 2008 Matt Keene,
Quality Assessment of MFA’s evaluations Rita Tesselaar Policy and operations Evaluation Department Netherlands Ministry of Foreign Affairs.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
1 Joint meeting of ESF Evaluation Partnership and DG REGIO Evaluation Network in Gdańsk (Poland) on 8 July 2011 The Use of Counterfactual Impact Evaluation.
Session 2: Developing a Comprehensive M&E Work Plan.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
AssessPlanDo Review QuestionYesNo? Do I know what I want to evaluate and why? Consider drivers and audience Do I already know the answer to my evaluation.
Monitoring and evaluation 16 July 2009 Michael Samson UNICEF/ IDS Course on Social Protection.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Stages of Research and Development
Applying for funding: Tips fom the trenches
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Eligibility and evaluation
M&E Basics Miguel Aragon Lopez, MD, MPH
Development of Strategies for Census Data Dissemination
INTER-AMERICAN DEVELOPMENT BANK CAPACITY BUILDING AND TRAINING.
Workshop on Strategic Programming, Monitoring and evaluation Focusing on Performance and REsults Madrid, 22 February 2013 Ines Hartwig DG Employment,
CLINICAL PROTOCOL DEVELOPMENT
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 8. Routine monitoring.
Impact evaluations at IFAD-IOE
MANCOSA Honours Marketing Research.
"Development of Strategies for Census Data Dissemination".
Program Evaluation Essentials-- Part 2
Reading Research Papers-A Basic Guide to Critical Analysis
Employment and Social Affairs Platform
Commission guidance and support on (counterfactual) impact evaluation of ESF interventions Frank Siebern-Thomas Georgeta-Alina Ungureanu DG EMPL, Unit.
Development Impact Evaluation in Finance and Private Sector
The Use of Counterfactual Impact Evaluation Methods in Cohesion Policy
Helene Skikos DG Education and Culture
Guidance on Evaluation of Youth Employment Initiative
Future Monitoring and Evaluation: Focus on results Antonella Schulte-Braucks Ines Hartwig ESF Evaluation Partnership Brussels 17 November 2011.
The European Statistics Code of Practice - a Basis for Eurostat’s Quality Assurance Framework Marie Bohatá Deputy Director General, Eurostat ... Strategic.
Counterfactual Impact Analysis applied in the ESF-Evaluation in Austria (period ) Contribution to the Expert-Hearing: Member States Experiences.
Guidance document on ex ante evaluation
EVALUATIONS in the EU External Aid
Positive analysis in public finance
ESF EVALUATION PARTNERSHIP
Monitoring and Evaluating FGM/C abandonment programs
ESF Leavers Survey 2010: Use of Counterfactual Impact Evaluation
Data for PRS Monitoring: Institutional and Technical Challenges
M & E Plans and Frameworks
Title Team Members.
Presentation transcript:

III. Practical Considerations in preparing a CIE

Planning ahead Evaluation plan  evaluation process  activities  timing, budget Evaluation scheme  detailed plan  taylored to specific circumstances All evaluations CIE CIE Which questions do I need to answer for a CIE evaluation scheme???

1. Selecting interventions Justify funds Reform process Learning Strategic issues 7 factors related to the intervention(s) Amenable to CIE Types of data Data protection Data availability

Amenable to CIE Discrete, distinctive and homogenous interventions? Identifyable causal mechanism? Interference with other instruments? Clear target group? Sufficient size of treatment group? Meaningful control group? Anticipation of system-wide changes? Amenable to CIE

Data availability Types of data Data protection Treatment and control group records Result records Contextual data/control variables Data protection Microdata available? National data on individual careers to compare ESF participants with control group? Anonymized data Possibility to de-anonymize data to follow individual careers? Data availability

2. Evaluation scheme Description: objectives, strategy, intervention logic Resources Timing Identify target group Identify control group Data issues Reporting and dissemination

A. Description Logic framework Objectives Purpose Stakeholder Use of the results Specific questions Input and activities Output Results Impacts Questions Results – success Qualitative results Relevant subgroups Contextual factors When do impacts materialize? Identify results that require measurement Link them to data sources Promote understanding how the intervention operates

B: Human resources Knowledge and collaboration Staff in MA/IB CIE Methods Data collection and managmt Guiding external contractors External contractors Understanding of contexts Familiar with data sources Data handling and statistical methods Statistical data providers

B: Financial resources Budget for CIE Internal costs External costs Relation to other evaluations Costs depend on Novelty of intervention Data collection/handling

C: Timing Crucial dates and constraints When do I need my results? When do I get reasonable results? How does CIE fit with other evaluation work? When are data available and ready- to-be-used?

No employment due to training C: When to evaluate? Training Higher productivity Higher wage New job Job search No employment due to training time

D: Identify the „treated group“ Anticipating treatment – drop out of target group Do not participate in treatment Drop outs of treatment Participants WHY? Intention to treat Treat- ment Target group

D: Identify the treated group Data sources (persons/enterprises) Enumerated or sample ESF monitoring or survey Characteristics provided Individual entity identifiable Consent required by individual entity Possibility to follow up by a survey

E: Identify the control group Analytical: Obtain unbiased estimates – non randomized design Excluded from participation (unrelated to results) Matching through Location Time Eligibility Choice/awareness

E: Identify the control group Policy related: What is the alternative to the intervention? No treatment? Other treatment? Can I find appropriate definition of net effects? Practical considerations Data availability

Counterfactual design Counterfactual results F: Data requirements Collect, collate, documente micro-data for an analytical data set for… Counterfactual design Eligible target population Individual level identifier Contextual data Collect baseline data Random allocation Treatment group Control group Treatment and control group records Intervention to be evaluated Treatment as usual or no treatment Results Counterfactual results Result records for treated and control group

F: Constraints in data issues Sample size big enough? Other constraints Independence and objectivity in measuring outcomes? Sufficient information on control group to explain differences in results? Minimum detectable effects size at different sample size

F: Checklist for data issues Are sources consistent? How to access them Who will check potential sources? Can data be merged? Which sources Data issues Legal barriers? Which IT infrastructure is required? Can individuals be identified? How will data be stored and transferred safely?

G: Reporting Presentation All evaluations need to be made public!! Written evaluation report(s) Verbal presentation Technical report All evaluations need to be made public!!

IV. Windup Thinking beyond individual CIEs— There’s more to evaluation than finding the ‘credible counterfactual’ Infrastructure is critical Collaboration is essential