Role of Training in Program Evaluation: Evidence from the Peace Corps Projects Shahid Umar Ph.D. Candidate Rockefeller College of Public.

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk –
World’s Largest Educational Community
Individualized Learning Plans A Study to Identify and Promote Promising Practices.
Comprehensive Literacy Program Louisiana Department of Education
Continuous School Improvement Final Presentation School Name Date.
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
SHRM Survey Findings: Workplace Practices Kathleen Kappy Lundquist, Ph.D. Mark Schmit, Ph.D., SHRM-SCP.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
ESC/EN Engineering Process Compliance Procedures August 2002.
Wenxin Zhang Department of Civic Design University of Liverpool
The Canadian Automotive Repair and Service Council Where people, training and technology meet Toronto Training Board Presentation January 16, 2007.
Healthy North Carolina 2020 and EBS/EBI 101 Joanne Rinker MS, RD, CDE, LDN Center for Healthy North Carolina Director of Training and Technical Assistance.
Leadership within SW-PBS: Following the Blueprints for Success Tim Lewis, Ph.D. University of Missouri OSEP Center on Positive Behavioral Intervention.
ASSESSMENT CERTIFICATE CULMINATING PROJECT: ASSESSING STUDENT LEARNING OUTCOMES Presented by: Shujaat Ahmed and Kaitlin Fitzsimons.
Program Overview The College Community School District's Mentoring and Induction Program is designed to increase retention of promising beginning educators.
Is Your School Improving Outcomes for Students with Disabilities: Guesswork or Science? Presented by The Elementary & Middle Schools Technical Assistance.
Impact assessment framework
Skunk Works Evaluation Tools: How do we know if we are having an impact?
UESCO/IFLA Workshop on Development of Information Literacy through School Libraries in Southeast Asia September 2005.
1 Informing a Data Revolution Getting the right data, to the right people, at the right time, on the right format Johannes Jütting, PARIS21 Tunis, 8 Decemeber.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Working Definition of Program Evaluation
Initial thoughts on a Global Strategy for the Implementation of the SEEA Central Framework Ivo Havinga United Nations Statistics Division.
Timberlane Regional School District
What is Triangulation? “Short-hand term” for synthesis and integrated analysis of data from multiple sources for program decision making A powerful tool.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Ministry of State for Administrative Development Towards Meaningful ICT Indicators for Developing Countries Dr. Ahmed M. Darwish EGYPT Government and Education.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
ROLE OF INFORMATION IN MANAGING EDUCATION Ensuring appropriate and relevant information is available when needed.
Instructional Plan | Slide 1 AET/515 Instructional Plan December 17, 2012 Kevin Houser.
UNESCO-CEPES 10 September 2002 Kauko Hämäläinen, Kirsi Mustonen and Karl Holm Standards, Criteria and Indicators in Programme Accreditation and Evaluation.
After School Programming Professional Development & Instructional Quality City of Wilmington After School Programs Judy L. Singletary Clemson University.
Part 3 UNDERSTANDING CUSTOMER REQUIREMENTS. Company Perceptions of Consumer Expectations Expected Service CUSTOMER COMPANY Listening Gap Provider Gap.
After School Programming Professional Development & Instructional Quality City of Wilmington After School Programs Judy L. Singletary Clemson University.
Summary of Lao CBMS (Progress Report: on-going activities)
1 Poverty Analysis and Data Initiative (PADI) Capacity Enhancement Program of WBI To Support a Network of researchers, policymakers, and data producers.
Professional Recruit Tracking System (PRTS) Purpose – Why Are We Here? –Background – Where Have We Been? –System Status – Where Are We? –Future Directions.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
Ultranet Readiness & eLearning Co-ordinator Training Semester 2, 2008
A short introduction to the Strengthened Approach to supporting PFM reforms.
Country HIS assessments Interim report July th Board Meeting July, 2007 Seattle, U.S.A.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Strategic Planning Session Part II: Outcome Assessment and Program Evaluation Chapter 16: John Clayton Thomas Presented by : David Rudder, Ph.D.
Questionnaire Content & Design Data-Driven Strategies Questionnaire Content & Design National Association of Wholesaler-Distributors January 28, 2008 Michael.
Program Evaluation for Nonprofit Professionals Unit 4: Analysis, Reporting and Use.
Workshop Conclusions and Recommendations Towards better Evidence on Migration and Development in Eastern Europe and Central Asia, Capacity-building Workshop.
Office of Service Quality
Assessment Small Learning Communities. The goal of all Small Learning Communities is to improve teaching, learning, and student outcomes A rigorous, coherent.
Blended Value Accounting & Social Enterprise Success John Anner, PhD SIERC Annual Conference Auckland, New Zealand 12 February, 2016.
CHANGE READINESS ASSESSMENT Measuring stakeholder engagement and attitude to change.
1 STATISTICAL DATA ANALYSIS SOFTWARE By Johnson Lubega Kagugube Director, District Statistics and Capacity Development Uganda Bureau of Statistics.
Rochdale Boroughwide Housing Tenant Interaction for Smart Controls – Social Housing 6/11/2016WPx-Organisation.
Effective Practices for a Successful Standards Review.
Development Account: 6th Tranche Strengthening the capacity of National Statistical Offices (NSOs) in the Caribbean Small Island Developing States to fulfill.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
MA-PAL TASK 1 LEADERSHIP THROUGH A VISION FOR HIGH STUDENT ACHIEVEMENT.
Monitoring and evaluation of disability-inclusive development
Evaluating SPP/APR Improvement Activities
Gender statistics in Information and Communication Technology for Women’s Empowerment and Gender Equality Dorothy Okello, Annual.
GENDER STATISTICS IN INFORMATION AND COMMUNICATION
BY : Eunious Kapito Data Processing Officer
Resource 1. Evaluation Planning Template
NSDS Roll-out: How Can PARIS21 Help?
Using Data for Program Improvement
Using Data for Program Improvement
Evaluating SPP/APR Improvement Activities
OPHPR Practice-based Research Agenda
Presentation transcript:

Role of Training in Program Evaluation: Evidence from the Peace Corps Projects Shahid Umar Ph.D. Candidate Rockefeller College of Public Affairs University at Albany, State University of New York

Outline Background Research questions Method of study Results Recommendations Conclusion

Background Most of evaluation capacity building (ECB) efforts include training as ECB strategy Using a case study method, a large number of studies have concluded that training as ECB strategy was successful

Background The effectiveness of training as ECB strategy has not been assessed using data from multiple sites (projects) in an international context The present study fills this gap using data from 184 projects of the Peace Corps operating in 72 countries (posts)

Background: The Peace Corps (PC) U.S. government agency Sends volunteers to developing countries unlike other international development agencies The PC provides a few weeks of training to both volunteers and staff Training includes a component of monitoring and evaluation Training about monitoring and evaluation is not more than a couple of days

Research questions To what extent does training induce the collection of baseline data and the use of these data for evaluation purposes? Do the Peace Corps posts have sufficient capacity to undertake impact evaluation?

Method of study The data come from the 2012 Project Status Report (PSR) survey of the Peace Corps Open-ended questions were asked about: – Baseline data collection – Baseline data use – Data collection tools – Baseline data storage – Training – Support needed from headquarters – Other comments

Method of study Overall qualitative responses were used to generate frequency and proportion statistics Comments were analyzed qualitatively and incorporated in the results

Results: Univariate (N=184) VariableYesNo Baseline data collected 0.85 (157)0.15 (27) Data collection tools 0.81 (149)0.19 (35) Baseline data stored 0.76 (139)0.24 (45) Baseline data used 0.75 (138)0.25 (46) Training given 0.72 (132)0.28 (52) HQ support needed 0.81 (149)0.19 (35) Unit of Analysis: Project

Training given Baseline data used (p=.92) Baseline data NOT used (p=.08) Training NOT given Baseline data used (p=.31) Baseline data NOT used (p=.69) Probability of using or not using baseline data in case of training given or not (N=184) Unit of Analysis: Project

Training given Baseline data NOT used (p=.07) Baseline data NOT collected (p=.01) Baseline data used (p=.93) Baseline data collected (p=.99) Training NOT given Baseline data not used (p=.38) Baseline data NOT collected (p=.50) Baseline data used (p=.62) Baseline data collected (p=.50) Conditional probability of using data in case of training given or not given (N=184) Unit of Analysis: Project

Results Projects receiving training are more likely to collect and use baseline data As expected, there is a strong positive association between staff training (as ECB strategy) and professional evaluation practices

The Peace Corps EC? (N=184) VariableNo Baseline data collected0.15 (27) Data collection tools0.19 (35) Baseline data stored0.24 (45) Baseline data used0.25 (46) Training0.28 (52) Unit of Analysis: Project Given these numbers, the Peace Corps does not seem to have reasonable EC and is not ready to undertake impact evaluation

Recommendations for posts (country level) Training, in addition to other strategies such as technical support and ICTs, should be used for ECB Improvement should be directed at post (country) level, not project level Greater monitoring and evaluation expertise should be deployed to the PC posts

Recommendations for the PC headquarters Leadership support Best practices, already present at some posts, should be more widely shared Standardized practices across all the posts should be promoted (e.g., data collection tools, data storing) Centralized data management system: – Data should be collected using web surveys – Quantitative responses should be sought in surveys

Conclusion There are some good news for the Peace Corps: 85% of the posts collect and 75% use baseline data But also a need for more substantial efforts to prepare for and undertake impact evaluation The study provides empirical evidence that training is necessary but perhaps not sufficient for ECB