A methodology for evaluating education and training activities A CASE STUDY IN ETHIOPIA 20 TH OCTOBER 2015.

Slides:



Advertisements
Similar presentations
Delivering Equitable Care in Mental Health Services A Clinical Area Based Approach Itai Nyamatore Clinical Team Manager O.B.M.H.
Advertisements

Experience with SARETI Training Betty Kwagala (PhD) – Makerere University.
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
0 Indicators to measure the effectiveness of the implementation of the UNECE Strategy for ESD Expert group Indicators for ESD drs. R.M. van Raaij, ministry.
Regional Coordinators Meeting September 28-30, 2009 Washington DC Data Process, Management, and Transmission Chart, and Access Policy Nada Hamadeh Statistics.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Chapter 12 Selecting a Data Collection Method. DATA COLLECTION AND THE RESEARCH PROCESS Steps 1 and 2: Selecting a General Research Topic Steps 3 and.
Young Entrepreneurs Alliance Project Proposal February 28, 2005 Erika Hammond and Michael Kalin.
Human memory has two channels for processing information : visual & auditory Cognitive Theory How Do People Learn? Human memory has a limited capacity.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Development of Competence Profile Quality managers in VET-institutions Project no: PL1-LEO This publication [communication] reflects the.
E-POSTER PRESENTATION MPhild/Phd
The Vision Implementation Project
PHAST Participatory Hygiene and Sanitation Transformation
Needs Analysis Session Scottish Community Development Centre November 2007.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Objectives To assess the effectiveness of strategies designed to improve hand hygiene behaviour among healthcare workers To assess the barriers to hand.
Tigist Tesfaye AEMFI, Program Manager July, 2010 Bern, Switzerland The Experience of AEMFI in the Expansion of SPM “ Performance, Challenges and the Way.
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Approaches to quality assurance TIPA’s perspectives Fatmir Demneri.
Team-Based Inquiry ASTC Preconference Workshop, October 18
Implementing QI Projects Title I HIV Quality Management Program Case Management Providers Meeting May 26, 2005 Presented by Lynda A. O’Hanlon Title I HIV.
Model Name Transition Project Learning Network Workshop 3 IYF Transition Project.
Action Research Action Research Juma Lungo University of Oslo 29 th December
Task NumberHarmonise, develop & implement capacity building Performance Indicators CB-07-01c Harmonise efforts by Tasks, in particular those related with.
The Research Process.  There are 8 stages to the research process.  Each stage is important, but some hold more significance than others.
IDEAS Global Assembly 2011 Evaluation in Times of Turbulence Amman, April 2011 EVALUATION OF THE IMPLEMENTATION OF THE PARIS DECLARATION ON AID.
The Curriculum Development Process Dr. M
Inter-agency Global Evaluation of RH Services for Refugees and IDPs Component 3: Evaluation of Quality, Access to, and Use of RH Services for Refugees.
Local HWTS Monitoring Eva Manzano, CAWST Technical Advisor Laos Vientiane, Lao PDR November 11, 2014.
BSc Honours Project Introduction CSY4010 Amir Minai Module Leader.
PBL Project Based Learning. What is PBL? PBL is a model for classrooms that emphasizes long- term, interdisciplinary and student-centered activities.
Guidance for Analyses of Pilot Interventions European Workplace and Alcohol Berlin, 9 November 2012 Jon Dawson.
Supporting community action on AIDS in developing countries ‘Evaluating HIV and AIDS-related advocacy’ Skills Building Session International.
WASH, HIV and Inclusive Sanitation in Kenya WASHplus Project April 2016.
HRM 560 Training and Development Environment
What is Advocacy? ]thepressuregroup[.
ENRAP Phase I Goal “to enhance the ability of IFAD-funded projects to address rural poverty” Purpose Build connectivity and electronic communication (horizontal.
CIFOR Guidelines for Foodborne Disease Outbreak Response and the CIFOR Toolkit: Focus Area 7: Epidemiology Investigation New York Integrated Center of.
MONITORING HYGIENE AND SANITATION IN UGANDA 26th May 2015
CLTS Rapid Appraisal Protocol (CRAP)
Background of the Workshop Managing Information for Results
PHAST process.
School WASH Thematic Session
American Evaluation Association Anaheim, 5 November 2011
MUHC Innovation Model.
Chapter 17 Evaluation and Evidence-Based Practice
CLIENT MARKETING / STRATEGIC SELLING WORKSHOP
EPAS Presentation. During one of your field seminars, you will present on your field experiences as they relate to CSWE core competencies and practice.
Corporate-level Evaluation of IFAD’s Decentralization Experience
FY 16 Refine and Implement
COMMUNICATION WORKSHOP PROCEEDINGS
MSc Project Management viva presentation
FAME DIRECTOR’S OFFICE
Presented by: Gail V. Barrington, PhD, CMC
EDU 675 STUDY Education for Service-- edu675study.com.
Use Hand Sanitizer after Hand washing.
Motivation & Background
Data Literacy Survey results and Data Protocols
Course design and objective setting in Training
Module 8- Stages in the Evaluation Process
GNC Global Partners Meeting Washington 30/03/16
The Hub Innovation Program Evaluation Plan
Building Organizational Capacity: An evaluation of
MAP-IT: A Model for Implementing Healthy People 2020
Measuring Behavior Kirkpatrick’s Level 3
Caribbean Workshop on the WHO/UNICEF Global Strategy for Infant Young Child Feeding and the New WHO Child Growth Standards October 13-14, 2005 Martinique.
Liberian-German Cooperation in Health Strengthening Gender Equality at Liberia’s Health Training Institutions – The Gender Audit Process – 2018.
FAME DIRECTOR’S OFFICE
Presentation transcript:

A methodology for evaluating education and training activities A CASE STUDY IN ETHIOPIA 20 TH OCTOBER 2015

2 HOW CAN WE EVALUATE EDUCATION AND TRAINING ACTIVITIES? Have capacities of government and NGOs increased? Has health and well- being improved? Has water and sanitation improved?

3 OBJECTIVES AND STUDY PHASES OBJECTIVE: To develop a methodology for evaluating the effectiveness of WASH training programs, and to test and refine this methodology in a field setting. STUDY PHASES: 1.Review methods for measuring and reporting results of education and training in WASH 2.Investigating metrics for evaluating education and training 3.Developing evaluation methodology for education and training, and piloting in Nepal and Peru 4.Modifying methodology, and piloting a new methodology in Ethiopia

4 Steps in the methodology: 1.Develop a theory of change for the education program 2.Develop indicators based on the Kirkpatrick model 3.Develop data collection tools 4.Collect data 5.Analyze and interpret data 6.Feed results back to planning process This methodology was piloted in an evaluation of a WASH training program for Health workers in Ethiopia. EVALUATION METHODOLOGY

5 STEP 1: THEORY OF CHANGE FOR EDUCATION PROGRAM 2 day WASH Awareness Training Delivered to Health Workers Health workers learn knowledge, attitudes, skills Health workers are more effective at spreading WASH messages in communities Community members learn about WASH options Community members practice better WASH

6 STEP 2:INDICATORS BASED ON KIRKPATRICK TOOL Kirkpatrick stage Examples of indicators Reaction  How relevant the training was  How useful the tools used in the training were Learning  Key knowledge, skills and attitudes related to WASH that HEWs had learned Behaviour  Changes to teaching methods of HEWs in communities as a result of the workshop Results  Changes in the community WASH situation where the trained HEWs are working.

7 STEPS Develop data collection tools  Interview protocols: Health Workers, Community members 4.Collect data  Total 40 interviews, 1 week 5.Analyze and interpret data  Created graphs, developed findings and recommendations 6.Feed results back to planning process  Used the findings to make improvements to the education program

8 FINDINGS AND RECOMMENDATIONS Kirk. Level Findings and recommendations Reaction  Workshop met expectations, and was relevant  Workshop tools (pictures, posters, games) were effective.  Need to translate materials into regional dialects Learning  Learned transmission and blocking, and critical hand washing times.  Refresher training needed on HWTS topics  Improvement needed on tippy tap construction skills Behaviour  Health workers used new methods to teach WASH in communities  Health workers need to prioritize WASH topics Results  Community members had basic knowledge of WASH  Latrine ownership and use was high  More emphasis on HWTS needed at community level

9 STRENGTHS AND WEAKNESSES OF EVALUATION METHODOLOGY StrengthsWeaknesses  Theory of change was useful in determining what to measure.  Easy to communicate and understand.  Efficient to apply.  Provided useful information for improvement  No baseline for this case  Analysis of qualitative data can be subject to interpretive bias

10 NEXT STEPS  Further develop evaluation support materials for clients and partners to implement similar evaluations  Develop a modified “light” methodology, appropriate for organizations where staff have less time to dedicate to the evaluation ANY QUESTIONS??