Download presentation
Presentation is loading. Please wait.
Published byJordy Flanders Modified over 9 years ago
1
Monitoring and Evaluating Interventions: A Workshop Chris Duclos, PhD. JSI Research & Training Institute
2
Today: Discuss M & E purposes Learn about E cycle & components Understand E designs Prepare for an E Discuss ethical considerations Prevention M & E: How it differs Discuss challenges in prevention M & E
3
Why All the Fuss?
4
Exercises Organize into groups Pick pocket exercise Identify purposes
5
To know how well you’re doing…you must have some place you’re trying to get to…. “If you don’t know where you’re going, you’ll end up somewhere else….
6
Why Evaluate (Purposes)? Ensure program effectiveness and appropriateness Demonstrate accountability Contribute to HIV/AIDS knowledge base Improve program operations and service delivery
7
Evaluation Evaluation is the systematic collection of information about a program in order to enable stakeholders: to better understand the program, to improve program effectiveness, and/or to make decisions about future programming.
8
Critical Evaluation Questions What do you want your project to accomplish? How will you know if you have accomplished your goals? What activities will your project undertake to accomplish your goals? What factors might help or hinder your ability to accomplish your goals? What will you want to tell others who are interested in your project?
9
Program Planning Process IMPLEMENT EVALUATE IMPROVE PLAN
10
1.Identify program goals and objectives 2.Define the scope of the evaluation 3.Define evaluation questions & indicators 4.Define methods 5.Design instruments and tools 6.Carry out the evaluation 7.Analyze data and write a report 8.Disseminate and use data Essential Steps to Evaluation (FHI, Impact, USAID manual)
11
Components of Project Level Evaluation 4 general components to comprehensive program evaluation: Formative evaluation: How do we make the program better? Process evaluation: How was the program implemented? Outcome evaluation: Did the program meet its objectives? Impact evaluation: Was the ultimate goal of the program achieved?
12
Application to Your Program: Identify Program Goals For each goal: Identify Process Objectives Identify Outcome Objectives For each objective: Identify Indicators Identify Data Source Plan Data Collection Plan Data Analysis
13
Program Goals and Objectives Well developed goals and objectives are critical to evaluation. Objectives are specific steps that contribute to a goal. Often several objectives per goal. Good objectives are SMART: S – specific S – specific M – measurable M – measurable A – attainable A – attainable R – realistic R – realistic T – time-bound T – time-bound
14
Program Outcome Model
15
Every program has… Every program evaluation should have… Impact Indicators Outcome Indicators Process Indicators GoalsObjectivesActivities
16
Every program has… Process Evaluation Every program evaluation should have… Impact Indicators Outcome Indicators Process Indicators GoalsObjectivesActivities
17
What is Process Evaluation? Process evaluation: Addresses how, and how well, the program is functioning It can help to… Create a better learning environment Show accountability to funder Reflect the target populations Track services
18
Process Evaluation con’t Key questions in process evaluation: Who is served? What activities or services are provided? Where, when, and how long is the program? What are the critical activities? How do these activities connect to the goals & intended outcomes? What aspects of the implementing process are facilitating success or acting as stumbling blocks?
19
Process Evaluation con’t Identify how an outcome is produced Identify strengths & weaknesses of a program Create detailed description of the program
20
Process Evaluation Objectives Improve current activities Provide support for sustainability Provide insight into why certain goals are or are not being accomplished Help leaders make decisions
21
Methods of Data Collection Quantitative vs. Qualitative Surveys, interviews, activity databases, etc. Observation
22
Exercise – Draft Process Evaluation Plan Initial groups, organize by experience Use least experienced member’s program Articulate one goal with objectives Come up with two process evaluation questions Define measurable process outcomes, indicators, and data collection method
23
Exercise: Goal: ObjectivesOutcomeIndicatorData Source Collection Method Data Analysis 1.
24
Every program has… Outcome Evaluation Every program evaluation should have… Impact Indicators Outcome Indicators Process Indicators GoalsObjectivesActivities
25
OUTCOMES…. are what a program is accountable for…
26
Outcome Evaluation Outcome evaluation: Measures the extent to which a program produces its intended improvements Examines effectiveness, goal attainment and unintended outcomes In simple terms, “What’s different as a result of your efforts?”
27
Outcome Evaluation con’t Key questions in outcome evaluation: To what degree did the desired change(s) occur? Outcomes can be immediate, intermediate or longer-term Outcomes can be measured at the patient, provider, organization, or system level.
28
Outcomes ??? Outcomes are: Changes in: behavior skills knowledge attitudes condition status Outcomes should be: Related to the business of the project REALISTIC and ATTAINABLE RELEVANT to the project Within the program’s sphere of influence
29
Outcomes… Are logical and reasonable if it is… Reasonable to believe that the outcome can be accomplished within the timeframe that a program has. Based on: Program previous experience Context Resources
30
An Example of Outcome Evaluation GOAL: Increase sexually active single seniors’ knowledge and use of condoms POSSIBLE EVALUATION QUESTIONS: Have seniors increased their knowledge about the use of condoms? Have seniors increased their use of condoms? How do we know that the outreach and education activities are responsible for the changes?
31
Outcome Evaluation Design Pre-ProgramPost- Single Group Designs Experimental Designs Random Assignment Participant Group Comparison Group
32
Outcome Evaluation Design Quasi-Experimental Designs Nonrandom Assignment, comparison usually does not have program Posttest-Only Designs Weakest, but better than nothing; compare results to local and/or national data.
33
Newer Methods Rolling Group Designs All groups receive intervention at different times; provide pretest & posttest data; no wait list controls Groups that receive intervention later serve as a control group until they too get intervention Pretest prior to the intervention is conceptualized as if it is a posttest of a “real” control group.
34
Newer Methods Internal Referencing Strategy Pretest-posttest single group design in which content relevant materials that are covered in the intervention & that are not covered are both assessed in the pretest and posttest Untrained/presented materials acts as a control for the presented materials Nonintervention materials assessed be conceptually related but distinct from the intervention materials Effectiveness inferred when improve seen from pretest to posttest on intervention materials, but little or no change on the nonintervention material
35
Data Collection Begins before program starts – needs assessments Think about what kind of data would answer the question Think about best method in collecting the data and how often Collect only what you need Nothing’s simple
36
Digging Through People’s Files Agency/program records Meeting records or minutes Other program records (ex. Schools) Public records (ex. police or court) Pros: Cheap and fast! Cons: Have to get permission; maybe biased because it was collected for another reason.
37
Custom Data Collection Allows you to collect what you need, how you want Develop own or use standardized Typically occurs though: Surveys, in-person interview, or focus groups Pros: Can be fairly cheap, allows a way to ask questions the way you want. Cons: Some people lie; interviewer bias, interviewee bias IMPORTANCE OF TESTING OR PILOTING
38
Things to Remember Collect only data that you will use & are relevant Involve all staff involved in the data collection phase in up-front question formation Revise collection strategies based on initial analyses – what’s working?, what’s still missing? Base changes to existing tracking/data collection strategies on what is learned from evaluation
39
Exercise: Goal: ObjectivesOutcomeIndicatorData Source Collection Method Data Analysis 1.
40
Every program has… Impact Evaluation Every program evaluation should have… Impact Indicators Outcome Indicators Process Indicators GoalsObjectivesActivities
41
Impact Evaluation Impact is sometimes used to mean “outcome.” Impact is perhaps better defined as a longer-term outcome - improved patient outcomes In global M&E, incidence or prevalence of disease
42
A note about impact… Most program evaluations focus on measuring the process and outcomes of a program Measuring impact requires significant resources that most programs don’t have It’s also difficult to link the more immediate effects of a program to broad, often community level, impacts
43
Ethical Considerations As a minimum, all evaluation projects must ensure that they are fully in line with the ethical research Issues: Help or benefit to others, do no harm, act fairly, respect others Consideration of risks & benefits Disruptions of participants’ life, emotional consequences, safety concerns, social harm
44
Ethical Considerations Keep evaluation procedures as brief & convenient as possible to minimize burden Do not ask emotionally troubling questions, unless absolutely necessary Provide incentives Provide Informed Consent Protect “Confidentiality” Ensure safety Obey HIPAA requirements Get possible IRB review
45
American Evaluation Association Principles Systematic inquiry Competence Integrity/Honesty Respect for people Responsibilities for general and public welfare
46
Prevention Evaluation Differences Very nature of prevention poses unique challenges Key difference in evaluating prevention programs is that you are trying to determine what DID NOT occur Measuring reduction or delayed onset before it happens Comparison groups then become essential
47
Challenges in Evaluating Prevention Programs Timeframe Measurement Results Statistical Significance Accountability Competition
48
Resources Kellogg Foundation - www.wkkf.org www.wkkf.org American Evaluator’s Association CDC MMWR Weekly Report, September 17, 1999, Vol. 48, No.RR-11 “Framework for Program Evaluation in Public Health CSAP’s Prevention Pathways: Online Courses
49
Hope This Helped Contact me: Christine Duclos, PhD, MPH JSI Research & Training Institute, Inc. 1860 Blake St. #320 Denver, CO 80202 303-262-4318
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.