Monitoring and Evaluating Interventions: A Workshop Chris Duclos, PhD. JSI Research & Training Institute
Today: Discuss M & E purposes Learn about E cycle & components Understand E designs Prepare for an E Discuss ethical considerations Prevention M & E: How it differs Discuss challenges in prevention M & E
Why All the Fuss?
Exercises Organize into groups Pick pocket exercise Identify purposes
To know how well you’re doing…you must have some place you’re trying to get to…. “If you don’t know where you’re going, you’ll end up somewhere else….
Why Evaluate (Purposes)? Ensure program effectiveness and appropriateness Demonstrate accountability Contribute to HIV/AIDS knowledge base Improve program operations and service delivery
Evaluation Evaluation is the systematic collection of information about a program in order to enable stakeholders: to better understand the program, to improve program effectiveness, and/or to make decisions about future programming.
Critical Evaluation Questions What do you want your project to accomplish? How will you know if you have accomplished your goals? What activities will your project undertake to accomplish your goals? What factors might help or hinder your ability to accomplish your goals? What will you want to tell others who are interested in your project?
Program Planning Process IMPLEMENT EVALUATE IMPROVE PLAN
1.Identify program goals and objectives 2.Define the scope of the evaluation 3.Define evaluation questions & indicators 4.Define methods 5.Design instruments and tools 6.Carry out the evaluation 7.Analyze data and write a report 8.Disseminate and use data Essential Steps to Evaluation (FHI, Impact, USAID manual)
Components of Project Level Evaluation 4 general components to comprehensive program evaluation: Formative evaluation: How do we make the program better? Process evaluation: How was the program implemented? Outcome evaluation: Did the program meet its objectives? Impact evaluation: Was the ultimate goal of the program achieved?
Application to Your Program: Identify Program Goals For each goal: Identify Process Objectives Identify Outcome Objectives For each objective: Identify Indicators Identify Data Source Plan Data Collection Plan Data Analysis
Program Goals and Objectives Well developed goals and objectives are critical to evaluation. Objectives are specific steps that contribute to a goal. Often several objectives per goal. Good objectives are SMART: S – specific S – specific M – measurable M – measurable A – attainable A – attainable R – realistic R – realistic T – time-bound T – time-bound
Program Outcome Model
Every program has… Every program evaluation should have… Impact Indicators Outcome Indicators Process Indicators GoalsObjectivesActivities
Every program has… Process Evaluation Every program evaluation should have… Impact Indicators Outcome Indicators Process Indicators GoalsObjectivesActivities
What is Process Evaluation? Process evaluation: Addresses how, and how well, the program is functioning It can help to… Create a better learning environment Show accountability to funder Reflect the target populations Track services
Process Evaluation con’t Key questions in process evaluation: Who is served? What activities or services are provided? Where, when, and how long is the program? What are the critical activities? How do these activities connect to the goals & intended outcomes? What aspects of the implementing process are facilitating success or acting as stumbling blocks?
Process Evaluation con’t Identify how an outcome is produced Identify strengths & weaknesses of a program Create detailed description of the program
Process Evaluation Objectives Improve current activities Provide support for sustainability Provide insight into why certain goals are or are not being accomplished Help leaders make decisions
Methods of Data Collection Quantitative vs. Qualitative Surveys, interviews, activity databases, etc. Observation
Exercise – Draft Process Evaluation Plan Initial groups, organize by experience Use least experienced member’s program Articulate one goal with objectives Come up with two process evaluation questions Define measurable process outcomes, indicators, and data collection method
Exercise: Goal: ObjectivesOutcomeIndicatorData Source Collection Method Data Analysis 1.
Every program has… Outcome Evaluation Every program evaluation should have… Impact Indicators Outcome Indicators Process Indicators GoalsObjectivesActivities
OUTCOMES…. are what a program is accountable for…
Outcome Evaluation Outcome evaluation: Measures the extent to which a program produces its intended improvements Examines effectiveness, goal attainment and unintended outcomes In simple terms, “What’s different as a result of your efforts?”
Outcome Evaluation con’t Key questions in outcome evaluation: To what degree did the desired change(s) occur? Outcomes can be immediate, intermediate or longer-term Outcomes can be measured at the patient, provider, organization, or system level.
Outcomes ??? Outcomes are: Changes in: behavior skills knowledge attitudes condition status Outcomes should be: Related to the business of the project REALISTIC and ATTAINABLE RELEVANT to the project Within the program’s sphere of influence
Outcomes… Are logical and reasonable if it is… Reasonable to believe that the outcome can be accomplished within the timeframe that a program has. Based on: Program previous experience Context Resources
An Example of Outcome Evaluation GOAL: Increase sexually active single seniors’ knowledge and use of condoms POSSIBLE EVALUATION QUESTIONS: Have seniors increased their knowledge about the use of condoms? Have seniors increased their use of condoms? How do we know that the outreach and education activities are responsible for the changes?
Outcome Evaluation Design Pre-ProgramPost- Single Group Designs Experimental Designs Random Assignment Participant Group Comparison Group
Outcome Evaluation Design Quasi-Experimental Designs Nonrandom Assignment, comparison usually does not have program Posttest-Only Designs Weakest, but better than nothing; compare results to local and/or national data.
Newer Methods Rolling Group Designs All groups receive intervention at different times; provide pretest & posttest data; no wait list controls Groups that receive intervention later serve as a control group until they too get intervention Pretest prior to the intervention is conceptualized as if it is a posttest of a “real” control group.
Newer Methods Internal Referencing Strategy Pretest-posttest single group design in which content relevant materials that are covered in the intervention & that are not covered are both assessed in the pretest and posttest Untrained/presented materials acts as a control for the presented materials Nonintervention materials assessed be conceptually related but distinct from the intervention materials Effectiveness inferred when improve seen from pretest to posttest on intervention materials, but little or no change on the nonintervention material
Data Collection Begins before program starts – needs assessments Think about what kind of data would answer the question Think about best method in collecting the data and how often Collect only what you need Nothing’s simple
Digging Through People’s Files Agency/program records Meeting records or minutes Other program records (ex. Schools) Public records (ex. police or court) Pros: Cheap and fast! Cons: Have to get permission; maybe biased because it was collected for another reason.
Custom Data Collection Allows you to collect what you need, how you want Develop own or use standardized Typically occurs though: Surveys, in-person interview, or focus groups Pros: Can be fairly cheap, allows a way to ask questions the way you want. Cons: Some people lie; interviewer bias, interviewee bias IMPORTANCE OF TESTING OR PILOTING
Things to Remember Collect only data that you will use & are relevant Involve all staff involved in the data collection phase in up-front question formation Revise collection strategies based on initial analyses – what’s working?, what’s still missing? Base changes to existing tracking/data collection strategies on what is learned from evaluation
Exercise: Goal: ObjectivesOutcomeIndicatorData Source Collection Method Data Analysis 1.
Every program has… Impact Evaluation Every program evaluation should have… Impact Indicators Outcome Indicators Process Indicators GoalsObjectivesActivities
Impact Evaluation Impact is sometimes used to mean “outcome.” Impact is perhaps better defined as a longer-term outcome - improved patient outcomes In global M&E, incidence or prevalence of disease
A note about impact… Most program evaluations focus on measuring the process and outcomes of a program Measuring impact requires significant resources that most programs don’t have It’s also difficult to link the more immediate effects of a program to broad, often community level, impacts
Ethical Considerations As a minimum, all evaluation projects must ensure that they are fully in line with the ethical research Issues: Help or benefit to others, do no harm, act fairly, respect others Consideration of risks & benefits Disruptions of participants’ life, emotional consequences, safety concerns, social harm
Ethical Considerations Keep evaluation procedures as brief & convenient as possible to minimize burden Do not ask emotionally troubling questions, unless absolutely necessary Provide incentives Provide Informed Consent Protect “Confidentiality” Ensure safety Obey HIPAA requirements Get possible IRB review
American Evaluation Association Principles Systematic inquiry Competence Integrity/Honesty Respect for people Responsibilities for general and public welfare
Prevention Evaluation Differences Very nature of prevention poses unique challenges Key difference in evaluating prevention programs is that you are trying to determine what DID NOT occur Measuring reduction or delayed onset before it happens Comparison groups then become essential
Challenges in Evaluating Prevention Programs Timeframe Measurement Results Statistical Significance Accountability Competition
Resources Kellogg Foundation American Evaluator’s Association CDC MMWR Weekly Report, September 17, 1999, Vol. 48, No.RR-11 “Framework for Program Evaluation in Public Health CSAP’s Prevention Pathways: Online Courses
Hope This Helped Contact me: Christine Duclos, PhD, MPH JSI Research & Training Institute, Inc Blake St. #320 Denver, CO