Download presentation
Presentation is loading. Please wait.
1
Evaluation in the Field: Putting Concepts into Action Janet Myers, PhD MPH Richard Vezina, MPH CAPS HIV Prevention Conference April 21, 2006
2
Overview Warm-up Where does Evaluation fit? Approaches to Evaluation Examples Q&A: Evaluating Your Programs
3
Warm-Up
4
Who here is … Evaluator? Service Provider? Administrator?
5
4 Questions: What are the benefits of evaluating your programs? What are the challenges to evaluating your programs? What needs (besides $) do you have in order to plan/conduct evaluation? What resources do you have/use for evaluation?
6
Where does Evaluation fit?
7
Mission Goals Objectives Activities Outcomes Impact Program & Process Evaluation IMPLEMENTATION EvaluationEvaluation Program & Evaluation PLANNING OutcomeOutcome
8
Mission Provides the vision How this work makes a difference in the world Broadest scope
9
Goals (Ross & Mico, 1980; McKenzie & Smeltzer 2001) “ A future event toward which a committed endeavor is directed ” Simple & concise 2 basic components “ Who will be affected ” “ What will change as a result of the program ”
10
Objectives Specific steps that contribute to a goal. Often several objectives per goal. Good objectives are SMART: S S – specific M M – measurable A A – attainable R R – realistic T T – time-bound
11
Good Objectives Show… (McKenzie & Smeltzer 2001) What will change: Outcome that will be achieved When will it change: Conditions under which the outcomes will be observed How much change: Criteria for deciding whether the outcomes has been achieved Who will change: Target population
12
Activities Internal: administrative, etc. External: the services you provide to clients Based on your goals/objectives
13
Outcomes Changes that occur in people being served by your program Attribution: To the best extent possible, show that change is a result of your program (but note…causality is difficult) Standards are typically different for evaluation (vs. research) To assess, you need at least 2 time points (pre- and post-) and/or a comparison group
14
Impact The scope of the program’s effects, the duration of its outcomes and the extent of its influence on the broader context (for example, HIV incidence) Attribution: Can be more challenging to show causality, because looking for more diffuse effects Usually broad and long-term Typically not in the scope of program evaluation
15
Approaches to Evaluation
16
Why do we evaluate? To determine if objectives are being met To improve quality of the program To decide how to change content To identify the effects of the program
17
Process vs. Outcome Evaluation Process Demographics (Who’s being trained?) Reaction to content (“Smile Sheets”) Service units delivered Outcome Changes in knowledge/attitudes/beliefs Changes in behavior Impact on patients/clients
18
Process Evaluation can help us… Create better learning environment Improve presentation skills Show accountability Reflect the target populations Track service units
19
Outcome Evaluation can help us… Show the program’s effects Allow for comparisons over time Provide specific guide points for improving programs Show accountability
20
Planning Your Evaluation (1) Figure out your questions: What will this be used for? Consider your Resources Staffing Time Materials $$$ Choose Methods Quantitative: Surveys, pre/post tests, etc. Qualitative: Interviews, focus groups, etc.
21
Planning Your Evaluation (2) Select Indicator(s) Relevant Measurable Improvable Instrument/Tool Development Don’t reinvent the wheel! Analysis: Get answers to your questions Reporting: Formal & Informal
22
Examples
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.