Download presentation
Presentation is loading. Please wait.
Published byAbner McCarthy Modified over 9 years ago
1
Practical Evaluation: CDC’s Framework for Program Evaluation in Public Health Thomas J. Chapel, MA, MBA Centers for Disease Control and Prevention Office of Workforce and Career Development TChapel@cdc.gov 404-498-6073
2
Why We Evaluate… “... The gods condemned Sisyphus to endlessly roll a rock up a hill, whence it would return each time to its starting place. They thought, with some reason…
3
Why We Evaluate… … there was no punishment more severe than eternally futile labor....” The Myth of Sisyphus
4
Objectives CDC Framework steps and standards Central importance of early steps How Framework approach helps ensure use and avoid roadblocks
5
Defining Evaluation Evaluation is the systematic investigation of the merit, worth, or significance of any “object” Michael Scriven Program is any organized public health action/activity implemented to achieve some result
6
These must be integrated… Continuous Quality Improvement (CQI) cycle. Planning—What actions will best reach our goals and objectives. Performance measurement— How are we doing? Evaluation—Why are we doing well or poorly? What do we do? Why are we doing well or poorly? How are we doing? How do we do it?
7
Enter the CDC Evaluation Framework 7
8
8 Good evaluation = use of findings
9
Enter the CDC Evaluation Framework 9 Good evaluation = use of findings Focus is situation -specific
10
Enter the CDC Evaluation Framework 10 Good evaluation = use of findings Focus is situation -specific Early steps key to best focus
11
The Four Standards No one “right” evaluation. Instead, best choice at each step is options that maximize: Utility: Who needs the info from this evaluation and what info do they need? Feasibility: How much money, time, and effort can we put into this? Propriety: Who needs to be involved in the evaluation to be ethical? Accuracy: What design will lead to accurate information? 11
12
Good results from Steps 4-6 more likely because we did a good job on Steps 1-3!!!
13
You Don’t Ever Need a Logic Model, BUT, You Always Need a Program Description Don’t jump into planning or eval without clarity on: The big “need” your program is to address The key target group(s) who need to take action The kinds of actions they need to take (your intended outcomes or objectives) Activities needed to meet those outcomes “Causal” relationships between activities and outcomes
14
Logic Models and Program Description Logic Models : Graphic depictions of the relationship between your program’s activities and its intended effects
15
Step 2: Describing the Program: Complete Logic Model Activities InputsOutputs Intermediate Effects/ Outcomes Short-term Effects/ Outcomes Long-term Effects/ Outcomes Context Assumptions Stage of Development
16
Activities InputsOutputs Intermediate Effects/ Outcomes Short-term Effects/ Outcomes Long-term Effects/ Outcomes Context Assumptions Stage of Development What the program and its staff actually do
17
Activities InputsOutputs Intermediate Effects/ Outcomes Short-term Effects/ Outcomes Long-term Effects/ Outcomes Context Assumptions Stage of Development Results of activities: Who/what will change?
18
Linking Planning, Evaluation and Performance Measurement Goals Objectives Actions/ Tactics ActivitiesST or MT Outcomes LT Outcomes or Impacts Process Measures Progress Measures Impl. Measures Outcome Measures Impact Measures Key Performance Indicators Success Factors Plan Eval PM
19
Activities InputsOutputs Intermediate Effects/ Outcomes Short-term Effects/ Outcomes Long-term Effects/ Outcomes Context Assumptions Stage of Development Resource “platform” for the program Tangible products of activities
20
Activities InputsOutputs Intermediate Effects/ Outcomes Short-term Effects/ Outcomes Long-term Effects/ Outcomes Context Assumptions Stage of Development Moderators: Contextual factors that will facilitate or hinder getting our outcomes
21
Forgetting Intermediate Outcomes
22
Good evaluation and performance measurement require a broad(er) focus: Not just : Did it work? But also : Is it working? How many tomatoes did I get? Are planting, watering, and weeding taking place? Are there nematodes on the plants? Have the blossoms “set”?
23
Stakeholders—Who is… Affected by the program? Involved in program operations? Intended user(s) of evaluation findings?
24
Lessons Learned—Stakeholder Engagement
25
“All the animals in the barnyard are equal…only some are more equal than others.” George Orwell, Animal Farm
26
Which Matter Most? Who is: Affected by the program? Involved in program operations? Intended users of evaluation findings? Of these, who do we most need to: Enhance credibility? Implement program changes? Advocate for changes? Fund, authorize, expand program?
27
Evaluation Can Be About Anything Evaluation can focus on any/all parts of the logic model Evaluation questions can pertain to Boxes---did this component occur as expected Arrows---what was the relationship between components
28
Activities InputsOutputs Intermediate Effects/ Outcomes Short-term Effects/ Outcomes Long-term Effects/ Outcomes Context Assumptions Stage of Development Were activities and outputs implemented as intended? How much? Who received? Did we get the inputs we needed/were promised? Process Evaluation
29
Activities InputsOutputs Intermediate Effects/ Outcomes Short-term Effects/ Outcomes Long-term Effects/ Outcomes Context Assumptions Stage of Development Which outcomes occurred? How much outcome occurred Outcome Evaluation
30
Activities InputsOutputs Intermediate Effects/ Outcomes Short-term Effects/ Outcomes Long-term Effects/ Outcomes Context Assumptions Stage of Development (How) was implementation quality related to inputs? Efficiency Evaluation
31
Activities InputsOutputs Intermediate Effects/ Outcomes Short-term Effects/ Outcomes Long-term Effects/ Outcomes Context Assumptions Stage of Development Did outcomes occur because of our activities and outputs? Causal Attribution
32
“Utility” Standard Purpose: Toward what end is the measurement being conducted? User: Who wants the info and what are they interested in? Use: How will they use the info?
33
“Feasibility” Standard Acts as “reality check” : Stage of Development: How long has the program been in existence? Program Intensity: How intense is the program? How much impact is reasonable to expect? Resources: How much time, money, expertise are available?
34
Utility—(Some) Potential Uses Show accountability Test/improve program effectiveness Test/improve program implementation “Continuous” program improvement Increase the knowledge base Other…
35
Appropriate Rigor Don’t make the “perfect” the enemy of the “good” Dr. Bill Foege, former CDC Director
36
Measuring the Right Thing… “…Sometimes, what counts can’t be counted. And what can be counted doesn’t count….” Albert Einstein
37
You Get What You Measure… “…In Poland in the 1970s, furniture factories were rewarded based on pounds of product shipped. As a result, today Poles have the world’s heaviest furniture…” (New York Times, 3/4/99)
38
Helpful Publications @ www.cdc.gov/eval www.cdc.gov/eval 38
39
Helpful Resources Intro to Program Evaluation for PH Programs—A Self- Study Guide: http://www.cdc.gov/eval/whatsnew.htmhttp://www.cdc.gov/eval/whatsnew.htm Logic Model Sites Innovation Network: http://www.innonet.org/ http://www.innonet.org/ W.K. Kellogg Foundation Evaluation Resources: http://www.wkkf.org/programming/overview.aspx?CI D=281 http://www.wkkf.org/programming/overview.aspx?CI D=281 University of Wisconsin-Extension: http://www.uwex.edu/ces/lmcourse/ http://www.uwex.edu/ces/lmcourse/ Texts Rogers et al. Program Theory in Evaluation. New Directions Series: Jossey-Bass, Fall 2000 Chen, H. Theory-Driven Evaluations. Sage. 1990
40
Community Tool Box http://ctb.ku.edu http://ctb.ku.edu 40
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.