Download presentation
Presentation is loading. Please wait.
Published byDarcy Ford Modified over 8 years ago
1
Fear and Loathing of Evaluation: Avoiding 7 Common “Potholes” Thomas J. Chapel, MA, MBA Chief Evaluation Officer Centers for Disease Control and Prevention (404) 639-2116 tchapel@cdc.gov May 2016 Office of the Director
2
Fear and Loathig of Evaluation: Avoiding 7 Common Potholes By: Thomas J. Chapel, MA, MBA Chief Evaluation Officer, CDC TChapel@cdc.gov
3
Why We Evaluate… “... The gods condemned Sisyphus to endlessly roll a rock up a hill, whence it would return each time to its starting place. They thought, with some reason…
4
Why We Evaluate… … there was no punishment more severe than eternally futile labor....” The Myth of Sisyphus
5
Objectives Program evaluation and typical “potholes” on the road to good evaluation CDC’s Evaluation Framework How key Framework steps circumvent potholes and ensure the strongest program evaluation
6
Today’s Focus Top Potholes on the Road to Good Evaluation
7
Pothole #7 Not understanding where evaluation “fits in” …
8
Defining Evaluation Evaluation is the systematic investigation of the merit, worth, or significance of any “object” Michael Scriven Program is any organized public health action/activity implemented to achieve some result
9
Integrating Processes to Achieve Continuous Quality Improvement Continuous Quality Improvement (CQI) cycle. Planning—What actions will best reach our goals and objectives. Performance measurement— How are we doing? Evaluation—Why are we doing well or poorly? What do we do? Why are we doing well or poorly? How are we doing? How do we do it?
10
10 “Research seeks to prove, evaluation seeks to improve…” M.Q. Patton
11
Pothole #6 Ignoring “process use” insights …
12
12 Process Use When influence/program improvement comes not from findings of an evaluation, but from insights gleaned during the tasks involved in doing an evaluation Process Use in Theory, Research, and Practice: New Directions for Evaluation, No. 116, 2008
13
Pothole #5 Making the “perfect” the enemy of the “good”
14
Every Little Bit Helps… “…The biggest mistake is doing nothing because you can only do a little…” Anonymous
15
Pothole #4 Evaluating only what you can “measure”…
16
Measuring the Right Thing… “…Sometimes, what counts can’t be counted. And what can be counted doesn’t count….” Albert Einstein
17
You Get What You Measure… “…In Poland in the 1970s, furniture factories were rewarded based on pounds of product shipped. As a result, today Poles have the world’s heaviest furniture…” (New York Times, 3/4/99)
18
Pothole #3 Neglecting intermediate outcomes….
19
Forgetting Intermediate Outcomes
20
Good evaluation broadens our focus: Not just : Did it work? But also : Is it working? How many tomatoes did I get? Are planting, watering, and weeding taking place? Are there nematodes on the plants? Have the blossoms “set”?
21
Finding Intermediate Outcomes What is the ultimate outcome I’m seeking? Who (besides me)needs to take action to achieve it? What action do they need to take?
22
Pothole #2 Confusing attribution and contribution…
23
…“to put a man on the moon”.
24
“Networked” Interventions Agency A Program A-n Program A-1 Agency B Program B-1 Agency C Program C-n Program C-1 Agency D Program D-n Program D-1 OUTPUTS SHORT-TERM OUTCOMES LONG-TERM OUTCOMES SYSTEM OUTCOME
25
Pothole #1 Not asking: “Who (else) cares…..”
26
Framework for Program Evaluation
27
Enter the CDC Evaluation Framework 27 Good M&E = use of findings
28
Enter the CDC Evaluation Framework 28 Good M&E= use of findings Focus is situation -specific
29
Enter the CDC Evaluation Framework 29 Good M&E = use of findings Focus is situation -specific Early steps key to best focus
30
Framework for Program Evaluation The 4 Evaluation Standards help focus efforts at each step
31
The Four Standards No one “right” evaluation. Instead, best choice at each step is options that maximize: Utility: Who needs the info from this evaluation and what info do they need? Feasibility: How much money, time, and effort can we put into this? Propriety: Who needs to be involved in the evaluation to be ethical? Accuracy: What design will lead to accurate information? 31
32
Pothole #1 Not asking: “Who (else) cares…..”
33
Enter the CDC Evaluation Framework 33 Stakeholders and users are key focus
34
Potholes #2 and #3 Neglecting intermediate outcomes…. Confusing attribution and contribution…
35
Enter the CDC Evaluation Framework 35 Program description guides evaluation and plannning
36
36 You Don’t Ever Need a Logic Model, BUT, You Always Need a Program Description The big “need” your program is to address The key target group(s) who need to take action The kinds of actions they need to take (your intended outcomes or objectives) Activities needed to meet those outcomes Resources to do the program Contextual factors that help or hurt
37
And here is our logic model….
38
Step 2: Describing the Program: Complete Logic Model Activities InputsOutputs Intermediate Effects/ Outcomes Short-term Effects/ Outcomes Long-term Effects/ Outcomes Context Assumptions Stage of Development
39
39 Global Logic Model: Childhood Lead Poisoning Program Early OutcomesLater OutcomesLater Activities Early Activities If we do… Outreach Screening ID of elevated kids Case mgmt of EBLL kids And we do… Refer EBLL kids for medical treatment Train family in in- home techniques Assess environment of EBLL child Refer environment for clean-up Then…. EBLL kids get medical treatment Family performs in-home techniques Lead source identified Environment gets cleaned up Lead source removed And then… EBLL reduced Develop’l slide stopped Quality of life improves
40
Basic CPPW Logic Model
41
Potholes #4 and #5 Evaluating only what you can “measure”… Making the “perfect” the enemy of the “good”
42
Enter the CDC Evaluation Framework 42 4 Evaluation Standards match “eval focus” and evidence to the situation
43
Potholes #6 and #7 Not understanding where evaluation “fits in” … Ignoring “process use” insights …
45
Enter the CDC Evaluation Framework 45 Good M&E = use of findings
46
Integrating Processes to Achieve Continuous Quality Improvement Continuous Quality Improvement (CQI) cycle. Planning—What actions will best reach our goals and objectives. Performance measurement— How are we doing? Evaluation—Why are we doing well or poorly? What do we do? Why are we doing well or poorly? How are we doing? How do we do it?
47
Reducing Fear and Practical Program Evaluation Life Post-Session
48
48 Helpful Publications @ www.cdc.gov/eval www.cdc.gov/eval 48
49
49 Helpful Resources Intro to Program Evaluation for PH Programs—A Self- Study Guide: http://www.cdc.gov/eval/whatsnew.htmhttp://www.cdc.gov/eval/whatsnew.htm Logic Model Sites Innovation Network: http://www.innonet.org/ http://www.innonet.org/ W.K. Kellogg Foundation Evaluation Resources: http://www.wkkf.org/programming/overview.aspx?CI D=281 http://www.wkkf.org/programming/overview.aspx?CI D=281 University of Wisconsin-Extension: http://www.uwex.edu/ces/lmcourse/ http://www.uwex.edu/ces/lmcourse/ Texts Rogers et al. Program Theory in Evaluation. New Directions Series: Jossey-Bass, Fall 2000 Chen, H. Theory-Driven Evaluations. Sage. 1990
50
50 Community Tool Box http://ctb.ku.edu http://ctb.ku.edu 50
51
For more information please contact Centers for Disease Control and Prevention 1600 Clifton Road NE, Atlanta, GA 30333 Telephone: 1-800-CDC-INFO (232-4636)/TTY: 1-888-232-6348 E-mail: cdcinfo@cdc.gov Web: http://www.cdc.gov The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention. Office of the Director Questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.