Fear and Loathing of Evaluation: Avoiding 7 Common “Potholes” Thomas J. Chapel, MA, MBA Chief Evaluation Officer Centers for Disease Control and Prevention (404) May 2016 Office of the Director
Fear and Loathig of Evaluation: Avoiding 7 Common Potholes By: Thomas J. Chapel, MA, MBA Chief Evaluation Officer, CDC
Why We Evaluate… “... The gods condemned Sisyphus to endlessly roll a rock up a hill, whence it would return each time to its starting place. They thought, with some reason…
Why We Evaluate… … there was no punishment more severe than eternally futile labor....” The Myth of Sisyphus
Objectives Program evaluation and typical “potholes” on the road to good evaluation CDC’s Evaluation Framework How key Framework steps circumvent potholes and ensure the strongest program evaluation
Today’s Focus Top Potholes on the Road to Good Evaluation
Pothole #7 Not understanding where evaluation “fits in” …
Defining Evaluation Evaluation is the systematic investigation of the merit, worth, or significance of any “object” Michael Scriven Program is any organized public health action/activity implemented to achieve some result
Integrating Processes to Achieve Continuous Quality Improvement Continuous Quality Improvement (CQI) cycle. Planning—What actions will best reach our goals and objectives. Performance measurement— How are we doing? Evaluation—Why are we doing well or poorly? What do we do? Why are we doing well or poorly? How are we doing? How do we do it?
10 “Research seeks to prove, evaluation seeks to improve…” M.Q. Patton
Pothole #6 Ignoring “process use” insights …
12 Process Use When influence/program improvement comes not from findings of an evaluation, but from insights gleaned during the tasks involved in doing an evaluation Process Use in Theory, Research, and Practice: New Directions for Evaluation, No. 116, 2008
Pothole #5 Making the “perfect” the enemy of the “good”
Every Little Bit Helps… “…The biggest mistake is doing nothing because you can only do a little…” Anonymous
Pothole #4 Evaluating only what you can “measure”…
Measuring the Right Thing… “…Sometimes, what counts can’t be counted. And what can be counted doesn’t count….” Albert Einstein
You Get What You Measure… “…In Poland in the 1970s, furniture factories were rewarded based on pounds of product shipped. As a result, today Poles have the world’s heaviest furniture…” (New York Times, 3/4/99)
Pothole #3 Neglecting intermediate outcomes….
Forgetting Intermediate Outcomes
Good evaluation broadens our focus: Not just : Did it work? But also : Is it working? How many tomatoes did I get? Are planting, watering, and weeding taking place? Are there nematodes on the plants? Have the blossoms “set”?
Finding Intermediate Outcomes What is the ultimate outcome I’m seeking? Who (besides me)needs to take action to achieve it? What action do they need to take?
Pothole #2 Confusing attribution and contribution…
…“to put a man on the moon”.
“Networked” Interventions Agency A Program A-n Program A-1 Agency B Program B-1 Agency C Program C-n Program C-1 Agency D Program D-n Program D-1 OUTPUTS SHORT-TERM OUTCOMES LONG-TERM OUTCOMES SYSTEM OUTCOME
Pothole #1 Not asking: “Who (else) cares…..”
Framework for Program Evaluation
Enter the CDC Evaluation Framework 27 Good M&E = use of findings
Enter the CDC Evaluation Framework 28 Good M&E= use of findings Focus is situation -specific
Enter the CDC Evaluation Framework 29 Good M&E = use of findings Focus is situation -specific Early steps key to best focus
Framework for Program Evaluation The 4 Evaluation Standards help focus efforts at each step
The Four Standards No one “right” evaluation. Instead, best choice at each step is options that maximize: Utility: Who needs the info from this evaluation and what info do they need? Feasibility: How much money, time, and effort can we put into this? Propriety: Who needs to be involved in the evaluation to be ethical? Accuracy: What design will lead to accurate information? 31
Pothole #1 Not asking: “Who (else) cares…..”
Enter the CDC Evaluation Framework 33 Stakeholders and users are key focus
Potholes #2 and #3 Neglecting intermediate outcomes…. Confusing attribution and contribution…
Enter the CDC Evaluation Framework 35 Program description guides evaluation and plannning
36 You Don’t Ever Need a Logic Model, BUT, You Always Need a Program Description The big “need” your program is to address The key target group(s) who need to take action The kinds of actions they need to take (your intended outcomes or objectives) Activities needed to meet those outcomes Resources to do the program Contextual factors that help or hurt
And here is our logic model….
Step 2: Describing the Program: Complete Logic Model Activities InputsOutputs Intermediate Effects/ Outcomes Short-term Effects/ Outcomes Long-term Effects/ Outcomes Context Assumptions Stage of Development
39 Global Logic Model: Childhood Lead Poisoning Program Early OutcomesLater OutcomesLater Activities Early Activities If we do… Outreach Screening ID of elevated kids Case mgmt of EBLL kids And we do… Refer EBLL kids for medical treatment Train family in in- home techniques Assess environment of EBLL child Refer environment for clean-up Then…. EBLL kids get medical treatment Family performs in-home techniques Lead source identified Environment gets cleaned up Lead source removed And then… EBLL reduced Develop’l slide stopped Quality of life improves
Basic CPPW Logic Model
Potholes #4 and #5 Evaluating only what you can “measure”… Making the “perfect” the enemy of the “good”
Enter the CDC Evaluation Framework 42 4 Evaluation Standards match “eval focus” and evidence to the situation
Potholes #6 and #7 Not understanding where evaluation “fits in” … Ignoring “process use” insights …
Enter the CDC Evaluation Framework 45 Good M&E = use of findings
Integrating Processes to Achieve Continuous Quality Improvement Continuous Quality Improvement (CQI) cycle. Planning—What actions will best reach our goals and objectives. Performance measurement— How are we doing? Evaluation—Why are we doing well or poorly? What do we do? Why are we doing well or poorly? How are we doing? How do we do it?
Reducing Fear and Practical Program Evaluation Life Post-Session
48 Helpful
49 Helpful Resources Intro to Program Evaluation for PH Programs—A Self- Study Guide: Logic Model Sites Innovation Network: W.K. Kellogg Foundation Evaluation Resources: D=281 D=281 University of Wisconsin-Extension: Texts Rogers et al. Program Theory in Evaluation. New Directions Series: Jossey-Bass, Fall 2000 Chen, H. Theory-Driven Evaluations. Sage. 1990
50 Community Tool Box
For more information please contact Centers for Disease Control and Prevention 1600 Clifton Road NE, Atlanta, GA Telephone: CDC-INFO ( )/TTY: Web: The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention. Office of the Director Questions?