Download presentation
Presentation is loading. Please wait.
Published bySeth Combs Modified over 11 years ago
1
Using the CDC Evaluation Fwork to Avoid Minefields on the Road to Good Evaluation Presented to: 2002 National Asthma Conference October 24, 2002 By: Thomas J. Chapel, MA, MBA Office of the Director Office of Program Planning and Evaluation Centers for Disease Control and Prevention
2
Why We Evaluate... The gods condemned Sisyphus to endlessly roll a rock up a hill, whence it would return each time to its starting place. They thought, with some reason, that there was no punishment more severe than eternally futile labor.... The Myth of Sisyphus – MMWR Framework for Program Evaluation in Public Health 1
3
2 Defining Evaluation Evaluation is... Evaluation is... the systematic investigation of the merit, worth, or significance of an object – Michael Scriven the systematic investigation of the merit, worth, or significance of an object – Michael Scriven Program is... Program is... any organized public health action/activity any organized public health action/activity
4
3 Research vs. Program Evaluation A continuum, not a dichotomy, but at far ends may differ in: A continuum, not a dichotomy, but at far ends may differ in: Framework and steps Framework and steps Decision making Decision making Standards Standards Key questions Key questions Design Design Data collection sources and measures Data collection sources and measures Analysis timing and scope Analysis timing and scope Role of values in making judgments Role of values in making judgments Centrality of attribution as conclusion Centrality of attribution as conclusion Audiences for dissemination of results Audiences for dissemination of results
5
4 The Continuum Efficacy…does my effort work in ideal circumstances Efficacy…does my effort work in ideal circumstances Effectiveness…does my effort work in real world settings, and work the same way across settings Effectiveness…does my effort work in real world settings, and work the same way across settings Implementation fidelity…is my (efficacious and effective) effort being implemented as intended. Implementation fidelity…is my (efficacious and effective) effort being implemented as intended.
6
Todays Focus Top Minefields on the Road Conducting Good Evaluation! – MMWR Framework for Program Evaluation in Public Health 5
7
Minefield # 8 Not linking planning and evaluation… – MMWR Framework for Program Evaluation in Public Health 6
8
Minefield # 7 Evaluating only what you can measure… – MMWR Framework for Program Evaluation in Public Health 7
9
You Get What You Measure… …In Poland in the 1970s, furniture factories were rewarded based on pounds of product shipped. As a result, today Poles have the worlds heaviest furniture… (New York Times, 3/4/99) – MMWR Framework for Program Evaluation in Public Health 8
10
Minefield # 6 Thinking evaluatively only at the end… – MMWR Framework for Program Evaluation in Public Health 9
11
When to Evaluate…. Good program evaluation shifts our focus from Did it (my effort) work? to Is it (my effort) working? – MMWR Framework for Program Evaluation in Public Health 10
12
Minefield # 5 Not asking who (else) cares… – MMWR Framework for Program Evaluation in Public Health 11
13
Minefield # 4 Neglecting intermediate outcomes… – MMWR Framework for Program Evaluation in Public Health 12
14
Forgetting Intermediate Outcomes 13
15
Minefield # 3 Neglecting process evaluation… – MMWR Framework for Program Evaluation in Public Health 14
16
Minefield # 2 Confusing attribution and contribution… – MMWR Framework for Program Evaluation in Public Health 15
17
16 Networked Interventions
18
Minefield # 1 Using more sticks than carrots… – MMWR Framework for Program Evaluation in Public Health 17
19
Framework for Program Evaluation 18
20
Standards for Effective Evaluation Not HOW TO do an evaluation, but help direct choices among options at each step: Not HOW TO do an evaluation, but help direct choices among options at each step: At each step, standards ask which choice(s) At each step, standards ask which choice(s) Utility (7): Best serve information needs of intended users Feasibility (3): Are most realistic, prudent, diplomatic, and frugal given resources Utility (7): Best serve information needs of intended users Feasibility (3): Are most realistic, prudent, diplomatic, and frugal given resources Propriety (8): Best meet law, ethics, and due regard for the welfare of those involved and affected Propriety (8): Best meet law, ethics, and due regard for the welfare of those involved and affected Accuracy (12): Best reveal and convey technically accurate information Accuracy (12): Best reveal and convey technically accurate information 19
21
Broadening Our Thinking About Evaluation What to evaluate What to evaluate When to evaluate When to evaluate Who should be involved in evaluation Who should be involved in evaluation How to evaluate How to evaluate 20
22
Who Should Evaluate? 21
23
Why Involve Stakeholders Smoke out disagreements in… Smoke out disagreements in… Definition of the problem Definition of the problem Activities and priorities of program Activities and priorities of program Outcomes that equate to success Outcomes that equate to success What constitutes proof of success What constitutes proof of success Get their help with.. Get their help with.. Credibility of findings Credibility of findings Access to key players Access to key players Follow-up Follow-up Dissemination of results Dissemination of results 22
24
Using Logic Models for Evaluation Clarity on Clarity on What are activities What are activities What are intended effects What are intended effects What is the sequence/order of intended effects What is the sequence/order of intended effects Which activities are to produce which effects Which activities are to produce which effects Consensus with stakeholders on all of the above Consensus with stakeholders on all of the above Focus the evaluation design Focus the evaluation design 23
25
24 Some Factors That Influence Choice of Evaluation Focus Users and uses – Who wants the information and what are they interested in? Users and uses – Who wants the information and what are they interested in? Accountability to (other) stakeholders – For what effects are key stakeholders expecting to see results? Accountability to (other) stakeholders – For what effects are key stakeholders expecting to see results? Resources – Time, money, expertise Resources – Time, money, expertise Stage of development –How long has the program been in existence? Stage of development –How long has the program been in existence? Ripple effect- How far out would an intervention of this intensity reasonably be expected to have an effect? Ripple effect- How far out would an intervention of this intensity reasonably be expected to have an effect?
26
Setting Evaluation Focus: Some Process Issues What are the likely key challenges to implementation fidelity? What are the likely key challenges to implementation fidelity? Dropped baton issues are key Dropped baton issues are key Partner failed to do their part Partner failed to do their part Client/family/patient failed to fulfill their referral Client/family/patient failed to fulfill their referral Other common challenges Other common challenges Inadequate dosage Inadequate dosage Bad access Bad access Failure to retain participants Failure to retain participants Wrong match of staff and participant Wrong match of staff and participant 25
27
26 Evidence Gathering: Choosing Design What intervention was actually delivered? What intervention was actually delivered? Were impacts and outcomes achieved? Were impacts and outcomes achieved? Was the intervention responsible for the impacts and outcomes? Was the intervention responsible for the impacts and outcomes?
28
Justifying Claims About Intervention Effectiveness Performance vs. a comparison/control group Performance vs. a comparison/control group Time sequence Time sequence Plausible mechanisms (or pathways toward change) Plausible mechanisms (or pathways toward change) Accounting for alternative explanations Accounting for alternative explanations Similar effects observed in similar contexts Similar effects observed in similar contexts 27
29
Choosing Data Collection Methods Function of: Function of: Time Time Cost Cost Sensitivity of the issue Sensitivity of the issue Hawthorne effect Hawthorne effect Ethics Ethics Validity Validity Reliability Reliability 28
30
Maximizing Use of Results: Key Questions Who is the audience? Who is the audience? What will be of greatest importance to them? What will be of greatest importance to them? How will they use the information provided? How will they use the information provided? How much time will they be willing to spend reading and assimilating the material? How much time will they be willing to spend reading and assimilating the material? What type of vocabulary will express the information most clearly? What type of vocabulary will express the information most clearly? 29
31
Some CDC Asthma Examples Comprehensive School-Based Asthma Project Comprehensive School-Based Asthma Project Controlling Asthma in American Cities (CAAP) Project Controlling Asthma in American Cities (CAAP) Project 30
32
Helpful Publications @ www.cdc.gov/eval www.cdc.gov/eval 31
33
Community Tool Box http://ctb.ku.edu/ 32
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.