Download presentation
Presentation is loading. Please wait.
1
Measuring and Communicating Coalition and Programs
the Impact of your Coalition and Programs Introductions – who’s from what kind of coalition? Personal Goals – what’s on your mind? What are your challenges? Write down three things you want to evaluate in your community (programs, strategies, policies, the coalition itself) Now let’s imagine we’re taking a family trip to Disney World. How do we go about doing that? What do we have to do or plan? Brian Bumbarger Coalition Leadership Institute June 2010
2
If you don’t know where you’re going, any road will get you there!
The Cheshire Cat, Alice in Wonderland
3
It is not enough to be busy. So are the ants
It is not enough to be busy. So are the ants. The question is: What are we busy about? Henry David Thoreau Here’s another quote I like. So often in prevention work especially we confuse movement for progress
4
Don’t mistake movement for progress.
They might feel the same, but just shaking your butt won’t get you to Disney World Brian Bumbarger
5
Select programs, policies and practices.
Create program-level outcomes. Plan to evaluate. Create a written Community Action Plan. Create community-level outcomes. Investigate potential programs, policies and practices. Since this day of training is about community coalitions, its appropriate for us to start at the big picture level. This is bigger than just evaluating an individual program or policy or strategy. No matter what kind of coalition you’re part of, the goal is to get to community-level, meaning population-level improvement. A lot of the content I’m going to present today is from the CTC model, but it applicable no matter which model you’re using. So I’d like you to think about how your own coalition is approaching the issue of measuring its impact.
6
Evaluate (v.): to determine the significance, worth, or condition of usually by careful appraisal and study Assess (v.) : to determine the importance, size, or value of
7
Simple pre-post comparison
Pre-test Post-test Prevention Program We can compare the difference between pre-test and post-test But… What if something else caused the change? If the post-test isn’t better, does that mean the program didn’t work? What would have happened in the absence of the program? Are these kids representative of all kids?
8
Simple pre-post comparison
Pre-test Post-test Prevention Program
9
Randomized Controlled Trial
Randomization Prevention Program Random Se l ec t i on Pre-test Post-test Fo l l owup Data Prevention Program Pre-test Post-test
10
Select programs, policies and practices.
Create program-level outcomes. Plan to evaluate. Create a written Community Action Plan. Create community-level outcomes. Investigate potential programs, policies and practices. Since this day of training is about community coalitions, its appropriate for us to start at the big picture level. This is bigger than just evaluating an individual program or policy or strategy. No matter what kind of coalition you’re part of, the goal is to get to community-level, meaning population-level improvement. A lot of the content I’m going to present today is from the CTC model, but it applicable no matter which model you’re using. So I’d like you to think about how your own coalition is approaching the issue of measuring its impact.
11
Gives clear direction toward achieving community vision
Provides built-in evaluation measures and accountability Required by many grant makers
13
Decrease in problem behaviors
Changes in participant knowledge, attitudes, skills or behavior Program implementation fidelity Increase in protective factors Decrease in risk factors (10 to 15 years) (3 to 10 years) (1 to 4 years) (6 months to 2 years)
14
You have to have a clear sense of what it is you’re trying to accomplish, how you intend to accomplish it, and then how you will measure whether or not you have achieved what you set out to do.
15
So you really need a logic model at the individual program or strategy level, and a logic model at the community level. From these logic models, we can begin to identify community-level outcomes, then program and participant outcomes. From there we make decisions about what to measure, and then how to measure it.
16
The Community Board will learn how to develop outcome-focused prevention plans.
17
Participants will be able to:
Define outcome-focused planning. Write desired outcomes: Behavior outcomes Risk-factor outcomes Protective-factor outcomes Prepare for community evaluation.
18
Measuring changes Measuring progress Knowing what actions will take place
19
Change in what As measured by The baseline or starting point for comparison By how much, by when
20
Priority risk-factor outcome to change
Indicator used to measure the outcome Baseline data How much change, by when
21
Identify targets for prevention-planning efforts.
Evaluate the impact of prevention efforts over time. Engage Key Leaders and community members in supporting prevention.
22
Know the data to be collected.
Know who is responsible for data collection. Know how outcomes will be reported. Determine what resources will be needed for future assessments.
24
Participants will learn how to develop participant and implementation outcomes in preparation for implementing and evaluating their selected programs, policies and practices.
25
Participants will be able to:
Develop participant outcomes. Develop implementation outcomes. Identify the elements of implementation.
26
• Participant outcomes measure the changes a program produces.
• Implementation outcomes measure the process by which a program produces desired changes.
27
Knowledge Attitudes Skills Behavior.
28
Who the program will be delivered by
When the program will be delivered, including how often and how long Where the program will be delivered How the program will be delivered Number of people to be affected by the program Who your target audience will be.
30
Participants will be able to:
Identify the uses of evaluation. Explain basic evaluation concepts. Assess the need for an expert evaluator.
31
Evaluation helps you: • monitor implementation fidelity
• monitor progress toward outcomes • identify problems with program design or selection • demonstrate achievements • determine cost-effectiveness. This is a simple but often overlooked question. Why are we evaluating? It defines our audience and ultimately what kind of evaluation we do. Are we simply checking off boxes for our funder or are we really interested in measuring our impact and improving our prevention activities?
32
What do you want to evaluate? What is the purpose of your evaluation?
Do an exercise here
33
• Community Board members
• Key Leaders • Program implementers and site administrators • Media • Local interest groups • Other community members • Sources of funding
34
Monitor implementation fidelity.
Identify and correct implementation problems before they result in program failure.
35
Records Activity logs Observation Questionnaires
36
Measure the program’s impact on participant knowledge, skills, attitudes or behavior.
Monitor progress toward the community’s vision.
37
Observation Interviews Questionnaires Records
38
An expert evaluator can help:
select appropriate evaluation design develop evaluation measures analyze statistical information interpret results.
39
The program requires significant technical assistance
Evaluation tools will need to be developed Data collection will be expensive and complex Complex data analysis will be required The Board lacks the expertise to complete the evaluation.
40
Local universities Research and consulting firms
41
The nature of the evaluation Who will be doing the evaluation
Resources needed for the evaluation. BREAK HERE
42
Community Plan Implementation Training
3-42
43
To help ensure that programs, policies and practices in the Community Action Plan are implemented with fidelity. Community Plan Implementation Training 3-43
44
Participants will be able to:
Explain the elements of successful, high-fidelity implementation. Develop implementation plans. Identify methods of monitoring implementation for fidelity. Community Plan Implementation Training 3-44
45
High-fidelity implementation means that:
the target audience is reached all components of the program are delivered according to the original design the program is delivered at the intended dosage (duration, frequency). Community Plan Implementation Training 3-45
46
Studies have found that:
the effectiveness of any program is significantly increased when the program is implemented with fidelity some programs are only effective when implemented with a high degree of fidelity. (Blueprints, 2001) This is a key role for the coalition – having the program implementers check back regularly and give updates on implementation. Community Plan Implementation Training 3-46
47
Support of site administrators
Qualified implementers who support the program Planning meetings and trainings A detailed implementation plan (who, what, where, when and how) Ongoing monitoring and technical assistance Community Plan Implementation Training 3-47
48
Select implementers who:
have the credentials prescribed by the program support the program approach are able to reach/communicate with target audience. Community Plan Implementation Training 3-48
49
Involve implementers in the planning process.
Provide implementers with clear information about the program. • Train implementers to deliver the program. Consider a written participation agreement. Community Plan Implementation Training 3-49
50
To educate stakeholders and build support for the program
To negotiate implementation logistics To develop detailed implementation plans To obtain signed participation agreements Community Plan Implementation Training 3-50
51
Implementers should leave trainings with:
a clear understanding of the program goals, rationale, desired outcomes, components and methods confidence that they have the skills and expertise to deliver the program effectively. Community Plan Implementation Training 3-51
52
Include: target audience and expected number of participants
plans for recruiting participants who the implementers are schedule for implementing all components. Community Plan Implementation Training 3-52
53
Community Plan Implementation Training
3-53
54
To identify and correct implementation problems.
To ensure that the program is implemented with fidelity to the original design. To identify and correct implementation problems. To fulfill requirements of funders. To provide “lessons learned” for future implementation efforts. To identify and celebrate early successes. Community Plan Implementation Training 3-54
55
The target audience is not participating in great enough numbers.
The program components are not being delivered at the intended dosage. Program delivery methods are not being used. Community Plan Implementation Training 3-55
56
Monitor: planning meetings before implementation
training of implementers program delivery. Community Plan Implementation Training 3-56
57
Who attended the trainings? What was covered at the trainings?
Did implementers leave trainings with the skills, confidence and motivation to deliver the program effectively? Community Plan Implementation Training 3-57
58
How were participants recruited?
How many members of the target audience participated? Was the program delivered at the correct dosage? Did implementers use methods prescribed in the program design? Did implementers communicate effectively with the target audience? Community Plan Implementation Training 3-58
59
Community Plan Implementation Training
4-5
60
Define what will be evaluated. Select an evaluation design.
Engage stakeholders. Define what will be evaluated. Select an evaluation design. Decide how you will collect and measure data. Gather, analyze and interpret data. Report your findings and use the results. Community Plan Implementation Training 4-7
61
Use the instrument designed for the program.
If no instrument is available, seek help from an expert evaluator. Community Plan Implementation Training 4-15
62
Consider: commitment to the project and to the evaluation objectivity
ability to collect complete and accurate data. Community Plan Implementation Training 4-18
63
Determine: who will oversee the evaluation
a central storage place for data from each program evaluation deadlines decision-making processes. Community Plan Implementation Training 4-19
64
Choose an appropriate format. Write for your most important audience.
Discuss results in terms of the overall program goals. Be honest. Community Plan Implementation Training 4-20
65
Identify individuals with:
skills in designing evaluation instruments skills in designing and conducting evaluation studies skills in data analysis and presentation the ability to objectively assess outcome results and recommend possible solutions the ability to credibly communicate results to key stakeholders. Community Plan Implementation Training 4-24
66
Write a report detailing the results.
Celebrate success. Write a report detailing the results. Fulfill accountability requirements. Identify causes of unmet expectations. Revise and update the Community Action Plan. Community Plan Implementation Training 4-25
67
Weak connection between the program and outcomes Unrealistic outcomes
Failure to implement a program with fidelity Participant-related reasons External factors Measurement problems Decreases in sample size (attrition) Community Plan Implementation Training 4-26
68
Identify changing priorities.
Celebrate success. Identify changing priorities. Select new tested, effective prevention strategies to address changing priorities. Assist resource allocation decisions. Evaluate policies and practices. Community Plan Implementation Training 4-30
69
If you want to build a ship, don't drum up the men to gather wood, divide the work, and give orders. Instead, teach them to yearn for the vast and endless sea. -- Antoine de Saint-Exupéry
70
Thank You! Brian K. Bumbarger
Evidence-based Prevention and Intervention Support Center Prevention Research Center, Penn State University 206 Towers Bldg. University Park, PA (814) Thank You! 70
71
Outcome-Based Planning Logic Model
Reduction in problem behavior ( years) Increase in Protective Factors (1-4 years) Decrease in priority risk factors (1-4 years) Vision (10 – 15 years) Programs (6 mos. - 2 years) What kind of future do you envision for your community’s children ? What changes in ATOD use, delinquency, violence, teen pregnancy, dropout will be necessary to achieve the vision ? What changes in protective factors are needed to reduce problem behaviors ? What changes in risk factors are needed to reduce problem behaviors ? How will your programs & services change risk & protective factors ? Program Outcomes Protective Factor Prevalence Risk Factor Prevalence Vision Statement Problem Behaviors
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.