Measuring and Communicating Coalition and Programs

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Analyzing Student Work
Introduction to Monitoring and Evaluation
Donald T. Simeon Caribbean Health Research Council
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Designing an Effective Evaluation Strategy
Project Monitoring Evaluation and Assessment
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
Evaluation. Practical Evaluation Michael Quinn Patton.
Measuring Learning Outcomes Evaluation
Standards and Guidelines for Quality Assurance in the European
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Leadership within SW-PBS: Following the Blueprints for Success Tim Lewis, Ph.D. University of Missouri OSEP Center on Positive Behavioral Intervention.
Molly Chamberlin, Ph.D. Indiana Youth Institute
1-2 Training of Process FacilitatorsTraining of Coordinators 5-1.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
The Comprehensive School Health Education Curriculum:
Reporting and Using Evaluation Results Presented on 6/18/15.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Training of Process Facilitators Training of Process Facilitators.
The Evaluation Plan.
Too expensive Too complicated Too time consuming.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Logic Models and Theory of Change Models: Defining and Telling Apart
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Community Planning Training 1-1. Community Plan Implementation Training Community Planning Training 1-2.
Community Planning Training 5- Community Planning Training 5-1.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Suggested Components of a Schoolwide Reading Plan Part 1: Introduction Provides an overview of key components of reading plan. Part 2: Component details.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
0 ©2015 U.S. Education Delivery Institute While there is no prescribed format for a good delivery plan, it should answer 10 questions What a good delivery.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Training of Process Facilitators 1- Training of Process Facilitators 5-1.
INFORMATION AND PROGRESS An analysis of what is happening in the Caribbean with information, decision- making and progress in Education.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Logic Models How to Integrate Data Collection into your Everyday Work.
Evaluating the Quality and Impact of Community Benefit Programs
Designing Effective Evaluation Strategies for Outreach Programs
Impact-Oriented Project Planning
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
INFORMATION AND PROGRESS
Overview – Guide to Developing Safety Improvement Plan
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
Policy Lessons Learned
Strategic Prevention Framework - Evaluation
Overview – Guide to Developing Safety Improvement Plan
Logic Models and Theory of Change Models: Defining and Telling Apart
Jean Scott & Logan Searcy July 22, MEGA
Topic Principles and Theories in Curriculum Development
Analyzing Student Work Sample 2 Instructional Next Steps
Using Data for Program Improvement
GENERAL INTRODUCTION TO ADVOCACY
Continuous Improvement Planning for Perkins
Unit 7: Instructional Communication and Technology
Establishing Oversight Mechanisms
Using Data for Program Improvement
February 21-22, 2018.
MAP-IT: A Model for Implementing Healthy People 2020
Presenter: Kate Bell, MA PIP Reviewer
Key Building Blocks Evaluating Community Coalitions & Partnerships
Presentation transcript:

Measuring and Communicating Coalition and Programs the Impact of your Coalition and Programs Introductions – who’s from what kind of coalition?   Personal Goals – what’s on your mind? What are your challenges? Write down three things you want to evaluate in your community (programs, strategies, policies, the coalition itself) Now let’s imagine we’re taking a family trip to Disney World. How do we go about doing that? What do we have to do or plan? Brian Bumbarger Coalition Leadership Institute June 2010

If you don’t know where you’re going, any road will get you there! The Cheshire Cat, Alice in Wonderland

It is not enough to be busy. So are the ants It is not enough to be busy. So are the ants. The question is: What are we busy about? Henry David Thoreau Here’s another quote I like. So often in prevention work especially we confuse movement for progress

Don’t mistake movement for progress. They might feel the same, but just shaking your butt won’t get you to Disney World Brian Bumbarger

Select programs, policies and practices. Create program-level outcomes. Plan to evaluate. Create a written Community Action Plan. Create community-level outcomes. Investigate potential programs, policies and practices. Since this day of training is about community coalitions, its appropriate for us to start at the big picture level. This is bigger than just evaluating an individual program or policy or strategy. No matter what kind of coalition you’re part of, the goal is to get to community-level, meaning population-level improvement. A lot of the content I’m going to present today is from the CTC model, but it applicable no matter which model you’re using. So I’d like you to think about how your own coalition is approaching the issue of measuring its impact.

Evaluate (v.): to determine the significance, worth, or condition of usually by careful appraisal and study Assess (v.) : to determine the importance, size, or value of

Simple pre-post comparison Pre-test Post-test Prevention Program We can compare the difference between pre-test and post-test But… What if something else caused the change? If the post-test isn’t better, does that mean the program didn’t work? What would have happened in the absence of the program? Are these kids representative of all kids?

Simple pre-post comparison Pre-test Post-test Prevention Program

Randomized Controlled Trial Randomization Prevention Program Random Se l ec t i on Pre-test Post-test Fo l l owup Data Prevention Program Pre-test Post-test

Select programs, policies and practices. Create program-level outcomes. Plan to evaluate. Create a written Community Action Plan. Create community-level outcomes. Investigate potential programs, policies and practices. Since this day of training is about community coalitions, its appropriate for us to start at the big picture level. This is bigger than just evaluating an individual program or policy or strategy. No matter what kind of coalition you’re part of, the goal is to get to community-level, meaning population-level improvement. A lot of the content I’m going to present today is from the CTC model, but it applicable no matter which model you’re using. So I’d like you to think about how your own coalition is approaching the issue of measuring its impact.

Gives clear direction toward achieving community vision Provides built-in evaluation measures and accountability Required by many grant makers

Decrease in problem behaviors Changes in participant knowledge, attitudes, skills or behavior Program implementation fidelity Increase in protective factors Decrease in risk factors (10 to 15 years) (3 to 10 years) (1 to 4 years) (6 months to 2 years)

You have to have a clear sense of what it is you’re trying to accomplish, how you intend to accomplish it, and then how you will measure whether or not you have achieved what you set out to do.

So you really need a logic model at the individual program or strategy level, and a logic model at the community level. From these logic models, we can begin to identify community-level outcomes, then program and participant outcomes. From there we make decisions about what to measure, and then how to measure it.

The Community Board will learn how to develop outcome-focused prevention plans.

Participants will be able to: Define outcome-focused planning. Write desired outcomes: Behavior outcomes Risk-factor outcomes Protective-factor outcomes Prepare for community evaluation.

Measuring changes Measuring progress Knowing what actions will take place

Change in what As measured by The baseline or starting point for comparison By how much, by when

Priority risk-factor outcome to change Indicator used to measure the outcome Baseline data How much change, by when

Identify targets for prevention-planning efforts. Evaluate the impact of prevention efforts over time. Engage Key Leaders and community members in supporting prevention.

Know the data to be collected. Know who is responsible for data collection. Know how outcomes will be reported. Determine what resources will be needed for future assessments.

Participants will learn how to develop participant and implementation outcomes in preparation for implementing and evaluating their selected programs, policies and practices.

Participants will be able to: Develop participant outcomes. Develop implementation outcomes. Identify the elements of implementation.

• Participant outcomes measure the changes a program produces. • Implementation outcomes measure the process by which a program produces desired changes.

Knowledge Attitudes Skills Behavior.

Who the program will be delivered by When the program will be delivered, including how often and how long Where the program will be delivered How the program will be delivered Number of people to be affected by the program Who your target audience will be.

Participants will be able to: Identify the uses of evaluation. Explain basic evaluation concepts. Assess the need for an expert evaluator.

Evaluation helps you: • monitor implementation fidelity • monitor progress toward outcomes • identify problems with program design or selection • demonstrate achievements • determine cost-effectiveness. This is a simple but often overlooked question. Why are we evaluating? It defines our audience and ultimately what kind of evaluation we do. Are we simply checking off boxes for our funder or are we really interested in measuring our impact and improving our prevention activities?

What do you want to evaluate? What is the purpose of your evaluation? Do an exercise here

• Community Board members • Key Leaders • Program implementers and site administrators • Media • Local interest groups • Other community members • Sources of funding

Monitor implementation fidelity. Identify and correct implementation problems before they result in program failure.

Records Activity logs Observation Questionnaires

Measure the program’s impact on participant knowledge, skills, attitudes or behavior. Monitor progress toward the community’s vision.

Observation Interviews Questionnaires Records

An expert evaluator can help: select appropriate evaluation design develop evaluation measures analyze statistical information interpret results.

The program requires significant technical assistance Evaluation tools will need to be developed Data collection will be expensive and complex Complex data analysis will be required The Board lacks the expertise to complete the evaluation.

Local universities Research and consulting firms

The nature of the evaluation Who will be doing the evaluation Resources needed for the evaluation. BREAK HERE

Community Plan Implementation Training 3-42

To help ensure that programs, policies and practices in the Community Action Plan are implemented with fidelity. Community Plan Implementation Training 3-43

Participants will be able to: Explain the elements of successful, high-fidelity implementation. Develop implementation plans. Identify methods of monitoring implementation for fidelity. Community Plan Implementation Training 3-44

High-fidelity implementation means that: the target audience is reached all components of the program are delivered according to the original design the program is delivered at the intended dosage (duration, frequency). Community Plan Implementation Training 3-45

Studies have found that: the effectiveness of any program is significantly increased when the program is implemented with fidelity some programs are only effective when implemented with a high degree of fidelity. (Blueprints, 2001) This is a key role for the coalition – having the program implementers check back regularly and give updates on implementation. Community Plan Implementation Training 3-46

Support of site administrators Qualified implementers who support the program Planning meetings and trainings A detailed implementation plan (who, what, where, when and how) Ongoing monitoring and technical assistance Community Plan Implementation Training 3-47

Select implementers who: have the credentials prescribed by the program support the program approach are able to reach/communicate with target audience. Community Plan Implementation Training 3-48

Involve implementers in the planning process. Provide implementers with clear information about the program. • Train implementers to deliver the program. Consider a written participation agreement. Community Plan Implementation Training 3-49

To educate stakeholders and build support for the program To negotiate implementation logistics To develop detailed implementation plans To obtain signed participation agreements Community Plan Implementation Training 3-50

Implementers should leave trainings with: a clear understanding of the program goals, rationale, desired outcomes, components and methods confidence that they have the skills and expertise to deliver the program effectively. Community Plan Implementation Training 3-51

Include: target audience and expected number of participants plans for recruiting participants who the implementers are schedule for implementing all components. Community Plan Implementation Training 3-52

Community Plan Implementation Training 3-53

To identify and correct implementation problems. To ensure that the program is implemented with fidelity to the original design. To identify and correct implementation problems. To fulfill requirements of funders. To provide “lessons learned” for future implementation efforts. To identify and celebrate early successes. Community Plan Implementation Training 3-54

The target audience is not participating in great enough numbers. The program components are not being delivered at the intended dosage. Program delivery methods are not being used. Community Plan Implementation Training 3-55

Monitor: planning meetings before implementation training of implementers program delivery. Community Plan Implementation Training 3-56

Who attended the trainings? What was covered at the trainings? Did implementers leave trainings with the skills, confidence and motivation to deliver the program effectively? Community Plan Implementation Training 3-57

How were participants recruited? How many members of the target audience participated? Was the program delivered at the correct dosage? Did implementers use methods prescribed in the program design? Did implementers communicate effectively with the target audience? Community Plan Implementation Training 3-58

Community Plan Implementation Training 4-5

Define what will be evaluated. Select an evaluation design. Engage stakeholders. Define what will be evaluated. Select an evaluation design. Decide how you will collect and measure data. Gather, analyze and interpret data. Report your findings and use the results. Community Plan Implementation Training 4-7

Use the instrument designed for the program. If no instrument is available, seek help from an expert evaluator. Community Plan Implementation Training 4-15

Consider: commitment to the project and to the evaluation objectivity ability to collect complete and accurate data. Community Plan Implementation Training 4-18

Determine: who will oversee the evaluation a central storage place for data from each program evaluation deadlines decision-making processes. Community Plan Implementation Training 4-19

Choose an appropriate format. Write for your most important audience. Discuss results in terms of the overall program goals. Be honest. Community Plan Implementation Training 4-20

Identify individuals with: skills in designing evaluation instruments skills in designing and conducting evaluation studies skills in data analysis and presentation the ability to objectively assess outcome results and recommend possible solutions the ability to credibly communicate results to key stakeholders. Community Plan Implementation Training 4-24

Write a report detailing the results. Celebrate success. Write a report detailing the results. Fulfill accountability requirements. Identify causes of unmet expectations. Revise and update the Community Action Plan. Community Plan Implementation Training 4-25

Weak connection between the program and outcomes Unrealistic outcomes Failure to implement a program with fidelity Participant-related reasons External factors Measurement problems Decreases in sample size (attrition) Community Plan Implementation Training 4-26

Identify changing priorities. Celebrate success. Identify changing priorities. Select new tested, effective prevention strategies to address changing priorities. Assist resource allocation decisions. Evaluate policies and practices. Community Plan Implementation Training 4-30

If you want to build a ship, don't drum up the men to gather wood, divide the work, and give orders. Instead, teach them to yearn for the vast and endless sea. -- Antoine de Saint-Exupéry

Thank You! Brian K. Bumbarger Evidence-based Prevention and Intervention Support Center Prevention Research Center, Penn State University 206 Towers Bldg. University Park, PA 16802 (814) 865-2618 www.prevention.psu.edu Thank You! 70

Outcome-Based Planning Logic Model Reduction in problem behavior (3 - 10 years) Increase in Protective Factors (1-4 years) Decrease in priority risk factors (1-4 years) Vision (10 – 15 years) Programs (6 mos. - 2 years) What kind of future do you envision for your community’s children ? What changes in ATOD use, delinquency, violence, teen pregnancy, dropout will be necessary to achieve the vision ? What changes in protective factors are needed to reduce problem behaviors ? What changes in risk factors are needed to reduce problem behaviors ? How will your programs & services change risk & protective factors ? Program Outcomes Protective Factor Prevalence Risk Factor Prevalence Vision Statement Problem Behaviors