Download presentation
Presentation is loading. Please wait.
Published byShannon Wilson Modified over 9 years ago
1
Measuring Our Success Kristen Sanderson, MPH, CHES Program Coordinator, Safe Kids Georgia 1
2
Objectives Describe our quality improvement process Defining metrics and what to measure Improve data collection process Challenges and successes Discuss program evaluation strategies Describe our assessment process Next steps 2
3
Our Objective Define a set of meaningful metrics that reflect our progress in reducing incidence and severity of child injuries and mortality in the six focus areas. 3
4
Challenges Measuring behavior change Many factors that influence safety and the incidence of injury. i.e. laws, law enforcement, community development, economics, cultural norms Measuring not just one program, but many programs and overall effectiveness 4
5
QUALITY IMPROVEMENT AND PROCESS EVALUATION 5
6
Monitoring vs. Evaluation Before we could begin measuring our impact (evaluation), we needed to collect more accurate data on our activities and programs (monitoring) Monitoring Monitor the implementation process On-going measurement of performance Regular tracking of resources, activities, and outputs Are things working well? Evaluation Assess specific program outcomes and impact What is the result? E.g. behavior change 6
7
First Step Analyzed current method of coalition reporting Pros: Collecting program activity information including # of events by focus area, # of materials/equipment distributed, people reached High response rate Cons: Annual reporting = inaccurate, untimely data Some questions unclear = different interpretations and unreliable information Current methods were not accurate, timely, or effective 7
8
Our Tasks 1.Improve Quality of Program Data Establish metrics Revise reporting forms Improve data collection process 2.Assess Our Activity and Impact Quarterly and Annual reports – monitoring Periodically assess program impact and identify areas for improvement 8
9
Establishing our Metrics Determine what data we want to and can collect Questions we asked ourselves: What do we want to measure? What data do we want to track over time? What can we feasibly collect from our coordinators? What information will be useful for the coordinators? What did we decide to measure/track? Injury statistics by county County demographics Zip code (location) of events Event details 9
10
Improving our Data Collection Process Goal 1) Collect data as close to time of event or program activity as possible Improve accuracy and reliability of the reported data Provide more timely feedback for program adjustments Goal 2) Make data collection as easy and fast as possible Reduce the # of non-reporting coalitions Recognize the diversity of coalition members’ roles (part- time vs. full-time, e.g.) Solution: Annual reporting Quarterly Reporting 10
11
Quarterly Report 11
12
Quarterly Report 12
13
Quarterly Report - Inventory 13
14
Challenges Change History of no repercussions for not reporting Estimating numbers No incentives or disincentives Varying skill levels and knowledge of Excel Buy-in Benefits of reporting never communicated Time intensive 14
15
Overcoming Challenges & Getting Buy-In Discussions at meetings, one-on-one site visits Technical assistance and feedback Follow-up: reminders, phone calls “What do I get out of it?” Accurate numbers for lead agency, funders, sponsors Quarterly/Annual reports showcasing numbers Not having to remember all activity at the end of the year Needed to keep 501(c)3 status User-friendly report 15
16
Quarterly Report Pilot testing Full dissemination in 2 nd quarter 2012 Lots of feedback, many revisions! Group revisions Only released updated versions quarterly Final changes for 2013 Process took over a year 16
17
Roadblocks Coalitions had multiple reports CPAT Annual Survey Funder requests SKW Grants Technical difficulties Many issues with the new form required lots of revisions Balancing multiple stakeholders’ expectations Board committees Lead agencies Coordinators 17
18
Early Successes Increases in number of reporting with each quarter More accurate and timely data Standardized process 18
19
Recommendations Involve coordinators and all other stakeholders throughout the process What does everyone want to get out of this? What are the benefits for everyone? Challenges? Talk with coordinators who report and don’t report – why aren’t they reporting? Provide training or meeting before dissemination of new tool Training/explanation beforehand will minimize inaccuracies and misinterpretation 19
20
Recommendations Obtain buy-in from coordinators early on Talk with them one-on-one on the benefits for them, how to help them Know what other reports they already have 4. Pilot test and revise 5. Slow process – patience is needed! 20
21
MEASURING OUTCOMES AND ASSESSING OUR IMPACT 21
22
Program Evaluation - Initial Focus on 2 Areas Child Passenger Safety Largest program Has the most evidence supporting risk mitigation approaches Questions to answer: Do we need to do more, and how much more? (un-served population) Can we show reasonable impact between SK vs. non-SK coalitions? Behavior changes? Poisoning Recently trended upwards in incidence Least evaluated area, least implemented program Question to answer: What needs to be done? What works? 22
23
Poisoning Prevention Program Development Develop program based on literature review, existing evidence- based programs, coordinator interviews Develop capacity Provide resources to coordinators Create sustainability Evaluation Evaluation of instructor training # of trained Poison Prevention Instructors (capacity) # of Poison Prevention educational events (increase in education) Pre- and post-test forms (change in knowledge, behavioral intentions) 3 month follow-up (behavior change) 23
24
Child Passenger Safety # of CPSTs in coalition counties (capacity) Training (capability) # of certification classes # of recertification classes Car seat checklist forms Monitoring misuse overtime 24
25
Child Passenger Safety Follow-up evaluation Pilot Project 3 month follow-up after car seat checks Demonstrate knowledge retained and behavior change Partnership with Georgia Department of Public Health Impact of Booster Legislation (July 2011) Challenges Lots of data from multiple sources Narrowing information Link between injury data and our activities 25
26
Assessment 26 Injury and Death Statistics For all 6 of the focus areas Compare to prior 2-3 years Program Statistics Activity levels by program, by coalition Location of activities Coalition building activities and interaction with other agencies Program funding & grants Staff resources devoted per program Program Evaluation & Recommendations Program specific outcome evaluation Summary of how SKG “moved the needle” Narrative success stories and lessons learned Coalition activity and assessment of effectiveness (paid v. volunteer) Changes to programs and coalition building Additional resources needed (staff and $)
27
Next Steps Monitor program activity Continue to improve # of coalitions reporting Create database for Program Activity Data Create and distribute quarterly and annual reports Develop goals and objectives Track our progress Identify injury prevention program priorities Outcome evaluation Program-specific evaluation 27
28
28
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.