Measuring Our Success Kristen Sanderson, MPH, CHES Program Coordinator, Safe Kids Georgia 1.

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Performance Measurement: Defining Results Tutorial #1 for AmeriCorps
Engaging Your Community in Practical Program Evaluation Stephanie Welch, MS-MPH, RD, LDN Knox County Health Department Phone (865)
Empowering tobacco-free coalitions to collect local data on worksite and restaurant smoking policies Mary Michaud, MPP University of Wisconsin-Cooperative.
Poison Prevention: A Prescription for a Safer and Healthier Georgia Megan Popielarczyk, MPH, BSN, RN Public Health Fellow, Safe Kids Georgia 1.
Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013 Evaluating Our Coalition: Are We Making.
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Consumer Participation in HIV Service Planning Quarterly Contractors Meeting May 12, 2010 Jennifer Flannagan ADAP Operations Specialist Virginia Department.
Family Resource Center Association January 2015 Quarterly Meeting.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Evaluation. Practical Evaluation Michael Quinn Patton.
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
Molly Chamberlin, Ph.D. Indiana Youth Institute
HEALTHY KIDS LEARN BETTER A Coordinated School Health Approach.
1 EEC Board Policy and Research Committee October 2, 2013 State Advisory Council (SAC) Sustainability for Early Childhood Systems Building.
How to Develop the Right Research Questions for Program Evaluation
REAL-START : Risk Evaluation of Autism in Latinos (Screening Tools and Referral Training) Assuring No Child Enters Kindergarten With an Undetected Developmental.
FORMATIVE EVALUATION Intermediate Injury Prevention Course August 23-26, 2011, Billings, MT.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
© 2003 IBM Corporation July 2004 Technology planning for not-for-profit organizations IBM volunteer name Title, organization.
Process Evaluation Intermediate Injury Prevention Course August 23-26, 2011 Billings, MT.
Step 6: Implementing Change. Implementing Change Our Roadmap.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
SPF SIG State-Level Evaluation COMMUNITY LEVEL INSTRUMENT (CLI): PART 2.
Training of Process Facilitators Training of Process Facilitators.
Evaluation 101: After School Programs February 1, 2007 Region 3 After School Technical Assistance Center Conference.
Measuring the effectiveness of your volunteer program Meals On Wheels Association of America 2011 Annual Conference August 30, 2011 Stacey McKeever, MPH,
How to Write an Action Plan
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
CONNECTICUT HEALTH FOUNDATION: Update on Evaluation Planning for the Strategic Plan.
Joint Infant and Toddler Steering Committee/Early Learning Regional Coalition Statewide Meeting “Using our Data for Continuous Improvement” Organizational.
Welcome! Please join us via teleconference: Phone: Code:
How to Get Started with JCI Accreditation. 2 The Accreditation Journey: General Suggestions The importance of leadership commitment: Board, CEO, and clinical.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
American Community Survey (ACS) Program Review Webinar March 6, 2012.
Program Assessment Training September 29, Learning Objectives By participating in this session, you will develop a better understanding of: how.
Technology Transfer Execution Framework. 2 © 2007 Electric Power Research Institute, Inc. All rights reserved. Relationship Between Your EPRI Value and.
Health Agenda Goal # 9f School Wellness Teams Progress to Date List activities in school year that were reflective of this goal as a priority.
Strategic Plan Strategic Goals (Thrusts) 1. Achieve Performance Excellence CRJ uses metrics of performance to evaluate, manage and plan its.
Community Planning Training 5- Community Planning Training 5-1.
Evaluation of the Indiana ECCS Initiative. State Context Previous Early Childhood System Initiatives –Step Ahead –Building Bright Beginnings SPRANS Grant.
Coordinated Entry.  A system-wide process that evaluates households for the best housing fit - rather than ‘are you eligible for services here’ it asks.
Project KEEP: San Diego 1. Evidenced Based Practice  Best Research Evidence  Best Clinical Experience  Consistent with Family/Client Values  “The.
11 Mayview Regional Service Area Plan (MRSAP) Tracking: Supporting Individuals in the Community June 18, 2008.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
FORUM GUIDE TO SUPPORTING DATA ACCESS FOR RESEARCHERS A STATE EDUCATION AGENCY PERSPECTIVE Kathy Gosa, Kansas State Department of Education.
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
A Team Members Guide to a Culture of Safety
New England Region Homeless Management Information System PATH Integration Into HMIS Richard Rankin, Data Remedies, LLC Melinda Bussino, Brattleboro Area.
Implementing CDC’s School Guidelines: Challenges And Opportunities Joy Larson Utah Department of Health Tobacco Prevention & Control Program.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
School Development Implementation and Monitoring “Building a Learning Community”
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
NIKOLE LOBB DOUGHERTY, MA JESSI LAROSE, MPH OCTOBER 2011 H&AC Promising Strategies: Evaluation & Technical Assistance.
Using Logic Models to Create Effective Programs
Evaluating Screen Time Reduction Initiatives: The Washington State Story Donna Johnson, RD, PhD Center for Public Health Nutrition University of Washington.
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
Local HWTS Monitoring Eva Manzano, CAWST Technical Advisor Laos Vientiane, Lao PDR November 11, 2014.
1 New Coordinator Orientation April 9, :00 p.m. (EST)
A Framework for Evaluating Coalitions Engaged in Collaboration ADRC National Meeting October 2, 2008 Glenn M. Landers.
 Community Health Status Assessment MAPP Phase 3 California Gaining Ground Coalition Small County Learning Community August 13, 2015 Tamara Maciel Bannan,
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
THE USE OF TWO TOOLS TO EVALUATE HEALTHY START COALITIONS Susan M. Wolfe, Ph.D.
Closing the circle and Reviewing the plan
Presentation transcript:

Measuring Our Success Kristen Sanderson, MPH, CHES Program Coordinator, Safe Kids Georgia 1

Objectives Describe our quality improvement process Defining metrics and what to measure Improve data collection process Challenges and successes Discuss program evaluation strategies Describe our assessment process Next steps 2

Our Objective Define a set of meaningful metrics that reflect our progress in reducing incidence and severity of child injuries and mortality in the six focus areas. 3

Challenges Measuring behavior change Many factors that influence safety and the incidence of injury. i.e. laws, law enforcement, community development, economics, cultural norms Measuring not just one program, but many programs and overall effectiveness 4

QUALITY IMPROVEMENT AND PROCESS EVALUATION 5

Monitoring vs. Evaluation Before we could begin measuring our impact (evaluation), we needed to collect more accurate data on our activities and programs (monitoring) Monitoring Monitor the implementation process On-going measurement of performance Regular tracking of resources, activities, and outputs Are things working well? Evaluation Assess specific program outcomes and impact What is the result? E.g. behavior change 6

First Step Analyzed current method of coalition reporting Pros: Collecting program activity information including # of events by focus area, # of materials/equipment distributed, people reached High response rate Cons: Annual reporting = inaccurate, untimely data Some questions unclear = different interpretations and unreliable information Current methods were not accurate, timely, or effective 7

Our Tasks 1.Improve Quality of Program Data Establish metrics Revise reporting forms Improve data collection process 2.Assess Our Activity and Impact Quarterly and Annual reports – monitoring Periodically assess program impact and identify areas for improvement 8

Establishing our Metrics Determine what data we want to and can collect Questions we asked ourselves: What do we want to measure? What data do we want to track over time? What can we feasibly collect from our coordinators? What information will be useful for the coordinators? What did we decide to measure/track? Injury statistics by county County demographics Zip code (location) of events Event details 9

Improving our Data Collection Process Goal 1) Collect data as close to time of event or program activity as possible  Improve accuracy and reliability of the reported data  Provide more timely feedback for program adjustments Goal 2) Make data collection as easy and fast as possible  Reduce the # of non-reporting coalitions  Recognize the diversity of coalition members’ roles (part- time vs. full-time, e.g.) Solution: Annual reporting  Quarterly Reporting 10

Quarterly Report 11

Quarterly Report 12

Quarterly Report - Inventory 13

Challenges Change History of no repercussions for not reporting Estimating numbers No incentives or disincentives Varying skill levels and knowledge of Excel Buy-in Benefits of reporting never communicated Time intensive 14

Overcoming Challenges & Getting Buy-In Discussions at meetings, one-on-one site visits Technical assistance and feedback Follow-up: reminders, phone calls “What do I get out of it?” Accurate numbers for lead agency, funders, sponsors Quarterly/Annual reports showcasing numbers Not having to remember all activity at the end of the year Needed to keep 501(c)3 status User-friendly report 15

Quarterly Report Pilot testing Full dissemination in 2 nd quarter 2012 Lots of feedback, many revisions! Group revisions Only released updated versions quarterly Final changes for 2013 Process took over a year 16

Roadblocks Coalitions had multiple reports CPAT Annual Survey Funder requests SKW Grants Technical difficulties Many issues with the new form required lots of revisions Balancing multiple stakeholders’ expectations Board committees Lead agencies Coordinators 17

Early Successes Increases in number of reporting with each quarter More accurate and timely data Standardized process 18

Recommendations Involve coordinators and all other stakeholders throughout the process What does everyone want to get out of this? What are the benefits for everyone? Challenges? Talk with coordinators who report and don’t report – why aren’t they reporting? Provide training or meeting before dissemination of new tool Training/explanation beforehand will minimize inaccuracies and misinterpretation 19

Recommendations Obtain buy-in from coordinators early on Talk with them one-on-one on the benefits for them, how to help them Know what other reports they already have 4. Pilot test and revise 5. Slow process – patience is needed! 20

MEASURING OUTCOMES AND ASSESSING OUR IMPACT 21

Program Evaluation - Initial Focus on 2 Areas Child Passenger Safety Largest program Has the most evidence supporting risk mitigation approaches Questions to answer: Do we need to do more, and how much more? (un-served population) Can we show reasonable impact between SK vs. non-SK coalitions? Behavior changes? Poisoning Recently trended upwards in incidence Least evaluated area, least implemented program Question to answer: What needs to be done? What works? 22

Poisoning Prevention Program Development Develop program based on literature review, existing evidence- based programs, coordinator interviews Develop capacity Provide resources to coordinators Create sustainability Evaluation Evaluation of instructor training # of trained Poison Prevention Instructors (capacity) # of Poison Prevention educational events (increase in education) Pre- and post-test forms (change in knowledge, behavioral intentions) 3 month follow-up (behavior change) 23

Child Passenger Safety # of CPSTs in coalition counties (capacity) Training (capability) # of certification classes # of recertification classes Car seat checklist forms Monitoring misuse overtime 24

Child Passenger Safety Follow-up evaluation Pilot Project 3 month follow-up after car seat checks Demonstrate knowledge retained and behavior change Partnership with Georgia Department of Public Health Impact of Booster Legislation (July 2011) Challenges Lots of data from multiple sources Narrowing information Link between injury data and our activities 25

Assessment 26 Injury and Death Statistics For all 6 of the focus areas Compare to prior 2-3 years Program Statistics Activity levels by program, by coalition Location of activities Coalition building activities and interaction with other agencies Program funding & grants Staff resources devoted per program Program Evaluation & Recommendations Program specific outcome evaluation Summary of how SKG “moved the needle” Narrative success stories and lessons learned Coalition activity and assessment of effectiveness (paid v. volunteer) Changes to programs and coalition building Additional resources needed (staff and $)

Next Steps Monitor program activity Continue to improve # of coalitions reporting Create database for Program Activity Data Create and distribute quarterly and annual reports Develop goals and objectives Track our progress Identify injury prevention program priorities Outcome evaluation Program-specific evaluation 27

28