Evaluating the Quality and Impact of Community Benefit Programs

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Introduction to Monitoring and Evaluation
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
What You Will Learn From These Sessions
Designing an Effective Evaluation Strategy
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Community Health Assessments: Requirements and Models April 25, 2013 Gianfranco Pezzino Senior Fellow Kansas Health Institute.
Reality Check For Extension Programs Deborah J. Young Associate Director University of Arizona Cooperative Extension.
Evaluation. Practical Evaluation Michael Quinn Patton.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
CDC Evaluation Process Harlen Hays, MPH Office of Health Promotion, KDHE.
How to Develop the Right Research Questions for Program Evaluation
“Walking Through the Steps and Standards” Presented by: Tom Chapel Focus On…
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
COMMUNITY Needs Assessment Overview
Too expensive Too complicated Too time consuming.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Nancy L. Weaver, PhD, MPH Department of Community Health School of Public Health Saint Louis University 16 July 2010 LOGIC MODEL FUNDAMENTALS.
Evaluating Title IIID Programs. November 8, :00-2:00 PM Webinar conference line toll free number (888) Code Login information
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Copyright © 2014 by The University of Kansas A Framework for Program Evaluation.
Evaluation Approaches, Frameworks & Designs HSC 489 – Chapter 14.
Program Assessment. Before you get started Need to determine if the program can be evaluated or should be evaluated. Is the answer already available?
Evaluation design and implementation Puja Myles
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Copyright © 2014 by The University of Kansas Data Collection: Designing an Observational System.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Are we there yet? Evaluating your graduation SiMR.
Yvonne Abel, Abt Associates Inc. November 2010, San Antonio, TX Enhancing the Quality of Evaluation Through Collaboration Among Funders, Programs, and.
Allison Nichols, Ed.D. Extension Specialist in Evaluation.
Evaluating with Perspective: Mobilizing Community Participation in Program Evaluation …its not too late!
Strategic planning A Tool to Promote Organizational Effectiveness
Logic Models How to Integrate Data Collection into your Everyday Work.
Session VII: Formulation of Monitoring and Evaluation Plan
Designing Effective Evaluation Strategies for Outreach Programs
Resource 1. Involving and engaging the right stakeholders.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Gathering a credible evidence base
CHNA Kick off Meeting: Board of Directors
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Readiness Consultations
Health Education THeories
Community program evaluation school
SEM II : Marketing Research
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Initiating Hospital Community Benefit Partnerships
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
Program Evaluation Essentials-- Part 2
Community Benefit and Community Health Needs Assessments
Logic Models and Theory of Change Models: Defining and Telling Apart
Developing an Evaluation Plan
Monitoring and Evaluation of Postharvest Training Projects
2018 OSEP Project Directors’ Conference
Evaluation in the GEF and Training Module on Terminal Evaluations
Collecting and Analyzing Data
Developing an Evaluation Plan
CATHCA National Conference 2018
Needs Assessment: Getting To Know Your Audiences Better
A Framework for Program Evaluation
Proposal Development Support & Planning
What is Planning? Start at 9:15—10 minutes to do this. Finish at 9:25.
Assessment of Service Outcomes
Changing the Game The Logic Model
Data Collection: Designing an Observational System
Integrating Gender into Rural Development M&E in Projects and Programs
Community Benefit Activities
Presentation transcript:

Evaluating the Quality and Impact of Community Benefit Programs September 27, 2016 Julie Trocchio Senior Director, Community Benefit and Continuing Care The Catholic Health Association Kimberly Peeren Senior Public Health Consultant Healthy Communities Institute, a Xerox Company

Learning Objectives Understand the mission and legal basis for evaluating the quality and impact of community benefit programs. Learn how the CDC’s Framework for Program Evaluation can be used to evaluate community health improvement activities. Become familiar with tools and resources for evaluating and reporting the impact of community health improvement programs to help meet new IRS requirements for tax-exempt hospitals.

Resource: CHA Evaluation Guide

Why Evaluate? To improve programs To assess the organization’s impact To ensure resources are being used wisely To meet federal requirements

Implementation Strategy Community Health Needs Assessment Federal Requirements Implementation Strategy Community Health Needs Assessment Include anticipated impact of activities taken in response to CHNA Report impact Source: https://www.federalregister.gov/articles/2014/12/31/2014-30525/additional-requirements-for-charitable-hospitals-community-health-needs-assessments-for-charitable

Community Benefit Cycle Community Health Needs Assessment Implementation / Program Planning Implementation and Evaluation REPORT EVALUATION FINDINGS EVALUATE PROGRESS DESIGN EVALUATION IDENTIFY PRIORITIES

CDC Evaluation Framework PLAN Engage stakeholders Describe the program Focus evaluation design IMPLEMENT 4. Gather credible evidence ANALYZE AND USE FINDINGS 5. Justify conclusions 6. Ensure use and share lessons learned 1. Engage Stakeholders 2. Describe the Program 3. Focus Evaluation Design 4. Gather Credible Evidence 5. Justify Conclusions 6. Ensure Use and Share Lessons Learned Standards Utility Feasibility Propriety Accuracy Source: http://www.cdc.gov/eval/framework/

Resource: CDC Guide Source: http://www.cdc.gov/eval/guide/index.htm

CDC Evaluation Framework PLAN Engage stakeholders Describe the program Focus evaluation design IMPLEMENT 4. Gather credible evidence ANALYZE AND USE FINDINGS 5. Justify conclusions 6. Ensure use and share lessons learned 1. Engage Stakeholders 2. Describe the Program 3. Focus Evaluation Design 4. Gather Credible Evidence 5. Justify Conclusions 6. Ensure Use and Share Lessons Learned Standards Utility Feasibility Propriety Accuracy Source: http://www.cdc.gov/eval/framework/

Step 1. Engage Stakeholders Who is involved in program operations? Who is served? Who will use findings?

Resource: HPOE Guide Source: http://www.hpoe.org/resources/hpoehretaha-guides/2846

Who Will Use the Findings? Decision-makers Program planners Individuals reporting to regulators Senior staff Board members Funders

CDC Evaluation Framework PLAN Engage stakeholders Describe the program Focus evaluation design IMPLEMENT 4. Gather credible evidence ANALYZE AND USE FINDINGS 5. Justify conclusions 6. Ensure use and share lessons learned 1. Engage Stakeholders 2. Describe the Program 3. Focus Evaluation Design 4. Gather Credible Evidence 5. Justify Conclusions 6. Ensure Use and Share Lessons Learned Standards Utility Feasibility Propriety Accuracy Source: http://www.cdc.gov/eval/framework/

Step 2. Describe the Program Goal Resources to be used Activities Anticipated outcomes

Activities + Participants Program Mapping Inputs Outputs Outcomes Resources What we invest in our programs Activities + Participants What we do and whom we reach through our programs Results What happens as a result of our programs

Backpacking gear & supplies Simple Logic Model INPUTS OUTPUTS OUTCOMES Family & friends Budget Time Backpacking gear & supplies Prepare and pack Drive to destination Set up camp Learn about nature; develop backpacking skills; increase bonding among family and friends! Backpack & explore

Resource: UWEX Logic Models

CDC Evaluation Framework PLAN Engage stakeholders Describe the program Focus evaluation design IMPLEMENT 4. Gather credible evidence ANALYZE AND USE FINDINGS 5. Justify conclusions 6. Ensure use and share lessons learned 1. Engage Stakeholders 2. Describe the Program 3. Focus Evaluation Design 4. Gather Credible Evidence 5. Justify Conclusions 6. Ensure Use and Share Lessons Learned Standards Utility Feasibility Propriety Accuracy Source: http://www.cdc.gov/eval/framework/

Step 3. Focus Evaluation Design Determine evaluation purpose Program expansion/replication Funding Objectives Identify what program aspects to evaluate Identify what questions to ask Identify indicators for evaluation questions Design the evaluation

Step 3. Focus Evaluation Design Determine evaluation purpose Identify what program aspects to evaluate Short, medium, long-term outcomes Cost effectiveness, etc. Identify what questions to ask Identify indicators for evaluation questions Design the evaluation

Step 3. Focus Evaluation Design Determine evaluation purpose Identify what program aspects to evaluate Identify what questions to ask Identify indicators for evaluation questions Design the evaluation

Process Questions How was the program carried out? Who did the program reach? How involved were participants in program activities? Were activities implemented as planned? How satisfied were participants? What else happened?

Resource: Evaluation Briefs

Impact Questions What changed? Did participants increase their knowledge or awareness? Did participants acquire new skills? Did participants change their behavior? Did health status improve? Did program achieve anticipated impact?

Step 3. Focus Evaluation Design Determine evaluation purpose Identify what program aspects to evaluate Identify what questions to ask Identify indicators for evaluation questions Design the evaluation

Evaluation Designs Post-test only Pre-test and post-test (single group) Pre-test and post-test with two or more groups Pre-test and post-test with control group

Post-Test Only AFTER PROGRAM Simple design; many factors can influence results and no other results to compare with

Pre-Test Post-Test (one group) BEFORE AFTER PROGRAM Can compare pre/post results of one group; must consider factors that can influence participants and results (e.g., participant selection, external events)

NO PROGRAM OR DIFFERENT PROGRAM More than One Group BEFORE AFTER PROGRAM BEFORE AFTER NO PROGRAM OR DIFFERENT PROGRAM Can compare different groups’ results; must consider factors that can influence participants, groups and results

NO PROGRAM OR DIFFERENT PROGRAM Control Group BEFORE Groups with similar characteristics AFTER PROGRAM AFTER NO PROGRAM OR DIFFERENT PROGRAM Can compare results of similar groups receiving different interventions; consider group comparability and other factors that can influence results

Resource: Community Toolbox

CDC Evaluation Framework PLAN Engage stakeholders Describe the program Focus evaluation design IMPLEMENT 4. Gather credible evidence ANALYZE AND USE FINDINGS 5. Justify conclusions 6. Ensure use and share lessons learned 1. Engage Stakeholders 2. Describe the Program 3. Focus Evaluation Design 4. Gather Credible Evidence 5. Justify Conclusions 6. Ensure Use and Share Lessons Learned Source: http://www.cdc.gov/eval/framework/

Types of Data Quantitative data = numbers Qualitative data = narrative Counts, ratings, scores E.g. # of participants, survey scores Qualitative data = narrative Descriptions E.g. key informant interviews, observation studies

Types of Data PRIMARY DATA SECONDARY DATA Original data collected for your own purposes Data collected by someone else Surveys, questionnaires Focus group discussions Key informant interviews Program records, existing databases Observation County Health Rankings State and local health departments CMS Small Area Health Insurance Estimates USDA Food Environment Atlas, etc.

Step 4. Gather Credible Evidence Have a plan for: Questions to be answered Data sources for answers Method of collecting data Timing Who is responsible

Data Collection Plan Source: http://www.cdc.gov/wisewoman/evaluation_toolkit.htm

Resource: WISEWOMAN Toolkit

CDC Evaluation Framework PLAN Engage stakeholders Describe the program Focus evaluation design IMPLEMENT 4. Gather credible evidence ANALYZE AND USE FINDINGS 5. Justify conclusions 6. Ensure use and share lessons learned 1. Engage Stakeholders 2. Describe the Program 3. Focus Evaluation Design 4. Gather Credible Evidence 5. Justify Conclusions 6. Ensure Use and Share Lessons Learned Standards Utility Feasibility Propriety Accuracy Source: http://www.cdc.gov/eval/framework/

Step 5. Justify Conclusions Analyze and interpret results Examine patterns Consider factors that may have influenced results (e.g., external events)

Step 5. Justify Conclusions Process: Was the strategy implemented correctly? Did any external events intervene? If program was evidence-based, was it faithful to the model?

Step 5. Justify Conclusions Impact: Did program participants change? Were the changes measurable? Were the changes likely because of the program?

Resources: Data Analysis CDC’s WISEWOMAN Evaluation Toolkit CDC Evaluation Briefs: Analyzing Qualitative Data: http://www.cdc.gov/healthyyouth/evaluation /pdf/brief19.pdf Analyzing Quantitative Data: http://www.cdc.gov/healthyyouth/evaluation /pdf/brief20.pdf

Look At Your Evaluation Findings What does information suggest about program? Was program carried out as planned? Any surprising information? What lessons were learned? Positive findings? Negative findings?

CDC Evaluation Framework PLAN Engage stakeholders Describe the program Focus evaluation design IMPLEMENT 4. Gather credible evidence ANALYZE AND USE FINDINGS 5. Justify conclusions 6. Ensure use and share lessons learned 1. Engage Stakeholders 2. Describe the Program 3. Focus Evaluation Design 4. Gather Credible Evidence 5. Justify Conclusions 6. Ensure Use and Share Lessons Learned Standards Utility Feasibility Propriety Accuracy Source: http://www.cdc.gov/eval/framework/

Audiences to Share Findings With Program participants Program managers, staff Funders Hospital leaders Community members Policy makers

Implementation Strategy Community Health Needs Assessment Federal Requirements Implementation Strategy Community Health Needs Assessment Include anticipated impact of activities taken in response to CHNA Report impact Source: https://www.federalregister.gov/articles/2014/12/31/2014-30525/additional-requirements-for-charitable-hospitals-community-health-needs-assessments-for-charitable

Step 6. Ensure Use Improve program Identify resources needed Make decisions about program Replicate/expand program Disseminate information to those who might be interested in adopting program

Resource: BetterEvaluation Source: BetterEvaluation - Report and Support Use of Findings (May 2013) http://betterevaluation.org

Resource: HPOE Guide Source: http://www.hpoe.org/resources/hpoehretaha-guides/2858

Resource: CDC CHI Navigator

Resource: CHA Evaluation Guide

Thanks for participating in this webinar! Questions? Thanks for participating in this webinar!