Everyday Program Evaluation Sheena Cretella MSPH SC DHEC Diabetes Division Program Evaluator 1.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Program Evaluation Alternative Approaches and Practical Guidelines
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Engaging Your Community in Practical Program Evaluation Stephanie Welch, MS-MPH, RD, LDN Knox County Health Department Phone (865)
Introduction to Monitoring and Evaluation
Donald T. Simeon Caribbean Health Research Council
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
What You Will Learn From These Sessions
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Designing an Effective Evaluation Strategy
Corry Bregendahl Leopold Center for Sustainable Agriculture Ames, Iowa.
Introduction to Monitoring and Evaluation – Measuring Success Cheryl Cichowski Kira Rodriguez Michelle Mitchell.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Program Evaluation Essentials. WHAT is Program Evaluation?
Rules of the Game Form groups of 6-8 persons The 1 st table to signal may answer - correct answers = +10 points - incorrect answer = -10 points You may.
Program Evaluation Spero Manson PhD
Reality Check For Extension Programs Deborah J. Young Associate Director University of Arizona Cooperative Extension.
Evaluation. Practical Evaluation Michael Quinn Patton.
+ Monitoring, Learning & Evaluation Questions or problems during the webinar?
Legal & Administrative Oversight of NGOs Establishing and Monitoring Performance Standards.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Monitoring, Learning & Evaluation
Molly Chamberlin, Ph.D. Indiana Youth Institute
How to Develop the Right Research Questions for Program Evaluation
“Walking Through the Steps and Standards” Presented by: Tom Chapel Focus On…
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Process Evaluation Intermediate Injury Prevention Course August 23-26, 2011 Billings, MT.
Key Performance Measures, Evaluation Plans, and Work Plan
Impact assessment framework
Washington State Library OBE Retreat October 20, 2006 Matthew Saxton & Eric Meyers.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
The Evaluation Plan.
Too expensive Too complicated Too time consuming.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Reflect and Revise: Evaluative Thinking for Program Success Tom DeCaigny and Leah Goldstein Moses.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Unit 10. Monitoring and evaluation
Outcome Based Evaluation for Digital Library Projects and Services
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
Neal D. Kohatsu, MD, MPH Medical Director May 17, 2012 Evaluation and Quality Improvement.
Nancy L. Weaver, PhD, MPH Department of Community Health School of Public Health Saint Louis University 16 July 2010 LOGIC MODEL FUNDAMENTALS.
Quality Assessment July 31, 2006 Informing Practice.
Evaluation Workshop Self evaluation – some workable ideas and approaches.
Your Health Matters: Growing Active Communities Take Action 1.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Program Evaluation.
Mapping the logic behind your programming Primary Prevention Institute
Copyright © 2014 by The University of Kansas A Framework for Program Evaluation.
Evaluation Approaches, Frameworks & Designs HSC 489 – Chapter 14.
Healthy Futures Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Program Evaluation Principles and Applications PAS 2010.
Readings n Text: Riddick & Russell –Ch1 stakeholders – p10 –Ch 2 an evaluation system –Proposal p25-36 – Ch 4 – Lit Review n Coursepack –GAO report Ch.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Evaluation and Implementation 21 October 2015 PUBH 535.
Session 2: Developing a Comprehensive M&E Work Plan.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
PROGRAM EVALUATION A TOOL FOR LEARNING AND CHANGE.
Logic Models How to Integrate Data Collection into your Everyday Work.
Evaluating the Quality and Impact of Community Benefit Programs
Designing Effective Evaluation Strategies for Outreach Programs
Program Evaluation Essentials-- Part 2
Outcomes and Evidence Based Programming
A Framework for Program Evaluation
Presentation transcript:

Everyday Program Evaluation Sheena Cretella MSPH SC DHEC Diabetes Division Program Evaluator 1

Today’s Objectives Planning Programs – Logic Models Evaluation – Standards of Evaluation – Steps to Evaluation – Indicators – Moderators – Success Stories Data – Types – Collection – Management 2

PLAN Like Your Program Depends On It! A Logic Model is like a road map Plan your route: START where you are now START where you are now FINISH where you intend to be after the program FINISH where you intend to be after the program 3

Logic Model Pieces Inputs and Resources What do you have? Activities What do you plan to do? Outputs What targets do you have for your activities? Outcomes What changes resulted from your activities? 4

Logic Model Resources Grant Funding Volunteers Church Facilities Activities Construct Walking Trail Conduct Sessions Create Community Garden Outputs Number of Miles Walked Number of Attendees Number of Seed Hills Short Term Outcomes Improved Attitude of Physical Activity Increase in knowledge of Healthy Behaviors Obtain Horticulture Skills Intermediate Outcomes Creation of an environment supporting healthy behaviors Increase in Physical Activity Increase in Fruit and Vegetable Consumption Long Term Outcomes Healthier Community Reduce the Risk of Pre- diabetes and Diabetes and Prevent Complications of Diabetes 5

Logic Model Resources Grant Funding Volunteers Church Facilities Activities Construct Walking Trail Conduct Sessions Community Garden Created Outputs Number of Miles Walked Number of Attendees Number of Seed Hills Short Term Outcomes Improved Attitude of Physical Activity Increase in knowledge of Healthy Behaviors Obtain Horticulture Skills Intermediate Outcomes Creation of an environment supporting healthy behaviors Increase in Physical Activity Increase in Fruit and Vegetable Consumption Long Term Outcomes Healthier community Reduce the Risk of Pre- diabetes and Diabetes and Prevent Complications of Diabetes 6

Program Planning 7

STOP: Check List for Program Planning  Do you and the stakeholders know what is going to be done, with whom, to whom/what?  Is the goal statement outcome oriented?  Are the expectations of stakeholders known to all?  Can the pieces of the logic model be measured?  Are there missing pieces or “gaps” in the logic model/plan?  Is the impact of the program important? Is there evidence? 8

What is Evaluation? 9 “Evaluation is the systematic investigation of merit, worth, or significance of any ‘object.’” - Michael Scriven

Keep In Mind Key Questions Grantees Need to Answer About their Programs: – What role, if any, did my program play in the results? – What role, if any, did the moderators play? – Were there any unintended outcomes? – What will happen if I do not do something? 10

11 Utility Will the evaluation provide relevant information in a timely manner? Feasibility Are the planned evaluation activities realistic given the time, resources, and expertise at hand? Propriety Does the evaluation protect the rights of individuals and protect the welfare of those involved? Accuracy Will the evaluation produce findings that are valuable and reliable? Standards of Evaluation

The Steps to Good Program Evaluation 1 Engage Stakeholders 2 Describe the Program 3 Focus the Evaluation 4 Gather Credible Evidence 5 Justify Conclusions 6 Use Lessons Learned 12

Moderators PoliticalEconomical SocialTechnological 13

Logic Model Resources Grant Funding Volunteers Church Facilities Activities Construct Walking Trail Conduct Sessions Community Garden Created Outputs Number of Miles Walked Number of Attendees Number of Seed Hills Short Term Outcomes Improved Attitude of Physical Activity Increase in knowledge of Healthy Behaviors Obtain Horticulture Skills Intermediate Outcomes Creation of an environment supporting healthy behaviors Increase in Physical Activity Increase in Fruit and Vegetable Consumption Long Term Outcomes Healthier community Reduce the Risk of Pre- diabetes and Diabetes and Prevent Complications of Diabetes 14 Moderators: Weather (drought, heat advisory, snow), safety (snakes/dogs), cost of gas (less volunteers/low attendance), neighborhood safety, inexperienced personnel, local habitat policies.

Measures Each activity and outcome should have a measure. A good measure is: Specific Observable Measurable 15 When choosing measures consider the following: Quality of data Quantity of data Logistics Sources of data collection Primary vs. Secondary

16 Data Management

Data Sources Primary Collecting new data Data is from the people you work with Common forms: – Group discussions – Observation – Document review: logs, journals, meeting minutes, sign-in sheets, etc. – Surveys: telephone, , personal, etc. Secondary Existing data sources Before using Secondary data ensure that they will meet the evaluation needs Large ongoing surveillance systems Routinely collected Not flexible 17

Data Sources Program participants and non-participants Key members Program staff Who would you interview? Meetings Special events / activities What might you observe? Meeting minutes Strategic plans Registration form Photos Which Documents might you review? 18

Success Story Catchy Action verb Title Importance of the problem Evidence Issue Describe program Partner Support Program Point out the most important measured outcomes/benefits of the program Impact 19

Success Story Content – Contact Information – Photos and Logos – Quotes Format – Brief – One Page – Bullets Possible Inclusions: – Testimonials – Promising practices – Lessons Learned – Partner Success – New Partners 20 Make it your own!

21 Issue Contact Impact Title Program Photo

Types of data 22 Make sure you are choosing the right kind of data for your specific evaluation questions. Qualitative Quantitative – Continuous – Categorical Ordinal Nominal

Data Management Physical Records – De-identify records whenever possible – Store records in a safe locked place – Shred personal data upon completion of evaluation Electronic Records – Use unique identification numbers whenever possible – Use a password on your computer – Erase (not delete) personal data upon completion of evaluation. – Back up data. 23

Data Management Id#AgeCountyVariable 3Variable Richland Fairfield Orangeburg Lexington 24 Id#AgeCountyVariable 3Variable Richland Fairfield Orangeburg Lexington Id#AgeCountyVariable 3Variable Richland Fairfield Orangeburg Lexington

Everyday Program Evaluation Thank You! Sheena Cretella MSPH SC DHEC Diabetes Division Program Evaluator