HMP Evaluation Overview and Tools UNE-CCPH June 22, 2011 Presenters: Kira Rodriguez Michelle Mitchell Patrick Madden Pamela Bruno-MacDonald.

Slides:



Advertisements
Similar presentations
Engaging Your Community in Practical Program Evaluation Stephanie Welch, MS-MPH, RD, LDN Knox County Health Department Phone (865)
Advertisements

Introduction to Monitoring and Evaluation
1 Department of State Program Evaluation Policy Overview Spring 2013.
Oakland EMA Patricia LaBrie Calloway, R.N., P.H.N.
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
What You Will Learn From These Sessions
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
Designing an Effective Evaluation Strategy
Introduction to Monitoring and Evaluation – Measuring Success Cheryl Cichowski Kira Rodriguez Michelle Mitchell.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Everyday Program Evaluation Sheena Cretella MSPH SC DHEC Diabetes Division Program Evaluator 1.
Testing a Strategic Evaluation Framework for Incrementally Building Evaluation Capacity in a Federal R&D Program 27 th Annual Conference of the American.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Comprehensive M&E Systems
Introduction to the User’s Guide for Evaluating Learning Outcomes from Citizen Science Tina Phillips, Cornell Lab of Ornithology Marion Ferguson, Cornell.
Quality evaluation and improvement for Internal Audit
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Title I Needs Assessment and Program Evaluation
Evaluation. Practical Evaluation Michael Quinn Patton.
CDC Evaluation Process Harlen Hays, MPH Office of Health Promotion, KDHE.
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
School Development Planning Initiative “An initiative for schools by schools” Self-Evaluation of Learning and Teaching Self-Evaluation of Learning and.
“Walking Through the Steps and Standards” Presented by: Tom Chapel Focus On…
1 Early Childhood Special Education Connecticut State Department of Education Early Childhood Special Education Maria Synodi.
Measuring Outcomes: Potential Approaches in Evaluating Public Health William M. Sappenfield, MD, MPH MCH EPI Program Team Leader Division of Reproductive.
School-Wide Positive Behavior Support District Planning Louisiana Positive Behavior Support Project.
COMMUNITY Needs Assessment Overview
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Too expensive Too complicated Too time consuming.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Participation of the State Advisory Panel and State Interagency.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
United States Department of Agriculture Food Safety and Inspection Service 1 National Advisory Committee on Meat and Poultry Inspection August 8-9, 2007.
2013 NEO Program Monitoring & Evaluation Framework.
Private & Confidential1 (SIA) 13 Enterprise Risk Management The Standard should be read in the conjunction with the "Preface to the Standards on Internal.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
EVALUATION THEORY AND MODEL Theory and model should have symbiotic relationship with practice Theory and model should have symbiotic relationship with.
SUB-MODULE 5. MEANS OF VERIFICATION RESULTS BASED LOGICAL FRAMEWORK TRAINING Quality Assurance and Results Department (ORQR.2)
Neal D. Kohatsu, MD, MPH Medical Director May 17, 2012 Evaluation and Quality Improvement.
Evaluating Title IIID Programs. November 8, :00-2:00 PM Webinar conference line toll free number (888) Code Login information
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
11 The CPCRN, DCPC, NCI, and the Community Guide: Areas for Collaboration and Supportive Work Shawna L. Mercer, MSc, PhD Director The Guide to Community.
Copyright © 2014 by The University of Kansas A Framework for Program Evaluation.
Evaluation Approaches, Frameworks & Designs HSC 489 – Chapter 14.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea A DESIGN OF THE METAEVALUATION MODEL A DESIGN OF THE METAEVALUATION.
 2007 Johns Hopkins Bloomberg School of Public Health Introduction to Program Evaluation Frances Stillman, EdD Institute for Global Tobacco Control Johns.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
1 Results-based Monitoring, Training Workshop, Windhoek, Results-based Monitoring Purpose and tasks Steps 1 to 5 of establishing a RbM.
Evaluation design and implementation Puja Myles
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
1 National Indicators and Qualitative Studies to Measure the Activities and Outcomes of CDC’s PRC Program Demia L. Sundra, MPH Prevention Research Centers.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Unit 9: Evaluating a Public Health Surveillance System #1-9-1.
2016 Spring Grantee Convening IKF Evaluation Update Center for Community Health and Evaluation April 11, 2016 Foundation for a Healthy Kentucky.
Demonstrating the Value of Stakeholder Input for Evaluation Design, Implementation and Dissemination Yvonne Abel, MS Abt Associates Inc. AEA Annual Conference,
Yvonne Abel, Abt Associates Inc. November 2010, San Antonio, TX Enhancing the Quality of Evaluation Through Collaboration Among Funders, Programs, and.
Michael Celestin, MA,CHES,CTTS 3/6/2013 R2R MENTORSHIP EXPERIENCE.
Evaluating the Quality and Impact of Community Benefit Programs
GEF Familiarization Seminar
Project Integration Management
Community program evaluation school
EVALUATION THEORY AND MODEL
A Framework for Program Evaluation
Presentation transcript:

HMP Evaluation Overview and Tools UNE-CCPH June 22, 2011 Presenters: Kira Rodriguez Michelle Mitchell Patrick Madden Pamela Bruno-MacDonald

2 UNE-CCPH Evaluation Team Core Evaluation Team Kira Rodriguez, MHS, HMP/Crosscutting Lead Michelle Mitchell, MSocSc, KIT Lead, Local HMP Lead Pamela Bruno MacDonald, MPH, PAN Lead Ruth Dufresne, MPH, CVH/Diabetes Lead Praphul Joshi, PhD, Tobacco Lead Patrick Madden, MBA, Data/Surveillance Jennie Aranovitch, HMP Evaluation Administrative Assistant Technical Scientific Advisers (as needed)

3 Engage Stakeholders Describe the Program Focus the Evaluation Design Gather Credible Evidence Justify Conclusions Ensure use and lessons learned Accuracy UtilityFeasibility Propriety

Accuracy Evaluation produces findings that are considered correct 4

Feasibility Evaluation is viable and pragmatic. 5

Utility Information needs of evaluation users are satisfied. 6

Propriety Evaluation is ethical (i.e., conducted with regard for the rights and interests of those involved and effected). 7

Cross-reference of steps and relevant standards 8 Steps in Evaluation PracticeActivitiesStandards Engaging stakeholdersStakeholder identification Evaluator credibility Formal agreements Rights of human subjects Utility/13-A Utility/13-B Propriety/15-B Propriety/15-C Describing the programComplete and fair assessment Program documentation Propriety/15-C Accuracy/16-A Focusing the evaluation design Evaluation impact Political viability Cost effectiveness Complete and fair assessment Described purposes and procedures Utility/13-G Feasibility/14-B Feasibility/14-C Propriety/15-E Accuracy/16-C Gathering credible evidenceInformation scope and selection Defensible information sources Valid information Reliable information Systematic information Utility/13-C Accuracy/16-D Accuracy/16-E Accuracy/16-F Accuracy/16-G Justifying conclusionsValues identification Analysis of quantitative information Analysis of qualitative information Justified conclusions Utility/13-D Accuracy/16-H Accuracy/16-I Accuracy/16-J Ensuring use and sharing lessons learned Report clarity Report timeliness and dissemination Disclosure of findings Impartial reporting Utility/13-E Utility/13-F Propriety/15-F Accuracy/16-K

Elements of HMP Evaluation Plan* Title Page Program Description Evaluation Question Overview Intended Users and Uses Evaluation Focus – Data Collection Matrix Methods Analysis and Interpretation Plan Dissemination and Sharing Plan 9 * Adapted from a presentation given at CDC/AEA Summer Institute presented by Lavinghouse and Snyder

Example: Users and Uses Users GroupEvaluation Focus/UseInvolvement & Input HMP Program Managers Understand overall effectiveness Program-Specific improvement information Report to funders Monthly Meetings to review design, tools, and evaluation deliverables/findings Year End deliverable notebooks directed to this group Make decisions about evaluation design Local Coalitions/Schools (e.g. HMP Directors & School Health Coordinators) Improve local programs Share successes/outcomes with funders, boards of directors, lead agencies, school administrators etc. Leadership Council – input and feedback to evaluation design Surveys, Interviews LegislatorsUnderstand impact of programsProvide snapshots/summaries of program successes and outcomes 10

Example: Data Collection Matrix Evaluation Section Evaluation Questions IndicatorsData Sources Frequency Implementation/ Process How many evidence-based strategies were implemented by local HMPs and in what settings? #,% of strategy milestones completed on time #,% of strategies implemented per setting KIT KIT or Survey Quarterly Annual Intermediate Outcomes (3-5 years) How many P&E changes were made? In what settings? At community and state level? #,% of smokefree municipal ground policies #,% of schools with best practice PE/recreation policies Environmental indicator surveys School health surveys Key Informant Interviews with HMP Directors and SHCs Survey every 3-5 years per setting Bi-Annual Interviews 11

Primary Data Collection – EI Survey of Municipalities Environmental Indicator (EI) Surveys are a tool to look at policy and environmental changes in settings State and Local HMPs work with 2011 Municipalities EI Survey covers policies and environments around: –tobacco, –physical activity and nutrition, –chronic disease policies for employees and public, and –emergency response capabilities Previous surveys of municipalities in 2004 and 2007 Baseline and Outcome information will be gleaned 12

13 5 CLICKS TO EVALUATION DOCUMENTS Grantee ResourcesResources and DocumentsUseful Documents Evaluation Resources

3 Breakouts Secondary Data Sources for Evaluation A Framework for Telling your Story Coalition Self-Assessment Tools 14

15 Contact Information Please contact Kira Rodriguez with any questions