Evaluation. Practical Evaluation Michael Quinn Patton.

Slides:



Advertisements
Similar presentations
Engaging Your Community in Practical Program Evaluation Stephanie Welch, MS-MPH, RD, LDN Knox County Health Department Phone (865)
Advertisements

Introduction to Monitoring and Evaluation
Educational Specialists Performance Evaluation System
From Research to Advocacy
Donald T. Simeon Caribbean Health Research Council
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
What You Will Learn From These Sessions
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
Designing an Effective Evaluation Strategy
Project Monitoring Evaluation and Assessment
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Program Evaluation Essentials. WHAT is Program Evaluation?
Evaluation.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
1 Evaluation as Continuous Improvement The Health Disparities Service- Learning Collaborative Suzanne B Cashman February 27, 2007.
University of Wisconsin - Extension, Cooperative Extension, Program Development and Evaluation Unit 3: Engaging stakeholders.
Student Assessment Inventory for School Districts Inventory Planning Training.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
CDC Evaluation Process Harlen Hays, MPH Office of Health Promotion, KDHE.
February 8, 2012 Session 4: Educational Leadership Policy Standards 1 Council of Chief School Officers April 2008.
Practicing the Art of Leadership: A Problem Based Approach to Implementing the ISLLC Standards, 4e © 2013, 2009, 2005, 2001 Pearson Education, Inc. All.
Continuous Quality Improvement (CQI)
Quality Improvement Prepeared By Dr: Manal Moussa.
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Reporting and Using Evaluation Results Presented on 6/18/15.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Program Evaluation Using qualitative & qualitative methods.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
The Evaluation Plan.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Too expensive Too complicated Too time consuming.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Outcome Based Evaluation for Digital Library Projects and Services
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Copyright 2010, The World Bank Group. All Rights Reserved. Planning and programming Planning and prioritizing Part 1 Strengthening Statistics Produced.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Crosswalk of Public Health Accreditation and the Public Health Code of Ethics Highlighted items relate to the Water Supply case studied discussed in the.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
Evaluation Overview: Rochelle Schultz Spinarski, Rural Health Solutions Community Care Learning Collaborative October 29, 2014.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Making Plans for the Future April 29, 2013 Brenda M. Tanner, Ed.D.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Evaluation Revisiting what it is... Who are Stakeholders and why do they matter? Dissemination – when, where and how?
Program Assessment. Before you get started Need to determine if the program can be evaluated or should be evaluated. Is the answer already available?
Evaluation design and implementation Puja Myles
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
What is a Performance Audit or Performance Auditing?
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Unit 9: Evaluating a Public Health Surveillance System #1-9-1.
Language Studies and Academics Report Writing Types of Reports CM 2300.
Time to answer critical and inter-related questions: Whom will we serve? What will we offer? How will we serve them?
Evaluation What is evaluation?
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Focus Questions What is assessment?
Logic Models How to Integrate Data Collection into your Everyday Work.
Evaluating the Quality and Impact of Community Benefit Programs
Designing Effective Evaluation Strategies for Outreach Programs
Program Evaluation Essentials-- Part 2
Presentation transcript:

Evaluation

Practical Evaluation Michael Quinn Patton

Systematic collection of information about: Activities Characteristics Outcomes Programs Personnel Products Of To be used by specific people to : Reduce uncertainties Improve effectiveness Make decisions

What have we done? How well have we done it? Whom have we done it to? How much have we done? How effective has our program been? What could we do better or differently?

Benefits of Program Evaluation Reflect on progress - where we’re going, where we’re coming from Improve programs Influence policy makers and funders - ensure funding and sustainability Build community capacity and engage community Share what works and what doesn’t with others Strengthen accountability

4 Standards: Useful Feasible Proper Accurate Joint Committee on Standards of Educational Evaluation, 1994

Useful Will results be used to improve practice or allocate resources better? Will the evaluation answer stakeholders’ questions?

Feasible Does the political environment support this evaluation? Do you have personnel, time, and monetary resources to do it in house? Do you have resources to contract with outside consultants? If you can’t evaluate all parts of the program, what parts can you evaluate?

Proper Is your approach fair and ethical? Can you keep individual responses confidential?

Accurate Are you using appropriate data collecting methods? Have interviewers been trained if you are using more than one? Have survey questions been tested for reliability and validity?

Step 1: Engage Stakeholders

Those involved in program operations –administrators –managers –staff –contractors –sponsors –collaborators –coalition partners –funding officials

Those served or affected by the program –clients –family members –neighborhood organizations –academic institutions –elected officials –advocacy groups –professional organizations –skeptics –opponents

Primary intended users of the evaluation –Those in a position to do or decide something regarding the program. –In practice, usually a subset of all stakeholders already listed.

Step 2: Describe the Program

Mission Need Logic model components inputs outputs outcomes Objectives outcome process Context setting history environmental influences

Step 3: Focus the Design

Goals of Focusing Evaluation assesses issues of greatest concern to stakeholder - and at the same time: Evaluation using time and resources as efficiently as possible

Questions to be answered to focus the evaluation: What questions will be answered? (i.e. what is the real purpose? What outcomes will be addressed?) What process will be followed? What methods will be used to collect, analyze, and interpret the data? Who will perform the activities? How will the results be disseminated?

Step 4: Gather Credible Evidence

Data must be credible to the evaluation audience Data gathering methods are reliable and valid Data analysis is done by credible personnel “triangulation” - applying different kinds and data to answer the question

Indicators Translate general program concepts into specific measures Samples of indicators participation rates client satisfaction changes in behavior or community norms health status quality of life expenditures

Data Sources Routine statistical reports census vital stats NHANES Program Reports log sheets service utilization personnel time sheets Special Surveys

Sources of Data People participants staff key informants representatives of advocacy groups Documents meeting minutes media reports surveillance summaries Direct Observation

Selected Techniques for Gathering Evidence

Step 5: Justify Conclusions

Justification Steps: What are the findings? What do the findings mean? How do the findings compare with the objectives for the program? What claims or recommendations are indicated for program improvement?

Step 6: Ensure Use and Share Lessons Learned

“Evaluations that are not used or inadequately disseminated are simply not worth doing.” “The likelihood that the evaluation findings will be used increases through deliberate planning, preparation, and follow-up.” Practical Evaluation of Public Health Programs, Public Health Training Programs

Activities to Promote Use and Dissemination: Designing the evaluation from the start to achieve intended uses Preparing stakeholders for eventual use by discussing how different findings will effect program planning Scheduling follow-up meetings with primary intended users Disseminating results using targeted communication strategies

Group Work Describe the ideal stakeholder group for your project evaluation. What questions will be answered? (include the questions inherent in your objectives) What data will be collected, analyzed and interpreted? –How will this get done & by whom? How will you disseminate your findings?