Evaluation in the Field: Putting Concepts into Action Janet Myers, PhD MPH Richard Vezina, MPH CAPS HIV Prevention Conference April 21, 2006.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Introduction to Monitoring and Evaluation
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
 To assess the learners achievement at the end of a teaching-learning process, for instance, at the end of the unit.  Measures the learners attainment.
Evaluation and Human Resources Focus: discuss how evaluation of schools is conducted and where the emphasis should be placed in these evaluations. Thesis:
How to Write Goals and Objectives
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Principal Performance Evaluation System
Objectives and Indicators for MCH Programs
Measuring for Success Module Nine Instructions:
How to Develop the Right Research Questions for Program Evaluation
Performance Measurement and Evaluation Basics 2014 AmeriCorps External Reviewer Training.
Nine steps of a good project planning
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
Key Performance Measures, Evaluation Plans, and Work Plan
The Evaluation Plan.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
J.B. Speed School of Engineering University of Louisville KEEPS Energy Management Toolkit Step 3: Set Performance Goals Toolkit 3A: Set Energy Performance.
AU- MAEL Dr. Dan Bertrand
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
Chapter 6 Training Evaluation. Chapter 6 Training Evaluation Concepts Training Evaluation: The process of collecting data regarding outcomes needed to.
Rethinking Homelessness Their Future Depends on it!
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Lesson 8: Effectiveness Macerata, 11 December Alessandro Valenza, Director, t33 srl.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
The Impact of Health Coaching
EXPANDING YOUR TRANSITION TOOLBOX: Teaching Transition Knowledge and Skills “Building Futures” Transition to Education and Employment Conference Salem,
Julie R. Morales Butler Institute for Families University of Denver.
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
Nancy L. Weaver, PhD, MPH Department of Community Health School of Public Health Saint Louis University 16 July 2010 LOGIC MODEL FUNDAMENTALS.
The Results-Based System Awoke Kassa ENTRO M&E Officer ENTRO M&E Officer SDCO Capacity Building Workshop IV October 2008 Cairo Nile Basin Initiative.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Evaluation and Impact of Entrepreneurial Education and Training Malcolm Maguire Transnational High Level Seminar on National Policies and Impact of Entrepreneurship.
Program Evaluation.
Outcomes, Outputs & Context Tammy Horne, Ph.D. WellQuest Consulting Ltd. (780)
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Performance Indicators Table for Youth Substance Abuse & Use Prevention September 15, 2005 Ottawa, Ontario Wanda Jamieson & Tullio Caputo.
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Performance Monitoring COURTNEY MILLS SCPCSD DIRECTOR OF ACADEMIC PROGRAMS.
School Development Goal Development “Building a Learning Community”
The Starting Point Things to consider include Defining the overarching aim(s) of the event/activity Deciding the type of event you wish to undertake Defining.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Lesson 3 Skills for Healthy Living Goal Setting 3 Skills for Healthy Living Goal Setting L E S S O N.
Funded by a grant from the National Science Foundation, GSE/EXT: STEM Equity Pipeline Project, Grant No. HRD © 2009 National Alliance for Partnerships.
Clarifying the Evaluation Focus in a Complex Program Context Howard Kress, PhD* Natalie Brown, MPH** *Battelle Memorial Institute, with National Center.
Are we there yet? Evaluating your graduation SiMR.
CAREER AND SKILLS TRAINING STRATEGIC FRAMEWORK Planning is key to success.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Evaluating Training The Kirkpatrick Model.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Identifying Monitoring Questions from your Program Logic.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Study Skills: Academic Goal Setting. Why set academic goals? It is important to set academic goals so that you have something to work toward and stay.
Module 2 Basic Concepts.
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
Training for 2018 Funded Program Evaluation form
Presentation transcript:

Evaluation in the Field: Putting Concepts into Action Janet Myers, PhD MPH Richard Vezina, MPH CAPS HIV Prevention Conference April 21, 2006

Overview  Warm-up  Where does Evaluation fit?  Approaches to Evaluation  Examples  Q&A: Evaluating Your Programs

Warm-Up

Who here is … Evaluator? Service Provider? Administrator?

4 Questions:  What are the benefits of evaluating your programs?  What are the challenges to evaluating your programs?  What needs (besides $) do you have in order to plan/conduct evaluation?  What resources do you have/use for evaluation?

Where does Evaluation fit?

Mission Goals Objectives Activities Outcomes Impact Program & Process Evaluation IMPLEMENTATION EvaluationEvaluation Program & Evaluation PLANNING OutcomeOutcome

Mission  Provides the vision  How this work makes a difference in the world  Broadest scope

Goals (Ross & Mico, 1980; McKenzie & Smeltzer 2001)  “ A future event toward which a committed endeavor is directed ”  Simple & concise  2 basic components “ Who will be affected ” “ What will change as a result of the program ”

Objectives  Specific steps that contribute to a goal. Often several objectives per goal.  Good objectives are SMART: S S – specific M M – measurable A A – attainable R R – realistic T T – time-bound

Good Objectives Show… (McKenzie & Smeltzer 2001)  What will change: Outcome that will be achieved  When will it change: Conditions under which the outcomes will be observed  How much change: Criteria for deciding whether the outcomes has been achieved  Who will change: Target population

Activities  Internal: administrative, etc.  External: the services you provide to clients  Based on your goals/objectives

Outcomes  Changes that occur in people being served by your program  Attribution: To the best extent possible, show that change is a result of your program (but note…causality is difficult)  Standards are typically different for evaluation (vs. research)  To assess, you need at least 2 time points (pre- and post-) and/or a comparison group

Impact  The scope of the program’s effects, the duration of its outcomes and the extent of its influence on the broader context (for example, HIV incidence)  Attribution: Can be more challenging to show causality, because looking for more diffuse effects  Usually broad and long-term  Typically not in the scope of program evaluation

Approaches to Evaluation

Why do we evaluate?  To determine if objectives are being met  To improve quality of the program  To decide how to change content  To identify the effects of the program

Process vs. Outcome Evaluation  Process Demographics (Who’s being trained?) Reaction to content (“Smile Sheets”) Service units delivered  Outcome Changes in knowledge/attitudes/beliefs Changes in behavior Impact on patients/clients

Process Evaluation can help us…  Create better learning environment  Improve presentation skills  Show accountability  Reflect the target populations  Track service units

Outcome Evaluation can help us…  Show the program’s effects  Allow for comparisons over time  Provide specific guide points for improving programs  Show accountability

Planning Your Evaluation (1)  Figure out your questions: What will this be used for?  Consider your Resources Staffing Time Materials $$$  Choose Methods Quantitative: Surveys, pre/post tests, etc. Qualitative: Interviews, focus groups, etc.

Planning Your Evaluation (2)  Select Indicator(s) Relevant Measurable Improvable  Instrument/Tool Development Don’t reinvent the wheel!  Analysis: Get answers to your questions  Reporting: Formal & Informal

Examples