Monitoring and Evaluating Interventions: A Workshop Chris Duclos, PhD. JSI Research & Training Institute.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Introduction to Monitoring and Evaluation
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Donald T. Simeon Caribbean Health Research Council
What You Will Learn From These Sessions
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Designing an Effective Evaluation Strategy
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
1 Theory of Change Chesapeake Bay Funders Network Program Evaluation Training Workshop OMG Center for Collaborative Learning January 9-10, 2008.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Evaluation in the Field: Putting Concepts into Action Janet Myers, PhD MPH Richard Vezina, MPH CAPS HIV Prevention Conference April 21, 2006.
Formative and Summative Evaluations
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Write Goals and Objectives
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Research Methods for the Social Sciences: Ethics Ryan J. Martin, Ph.D. Thomas N. Cummings Research Fellow March 9, 2010.
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
Program Evaluation: A Pseudo-Case Study
Monitoring & Evaluation in World Bank Agriculture and Rural Development Operations SASAR Monitoring & Evaluation Workshop New Delhi; June 20, 2006.
Charting a course PROCESS.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
The Evaluation Plan.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Too expensive Too complicated Too time consuming.
WRITING EFFECTIVE GRANT PROPOSALS With an Eye toward Performance and Evaluation Issues.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Outcome Based Evaluation for Digital Library Projects and Services
Developing Indicators
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Introduction to Evaluation Odette Parry & Sally-Ann Baker
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Program Evaluation Principles and Applications PAS 2010.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Chapter 15: Getting Started on the Assessment Path Essential Issues to Consider.
 2007 Johns Hopkins Bloomberg School of Public Health Section B Logic Models: The Pathway Model.
Making it Count! Program Evaluation For Youth-Led Initiatives.
Prevention Education Meeting May 29, 2013 Evaluation 101.
IDEV 624 – Monitoring and Evaluation Evaluating Program Outcomes Elke de Buhr, PhD Payson Center for International Development Tulane University.
Are we there yet? Evaluating your graduation SiMR.
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
Welcome! These slides are designed to help you think through presenting your evaluation planning and results. Feel free to pick and choose the slides that.
Evaluation What is evaluation?
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Designing Effective Evaluation Strategies for Outreach Programs
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
HEALTH IN POLICIES TRAINING
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
14 Cultural Competence Training, Assessment, and Evaluation of Cultural Competence, and Evidence-Based Practices in Culturally Competent Agencies.
Monitoring and Evaluating FGM/C abandonment programs
Presentation transcript:

Monitoring and Evaluating Interventions: A Workshop Chris Duclos, PhD. JSI Research & Training Institute

Today:  Discuss M & E purposes  Learn about E cycle & components  Understand E designs  Prepare for an E  Discuss ethical considerations  Prevention M & E: How it differs  Discuss challenges in prevention M & E

Why All the Fuss?

Exercises  Organize into groups  Pick pocket exercise  Identify purposes

To know how well you’re doing…you must have some place you’re trying to get to…. “If you don’t know where you’re going, you’ll end up somewhere else….

Why Evaluate (Purposes)?  Ensure program effectiveness and appropriateness  Demonstrate accountability  Contribute to HIV/AIDS knowledge base  Improve program operations and service delivery

Evaluation  Evaluation is the systematic collection of information about a program in order to enable stakeholders:  to better understand the program,  to improve program effectiveness, and/or  to make decisions about future programming.

Critical Evaluation Questions  What do you want your project to accomplish?  How will you know if you have accomplished your goals?  What activities will your project undertake to accomplish your goals?  What factors might help or hinder your ability to accomplish your goals?  What will you want to tell others who are interested in your project?

Program Planning Process IMPLEMENT EVALUATE IMPROVE PLAN

1.Identify program goals and objectives 2.Define the scope of the evaluation 3.Define evaluation questions & indicators 4.Define methods 5.Design instruments and tools 6.Carry out the evaluation 7.Analyze data and write a report 8.Disseminate and use data Essential Steps to Evaluation (FHI, Impact, USAID manual)

Components of Project Level Evaluation  4 general components to comprehensive program evaluation:  Formative evaluation: How do we make the program better?  Process evaluation: How was the program implemented?  Outcome evaluation: Did the program meet its objectives?  Impact evaluation: Was the ultimate goal of the program achieved?

Application to Your Program:  Identify Program Goals  For each goal:  Identify Process Objectives  Identify Outcome Objectives  For each objective:  Identify Indicators  Identify Data Source  Plan Data Collection  Plan Data Analysis

Program Goals and Objectives  Well developed goals and objectives are critical to evaluation.  Objectives are specific steps that contribute to a goal. Often several objectives per goal.  Good objectives are SMART: S – specific S – specific M – measurable M – measurable A – attainable A – attainable R – realistic R – realistic T – time-bound T – time-bound

Program Outcome Model

Every program has… Every program evaluation should have… Impact Indicators Outcome Indicators Process Indicators GoalsObjectivesActivities

Every program has… Process Evaluation Every program evaluation should have… Impact Indicators Outcome Indicators Process Indicators GoalsObjectivesActivities

What is Process Evaluation?  Process evaluation:  Addresses how, and how well, the program is functioning  It can help to…  Create a better learning environment  Show accountability to funder  Reflect the target populations  Track services

Process Evaluation con’t  Key questions in process evaluation:  Who is served?  What activities or services are provided?  Where, when, and how long is the program?  What are the critical activities?  How do these activities connect to the goals & intended outcomes?  What aspects of the implementing process are facilitating success or acting as stumbling blocks?

Process Evaluation con’t  Identify how an outcome is produced  Identify strengths & weaknesses of a program  Create detailed description of the program

Process Evaluation Objectives  Improve current activities  Provide support for sustainability  Provide insight into why certain goals are or are not being accomplished  Help leaders make decisions

Methods of Data Collection  Quantitative vs. Qualitative  Surveys, interviews, activity databases, etc.  Observation

Exercise – Draft Process Evaluation Plan  Initial groups, organize by experience  Use least experienced member’s program  Articulate one goal with objectives  Come up with two process evaluation questions  Define measurable process outcomes, indicators, and data collection method

Exercise: Goal: ObjectivesOutcomeIndicatorData Source Collection Method Data Analysis 1.

Every program has… Outcome Evaluation Every program evaluation should have… Impact Indicators Outcome Indicators Process Indicators GoalsObjectivesActivities

OUTCOMES….  are what a program is accountable for…

Outcome Evaluation  Outcome evaluation:  Measures the extent to which a program produces its intended improvements  Examines effectiveness, goal attainment and unintended outcomes  In simple terms, “What’s different as a result of your efforts?”

Outcome Evaluation con’t  Key questions in outcome evaluation:  To what degree did the desired change(s) occur?  Outcomes can be immediate, intermediate or longer-term  Outcomes can be measured at the patient, provider, organization, or system level.

Outcomes ??? Outcomes are: Changes in: behavior skills knowledge attitudes condition status Outcomes should be: Related to the business of the project REALISTIC and ATTAINABLE RELEVANT to the project Within the program’s sphere of influence

Outcomes… Are logical and reasonable if it is… Reasonable to believe that the outcome can be accomplished within the timeframe that a program has. Based on: Program previous experience Context Resources

An Example of Outcome Evaluation GOAL: Increase sexually active single seniors’ knowledge and use of condoms POSSIBLE EVALUATION QUESTIONS:  Have seniors increased their knowledge about the use of condoms?  Have seniors increased their use of condoms?  How do we know that the outreach and education activities are responsible for the changes?

Outcome Evaluation Design Pre-ProgramPost- Single Group Designs Experimental Designs Random Assignment Participant Group Comparison Group

Outcome Evaluation Design  Quasi-Experimental Designs Nonrandom Assignment, comparison usually does not have program  Posttest-Only Designs Weakest, but better than nothing; compare results to local and/or national data.

Newer Methods  Rolling Group Designs All groups receive intervention at different times; provide pretest & posttest data; no wait list controls Groups that receive intervention later serve as a control group until they too get intervention Pretest prior to the intervention is conceptualized as if it is a posttest of a “real” control group.

Newer Methods  Internal Referencing Strategy  Pretest-posttest single group design in which content relevant materials that are covered in the intervention & that are not covered are both assessed in the pretest and posttest  Untrained/presented materials acts as a control for the presented materials  Nonintervention materials assessed be conceptually related but distinct from the intervention materials  Effectiveness inferred when improve seen from pretest to posttest on intervention materials, but little or no change on the nonintervention material

Data Collection  Begins before program starts – needs assessments  Think about what kind of data would answer the question  Think about best method in collecting the data and how often  Collect only what you need  Nothing’s simple

Digging Through People’s Files  Agency/program records  Meeting records or minutes  Other program records (ex. Schools)  Public records (ex. police or court) Pros: Cheap and fast! Cons: Have to get permission; maybe biased because it was collected for another reason.

Custom Data Collection  Allows you to collect what you need, how you want  Develop own or use standardized  Typically occurs though: Surveys, in-person interview, or focus groups Pros: Can be fairly cheap, allows a way to ask questions the way you want. Cons: Some people lie; interviewer bias, interviewee bias IMPORTANCE OF TESTING OR PILOTING

Things to Remember  Collect only data that you will use & are relevant  Involve all staff involved in the data collection phase in up-front question formation  Revise collection strategies based on initial analyses – what’s working?, what’s still missing?  Base changes to existing tracking/data collection strategies on what is learned from evaluation

Exercise: Goal: ObjectivesOutcomeIndicatorData Source Collection Method Data Analysis 1.

Every program has… Impact Evaluation Every program evaluation should have… Impact Indicators Outcome Indicators Process Indicators GoalsObjectivesActivities

Impact Evaluation  Impact is sometimes used to mean “outcome.”  Impact is perhaps better defined as a longer-term outcome - improved patient outcomes  In global M&E, incidence or prevalence of disease

A note about impact…  Most program evaluations focus on measuring the process and outcomes of a program  Measuring impact requires significant resources that most programs don’t have  It’s also difficult to link the more immediate effects of a program to broad, often community level, impacts

Ethical Considerations  As a minimum, all evaluation projects must ensure that they are fully in line with the ethical research  Issues:  Help or benefit to others, do no harm, act fairly, respect others   Consideration of risks & benefits   Disruptions of participants’ life, emotional consequences, safety concerns, social harm

Ethical Considerations  Keep evaluation procedures as brief & convenient as possible to minimize burden  Do not ask emotionally troubling questions, unless absolutely necessary  Provide incentives  Provide Informed Consent  Protect “Confidentiality”  Ensure safety  Obey HIPAA requirements  Get possible IRB review

American Evaluation Association Principles   Systematic inquiry   Competence   Integrity/Honesty   Respect for people   Responsibilities for general and public welfare

Prevention Evaluation Differences  Very nature of prevention poses unique challenges  Key difference in evaluating prevention programs is that you are trying to determine what DID NOT occur  Measuring reduction or delayed onset before it happens  Comparison groups then become essential

Challenges in Evaluating Prevention Programs  Timeframe  Measurement  Results  Statistical Significance  Accountability  Competition

Resources  Kellogg Foundation  American Evaluator’s Association  CDC MMWR Weekly Report, September 17, 1999, Vol. 48, No.RR-11 “Framework for Program Evaluation in Public Health  CSAP’s Prevention Pathways: Online Courses

Hope This Helped Contact me: Christine Duclos, PhD, MPH JSI Research & Training Institute, Inc Blake St. #320 Denver, CO