Program Evaluation Spero Manson PhD

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Andrea M. Landis, PhD, RN UW LEAH
What You Will Learn From These Sessions
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
Designing an Effective Evaluation Strategy
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Reading the Dental Literature
The Research Consumer Reviews the Measures
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
Chapter 4 Validity.
Beginning the Research Design
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Evaluation. Practical Evaluation Michael Quinn Patton.
RESEARCH METHODS IN EDUCATIONAL PSYCHOLOGY
Formulating the research design
USING STUDENT OUTCOMES WHEN INTEGRATING INFORMATION LITERACY SKILLS INTO COURSES Information Literacy Department Asa H. Gordon Library Savannah State University.
Science Inquiry Minds-on Hands-on.
Continuous Quality Improvement (CQI)
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
Chapter 8 Experimental Research
How to Develop the Right Research Questions for Program Evaluation
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
Chapter 4 Principles of Quantitative Research. Answering Questions  Quantitative Research attempts to answer questions by ascribing importance (significance)
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Performance Measurement and Evaluation Basics 2014 AmeriCorps External Reviewer Training.
Health promotion and health education programs. Assumptions of Health Promotion Relationship between Health education& Promotion Definition of Program.
Instructional Design Eyad Hakami. Instructional Design Instructional design is a systematic process by which educational materials are created, developed,
Research Methods Key Points What is empirical research? What is the scientific method? How do psychologists conduct research? What are some important.
Performance Measurement and Analysis for Health Organizations
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Tools in Media Research In every research work, if is essential to collect factual material or data unknown or untapped so far. They can be obtained from.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Module 4 Notes Research Methods. Let’s Discuss! Why is Research Important?
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Copyright ©2008 by Pearson Education, Inc. Pearson Prentice Hall Upper Saddle River, NJ Foundations of Nursing Research, 5e By Rose Marie Nieswiadomy.
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
Psychological Research Strategies Module 2. Why is Research Important? Gives us a reliable, systematic way to consider our questions Helps us to draw.
Assumes that events are governed by some lawful order
Quantitative and Qualitative Approaches
1 Experimental Research Cause + Effect Manipulation Control.
1-2 Training of Process Facilitators 3-1. Training of Process Facilitators 1- Provide an overview of the role and skills of a Communities That Care Process.
Why is Research Important?. Basic Research Pure science or research Research for the sake of finding new information and expanding the knowledge base.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
1 The Theoretical Framework. A theoretical framework is similar to the frame of the house. Just as the foundation supports a house, a theoretical framework.
Basic Nursing: Foundations of Skills & Concepts Chapter 9
Research Strategies. Why is Research Important? Answer in complete sentences in your bell work spiral. Discuss the consequences of good or poor research.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Program Evaluation Principles and Applications PAS 2010.
Psychological Research Strategies Module 2. Why is Research Important? Gives us a reliable, systematic way to consider our questions Helps us to draw.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Logical Framework Approach 1. Approaches to Activity Design Logical Framework Approach (LFA) – Originally developed in the 1970s, this planning process.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
Outcomes Evaluation A good evaluation is …. –Useful to its audience –practical to implement –conducted ethically –technically accurate.
Welcome! Seminar – Monday 6:00 EST HS Seminar Unit 1 Prof. Jocelyn Ramos.
Session 2: Developing a Comprehensive M&E Work Plan.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Designing Effective Evaluation Strategies for Outreach Programs
Module 02 Research Strategies.
Program Evaluation Essentials-- Part 2
Presentation transcript:

Program Evaluation Spero Manson PhD

Program evaluation is a careful investigation of a program’s characteristics and merits. Its purpose is to provide information about the effectiveness of an activity or set of activities in order to optimize the outcomes, efficiency, and quality of health care.

Evaluation examines a program’s structure, activities, organization, as well as broader social environment. Evaluation also appraises the achievement of a program’s goals and objectives, and extent of its impact as well as cost.

We study, we plan, we research. And yet, somehow evaluation remains as much an art as a science.

The Basic Steps in Program Evaluation Describe the program or activities to evaluate Define goals and objectives Create a Logic Model Pose the evaluation question(s) Select the methods - information needed to answer the question Gather the information Report on and learn from results

Describe the Program or Activities to Evaluate Need clarity as to the program and activities that are the focus of evaluation Must be able to distinguish the program from the broader environment in which it operates Premium placed on specifying the nature of the activities: who, what, where, when, how

Define the Program’s Goals and Objectives Goals: a broad general statement of what the program hopes to accomplish Objectives: measurable, time-specific out-comes that are expected to be achieved as a result of the program or activities

Define the Program’s Goals and Objectives Establishing measurable, time-specific outcomes is a major key to an evaluation’s credibility The more specific they are, the easier they are to measure The absence of uniformly accepted definitions or levels of performance introduces ambiguity – the bane of program evaluation!

Define the Program’s Goals and Objectives Example: Goal: Reduce the risk of diabetes in youth by increasing their participation in physical activity Objective: Increase the number of students participating in physical activity in the wellness center by 50% during the next school year

Create a Logic Model Overall description of program activities and potential outcomes, impacts Useful to planning evaluation, defining questions, determining information that you can measure

Logic Model Components Context Influences, circumstances, resources, stakeholders Program goals, objectives Outputs, Activities – what will be done or result Outcomes – Potential measurable data, information Impacts – Short and long term benefits

Logic Model Example Context Influences, circumstances, resources, stakeholders Program goals, objectives Outputs, Activities – what will be done Outcomes – Potential measurable data, information Impacts – Short and long term benefits Tribe, school, health program, Move it! Grant resources Summer Camp Provide Nutrition education Increase physical activity Daily walks Introduce new activities Prepare meals Classes on nutrition Increase participation, distance, # steps per day Knowledge on nutrition, fitness Awareness of their risk for diabetes Weight loss Decreased obesity Prevention of diabetes

Evaluation Art as well as Science They’re harmless when they’re alone, but get a bunch of them together to evaluate a program … watch out!!

Pose the Evaluation Question(s) Most important step Determines the methods of the evaluation Can have more than one question Pose after reviewing the logic model and can see possible questions

Types of Evaluation Questions Process evaluation questions Did we do what we said we would do? Documentation of activities, review Outcome evaluation questions Did we achieve any short or long term outcomes as a result of our program?

Typical Evaluation Questions To what extent did the program achieve its goals and objectives? What are the characteristics of the individuals and groups who participated in the program? For which individuals or groups was the program most effective? How enduring were the effects?

Typical Evaluation Questions Which features (e.g., activities, settings, care strategies) of the program were most effective? How applicable are the program’s objectives and activities to other participants in other settings? What are the relationships among the costs of the program and its effects? To what extent did changes in social, political, and/or financial circumstances influences the program’s support and outcomes?

Evaluation Questions: Examples Process evaluation: Did the school implement the Move It! Campaign activities during the summer school session? Outcome evaluation: Did student awareness of their risk for diabetes increase after the summer school session? Did at least ½ of the student population participate in physical activity each day? What percent of students experienced weight loss during the summer school session?

Select the Methods What you need to do to answer your evaluation questions – depends on the question! Various methods Pre/post test on knowledge, attitudes Track participation rates in activities Track clinical parameters – weight, heart rate, blood glucose Evaluate satisfaction with activities Always consider a comparison if possible Before vs. after comparisons Participants vs. non-participants

Evaluation Designs Evaluations with concurrent controls in which participants are randomly assigned to groups Benefits: If properly conducted, can establish the extent to which a program caused outcomes Concerns: More difficult to implement, logistically and methodologically

Evaluation Designs Evaluations with concurrent controls in which participants are not randomly assigned to groups Benefits: Easier to implement Concerns: A wide range of potential biases may occur because, without an equal chance of selection, participants in the program may be systematically different from those in the control. Also the 2 groups in the evaluation may be systematically different than other, nonparticipating groups.

Evaluation Designs Evaluations with self-controls. Require pre-/post-measures and often are referred to as longitudinal evaluations or before-and-after designs Benefits: Relatively easy to implement logistically. Provides data on change and improvement Concerns: Must be certain that measurements are appropriately timed. Without a control group, cannot tell if seemingly program effects are also present in other, nonparticipants

Evaluation Designs Evaluations with historical controls use data collected from participants in other evaluations Benefits: Easy to implement, unobtrusive Concerns: Must be certain that “normative” comparisons are applicable to participants in the evaluation.

Threats to Validity of Evaluation Maturation – as a part of normal human development, individuals mature intellect-ually, emotionally, and socially. This new maturity may be as important as the program in producing change.

Threats to Validity of Evaluation History – historical events may occur that can bias results or produce similar changes as those intended by the program, e.g., new educational campaigns that encourage the community at large to change its behavior, change in the structure and financing of health care, etc.

Threats to Validity of Evaluation Instrumentation – unless the measures or tools used to collect the data are dependable, one cannot be confident that the data are accurate.

Threats to Validity of Evaluation Attrition – the participants who remain in a program may be, indeed often are different from those who drop-out.

Gather information Set up a process for gathering the information Create forms, tracking sheets, surveys Plan for how you will analyze or review the information when it is done Hand counts, tallys Spreadsheets, Statistical Databases Common themes

Questions to Ask in Choosing a Data Source What variable, constructs, concepts need to be measured? Are they sufficiently well defined to be measured? Can I borrow or adapt a currently available measure, or must a new measure be created? If an available measure seems appropriate, has it be used in circumstances similar to the current evaluation?

Questions to Ask in Choosing a Data Source Do I have the technical skills, financial resources, and time to create a valid measure? Do I have the technical skills, financial resources, and time to collect information with the chosen measure? Are participants likely to be able to fill out the forms, answer the questions, and provide the information called for by the measure?

Questions to Ask in Choosing a Data Source When medical and other confidential records are relevant data sources, are these records likely to: Contain the necessary information? Be complete? Be timely? Be reliable? Be accessible?

Questions to Ask in Choosing a Data Source To what extent will users of the evaluation’s results have confidence in the nature of the information, the manner in which it was collected, and its sources?

Report on and Learn from Results Most commonly forgotten step Communicate results to stakeholders Review results – lessons learned Develop a plan to respond to results, future activities to address Communicate with parents, community Include in grant progress report! Publish in appropriate scientific forums to established best practices specific to our communties