Summary Slide PROGRAM EVALUATION PA430 – Week 7 – 2/29-3/1/00.

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

Donald T. Simeon Caribbean Health Research Council
Service Learning through Community Inquiry: A Campus-Community Partnership Robin Ringstad Valerie Leyva John Garcia Kelvin Jasek-Rysdahl California State.
Laura Pejsa Goff Pejsa & Associates MESI 2014
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Historical Research.
Business research methods: data sources
Evaluation. Practical Evaluation Michael Quinn Patton.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
The Evaluation Plan.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Too expensive Too complicated Too time consuming.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Outcome Based Evaluation for Digital Library Projects and Services
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
1: Overview and Field Research in Classrooms ETL329: ENTREPRENEURIAL PROFESSIONAL.
Overview of Chapters 11 – 13, & 17
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Program Evaluation.
Monitoring and Evaluation
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Data Driven Planning and Decision-making for Continuous School Improvement: Developing a Program Evaluation Plan Leadership for Innovative Omani Schools.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
CHAPTER ONE EDUCATIONAL RESEARCH. THINKING THROUGH REASONING (INDUCTIVELY) Inductive Reasoning : developing generalizations based on observation of a.
Essential Competencies for Program Evaluators Jean A. King 2012 AEA/CDC Summer Institute.
Publisher to insert cover image here Chapter 9
Logic Models How to Integrate Data Collection into your Everyday Work.
Evaluating the Quality and Impact of Community Benefit Programs
Group evaluation There is need to assess the degree to which a group is achieving or has achieved its set goals. The process of assessing this constitutes.
Qualitative Research Methodology
Part 1: Program Evaluation: What is it?
Designing Effective Evaluation Strategies for Outreach Programs
EIA approval process, Management plan and Monitoring
Impact-Oriented Project Planning
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Planning an Applied Research Project
Evaluating a Task-based English Course: A Proposed Model
Capital Project / Infrastructure Renewal – Making the Business Case
The scope and focus of the Research
MUHC Innovation Model.
Introduction to Comprehensive Evaluation
Program Logic Models Clarifying Your Theory of Change
Strategic Planning for Learning Organizations
Introduction to Program Evaluation
Content analysis, thematic analysis and grounded theory
Outline What is Literature Review? Purpose of Literature Review
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
Ross O. Love Oklahoma Cooperative Extension Service
14 Cultural Competence Training, Assessment, and Evaluation of Cultural Competence, and Evidence-Based Practices in Culturally Competent Agencies.
Program Evaluation Essentials-- Part 2
Intro slide to all of these:
© 2012 The McGraw-Hill Companies, Inc.
Planning a Learning Unit
Logic Models and Theory of Change Models: Defining and Telling Apart
Institutional Effectiveness Presented By Claudette H. Williams
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Monitoring and Evaluation
Regulated Health Professions Network Evaluation Framework
Research Problem: The research problem starts with clearly identifying the problem you want to study and considering what possible methods will affect.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
TECHNOLOGY ASSESSMENT
Performance and Quality Improvement
Changing the Game The Logic Model
Thinking through PROGRAMME Theory
Re-Framing Agendas: From the Personal to the Policy Level
Using Logic Models in Project Proposals
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

Summary Slide PROGRAM EVALUATION PA430 – Week 7 – 2/29-3/1/00

PROGRAM EVALUATION A systematic assessment of the operation and/or outcomes of a program or policy Comparison against a set of explicit of implicit standards Means of contributing to the improvement of the program or policy

Key Elements Systematic assessment Focus is operations and outcomes of program Standards for comparison Purpose is to contribute to improvement or program/policy

Definitions Program – a specific government activity (examples: Head Start, toxic waste cleanup, immunization of children) Policy – a broad, officially recognized statement of objectives tied to a variety of government activities (examples: health and safety of medical products)

Main questions of evaluation research Outcome evaluation Is the program reaching the goals it was meant to accomplish? Includes both intended and unintended results Impact evaluation What happens to participants as a result of the program?

Main questions of evaluation research Process evaluation What is the program actually doing? What kinds of services are being provided? Are clients satisfied with the program? Helps in understanding of outcome data

Reasons for Evaluation Research Formative – to help improve design of program. Improve delivery of program/activity. Summative – to provide information at the end of a program (or at least one cycle of it) about whether it should be continued or modified

How is evaluation research different from other types of research? Utility – it is intended for use by program administrators Program-derived questions – questions are derived from concerns of program communities Judgmental quality – Tends to compare the “what is” with “what should we be”

Action setting – must be field-based research Action setting – must be field-based research. Goals of evaluation sometimes conflict with goals of program or its administrators Role conflicts – it is difficult for program administrators to remove themselves from commitment to and positive view of their program

Publication – basic research is usually published in academic journals Publication – basic research is usually published in academic journals. Most evaluation research is not published – remains “in-house” Allegiance – research has dual role – both program improvement and contribution to understanding of a particular policy area or evaluation research

Who performs evaluation research? Three basic ways Assign the task to a staff member of the agency Hire an outside, independent evaluation or consulting firm (sometimes a university researcher) Open biding to all potential evaluations (often through an RFP – request for proposal) Periodic evaluation of programs often required as condition of grant (either public or private granting agency)

Inside vs Outside Evaluation Is the evaluator a member of the agency staff or not Concerns over the evaluator’s perspective Do they have a stake in the study results Are they too removed from the program (too “ivory tower”) Is the evaluator competent

Objectivity – what are the possible biases of the evaluator Program knowledge – how well does the evaluator understand the program (process, goals, etc.) Potential for utilization – evaluators often must take an active role in moving evaluation from report to action (implementation)

Step 1: Understand the Program Begin with a general knowledge of the field but quickly develop an in-depth understanding of the specific program. How? Research by others in the general and specific area Written materials of the program Field research including interviews

Why this is important To develop a sense of the issues – separate the wheat from the chaff To formulate relevant, incisive questions To interpret the evidence/findings To make sound recommendations for program change or continuation To write a thorough, useable report

Step 1: Understand the Program Develop a characterization of the program (reality vs the illusion) read previous evaluations talk to program directors observation data-based inquiry What is the program trying to achieve? Begin with official goals (if available) get other, more contemporary information from program managers communicate with clients

Step 1: Understand the Program How does the program expect to achieve its goals? Not just did the program work, but what made it work examine program’s theories of change - the set of beliefs that underlie action an explanation of the causal links between program and outcomes

Step 2: Plan the Evaluation Identify key questions for study decide on analysis method: quantitative, qualitative, or both develop measures to answer questions plan data collection to operationalize key measures plan an appropriate research design collect and analyze data write and disseminate report promote appropriate use of the results

Step 2: Plan the Evaluation Additional considerations long-term vs short-term study questions should examine both intended and unintended impacts of program practicalities (clout of stakeholders, uncertainties, decision timetable) advisory committee ethical issues

Step 3: Develop measures Desired outcomes effects on persons served effects on agencies effects on larger systems (networks) effects on the public unintended outcomes both positive and negative Interim markers of progress towards outcomes real changes desired may lie far in future

Step 3: Develop measures components of program implementation (program processes) how the program is carried out how the program operates and for whom program quality resources, inputs, and environment budgets, staff, location management, years of operation client eligibility standards

Step 4: Collect data Data sources existing data informal interviews observations formal interviews, written questionnaires program records data from outside institutions

Step 5: Select a program design Identify people/units to be studied how study units will be selected kinds of comparisons to be drawn timing of the investigation Outcome studies underlying logic is: compare program participants before and after receiving program compare participants with non-participants

Step 5: Select a program design Informal designs self-evaluation by administrators, staff, and clients expert judgment (outsider knowledge) Formal designs post-test only pre-test, post-test comparison group time series designs

Step 6: Analyze and interpret data Whether quantitative or qualitative data, goal is to convert a mass of raw data into a coherent, organized report Types of analytical strategy describe, count factor, cluster (divide into parts) compare, find commonalities covariation tell the story

Step 7: Write the report What should the report look like? It depends!! May require more than one report (different audiences) comprehensive report may be required by sponsor of agency/evaluation - may be too much for most audiences executive summary

Step 7: Write the report Possible topics (full report) summary of study results findings, implications, recommendations problem with which program deals nature of the program context (history, sponsorship, setting) beneficiaries staff how study was conducted suggestions for further evaluation

Step 7: Write the report Other report types summary report for clients and public short, focus on highlights executive summary broader, useful for variety of audiences Ultimate goal: a report that is clear, timely, generalizable to other similar programs, inclusive of the organization's views, and of high utility