Download presentation
Presentation is loading. Please wait.
Published byEarl Sims Modified over 9 years ago
1
Building an Organizational Evaluation Infrastructure / Culture to Improve the Management and Outcomes of Health Programs Judith Hager, MA, MPH Molly Bradshaw, MPH Geraldine Oliva, MD, MPH Family Health Outcomes Project University of California, San Francisco October 24, 2001
2
An Evaluation Infrastructure An institutional philosophy / ethic to provide quality, relevant, cost effective services Shared systematic approach / framework Staff with an understanding of and basic evaluation skills Access to evaluation resources (tools, Information, data, etc.)
3
Why an Evaluation Infrastructure A Formal System of Accountability Influence External Decision Making Assist Program Management Decisions How to best use resources, improve client and staff satisfaction Promote team-building
4
Evaluation Is: zThe systematic investigation of the quality and effectiveness of organized efforts or activities zThe systematic assessment of the merit, worth, or significance of an object (CDC)
5
A Program is Any organized set of activities intended to achieve an outcome (supported by resources) Can be an Initiative, Departmental Program or a Project
6
Three approaches to building an evaluation infrastructure yState MCH Sponsored Evaluation Course yNon-profit Community Collaboration Framework yCounty Health Department capacity- building
7
MCH “Evaluation Course” for County MCH Teams Two workshop days / county teams Choose and develop a “real” program to evaluate Technical assistance by phone or on- site Assignments and feedback
8
Solano Coalition for Better Health - (5013C) Comprised of hospitals, health plans, health depart., CBOs, comm. Clinics Contract for development / facilitation / data collection & analysis Three goals / comm. health indicators Agree to share info
9
Mendocino County Health Department Mandate for all Division/program managers Philosophy of program improvement / resource allocation 2 day workshop Develop evaluations /group critique Manuals / train the trainers
10
Elements of An Evaluation Infrastructure Mandate Purpose Model/Approach Training Expertise Data Capability Gestalt / Integrated
11
Evaluation Planning Framework (CDC) Engage Stakeholders Describe the Program Focus the Evaluation Design Gather and Analyze Evidence Justify Conclusions Ensure Use Share Lessons Learned
12
Creating and Using a Logic Model Purposes: Understand the program, how it works, desired results and Test logic
13
Examine / improve broad, fuzzy objectives Show the ‘chain of events’ linking inputs to results Clarify difference between activities and outcomes Identify gaps in logic / assumptions Help determine what to evaluate / key ?’s Build understanding & consensus about the program More about why…….
14
Building a Logic Model Evaluation logic models: depict how a program works to achieve its intended outcomes may be a flow chart, table, diagram, etc have common elements
15
Elements of an Evaluation Logic Model zInputs (resources) zActivities (interventions) zOutputs zOutcomes (short, intermediate and long- term) Some models also include other elements such as problem statement, assumptions, environment and program target (e.g., 13 to 17 year olds / age, sex, location)
16
Today, we will use an adaptation of the UWEX* logic Model A graphic representation that shows logical relationships between inputs, outputs and outcomes relative to a situation Elements: Problem statement/ situation Inputs Outputs Outcomes Assumptions Environment * Ellen Taylor-Powell, University of Wisconsin - Extension
17
Problem Statement Program Description Goals and Objectives Stakeholders / Use of Evaluation Program concepts / action theory Literature Review Program Logic Model: Preparation
18
UWEX LOGIC MODEL SITUATIONSITUATION INPUTSOUTCOMESOUTPUTS ENVIRONMENT (external factors) ASSUMPTIONS (Theories that guide your program) 1. 2. 3.
19
What it tells us INPUTS Programmatic investments Resources allocated OUTPUTS ActivitiesParticipation Effort / What program does Who program targets OUTCOMES ShortInter- mediate Long term With what results
20
Participation OUTPUTS Activities OUTCOMES ShortMediumLong term INPUTS Programmatic investments EVALUATION PLANNING
21
STAFF (special requirements) MONEY LOCATION VOLUNTEERS PARTNERS EQUIPMENT TECHNOLOGY INPUTS
22
PARTICIPATION Participants Providers Users ACTIVITIES Treatment Classes Counseling Case management Curriculum design Trainings Conferences OUTPUTS What program does Who it reaches
23
LONG-TERM Conditions Mortality Morbidity Quality of Life Environmental Intermediate Action Behavior Practice Decisions Policies Systems change SHORT Learning Access Awareness Knowledge Attitudes Skills Opinion Aspirations Motivation OUTCOMES What results for individuals, agencies, communities..…
24
Logical Linkages: Example Series of If-Then Relationships IF THEN IF THEN IF THEN IF THEN Program invests time & money Design parenting curriculum Parents increase knowledge Parenting improved Decrease rates of child abuse INPUTS OUTPUTS OUTCOMES
25
Staff Money Partners Design parent ed. curriculum Provide 6 training session Targeted parents attend Parents increase knowledge of child develop. Parents learn new ways to discipline Parents use improved parenting skills Reduced rates of child abuse & neglect INPUTSOUTPUTS OUTCOMES Program Example Problem: Child Abuse
26
Where Does Evaluation Fit? Staff Money Partners Design parent ed. curriculum Provide 6 training sessions Targeted parents attend Parents increase knowledge/ child develop. Parents learn new ways to discipline Parents use improved parenting skills Reduced rates of child abuse & neglect INPUTSOUTPUTS OUTCOMES EVALUATION: What do you want to know? What data do you need? Decrease in rates agency records Behavior change follow-up interview/ob Increase in knowledge/skill Pre-post session survey # parents attending session demographics of parents Quality of curriculum # of sessions delivered
27
Elements of An Evaluation Infrastructure Mandate Purpose Model/Approach Training Expertise Data Capability Gestalt / Integrated
28
Limitations of an evaluation infrastructure: zRequires staff commitment / time zDifficult to evaluate own work zMay not support rigorous evaluation zNot the answer to management problems zStill need access to expertise
29
Recommendations – Leadership build trust – Develop a common language – Introduce evaluation model – Provide training
30
Recommendations – Build a team approach – Provide evaluation tools – Allocate time / long term – Provide guidelines about what is important to evaluate
31
Recommendations zUnderstand Evaluation is Complex xAccountability for the denominator population xUnderstand stages of evaluation zKnow when consultants work best
32
Program Planning and Evaluation
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.