Download presentation
Presentation is loading. Please wait.
1
1 Evaluation as Continuous Improvement The Health Disparities Service- Learning Collaborative Suzanne B Cashman February 27, 2007
2
2 “Good evaluation” is nothing more than “good thinking” It is the systematic collection of information about activities, characteristics and outcomes of programs, personnel, and products to use to reduce uncertainties, improve effectiveness and make decisions. Patton, 1997
3
3 Evaluation as Assessment/Improvement Mechanism to tell the story Becomes less of a burdensome add-on Useful learnings For yourself For others
4
4 Why Evaluate? Reduce uncertainties Measure program achievement Improve effectiveness Demonstrate accountability Make programmatic decisions Build constituency Influence policy
5
5 Why are you engaged in evaluation?
6
6 Comparison of Academic Research and Practical Evaluation Academic ResearchPractical Evaluation PurposeTest hypothesesImprove program/ practice MethodControlled environment Context sensitive StatisticsSophisticatedSimpler
7
7 Program Evaluation Commitment to following the “rule” of social research But more than application of methods… also a political and managerial activity; input into process of policy making and allocation for planning, design, implementing, continuing programs
8
8 Program Evaluation Rooted in scientific methodology, but responsive to resource constraints, needs/purposes of stakeholders, and nature of evaluation setting
9
9 Key Questions What is the aim of the assessment? Who/What wants/needs the information? What resources are available? Who will conduct the assessment? How can you ensure results are used?
10
10 Evaluation should: Strengthen projects Use multiple approaches Address real issues Create a participatory process Allow for flexibility Build capacity WKKellogg Foundation, 1998
11
11 Evaluation ( should tell us……………) What has been done How well it has been done How much has been done How effective the work/program has been
12
12 Reasons to Evaluate Measure program achievement Demonstrate accountability Examine resource needs and effectiveness Improve operations, obtain and give feedback Influence policy Expand voices
13
13 Evaluation Framework (CDC) I. Engage Stakeholders II. Describe Program III. Focus the Design IV. Gather Credible Evidence V. Justify Conclusions VI. Ensure Use and Share Lessons Learned
14
14 I. Stakeholders People who have a “stake” in what will be learned from an evaluation and what will be done with the knowledge They include: People who manage or work in the program/organization People who are served or affected by the program, or who work in partnership with the program People who are in a position to do or to decide something about the program CDC, 1998
15
15 Stakeholders Stakeholders’ information needs and intended uses serve to focus the evaluation Variety of stakeholders may mean: more than one focus (policy implications vs documentation of local activities) varied levels of involvement
16
16 Stakeholders Who are your stakeholders? How do their needs and desires differ from one another?
17
17 II. Describe Program Need Expectations Activities Context
18
18 Expectations Outcome Objectives statement of the amount of change expected for a given problem/condition for a specified population within a given timeframe Process Objectives statement regarding the amount of change expected in the performance or utilization of interventions that are related to the outcome
19
19 III. Focus the Design Questions to answer Process to follow Methods to use Activities to develop and implement Results to disseminate
20
20 Clarify Individual, Systems, or Community Level Individual: individually targeted services or programs, often for people at high-risk Systems: change organizations, policies, laws, or structures Community: focus is on community norms, attitudes, beliefs, practices
21
21 IV. Gather Credible Evidence Types of data Demographic, health status, expenditures, quality of life, eligibility, utilization, capacity Sources of data Statistical reports, published studies, voluntary organizations, program reports, media articles, government reports, state surveys
22
22 Thinking about data Match the data to the questions – what kinds of information would be worthwhile? As much as possible, use data that are being created as a regular part of the program Collect and analyze data from multiple perspectives Keep available resources in mind
23
23 Thinking about data (continued) Where might we find them? How might we obtain them? What types should we consider? What do we do now that we have them?
24
24 Who can help us collect and make sense of data? Community partners Student participants College administrative offices Faculty colleagues (and their students) Students who participated in previous programs Campus service-learning centers
25
25 Indicators of Well-being: Dimensions to Consider (Cohen, MDPH) Traditional Less Traditional AssetsSocial indications,Resiliency, Quality of life, Satisfaction, Self-reportsResources & Investment of health DeficitsDisease, Utilization ofGaps among groups, medical servicesEducation, Economics, Cultural, Safety deficits
26
26 ( Cont ) Traditional Less Traditional Assets Use of pre-natal care Quality adjusted life years Self-reported health Social networks Screening ratesRescue response time % insured Support for needle exchange Graduation rateVolunteerism Deficits Age - adjusted death rateLack of support for arts/culture HospitalizationsCrimes per capita Smoking prevalence
27
27 Specific Data Collection Methods Surveys Interviews Focus groups Literature search Structured observations Critical events log Institutional documentation
28
28 Now that we have the data…... Analyze Quantitative (statistical software) Qualitative (systematic review and assessment) Synthesize information Follow framework of concepts Write reports Disseminate
29
29 V. Justify Conclusions Review findings What do they mean? How significant are they? How do the findings compare to the objectives of the program? What claims or recommendations are indicated?
30
30 VI. Ensure Use and Share Lessons Through deliberate planning, preparation, and follow-up Collaborate with stakeholders for meaningful: Communication of results (process and outcome) Decisions based on results New assessment plans emerging from results Reflection on the assessment process
31
31 Challenges Important things difficult to measure Complexity Measurement validity Time Proof of causation Need to be sensitive to context Resources
32
32 Challenges What are the challenges you face?
33
33 Summary: Characteristics of Evaluation Evolving process Variety of approaches More than collecting and analyzing data Critical design issues Reconciles competing expectations Recognizes and engages stakeholders
34
34 References Bell R, Furco A, Ammon M, Muller P, Sorgen V. Institutionalizing Service-Learning in Higher Education. Berkeley: University of California. 2000. Centers for Disease Control and Prevention. Practical Evaluation of Public Health Programs. 1998. Kramer M. Make It Last: The Institutionalization of Service- Learning in America. Washington, DC: Corporation for National Service. 2000. Patton M. Utilization-Focused Evaluation. Sage Publications. 1997. WKKellogg Foundation. Evaluation Handbook. 1998.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.