1 Evaluation as Continuous Improvement The Health Disparities Service- Learning Collaborative Suzanne B Cashman February 27, 2007.

Slides:



Advertisements
Similar presentations
Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Advertisements

Engaging Your Community in Practical Program Evaluation Stephanie Welch, MS-MPH, RD, LDN Knox County Health Department Phone (865)
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
School Improvement Through Capacity Building The PLC Process.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Donald T. Simeon Caribbean Health Research Council
Basic Principles of Program Evaluation Corinne Datchi-Phillips, Ph.D. CEBP Learning Institute May 26 th, 2010.
Service Learning through Community Inquiry: A Campus-Community Partnership Robin Ringstad Valerie Leyva John Garcia Kelvin Jasek-Rysdahl California State.
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
Assessment of Service-Learning: Principles and Techniques Barbara A. Holland, Ph.D. Senior Scholar, Indiana University-Purdue University Indianapolis Director,
What You Will Learn From These Sessions
Practicing Community-engaged Research Mary Anne McDonald, MA, Dr PH Duke Center for Community Research Duke Translational Medicine Institute Division of.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Gathering Credible Data Seattle Indian Health Board Urban Indian Health Institute Shayla R. Compton, MPH University of Washington School of Public Health.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Update from the UNC General Education Council [presented to the UNC Board of Governors’ Educational Planning, Programs, and Policies Committee on February.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 502 – Program Evaluation
Happy semester with best wishes from all nursing staff Dr Naiema Gaber
Problem Analysis Intelligence Step 2 - Problem Analysis Developing solutions to complex population nutrition problems (such as obesity or food insecurity)
Evaluation. Practical Evaluation Michael Quinn Patton.
University of Wisconsin - Extension, Cooperative Extension, Program Development and Evaluation Unit 3: Engaging stakeholders.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Competency Assessment Public Health Professional (2012)-
School Development Planning Initiative “An initiative for schools by schools” Self-Evaluation of Learning and Teaching Self-Evaluation of Learning and.
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Essential Service # 7:. Why learn about the 10 Essential Services?  Improve quality and performance.  Achieve better outcomes – improved health, less.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
Narrowing the Gap between Evaluation and Developmental Science: Implications for Evaluation Capacity Building Tiffany Berry, PhD Research Associate Professor.
The Proof is in The Process: Data-Driven Program Implementation Rose Lee Felecia Johnson Tonya Johnson.
DON MARIANOS, DDS, MPH ORAL HEALTH 2014 INITIATIVE WEBINAR MAY 11, 2012 Prevention & Dental Public Health (DPH) Infrastructure: A State Oral Health Program.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Program Evaluation Using qualitative & qualitative methods.
The Evaluation Plan.
Evidence of Success: Assessing Student Learning Outcomes in International Education Dr. Darla K. Deardorff Association of International Education.
Too expensive Too complicated Too time consuming.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Challenges of putting research into action. Oxfam-Monash Partnership Research that will “make a difference in people’s lives” Research conducted by academics.
Mission and Mission Fulfillment Tom Miller University of Alaska Anchorage.
Leukemia & Lymphoma Society (LLS) Information Resource Center (IRC) Planning for a Service Program Evaluation - Case Study Public Administration 522 Presented.
Using Needs Assessment to Build A Strong Case for Funding Anna Li SERVE, Inc.
The Process of Conducting Research
Group Technical Assistance Webinar August 5, CFPHE RESEARCH METHODS FOR COMPARATIVE EFFECTIVENESS RESEARCH.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Copyright 2012 Delmar, a part of Cengage Learning. All Rights Reserved. Chapter 9 Improving Quality in Health Care Organizations.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Institutional Effectiveness A set of ongoing and systematic actions, processes, steps and practices that include: Planning Assessment of programs and.
Copyright © 2014 by The University of Kansas A Framework for Program Evaluation.
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Program Evaluation Principles and Applications PAS 2010.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Introduction to research
Time to answer critical and inter-related questions: Whom will we serve? What will we offer? How will we serve them?
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Authentic service-learning experiences, while almost endlessly diverse, have some common characteristics: Positive, meaningful and real to the participants.
Hello and Welcome to Unit 4- Seminar Topic: Addressing Health Care in Communities Instructor- Adaeze Oguegbu.
Enhancing Evaluation Stakeholder Responsiveness Through Collaborative Development of Data Collection Instruments Karen Kortecamp, PhD The George Washington.
Program Evaluation ED 740 Study Team Project Program Evaluation
Designing Effective Evaluation Strategies for Outreach Programs
Enacting Multiple Strategies and Limiting Potential Successes: Reflections on Advocacy Evaluation, Competing Objectives, and Pathways to Policy Change.
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Assessing Academic Programs at IPFW
Presentation transcript:

1 Evaluation as Continuous Improvement The Health Disparities Service- Learning Collaborative Suzanne B Cashman February 27, 2007

2 “Good evaluation” is nothing more than “good thinking” It is the systematic collection of information about activities, characteristics and outcomes of programs, personnel, and products to use to reduce uncertainties, improve effectiveness and make decisions. Patton, 1997

3 Evaluation as Assessment/Improvement  Mechanism to tell the story Becomes less of a burdensome add-on  Useful learnings For yourself For others

4 Why Evaluate? Reduce uncertainties Measure program achievement Improve effectiveness Demonstrate accountability Make programmatic decisions Build constituency Influence policy

5 Why are you engaged in evaluation?

6 Comparison of Academic Research and Practical Evaluation Academic ResearchPractical Evaluation PurposeTest hypothesesImprove program/ practice MethodControlled environment Context sensitive StatisticsSophisticatedSimpler

7 Program Evaluation  Commitment to following the “rule” of social research  But more than application of methods… also a political and managerial activity; input into process of policy making and allocation for planning, design, implementing, continuing programs

8 Program Evaluation  Rooted in scientific methodology, but responsive to resource constraints, needs/purposes of stakeholders, and nature of evaluation setting

9 Key Questions  What is the aim of the assessment?  Who/What wants/needs the information?  What resources are available?  Who will conduct the assessment?  How can you ensure results are used?

10 Evaluation should:  Strengthen projects  Use multiple approaches  Address real issues  Create a participatory process  Allow for flexibility  Build capacity WKKellogg Foundation, 1998

11 Evaluation ( should tell us……………)  What has been done  How well it has been done  How much has been done  How effective the work/program has been

12 Reasons to Evaluate  Measure program achievement  Demonstrate accountability  Examine resource needs and effectiveness  Improve operations, obtain and give feedback  Influence policy  Expand voices

13 Evaluation Framework (CDC) I. Engage Stakeholders II. Describe Program III. Focus the Design IV. Gather Credible Evidence V. Justify Conclusions VI. Ensure Use and Share Lessons Learned

14 I. Stakeholders  People who have a “stake” in what will be learned from an evaluation and what will be done with the knowledge  They include: People who manage or work in the program/organization People who are served or affected by the program, or who work in partnership with the program People who are in a position to do or to decide something about the program CDC, 1998

15 Stakeholders  Stakeholders’ information needs and intended uses serve to focus the evaluation  Variety of stakeholders may mean: more than one focus (policy implications vs documentation of local activities) varied levels of involvement

16 Stakeholders  Who are your stakeholders?  How do their needs and desires differ from one another?

17 II. Describe Program  Need  Expectations  Activities  Context

18 Expectations Outcome Objectives statement of the amount of change expected for a given problem/condition for a specified population within a given timeframe Process Objectives statement regarding the amount of change expected in the performance or utilization of interventions that are related to the outcome

19 III. Focus the Design  Questions to answer  Process to follow  Methods to use  Activities to develop and implement  Results to disseminate

20 Clarify  Individual, Systems, or Community Level Individual: individually targeted services or programs, often for people at high-risk Systems: change organizations, policies, laws, or structures Community: focus is on community norms, attitudes, beliefs, practices

21 IV. Gather Credible Evidence  Types of data Demographic, health status, expenditures, quality of life, eligibility, utilization, capacity  Sources of data Statistical reports, published studies, voluntary organizations, program reports, media articles, government reports, state surveys

22 Thinking about data  Match the data to the questions – what kinds of information would be worthwhile?  As much as possible, use data that are being created as a regular part of the program  Collect and analyze data from multiple perspectives  Keep available resources in mind

23 Thinking about data (continued)  Where might we find them?  How might we obtain them?  What types should we consider?  What do we do now that we have them?

24 Who can help us collect and make sense of data?  Community partners  Student participants  College administrative offices  Faculty colleagues (and their students)  Students who participated in previous programs  Campus service-learning centers

25 Indicators of Well-being: Dimensions to Consider (Cohen, MDPH) Traditional Less Traditional AssetsSocial indications,Resiliency, Quality of life, Satisfaction, Self-reportsResources & Investment of health DeficitsDisease, Utilization ofGaps among groups, medical servicesEducation, Economics, Cultural, Safety deficits

26 ( Cont ) Traditional Less Traditional Assets Use of pre-natal care Quality adjusted life years Self-reported health Social networks Screening ratesRescue response time % insured Support for needle exchange Graduation rateVolunteerism Deficits Age - adjusted death rateLack of support for arts/culture HospitalizationsCrimes per capita Smoking prevalence

27 Specific Data Collection Methods  Surveys  Interviews  Focus groups  Literature search  Structured observations  Critical events log  Institutional documentation

28 Now that we have the data…...  Analyze Quantitative (statistical software) Qualitative (systematic review and assessment)  Synthesize information Follow framework of concepts  Write reports  Disseminate

29 V. Justify Conclusions  Review findings  What do they mean? How significant are they?  How do the findings compare to the objectives of the program?  What claims or recommendations are indicated?

30 VI. Ensure Use and Share Lessons  Through deliberate planning, preparation, and follow-up  Collaborate with stakeholders for meaningful: Communication of results (process and outcome) Decisions based on results New assessment plans emerging from results Reflection on the assessment process

31 Challenges  Important things difficult to measure  Complexity  Measurement validity  Time  Proof of causation  Need to be sensitive to context  Resources

32 Challenges  What are the challenges you face?

33 Summary: Characteristics of Evaluation  Evolving process  Variety of approaches  More than collecting and analyzing data  Critical design issues  Reconciles competing expectations  Recognizes and engages stakeholders

34 References  Bell R, Furco A, Ammon M, Muller P, Sorgen V. Institutionalizing Service-Learning in Higher Education. Berkeley: University of California  Centers for Disease Control and Prevention. Practical Evaluation of Public Health Programs  Kramer M. Make It Last: The Institutionalization of Service- Learning in America. Washington, DC: Corporation for National Service  Patton M. Utilization-Focused Evaluation. Sage Publications  WKKellogg Foundation. Evaluation Handbook