Evaluation: Asking the Right Questions & Using the Answers

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
Step by step guide.
HOWARD UNIVERSITY LIBRARIES Strategic Planning Retreat, 2005.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
1 Theory of Change Chesapeake Bay Funders Network Program Evaluation Training Workshop OMG Center for Collaborative Learning January 9-10, 2008.
Family Resource Center Association January 2015 Quarterly Meeting.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Presented By: Tracy Johnson, Central CAPT
MAP-IT: A Model for Implementing Healthy People 2020
A Healthy Place to Live, Learn, Work and Play:
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Evaluation. Practical Evaluation Michael Quinn Patton.
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Molly Chamberlin, Ph.D. Indiana Youth Institute
How to Develop the Right Research Questions for Program Evaluation
Domus Leadership Training TOPIC - Planning June 2015.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Strategy for Excellence Leadership Development & Succession Planning Carl L. Harshman & Associates.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Do it pro bono. Key Messages & Brand Strategy Service Grant.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Unit 5:Elements of A Viable COOP Capability (cont.)  Define and explain the terms tests, training, and exercises (TT&E)  Explain the importance of a.
The County Health Rankings & Roadmaps Take Action Cycle.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Training of Process Facilitators Training of Process Facilitators.
The Evaluation Plan.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Working Definition of Program Evaluation
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
Welcome! Please join us via teleconference: Phone: Code:
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Partnership Analysis & Enhancement Tool Kit Cindy S. Soloe Research Triangle Institute (RTI) April Y. Vance Centers for Disease Control and Prevention.
Mental Health Services Act Oversight and Accountability Commission June, 2006.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
1 Nemours Health Planning and Evaluation Collaborative Learning Session I: Designing Effective Health Programs.
Module V: Writing Your Sustainability Plan Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 © 2011.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
1 EMS Fundamentals An Introduction to the EMS Process Roadmap AASHTO EMS Workshop.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Using Logic Models to Create Effective Programs
Session 2: Developing a Comprehensive M&E Work Plan.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Stages of Research and Development
Logic Models How to Integrate Data Collection into your Everyday Work.
Using Logic Models in Program Planning and Grant Proposals
Introduction to Program Evaluation
SUCCESSFUL MEASURING JAMES HARDIN, DIRECTOR, HEALTH IMPACT
Proposal Development Support & Planning
MAP-IT: A Model for Implementing Healthy People 2020
Using Logic Models in Project Proposals
BOOTCAMP SOCIAL INNOVATION ACCELERATOR TO CREATE LASTING CHANGE
Community Benefit Activities
Presentation transcript:

Evaluation: Asking the Right Questions & Using the Answers Presented by Annemarie Charlesworth, MA UCSF National Center of Excellence in Women’s Health November 3, 2006

Part 1 - Evaluation Overview Part 2 - Steps to Program Planning and Evaluation Part 3 - The Logic Model: A Tool for Planning and Evaluation

Part 1 - Evaluation Overview What is Evaluation? Process of collecting information about your program in order to make some decisions about it. Complements program management by improving and accounting for program effectiveness.

How is Evaluation Helpful? Gain insight Change practice Assess effects Affect participants Selected uses for evaluation in public health practice by category of purpose Gain insight Assess needs, desires, and assets of community members. Identify barriers and facilitators to service use. Learn how to describe and measure program activities and effects. Change practice Refine plans for introducing a new service. Characterize the extent to which intervention plans were implemented. Improve the content of educational materials. Enhance the program's cultural competence. Verify that participants' rights are protected. Set priorities for staff training. Make midcourse adjustments to improve patient/client flow. Improve the clarity of health communication messages. Determine if customer satisfaction rates can be improved. Mobilize community support for the program. Assess effects Assess skills development by program participants. Compare changes in provider behavior over time. Compare costs with benefits. Find out which participants do well in the program. Decide where to allocate new resources. Document the level of success in accomplishing objectives. Demonstrate that accountability requirements are fulfilled. Aggregate information from several evaluations to estimate outcome effects for similar kinds of programs. Gather success stories. Affect participants Reinforce intervention messages. Stimulate dialogue and raise awareness regarding health issues. Broaden consensus among coalition members regarding program goals. Teach evaluation skills to staff and other stakeholders. Support organizational change and development. http://www.cdc.gov/mmwr/preview/mmwrhtml/rr4811a1.htm

Gain Insight Assess needs, desires, and assets of community members. Identify barriers and facilitators to service use. Learn how to describe and measure program activities and effects.

Change Practice Refine plans for introducing a new service. Characterize the extent to which plans were implemented. Improve the content of educational materials. Enhance the program's cultural competence.

Change Practice (cont.) Verify that participants' rights are protected. Set priorities for staff training. Make midcourse adjustments for improvement. Improve the clarity of health communication messages. Mobilize community support for the program.

Assess Effects Assess skills development by program participants. Compare changes in provider behavior over time. Compare costs with benefits. Find out which participants do well in the program. Decide where to allocate new resources.

Assess Effects (cont.) Document the level of success in accomplishing objectives. Demonstrate that accountability requirements are fulfilled. Aggregate information from several evaluations to estimate outcome effects for similar kinds of programs. Gather success stories.

Affect Participants Reinforce program/intervention messages. Stimulate dialogue/raise awareness regarding health issues. Broaden consensus among coalition members regarding program goals. Teach evaluation skills to staff and other stakeholders. Support organizational change and development.

Types of Program Evaluation Goals based evaluation (identifying whether you’re meeting your overall objectives) Process based evaluation (identifying your program’s strengths and weaknesses) Outcomes based evaluation (identifying benefits to participants/clients)

Type of evaluation depends on what you want to learn… Start with: 1) What you need to decide (why are you doing this evaluation?); What you need to know to make the decision; How to best gather and understand that information!

Key questions to consider when designing program evaluation: For what purposes is the evaluation being done, i.e., what do you want to be able to decide as a result of the evaluation? 2. Who are the audiences for the information from the evaluation (e.g., funders, board, management, staff, clients, etc.) What kinds of information are needed to make the decision you need to make and/or enlighten your intended audiences? The type of evaluation you undertake to improve your programs depends on what you want to learn about the program. Don't worry about what type of evaluation you need or are doing -- worry about what you need to know to make the program decisions you need to make, and worry about how you can accurately collect and understand that information. Key Considerations: Consider the following key questions when designing a program evaluation. 1. For what purposes is the evaluation being done, i.e., what do you want to be able to decide as a result of the evaluation? 2. Who are the audiences for the information from the evaluation, e.g., customers, bankers, funders, board, management, staff, customers, clients, etc. 3. What kinds of information are needed to make the decision you need to make and/or enlighten your intended audiences, e.g., information to really understand the process of the product or program (its inputs, activities and outputs), the customers or clients who experience the product or program, strengths and weaknesses of the product or program, benefits to customers or clients (outcomes), how the product or program failed and why, etc. 4. From what sources should the information be collected, e.g., employees, customers, clients, groups of customers or clients and employees together, program documentation, etc. 5. How can that information be collected in a reasonable fashion, e.g., questionnaires, interviews, examining documentation, observing customers or employees, conducting focus groups among customers or employees, etc. 6. When is the information needed (so, by when must it be collected)? 7. What resources are available to collect the information?

Key questions (cont.) 4. From what sources should the information be collected (e.g., employees, customers, clients, etc.?) 5. How can that information be collected in a reasonable fashion (e.g., questionnaires, interviews, examining documentation, etc.) 6. When is the information needed (so, by when must it be collected)? 7. What resources are available to collect the information?

Evaluation should be considered during program planning and implementation… Not just at the end!

Problem Need Action Goal It is not enough to have a goal… Goals exist because some action is needed. However, you can’t argue an action without a deep understanding of the problem. Problem Need Action Goal

Part 2 - Steps to Program Planning and Evaluation

10 Steps to Planning a Program (and its evaluation!) Needs and assets Extent, magnitude and scope of problem Summary of what’s already being done Gaps between needs and existing services Community support Goals and objectives Long-term specific to target population Link short-term objectives to goals 3. Defining the intervention/treatment program components to accomplish objectives and goals one or two activities should support each objective

10 Steps to Planning a Program (and its evaluation!) 4. Developing the program/logic model 5. Choose the type(s) of data collection (i.e., surveys, interviews, etc.) 6. Select your evaluation design (i.e., one group pre/posttest vs. comparison pre/posttest)

10 Steps to Planning a Program (and its evaluation!) 7. Pilot test tools 8. Collect data 9. Analyze data 10. Report, share, and act on the findings

Part 3 - The Logic Model: A Tool for Planning and Evaluation Picture of how your organization does its work Communicates its “rationale” Explains hypotheses and assumptions about why the program will work Links outcomes with activities

Logic models help you chart the course ahead … Allow you to better understand Challenges Resources available Timetable Big picture as well as smaller parts

Basic Logic Model 1. Resources/ Inputs 2. Activities 3. Outputs 4. Outcomes 5. Impact Planned Work Intended Results Basic Logic Model Connection between Planned Work and Intended Results Planned Work 1. Resources/Inputs – Protective Factors and Risk Factors (human, financial, organizational, community, etc.) 2. Program Activities (processes, tools, events, technology, actions, etc.) Intended Results 3. Outputs (direct products of activities) 4. Outcomes (changes in behavior, knowledge, skills, etc. – short and long-term) 5. Impact (ultimate change system-wide…intended and unintended) *From W.K. Kellogg Foundation Logic Model Development Guide

Basic Logic Model Resources Activities Outputs Short and Long-term Outcomes Impact In order to accomplish our set of activities we will need the following: In order to address our problem or asset we will conduct the following activities: We expect that once completed or under way these activities will produce the following evidence: We expect that if completed or ongoing these activities will lead to the following changes in 1-3 then 4-6 years: We expect that if completed these activities will lead to the following changes in 7-10 years:

Short and Long-term Outcomes Impact Example Logic Model for a free clinic to meet the needs of the growing numbers of uninsured residents (Mytown, USA) Resources Activities Outputs Short and Long-term Outcomes Impact IRS 501(c)(3) status • Diverse, dedicated board of directors representing potential partners • Endorsement from Memorial Hospital, Mytown Medical Society, and United Way • Donated clinic facility • Job descriptions for board and staff • First year’s funding ($150,000) • Clinic equipment • Board & staff orientation process • Clinic budget • Launch/complete search for executive director • Board & staff conduct Anywhere Free Clinic site visit planning retreat • Design and implement funding strategy volunteer recruitment and training • Secure facility for clinic • Create an evaluation plan PR campaign • # of patients referred from ER to the clinic/year • # of qualified patients enrolled in the clinic/year • # of patient visits/year • # of medical Volunteers serving/year • # of patient fliers distributed • # of calls/month seeking info about clinic • Memorandum of Agreement for free clinic space • Change in patient attitude about need for medical home • Change in # of scheduled annual physicals/follow-ups • Increased # of ER/physician referrals • Decreased volume of unreimbursed emergencies treated in Memorial ER • Patient co-payments supply 20% of clinic operating costs • 25% reduction in # of uninsured ER visits/year • 300 medical volunteers serving regularly each year • Clinic is a United Way Agency • Clinic endowment established • 90% patient satisfaction for 5 years. • 900 patients served/year In our example, the folks in Mytown, USA are striving to meet the needs of growing numbers of uninsured residents who are turning to Memorial Hospital’s Emergency Room for care. Because that care is expensive and not the best way to offer care, the community is working to create a free clinic. Throughout the chapters, Mytown’s program information will be dropped into logic model templates for Program Planning, Implementation, and Evaluation Produced by The W. K. Kellogg Foundation

S.M.A.R.T. Specific Measurable Action-oriented Realistic Timed Outcomes and Impacts should be: Specific Measurable Action-oriented Realistic Timed

One size does not fit all! Many different types of logic models Experiment with models that suit your program and help you think through your objectives

Useful for all parties involved (Funder, Board, Administration, Staff, Participating organizations, Evaluators, etc.) Convey purpose of program Show why its important Show what will result Illustrate the actions that will lead to the desired results Basis for determining whether actions will lead to results! Serves as common language Enhance the case for investment in your program!

Strengthen Community involvement Created in partnership, logic models give all parties a clear roadmap Helps to build community capacity and strengthen community voice Helps all parties stay on course or intentionally decide to go off-course Visual nature communicates well with diverse audiences

Logic Models Used throughout the life of your program Planning Program Implementation Program Evaluation May change throughout the life of the program! Fluid; a “working draft” Responsive to lessons learned along the way Reflect ongoing evaluation of the program

The Role of the Logic Model in Program Design/Planning Helps develop strategy and create structure/organization Helps explain and illustrate concepts for key stakeholders Facilitates self-evaluation based on shared understanding Requires examination of best-practices research

The Role of the Logic Model in Program Implementation Backbone of management plan Helps identify and monitor necessary data Help improve program Forces you to achieve and document results Helps to prioritize critical aspects of program for tracking

The Role of the Logic Model in Program Evaluation Provides information about progress toward goals Teaches about the program Facilitates advocacy for program approach Helps with strategic marketing efforts

References Kellogg Foundation http://www.wkkf.org/pubs/tools/evaluation/pub3669.pdf Schmitz, C. & Parsons, B.A. (1999) “Everything you wanted to know about Logic Models but were afraid to ask” http://www.insites.org/documents/logmod.pdf University of Wisconsin Cooperative Extension http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html CDC Evaluation Working Group http://www.cdc.gov/eval/logic%20model%20bibliography.PDF CDC/MMWR - Framework for Program Evaluation in Public Health http://www.cdc.gov/mmwr/preview/mmwrhtml/rr4811a1.htm McNamara, C. (last revision: Feb 16, 1998) “Basic Guide to Program Evaluation” http://www.managementhelp.org/evaluatn/fnl_eval.htm