The Evaluation Plan.

Slides:



Advertisements
Similar presentations
Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.
Advertisements

Introduction to Impact Assessment
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Catherine Kost Heather McGreevy E VALUATING Y OUR V OLUNTEER P ROGRAM.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Kathy Keeley Northland Foundation Strengthening Communities April 14, 2011 Writing Result Measures and Program Evaluation.
Copyright © 2014 by The University of Kansas Collecting and Analyzing Data.
Creating an Evaluation Plan Freya Bradford, Senior Consultant NorthSky Nonprofit Network May 23, 2013.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Evaluation. Practical Evaluation Michael Quinn Patton.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Continuous Quality Improvement (CQI)
Molly Chamberlin, Ph.D. Indiana Youth Institute
Measuring for Success Module Nine Instructions:
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
CENTER FOR NONPROFIT EXCELLENCE EVALUATION WORKSHOP SERIES Session IV: Analyzing and Reporting Your Data Presenters: Patty Emord.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Measuring the effectiveness of your volunteer program Meals On Wheels Association of America 2011 Annual Conference August 30, 2011 Stacey McKeever, MPH,
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
“Advanced” Data Collection January 27, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
The Logic Model An Introduction. Slide 2 Innovation Network, Inc. Who We Are National nonprofit organization Committed to evaluation as a tool for empowerment.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Promoting a Culture of Evidence and Use of Data through Program Evaluation Session Theme 2 Presentation to: OSEP Project Directors’ Conference July 21,
EVALUATION PLANNING Legal Services Corporation May 19, 2004 Please standby while others connect.
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Workshop 6 - How do you measure Outcomes?
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Program Evaluation for Nonprofit Professionals Unit 1 Part 2: Evaluation and The Logic Model.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
704: Conducting Business in Fiscally Challenging Times: Strategies and Tools to Get There PCYA Leadership Academy Presentation March 28, 2012.
Center for Leadership Development Guarantee The Money: Making Your Case Through Program Evaluation.
Program Evaluation for Nonprofit Professionals Unit 2: Creating an Evaluation Plan.
Beyond Data Why does evaluation seen so foreign to most?
Program Evaluation.
Innovation Network, Inc. Program Logic Model Program Planning Communication Evaluation.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Measuring the Impact of Your Volunteer Program Barbra J. Portzline, Ph.D. Liz Benton, MBA.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Making it Count! Program Evaluation For Youth-Led Initiatives.
Program Evaluation for Nonprofit Professionals Unit 3: Collecting the Data.
Using Logic Models to Create Effective Programs
Does Your HHW Education Program Work? Workshop on Program Evaluation – Evaluation Logic Model NAHMMA Conference September 22, 2006 Tacoma, WA Trudy C.
Trouble? Can’t type: F11 Can’t hear & speakers okay or can’t see slide? Cntrl R or Go out & come back in 1 Sridhar Rajappan.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Designing Effective Evaluation Strategies for Outreach Programs
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Introduction to Program Evaluation
Program Evaluation Essentials-- Part 2
General Notes Presentation length - 10 – 15 MINUTES
Introduction to M&E Frameworks
Troubleshooting Logic Models
Changing the Game The Logic Model
Using Logic Models in Project Proposals
Presentation transcript:

The Evaluation Plan

Session Purpose To understand how evaluation can be useful To understand how your logic model helps to focus an evaluation To understand both implementation and outcome evaluation

What is Evaluation? The systematic collection of information about a program in order to enable stakeholders to better understand the program, to improve program effectiveness, and/or to make decisions about future programming.

What’s in it for you? Understand and improve your program Test the theory underlying your program Tell your program’s story Be accountable Inform the field Support fundraising efforts

Evaluation Principles Evaluation is most effective when it: Is connected to program planning and delivery Involves the participation of stakeholders Supports an organization’s capacity to learn and reflect Respects the community served by the program Enables the collection of the most information with the least effort

Logic Model Program Goals: overall aims or intended impacts Resources The resources dedicated to or consumed by the program Activities The actions that the program takes to achieve desired outcomes Outputs The tangible, direct results of a program’s activities Outcomes The benefits to clients, communities, systems, or organizations External Factors: what else affects the program

Putting Your Plans Together Logic Model Resources Activities Outputs Outcomes Evaluation Plan: Implementation Data Collection Method Effort Activities Outputs Outcomes Data Collection Method Effort Outcomes Indicators

Implementation and Outcomes Evaluating Outcomes: What changes occurred as a result of your work? Evaluating Implementation: What did you do? How well did you do it?

Evaluating Outcomes Outcomes: the changes you expect to see as a result of your work Indicators: the specific, measurable characteristics or changes that represent achievement of an outcome. They answer the question: How will I know it?

Evaluating Outcomes: Indicators = What to Measure Meaningful Direct Useful Practical

Evaluating Outcomes: Direct v. Indirect Indicators Participating new mothers have their children immunized. Indirect: #/% of new mothers who are aware of importance of immunization. Direct: #/% of children of participating mothers who are up-to-date in immunizations. Increase referrals to your services from targeted doctors. Indirect: #/% of increase in your client base. Direct: #/% of increase in your client base from targeted doctors. Targeted teens learn about certain environmental health hazards. Indirect: # of students who receive brochure on topic. Direct: #/% of students who ID 3 health hazards.

Evaluating Outcomes: Template Indicators Data Collection Method Level of Effort (have, low, med, high)

Evaluating Outcomes: Example Indicators Data Collection Method Data Collection Effort (have, low, med, high) Participants learn job-seeking skills #/% of participants who can meet criteria in mock interview Observation of mock interview conducted at end of training session using observation checklist Low #/% of participants who develop a quality resume Job counselors review resumes based on quality checklist Have Participants obtain and carry out job interviews #/% of participants who go on at least 2 job interviews Tracking by job counselors

Evaluating Implementation Activities and Outputs: The “what” —the work you did, and the tangible results of that work Additional Questions: The “why”—understanding how well you did, and why

Evaluating Implementation: Understanding how well you did What information will help you understand your program implementation? Think about: Participation Quality Satisfaction Context

Evaluating Implementation: Template Activities Outputs & Implementation Questions Data Collection Method Data Collection Effort (have, low, med, high) Program Component Outputs Questions

Evaluating Implementation: Example Activities Outputs & Implementation Questions Data Collection Method Data Collection Effort (have, low, med, high) Develop/revise curriculum for training series Meet with potential program clients Provide training session series to two groups of clients Outputs Updated curriculum 2 training session series held #/rate of participation by group Program records Records Records, logs Have Questions Are we getting the clients we expected to get?(Partic.) Are they satisfied w/ training? What did they like most, least? (Satisfaction) Review of participant intake data Participant survey Low Med

Data Collection Determine what methods will you use to collect the information you need? Choose the method Decide which people, or records will be the source of the information Determine the level of effort involved in using that method with that population.

Data Collection Methods Review documents Observe Talk to people Collect written information Pictorial/multimedia

Issues to Consider Resist pressure to “prove” Start with what you already collect Consider the level of effort it will take to gather the data. Prioritize. What do you need to collect now, and what can wait until later?

Data Collection: Level of Effort Instrument development Cost/practicality of actually collecting data Cost of analyzing and presenting data

Qualitative Data Usually in narrative form—not using numbers Collected through focus groups, interviews, open-ended questionnaire items, but also poetry, stories, diaries, and notes from observations

Quantitative Data Pieces of information that can be expressed in numerical terms, counted, or compared on a scale Collected in surveys, attendance logs, etc.

Both Types of Data are Valuable Qualitative information can provide depth and insight about quantitative data Some information can only be collected and communicated qualitatively Both methods require a systematic approach

What do your data tell you? Are there patterns that emerge? Patterns for sub-groups of your population? Patterns for different components of your program? What questions do the data raise? What is surprising? What stands out? What are other ways the data should be analyzed? What additional information do you need to collect?

Communicating Findings Who is the information for? How will you tell them what you know?

Communicating Findings “Information that is not effectively shared with others will not be effectively used.” Source: Building a Successful Evaluation Center for Substance Abuse Prevention

Audience: Who needs the findings, and what do they need? Who are the audiences for your results? Which results? Staff Board Funders Partners Other agencies Public

Different ways to communicate Decide what format is appropriate for different audiences. Written report Short summaries Film or videotape Pictures, displays PowerPoint presentations Graphs and other visuals

Whatever communication strategy you choose: Link the findings to the program’s desired outcomes Include the “good, the bad, and the ugly” Be sure you can support your claims Acknowledge knowledge gaps

Continuous Learning Cycle Logic Model Reflection/ Improvement Evaluation Planning Data Collection

Thanks for Your Participation! Measure results. Make informed decisions. Create lasting change. Innovation Network, Inc. 1625 K Street, NW 11th Floor Washington, DC 20006 (202) 728-0727 Website: www.innonet.org Robin: Extension 104; rkane@innonet.org Veena: Extension 107; vpankaj@innonet.org