Insights from an evaluator and professor HOW TO MEASURE IMPACT Paul Penley, PhD Director of Research in theological education Excellence in Giving.

Slides:



Advertisements
Similar presentations
Results Based Monitoring (RBM)
Advertisements

Kathy Keeley Northland Foundation Strengthening Communities April 14, 2011 Writing Result Measures and Program Evaluation.
Selecting the Right Evaluation Method. Objectives Why should we evaluate? Which activities should we evaluate? When should we evaluate? How should we.
Working with Qualitative Data Recounting a Collective Story from your Woksape Oyate Project Nancy M. Lucero, Ph.D., LCSW
Corry Bregendahl Leopold Center for Sustainable Agriculture Ames, Iowa.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
The Purpose of Statement
Assessment of Student Affairs Initiatives for First-Year Students National Conference on First-Year Assessment October 12-14, 2008 San Antonio, Texas Jennifer.
Focusing Your Evaluation Activities Chapter Four cont…
Evaluation in the Field: Putting Concepts into Action Janet Myers, PhD MPH Richard Vezina, MPH CAPS HIV Prevention Conference April 21, 2006.
How to Write Goals and Objectives
研究方法論課程報告 報告人:余惟茵 指導老師:任維廉教授
Getting Started. Decide which type of assessment –Input assessment –Process assessment –Outcomes assessment –Impact assessment.
Gender Aware Monitoring and Evaluation. Amsterdam, The Netherlands Presentation overview This presentation is comprised of the following sections:
Studying treatment of suicidal ideation & attempts: Designs, Statistical Analysis, and Methodological Considerations Jill M. Harkavy-Friedman, Ph.D.
Educational Solutions for Workforce Development PILOT WORKSHOP EVALUATION MARY RICHARDSON MER CONSULTING.
Helping Non-Profit Organizations Understand The Value of Their Work Copyright Postrain Consulting Services 2009.
Charting a course PROCESS.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Program Planning & Evaluation Begin with the End in Mind Dr. Dallas L. Holmes Specialist Institutional Research Utah State University Extension.
Program Evaluation Using qualitative & qualitative methods.
David Gibbs and Teresa Morris College of San Mateo.
Measuring the Value of Your Volunteer Efforts Nikki Russell Volunteer Initiatives Manager United Way of King County.
The Evaluation Plan.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
Evaluating the Effectiveness of the CTRC: Designing Quantitative and Qualitative Measures to Assess in Real Time the Value of the Center Mike Conlon, University.
Too expensive Too complicated Too time consuming.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Academy for Leadership & Development. Strategic Planning – Process An effective strategic plan: Involves the primary stakeholders Involves the primary.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Impact evaluation: External and internal stakes Impact evaluation seminar - 2 to 6 December, Phnom Penh.
Developed by Yolanda S. George, AAAS Education & Human Resources Programs and Patricia Campbell, Campbell-Kibler Associates, Inc. With input from the AGEP.
…and how to give it to them What CEOs Want Patti Phillips, Ph.D.
Chapter 3 Needs Assessment.
Juggling the Program Management Ball 1. One day thou art an extension educator… The next day thou art a county director or district extension director…how.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Evaluation and Impact of Entrepreneurial Education and Training Malcolm Maguire Transnational High Level Seminar on National Policies and Impact of Entrepreneurship.
Evaluation Workshop Self evaluation – some workable ideas and approaches.
Creating a Culture of Accountability and Assessment at an Urban Community College Diane B. Call, President Paul Marchese, VP Academic Affairs, Liza Larios,
Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College Making Sense of the.
Identifying What Data Must Be Collected. Collect the Right Data You must collect data that will help you answer the evaluation questions. The data should.
Transforming Patient Experience: The essential guide
New England Vocational Rehabilitation Quality Assurance System
The Power of Data: Mitacs Program Outcomes & Why They Matter Val Walker Policy Director October 31, 2015.
TDRp Implementation Challenges David Vance, Executive Director Peggy Parskey, Assistant Director October 23, 2014.
Methods of Data Collection Survey Methods Self-Administered Questionnaires Interviews Methods of Observation Non-Participant Observation Participant Observation.
Copyright © 2014 by The University of Kansas Data Collection: Designing an Observational System.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
School Development Goal Development “Building a Learning Community”
Program Evaluation for Nonprofit Professionals Unit 4: Analysis, Reporting and Use.
The Business Case for Executive Assessment : Why Assessment in Challenging Times Can Enhance Productivity and Be a Talent “Game Changer” Linda Sharkey,
Key Components of a successful Proposal OAS / IACML Workshop on Technical Assistance. San José, Costa Rica May 8, 2007 By: José Luis Alvarez R. Consultant.
BY: ALEJANDRA REYES DALILA OCHOA MARY GARCIA Part A Introduction to Research Methods Topics 1-5.
Are we there yet? Evaluating your graduation SiMR.
A Comprehensive Framework for Evaluating Learning Effectiveness in the Workplace Presented by Dr Cyril Kirwan.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Balanced Scorecard The University of Texas at El Paso Division of the Vice President for Business Affairs.
35 Measuring Mission – Revisiting the Premise AEA - October 2014 Presented by Barbara Willett Director of Monitoring, Evaluation and Learning Mercy Corps.
Financial Capability Integration Project
Creating Assessable Student Learning Outcomes
Presentation transcript:

Insights from an evaluator and professor HOW TO MEASURE IMPACT Paul Penley, PhD Director of Research in theological education Excellence in Giving

External Communication WHY MEASURE OUTCOMES

“I always wanted to know the measurable difference made from our significant investment of time and money in the Global Leadership Summit. Like many ministries, we had stories of impact but no concrete figures. Excellence in Giving's research changed all that. The data has now sharpened our strategies and given us confidence to approach donors we didn’t have before. I’ve been sharing the results with every financial and ministry partner we have around the world since we received the first Impact Report!” - Gary Schwammlein, WCA President

Internal Improvement WHY MEASURE OUTCOMES

DOES THE ALLIANCE ACCELERATE IMPACT?

#1 Internal Improvement #2 External Communication WHY MEASURE OUTCOMES

How to Measure Impact

How do you know when your mission is accomplished? WHAT TO MEASURE

DEFINITIONS DEFINE THE MISSION DEFINE GOALS Specific Attainable Measurable

DEFINITIONS DEFINE KEY TERMS  Leaders trained HOW? 1.Clarify the problem 2.Define success

WHAT TO MEASURE INPUTS ACTIVITIES OUTPUTS OUTCOMES

We had 220 Graduates VS. 68% of Graduates started a church, school, or ministry within 5 years WHAT TO MEASURE

How to Measure Impact

Practical EXERCISE: 1)Write down your organization’s mission statement and 1 goal. 2)Find a word (usually verbs) or phrase that needs to be defined more clearly. 3)Identify 1 long-term outcome you could measure and which activities and outputs are related to it. WHAT TO MEASURE

“Measuring outcomes is important, but our ministry can’t do it.” - every ministry leader HOW TO MEASURE “Our mission is so important we can't afford to lead blindly.” - humble ministry leader

1.ACTIVITIES & OUTPUTS  Organizational Data 2.OUTCOMES  Beneficiary Surveys HOW TO MEASURE Where do we get the data?

Where do we start? Too many outcomes to measure. How do you decide which ones to include in the survey? HOW TO MEASURE

#1 Intended OUTCOMES #2 Observed OUTCOMES HOW TO MEASURE Gather & Categorize:

INTENDED OUTCOMES

Where to find Observed OUTCOMES to gather and categorize:  Impact stories  Structured interviews  Focus groups HOW TO MEASURE

(Qualitative Research) FIND Outcome THEMES CREATE Closed-Ended QUESTIONS (Quantitative Research) HOW TO MEASURE

Quantitative DATA answers key question about stories of impact: Is this story the exception or the rule for program impact? HOW TO MEASURE

After being rescued from a brothel in Thailand, Nani is the first girl to graduate high school from her family. of girls in aftercare graduate high school without returning to the sex trade 89%

Track Impact VARIABLES:  Location  Experience  Age  Depth of Participation HOW TO MEASURE

5 Steps to Survey RELIABILITY: 1. Ask it 3 different ways 2. Require specific details 3. Avoid leading questions 4. Use validated instruments 5. Get 360 degree feedback HOW TO CREATE SURVEYS

Survey USEFULNESS:  Test it with small group  Create templates for how you will report the data HOW TO CREATE SURVEYS

1)IMPACT REPORT  Acceleration  Lasting Outcomes 2)IMPROVEMENT REPORT  Negative trends  Success Factors HOW TO CREATE SURVEYS

How to Measure Impact

Practical EXERCISE: 1)OUTCOME : Write down 3 common actions your graduates do because of their training. 2)OUTCOME INDICATORS : For 1 action, write down 3-5 ways graduates do those common actions differently in their context. 3)SURVEY QUESTION : Use the different ways graduates do one of the common actions to create 1 multiple-choice survey question. HOW TO MEASURE

1.Which is better? 1-time longitudinal studies OR ongoing feedback loops? 2.When does transformation happen? WHEN TO MEASURE

TRANSFORMATION BEFORE AFTER

3 typical times to MEASURE: 1)BEFORE program participation 2)END of program participation 3)1-5 Years AFTER program completion WHEN TO MEASURE

How to Measure Impact

Practical EXERCISE: 1)MEASUREMENT INTERVALS: Pick up to 3 times when you could survey your beneficiaries. 2)SURVEY INTEGRATION: Identify current paperwork in which you could integrate surveys. WHEN TO MEASURE

Insights from a professor and evaluator HOW TO MEASURE IMPACT Paul Penley, PhD Director of Research in theological education Excellence in Giving