It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference.

Slides:



Advertisements
Similar presentations
Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management.
Advertisements

Working Group Feedback EDR Replacement. Where are we now? Inconsistent deployment and practice. Wide variation in the value placed on the process by managers.
Results Based Monitoring (RBM)
Content of the Presentation WHY - What are the benefits of Transnational Projects? HOW- LAG taking a proactive approach -Using the existing mechanisms.
Thematic evaluation on the contribution of UN Women to increasing women’s leadership and participation in Peace and Security and in Humanitarian Response.
Bryan Roach Chairman Crime Stoppers Australia. Strategic Planning The process for defining strategy (direction) and decision making For Crime Stoppers,
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Donald T. Simeon Caribbean Health Research Council
1 Budgets and Budgetary Control Prepared and Presented By Gladstone K. Hlalakuhle.
POWER TO THE PEOPLE How NFP boards can produce better outcomes by inviting stakeholders to have a say in their decision-making.
MODULE 8: PROJECT TRACKING AND EVALUATION
Monitoring and evaluation of carers’ services and projects Dr Andrea Wigfield - Associate Professor of Social Policy Centre for International Research.
2006 TEA Conference Terry Berends, PE Assistant State Design Engineer Washington State Department of Transportation Risk Based Estimating Tools at WSDOT.
R. MUNYARADZI FOURTH AFCOP ANNUAL MEETING MAY 2011 – NAIROBI, KENYA HOW CAN WE OVERCOME CHALLENGES IN PLANNING FOR OUTCOMES?
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Action Implementation and Evaluation Planning Whist the intervention plan describes how the population nutrition problem for a particular target group.
Budgets. On completing this chapter, we will be able to: Understand why financial planning is important. Analyse the advantage of setting budgets- or.
Adapting to Consumer Directed Care funding Developing an approach for Unit Based Costing.
How to Develop the Right Research Questions for Program Evaluation
DIY HEALTH CHECK… ARE YOU READY FOR THE NEW HORIZON? Linda Hayes, Managing Director, Corporate Synergies Australia 1.
This project is funded by the EUAnd implemented by a consortium led by MWH Logical Framework and Indicators.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
9 Closing the Project Teaching Strategies
ZHRC/HTI Financial Management Training
THE OECD APPROACH TO ASSESSING ORGANISATIONAL EFFECTIVENESS Frank van Tongeren Head of Division, Policies in Trade and Agriculture (OECD) ICAE pre-conference.
Program Evaluation and Logic Models
Strategic Planning Session David Rudder, Ph.D. Rudder Consultants, LLC. May 17, 2006.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
1. Development Planning and Administration MPA – 403 Lecture 17 FACILITATOR Prof. Dr. Mohammad Majid Mahmood Bagram.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
© 2012 Cengage Learning. All Rights Reserved. This edition is intended for use outside of the U.S. only, with content that may be different from the U.S.
Overall Quality Assurance, Selecting and managing external consultants and outsourcing Baku Training Module.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
Monitoring and Evaluation
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Derbyshire Accommodation & Support (Supporting People) Quality Assessment Framework (QAF) Quality Assessment Framework (QAF) In relation to: C1.4 Fair.
Program Assessment. Before you get started Need to determine if the program can be evaluated or should be evaluated. Is the answer already available?
Implementing Strategy Chapter 7. Objectives Upon completion of this chapter, you should be able to:  Translate strategic thought to organisational action.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
ANSI/ISA Workshop Chapter 4: External Communications and Crisis Management Team.
How to measure the impact of R&D on SD ? Laurence Esterle, MD, PhD Cermes and Ifris France Cyprus, 16 – 17 October L. ESTERLE Linking science and.
Influencing Policy through Research: Introduction to Principles and Tools Arnaldo Pellini:
Artists in Schools Documentation and Final Reporting Template for ArtsSmarts Saskatchewan projects: Artist in Residence I Artist in Residence II ArtsSmarts.
ROOTS 1+2 Advocacy toolkit
Developing Effective Performance Measures: A Skill-Building Workshop David J. Bernstein, Ph.D., Senior Study Director, Westat Sponsored by the Government.
Evaluation What is evaluation?
Review of Social Marketing South East Region Presentation to Department of Health South East 5 August 2009 Hannah Corbett (South East.
WORKSHOP ON PROJECT CYCLE MANAGEMENT (PCM) Bruxelles 22 – 24 May 2013 Workshop supported by TAIEX.
Building an ENI CBC project
How to show your social value – reporting outcomes & impact
Designing Effective Evaluation Strategies for Outreach Programs
QIC-AG Logic Model Template
Using Logic Models in Program Planning and Grant Proposals
EVALUATION RESEARCH Research Methods
Draft OECD Best Practices for Performance Budgeting
TECHNOLOGY ASSESSMENT
Group work on challenges Learning oriented M&E System
INTERMEDIATE OUTCOMES ACTIVE, ENGAGED CITIZENS
Presentation transcript:

It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference 5 July 2013

Evaluation – What is it? ‘A systematic way of answering questions about projects, policies and programs’ 1.Is it needed or worthwhile? 2.Is it having an effect? 3.At what cost? 4.How could it be improved? 5.Are there better alternatives? 6.Are there any unintended consequences?

NFP Evaluation – What it isn’t

Who evaluates?

Why do (or don’t) they evaluate? Source: New Philanthropy Capital

What do they gain? Source: New Philanthropy Capital

Who should evaluate? AdvantagesDisadvantages Internal Better overall & informal knowledge Less threatening/known to staff Less costly May be less objective Evaluation could be a part-time responsibility May not be trained in evaluation External More objective Able to dedicate time and attention Greater evaluation expertise Needs to learn about organisation and culture Unfamiliar to staff More costly

When to conduct evaluation? Form:Stage of ProgramPurpose: Proactive Evaluation Program Start Program Completion Before a program starts To synthesis information to inform program design Clarificative Evaluation During program development To clarify the program design and how it operates Interactive Evaluation During program delivery To improve program delivery. Involves stakeholders in the evaluation. Monitoring Evaluation Once the program is in a settled stage To monitor program progress for accountability and improvement Impact Evaluation During or after program implementation To assess what has been achieved, to learn and be accountable Source: K Roberts (adapted from Owen and Rogers, 2006)

Dispelling myths Theory of change? Not needed because the evaluator will reconstruct the logic of the actual program, not the theoretical model: Foundational activities Activities Outputs Immediate outcomes Intermediate outcomes Long-term outcomes Organisational goals

Dispelling myths Mountain of data? Most data is just information…we are looking for insight into what it means Historical data is more valuable than a mountain of current data Your evaluator should identify the few ‘dashboard’ measures that you will need to evaluate Once an evaluation has been conducted you can use the dashboard forever

Dispelling myths A wad of cash? Think of what is at stake versus the internal budget allocation – any activity with a value in excess of $200K should be evaluated Governments and foundations often allow for 10% to be spent on evaluation There are many ways that NFPs can reduce the cost of evaluations

Using the results of evaluation Share them…as widely as you can Some evaluators will agree to write a summary which protects the egos of those involved Action learning/research is a participative approach based on a four part cycle of: taking action, measuring the results of the action, interpreting the results and planning what change needs to take place for the next cycle of action The best projects conclude with a Summit workshop

Beyond program impact evaluation

Learning along the way Documentation Documents success and failures Summary of key documents in one place Timeline/sequence of events Isolates key measures for the future Supports performance appraisal for staff and board Helps orient staff, volunteers and contractors

Learning along the way Full cost accounting Full costs and expenses need to be calculated to arrive at the true financial picture Need to include: Budget allocation Cash donations In-kind services Pro-bono services

Learning along the way Full value assessment Captures all non-financial outputs in addition to financial information For example, while social media produces a host of measures, there are no financial equivalents as there are in traditional media (i.e. TARPs) Need to identify data sources for year-on-year comparison in future

Learning along the way Organisational behaviour and Governance Qualitative research reveals issues around organisational behaviour and governance which can impact outcomes Project governance can be examined independent of personalities to pinpoint areas for change/improvement

Learning along the way Relationship building The evaluation process has been described as ‘cathartic’ by key players Helps diffuse tensions that build up during a campaign Provides stakeholders a voice/builds goodwill for the future Aids communication ‘across the political/media divide’

Over to you… Questions

For more information, contact: Randall Pearce NOTE: For a copy of this presentation, please provide your business card at the end of the session or