OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.

Slides:



Advertisements
Similar presentations
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Advertisements

Customised training: Learner Voice and Post-16 Citizenship.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Study on the outcomes of teaching and learning about ‘race’ and racism Kish Bhatti-Sinclair (Division of Social Work Studies) Claire Bailey (Division of.
Project Monitoring Evaluation and Assessment
Ray C. Rist The World Bank Washington, D.C.
Applicant Guidance Colette Nulty Pobal is a not-for-profit company with charitable status that manages programmes on behalf of the Irish Government and.
Return On Investment Integrated Monitoring and Evaluation Framework.
Business Excellence within The University of Bolton Strategic Planning Process 17 th October 2006.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Monitoring, Review and Reporting Project Cycle Management A short training course in project cycle management for subdivisions of MFAR in Sri Lanka.
Standards and Guidelines for Quality Assurance in the European
HOW TO WRITE A GOOD TERMS OF REFERENCE FOR FOR EVALUATION Programme Management Interest Group 19 October 2010 Pinky Mashigo.
The Student Experience Project Overview for Kosovo Higher Education visit Mark Wilkinson October 2014.
OECD/INFE toolkit to measure financial literacy and inclusion
OECD/INFE tools for cross- country surveys of financial literacy
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
TEMPUS IV- THIRD CALL FOR PROPOSALS Recommendation on how to make a good proposal TEMPUS INFORMATION DAYS Podgorica, MONTENEGRO 18 th December 2009.
Dr Elena Luchinskaya, Lancaster University/ Leeds Metropolitan University, UK.
1 Welcome Working with Volunteers Course Heelis, 10 th January 2012 Mike Elliott, National Volunteering Manager Michelle Upton, Working Holidays Officer.
Impact assessment framework
Presenter-Dr. L.Karthiyayini Moderator- Dr. Abhishek Raut
HERA Higher Education Role Analysis ITSS Conference HIGHER EDUCATION ROLE ANALYSIS Sarah Haworth John Dickson.
WWF – World Bank Management Effectiveness Tracking Tool What is Management Effectiveness Evaluation? Sue Stolton.
Evaluation Assists with allocating resources what is working how things can work better.
Rwanda MCC Threshold Program CIVIL SOCIETY STRENGTHENING PROJECT Cross-Cutting Advocacy Issues Data Collection Monitoring and Evaluation.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
IAOD Evaluation Seminar “Demystifying Evaluation in WIPO- Best Practices from Initial Evaluations” Geneva November, Evaluation Section Internal.
Introduction to Evaluation Odette Parry & Sally-Ann Baker
Transforming Community Services Commissioning Information for Community Services Stakeholder Workshop 14 October 2009 Coleen Milligan – Project Manager.
1 Women Entrepreneurs in Rural Tourism Evaluation Indicators Bristol, November 2010 RG EVANS ASSOCIATES November 2010.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
EU Funding opportunities : Rights, Equality and Citizenship Programme Justice Programme Jose Ortega European Commission DG Justice.
University of Palestine Dept. of Urban Planning Introduction to Planning ( EAGD 3304 ) M.A. Architect: Tayseer Mushtaha Mob.:
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
1 Evaluating the Quality of the e-Learning Experience in Higher Education Anne Jelfs and Keir Thorpe, Institute of Educational Technology (IET), The Open.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
Copyright 2010, The World Bank Group. All Rights Reserved. Principles, criteria and methods Part 2 Quality management Produced in Collaboration between.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
KEY OECD/INFE DELIVERABLES under the RUSSIAN TRUST FUND on FINANCIAL LITERACY AND EDUCATION Flore-Anne Messy Senior Policy Expert Executive Secretary of.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Consultants attitudes to consumer involvement in clinical research Rachel Thompson, Alan Horwich, Jim Laxton, Joe Flaherty, Andy Norman, Barbara Pearce,
Kathy Corbiere Service Delivery and Performance Commission
Jane Holdsworth 20 January The terminology of outcomes and impact How to evidence outcomes and impact Methods for collecting evidence Sources of.
Session 2: Developing a Comprehensive M&E Work Plan.
Telecare regional support Working with local authorities to focus on delivering an enhanced telecare offering to service users and executing targeted pilots.
AssessPlanDo Review QuestionYesNo? Do I know what I want to evaluate and why? Consider drivers and audience Do I already know the answer to my evaluation.
Folie 1 Sarajevo, October 2009 Stefan Friedrichs Managing Partner Public One // Governance Consulting Project Management in the Public Sector Monitoring.
Torbay Council Partnerships Review August PricewaterhouseCoopers LLP Date Page 2 Torbay Council Partnerships Background The Audit Commission defines.
Raising standards improving lives The revised Learning and Skills Common Inspection Framework: AELP 2011.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Rannís Erasmus+ Data collection, analysis & impact studies.
European Agency for Development in Special Needs Education Project updates Marcella Turner-Cmuchal.
Exploitation means to use and benefit from something. For Erasmus+ this means maximising the potential of the funded activities, so that the results are.
Building an ENI CBC project
Welcome to Scottish Improvement Skills
PowerPoint to accompany:
Training Trainers and Educators Unit 8 – How to Evaluate
Integrated Management System and Certification
Using Logic Models in Program Planning and Grant Proposals
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 8. Routine monitoring.
Unit 6 Research Project in HSC Unit 6 Research Project in Health and Social Care Aim This unit aims to develop learners’ skills of independent enquiry.
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Training Trainers and Educators Unit 8 – How to Evaluate
Benchmarking and Collaboration
Presentation transcript:

OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust Fund

Outline Terminology Motivation for focusing on evaluation Overview of the OECD INFE research and tools developed under the Trust Fund Introduction to the 2 practical guides Focus on High-level Principles

Terminology Checking whether targets were met by monitoring inputs and outputs keeping track of the day-to-day inputs and processes involved in delivering the education Assessing the outcomes and impact for participants Analysing the cost-effectiveness of the programme Monitoring and Evaluating financial education programmes: what do we mean?

Motivation for focusing on evaluation Widespread policy interest in the role of financial education requires answers to pressing questions Does financial education work?What makes it work?How does it help consumers? When is it the appropriate method and when is consumer protection required?

Benefits for programme delivery Good evaluation allows: Identification of elements that can be scaled-up or replicated Method to test different approaches to see which are most cost efficient Indications of where fine-tuning could be useful Ability to show that objectives are being met and reward staff Opportunity to share experiences rather than repeating mistakes

The challenges faced 46 Authorities from 29 countries responded to an INFE request for information about the extent to which they were evaluating and the challenges they faced; 28 authorities in 23 countries had evaluated The most frequently faced challenges:

Evaluation research and tools Fact-finding stock take of programme evaluation amongst INFE members (2009) Comparative and analytical framework for evaluation of financial education (2010) OECD/INFE high-level principles on evaluation of financial education programmes (2011) OECD/INFE Guides to Evaluation (2011) 7

OECD-INFE Guide available online 8 Non-technical 7 page guide, answering questions such as: Why evaluate? What types of questions will an evaluation answer? Providing guidance on: The principles of a good evaluation The key steps Suggested methods

Detailed guidance also available online 16 page, detailed guidance in non-technical language: Information about the theory of change Detailed guidance on the steps of evaluation Information about analysis and interpretation of data Reminders about reporting the results Annex with information about additional resources

OECD INFE High-level Principles 5 key principles : 1.New programmes should be evaluated; try to also evaluate existing programmes 2.Include evaluation in the budget 3.Evaluation should be credible: consider an external evaluator or reviewer 4.Design the evaluation in accordance with the objectives and type of programme 5.Report what worked, and what didn’t work 3 steps : planning, implementation, reporting 10

1. Evaluation: an essential element of FE programmes New programmes: Develop a strategy for monitoring and evaluating alongside programme design Keep in mind the benefit of collecting information before the programme starts All programmes: Encourage dialogue and collaboration with key stakeholders to ensure clarity and consistency of aims and objectives Reassure providers that evaluation is not designed to judge them, but to improve efficiency if appropriate and to identify successful programmes that will ensure the best possible outcomes for future participants 11 FE Programmes planning should include evaluation

2. Budget for evaluation Find out how much other evaluations have cost and gather estimates before finalising the budget The amount of money available shouldn’t determine the design of evaluation, but may indicate the need to prioritise certain aspects of evaluation Look for ways of reducing costs: e.g. sharing questionnaires, drawing on existing data and international methodology and drawing on contacts; piloting programmes before large scale roll-out 12 A good evaluation ensures that resources are being well spent: it is a wise expense!

3. External evaluators: adding credibility, skills and experience External evaluators bring skills and independence Other ways of ensuring credibility: Use technology: Administrative systems and websites can automate data collection. Electronic games can store scores and measure improvement. Ask (well designed) questions of participants and non- participants, trainers and designers: use surveys, tests, interviews, focus groups. Corroborate findings where possible: check bank statements, pension fund records, credit counselling services 13

4. Appropriate evaluation design Continuous monitoring: Count/measure/quantify- how many participants, hours of contact, leaflets distributed etc Measure change according to programme type and objectives: monitor improved awareness, evaluate behaviour change strategies, test knowledge Identify ways of attributing change: create a control group- lottery for places, random marketing of courses– according to programme design. Undertake comparisons of: knowledge, behaviour, attitudes before vs. after – and long after; participants vs. non-participants; targets vs. achievements; budget vs. expenditure, opinions of providers vs. users 14

5. Reporting Reporting is critical for the future of FE programmes Avoid over generalisation: check carefully and get advice on whether findings may apply more widely Also report the method and limitations of the evaluation Disseminate the findings widely, using different styles of reporting (newsletter, academic paper..) Draw on the report when making future funding decisions & designing future programmes Compare your results to those of other programmes 15

Questions, comments, further information: OECD/INFE Russian Trust Fund 16