Tactics for improving evidence base. TACTICS OF ALL TACTICS FOR IMPROVING EVIDENCE BASE: YOU, ME, OUR COLLEAGUES If we understand the needs for evidence.

Slides:



Advertisements
Similar presentations
Consensus Building Infrastructure Developing Implementation Doing & Refining Guiding Principles of RtI Provide working knowledge & understanding of: -
Advertisements

Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Evidence-based Policy in DEFRA
Project Monitoring Evaluation and Assessment
Stepping Up to Scrutiny – in practice
Capturing the impact of research Briony Rayfield.
Monitoring Implementation: Strategy and Program for Good Governance and Prevention and Countering Corruption ( ) Alexander Stoyanov Center for.
IFAD Reform towards a better development effectiveness How can we all do better? Mohamed Béavogui Director, West and Central Africa January 2009.
Module 1: Key concepts in data demand & use
Benchmarking as a management tool for continuous improvement in public services u Presentation to Ministry of Culture of the Russian Federation u Peter.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
Comprehensive M&E Systems
I NTRODUCTION TO B ASIC D ATA A NALYSIS AND I NTERPRETATION FOR H EALTH P ROGRAMS.
Role of Result-based Monitoring in Result-based Management (RBM) Setting goals and objectives Reporting to Parliament RBM of projects, programs and.
Benefits for using a standardised risk management framework to risk assess Infection Prevention and Control Sue Greig Senior Project Officer National.
Pestalozzi Children‘s Foundation emPower 2012 Monitoring & Evaluation Lecturers: Beatrice Schulter.
Impact Measurement and You Understanding Impact Measurement, Practical Tools, Key Questions.
Gender Aware Monitoring and Evaluation. Amsterdam, The Netherlands Presentation overview This presentation is comprised of the following sections:
Monitoring & Evaluation in World Bank Agriculture and Rural Development Operations SASAR Monitoring & Evaluation Workshop New Delhi; June 20, 2006.
Adapted from Growing Success (Ontario Schools) by K. Gibson
Needs Analysis Session Scottish Community Development Centre November 2007.
Scottish Government funded Projects: Monitoring, Evaluation and Learning (MEL) … and reporting 4 September 2013.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Lessons from RAPID’s work on research-policy links John Young.
Sub-theme Three The Self-Assessment Process and Embedding QA into the Life of an Institution by Terry Miosi, Ph.D. UAE Qualification Framework Project.
A Sourcebook for Monitoring and Evaluating Agricultural and Rural Development Measuring Results in less-than-ideal Conditions Global Donor Platform for.
Impact assessment framework
Overview of operational research in MSF Myriam Henkens, MD, MPH International Medical Coordinator MSF London 1st of June, 2006.
HERA Higher Education Role Analysis ITSS Conference HIGHER EDUCATION ROLE ANALYSIS Sarah Haworth John Dickson.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Middle Leaders’ Role in School Self-Evaluation
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Unit 10. Monitoring and evaluation
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Rwanda MCC Threshold Program CIVIL SOCIETY STRENGTHENING PROJECT Cross-Cutting Advocacy Issues Data Collection Monitoring and Evaluation.
By Bankole Ebisemiju At an Intensive & Interactive workshop on Techniques for Effective & Result Oriented Annual Operation Plan November 24th 2010 Annual.
Introduction to Evaluation. Objectives Introduce the five categories of evaluation that can be used to plan and assess ACSM activities. Demonstrate how.
MaineHousing Organizational Assessment Strategic Plan engaged our external partners, stakeholders, and staff and set broad goals for the agency Organizational.
Workshop II Monitoring and Evaluation INTERACT ENPI Annual Conference December 2009 | Rome.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
Workshops to support the implementation of the new languages syllabuses in Years 7-10.
Analysis of the Impact and Effectiveness of Projects Structural Funds Programme for Malta Presentation of main findings and recommendations Evaluation.
1 Results-based Management in the ILO Joe Thurman Director Bureau of Programming and Management October 2009.
1 Self-directed Support – Older People’s Service Providers EVOC thinkSpace 20 June 2014.
[Country] Poverty-Environment Initiative Economics Assessment/Valuation of Environment and Natural Resources Country Experience Presented to the PEI Africa.
Monitoring and Evaluation
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
IFAD Reform towards a better development effectiveness How can we all do better? Mohamed Tounessi Bamba Zoumana Virginia Cameroon Retreat 4-5 November.
A short introduction to the Strengthened Approach to supporting PFM reforms.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Provincial M&E Forum 18 August 2011 The Presidency Department of Performance Monitoring and Evaluation Draft National Evaluation Policy Framework.
Kathy Corbiere Service Delivery and Performance Commission
Evaluation of NRNs Andreas Resch, Evaluation Advisor.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
27/04/2017 Strengthening of the Monitoring and Evaluation system for FTPP/FTTP in FAO /SEC December 2015 FTPP/FTFP Workshop, Bishkek, Kyrgyzstan.
Comprehensive M&E Systems: Identifying Resources to Support M&E Plans for National TB Programs Lisa V. Adams, MD E&E Regional Workshop Kiev, Ukraine May.
Project Monitoring and Evaluation A useful Tool for Research Projects’ Administrators and Managers Folajogun V. Falaye. (Professor of Educational Research.
Developing Provision Management to Improve Accountability and Outcomes Natalie Packer Summer 2014.
Monitoring and Reporting Monitoring: regular review to keep track of progress in: Workplan implementation Resource use and the management of risks Progress.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Strategy and Program for Good Governance and Prevention and Countering Corruption ( ) Goals, Principles, Methods and System of Indicators.
Project monitoring and evaluation
بسم الله الرحمن الرحيم.
Higher physical education
Helene Skikos DG Education and Culture
monitoring & evaluation THD Unit, Stop TB department WHO Geneva
Results Based Management for Monitoring & Evaluation
OGB Partner Advocacy Workshop 18th & 19th March 2010
Monitoring and Evaluating FGM/C abandonment programs
Presentation transcript:

Tactics for improving evidence base

TACTICS OF ALL TACTICS FOR IMPROVING EVIDENCE BASE: YOU, ME, OUR COLLEAGUES If we understand the needs for evidence If we have the will to do it If we have the time If we have the resources

TACTICS OF ALL TACTICS: YOU, ME, OUR COLLEAGUES If we understand the needs for evidence If we have the will to do it If we have the time If we have the resources <- INCREDIBLY, over the next 2 years, resources are not necessarily the limiting factor

Why do we need evidence? Track progress Foster adaptive management Ensure accountability Catalyse learning If we are serious about results, evidence is key.

What evidence? Monitoring and evaluation rely on different methodological approaches to generate specific type of evidence DimensionMonitoringEvaluation FrequencyPeriodic, occurs regularlyEpisodic FunctionTracking / oversightAssessment Purpose Improve efficiency, provide information for reprogramming to improve outcomes Improve effectiveness, impact, value for money, future programming, strategy and policymaking FocusInputs, outputs, processes, workplans (operational implementation) effectiveness, relevance, impact, cost- effectiveness Methods routine review of reports, registers, administrative databases, field observations More rigorous research design (e.g. scientific), complex and intensive Information source routine or surveillance system, field observation, reports, progress reports, rapid assessment, program review meetings Same sources used for monitoring, plus additional surveys and/or analysis, special studies Cost Consistent, recurrent costs spread across implementation period Episodic but need to be budgeted for

What tools? Tactic 1: ask your peers Tactic 2: use existing resources

What tools? Tactic 3: use Google

Is the evidence good enough? Tactic 4 self-assessment: for example use analytical frameworks like the BOND tool (that we have referred to in the Annexes to the PPA extension) Tactic 5: use your peers to review… Tactic 6: If peers/ yourself not available or do not have the right amount of knowledge use external support

Subliminal message: use NING

Do we have the time? Tactic 7: Identifying, collecting, managing, analysing evidence takes time. Ask for more time/ allocate more time towards this?

Do we have the time? Tactic 8: do we need to learn about time management? Available in most airports

Do we have the time? Tactic 9: consider hiring a consultant to support and relieve the pressure, especially for the more complex analysis

TACTICS OF ALL TACTICS: YOU, ME, OUR COLLEAGUES If we understand the needs for evidence If we have the will to do it If we have the time If we have the resources <- INCREDIBLY, over the next 2 years, resources are not necessarily the limiting factor

Some further points to consider in developing your tactics Analysis: I have the data but do not know what to do with it How do you deal with time-lag issues to observe results? – example see Tuesday’s presentation Staff turnover will happen – how do you deal with that? How do you change things when a programme is already being implemented (with partners?) and the wrong type of evidence is being gathered How do you work in an environment where gathering evidence is not a priority?

Thank you