Independent IOD and CDIP Project Evaluations - An external perspective WIPO 2016 Evaluation Seminar 29 January 2016 Glenn O’Neil Evaluation consultant.

Slides:



Advertisements
Similar presentations
Application of the PROJECT CYCLE MANAGEMENT in Piedmont Region.
Advertisements

Scottish Learning and Teaching Strategies Support Group Academy Scotland - Enhancement and Engagement 24 May 2007.
Frances Molloy (Chief Executive) Keith Gorman (Programme Manager)
UNDP Global Programme Mr. Magdy Martinez-Soliman Director a.i., Bureau for Development Policy New York - 3 September 2014 United Nations Development.
Project Monitoring Evaluation and Assessment
UNITED NATIONS NORMS AND STANDARDS FOR EVALUATION UNITED NATIONS EVALUATION GROUP (UNEG) Maya Bachner WIPO IDEAS 1st BIENNIAL CONFERENCE, NEW DELHI, APRIL.
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Prof. György BAZSA, former president Hungarian Accreditation Committee (HAC) CUBRIK Workshop IV Beograd, 13 March, 2012 European Standards and Guidelines.
What makes a successful development project? Kristin Olsen IOD PARC
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
International Health Policy Program -Thailand Reflection on experience in using JANS with sector strategy Phusit Prakongsai, MD. Ph.D. International Health.
Kathy Corbiere Service Delivery and Performance Commission
IB Business & Management Topic 6 – Strategy HL ONLY.
United Nations Economic Commission for Europe Statistical Division WHAT MAKES AN EFFECTIVE AND EFFICIENT STATISTICAL SYSTEM Lidia Bratanova, Statistical.
Ricardo Furman Senior Evaluation Officer- Geneva EIA Evaluation and Impact Assessment Section International Programme for the Elimination of Child Labour.
Kick Off Meeting Largs, Scotland
Making Programs Make more Systematic use of Evaluations and
Country Level Programs
Well Trained International
TRAINERS AND TRAINING PROCESSES
Welcome to the Annual Meeting of Title I Parents
Welcome to the Annual Meeting of Title I Parents
GEF Familiarization Seminar
Module 2 Basic Concepts.
Gender-Sensitive Monitoring and Evaluation
GENDER TOOLS FOR ENERGY PROJECTS Module 2 Unit 2
Gender-Sensitive Monitoring and Evaluation
The Hope Foundation Presentation on Pre-Funding Appraisal For Partners and New Applicant Organisations.
Risk Communication in Medicines
The interim evaluation of Horizon 2020 – the way forward
Welcome to the Annual Meeting of Title I Parents
CRE8TIVE KO Meeting, Rome Italy Quality Assurance
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
LEARNING REPORT 2016 Disasters and Emergencies Preparedness Programme
Example Focus Group Protocol
Good Participatory Practice (GPP) Guidelines for Biomedical HIV Prevention Trials 2011, Second Edition Note to trainer/presenter: You may use all or.
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Add your school name and the date and time of the meeting
Strategic Human Resource Management
Welcome to the Annual Meeting of Title I Parents
GSF Results and Financial Monitoring Workshop
Welcome to the Annual Meeting of Title I Parents
Welcome to the Annual Meeting of Title I Parents
Welcome to the Annual Meeting of Title I Parents
GRANTS – WORKPLAN AND MONITORING AND EVALUATION (M&E) INSTRUCTIONS
Senior Adviser, OECD/SIGMA
COMMUNITY RELATIONS, EQUALITY & DIVERSITY IN EDUCATION POLICY
Senior Adviser, OECD/SIGMA
The partnership principle in the implementation of the CSF funds ___ Elements for a European Code of Conduct.
Introduction to M&E Frameworks
Helene Skikos DG Education and Culture
CATHCA National Conference 2018
Post-2020 discussions 1. State of play of discussions 2. On-going work 3. Questions for debate.
WHAT is evaluation and WHY is it important?
Welcome to the CIS Annual Meeting of Title I Parents
United Nations Voluntary Fund on Disability (UNVFD)
Objectives, Scope and Structure of Country Reports
Ricardo Furman Senior Evaluation Officer- Geneva
Welcome to the Annual Meeting of Title I Parents
STRATEGIC PLANNING FOR Economic development
Welcome to the Annual Meeting of Title I Parents
Personal Academic Tutoring
Welcome to the Annual Meeting of Title I Parents
Field monitoring Project (number and title)
Monitoring and Evaluation in Communication Management
Integrating Gender into Rural Development M&E in Projects and Programs
Welcome to the Annual Meeting of Title I Parents
Data for PRS Monitoring: Institutional and Technical Challenges
Jeannette Monier and Louise Reid
Presentation transcript:

Independent IOD and CDIP Project Evaluations - An external perspective WIPO 2016 Evaluation Seminar 29 January 2016 Glenn O’Neil Evaluation consultant oneil@owlre.com www.owlre.com

Use Methodology Implementation Findings Evaluation This is a simplified diagram of the evaluation process Working in a cyclical way; each step leading into the next Ultimately the whole experience should feed into improved methodology, etc. I will use this to make some reflections on evaluation in WIPO

Methodology Use Implementation Findings Procedures in place Evaluation Implementation Findings Procedures in place Variety of methods Pre-post designs Largely inclusive Reaching stakeholders Isolating WIPO’s contribution Often direct & intended use Input into program design/direction Here I’ve now highlighted some points that I’ve noticed about evaluation in WIPO – per step Methodology: Procedures in place that are largely respected A variety of methods can be used; often programs/projects collect interesting data (n.b. need for monitoring..) 90% of evaluations are post only – with WIPO programs/projects, if prepared well, some pre-data can exist, allowing for a stronger design Evaluations at WIPO are inclusive – stakeholders and staff do participate Implementation Reaching all relevant stakeholders is often difficult; also given that WIPO is often 1 or 2 removed from beneficiaries Difficulty in analysing data to isolate WIPO’s contribution – often given that WIPO activities (and IP in itself) are part of a larger “package” that relies on many different inputs to “succeed” Findings Feedback is manageable in that there are not 100s of people commenting on drafts (in my experience..) Evaluation reports are received in a relatively structured way – CDIP for example Use Often seen that reports are used directly and what they were intended for Program/projects do use findings as an input into program/project design Manageable feedback Structured reception

Evaluation policies & institution Field Program design & management Organisational setting Funding People Use Methodology Evaluation Implementation Findings However.. Evaluation doesn’t occur in isolation – here I’ve introduced what I identify as the main influences on evaluation These factors can be both enabling and hindering – depending upon the situation Context

Evaluation policies & institution Field Program design & management Organisational setting Funding People Use Methodology Evaluation Implementation Findings What I highlight in bold are the three influences that I see as the most important (in WIPO and elsewhere) Note – they are all internal factors and therefore “possible” to control (to a certain extent…) - Evaluation policies and institutions – are key to enabling evaluation in an organisation; WIPO has built its institution over time and its policy is flexible (and not “dogmatic”, e.g. it doesn’t enforce a given method/approach) - Organisational settings – the notion of culture is in important and elements such as RBM and CDIP have encouraged M&E in WIPO - Program design/mgt – the acceptance of M&E by programs is key; incorporating monitoring elements, indicators, baselines, measurable objectives – this we see starting to happen in WIPO Context

Evaluation policies & institution Field Program design & management Organisational setting Funding People Use Methodology Implementation Findings Isolating WIPO’s contribution In this final slide, I highlight areas where I think improvements could be seen: five points: Link between evaluation policy and program - programs need to ensure that they adopt the minimum (and more!) required for M&E, monitoring, etc. and staff have time and resources to be involved Funding needs to be budgeted for by programs for evaluation 3. Isolating WIPO’s contribution – deserves more thought – does the evaluation field (academic, prof. associations, evaluation community) have methods to tackle this better? Such as contribution analysis, comparative studies (e.g. country with and country without WIPO support), etc. People – participation of WIPO staff in evaluation design/TOR and inputting at all steps is important – must be kept manageable – but link clearly shown between participation and use 5. Use – use of findings can be seen directly in WIPO – however, these “squiggles” show other ways that use occurs; It can be unintended and direct It can be unintended and indirect It can be intended and indirect It can be intended, indirect and not occur Not an improvement per se - just to be aware that evaluation reports can be used in unintended and indirect ways…. Context

Thank you! @glenn_oneil Contacts: oneil@owlre.com glennoneil www.owlre.com www.intelligentmeasurement.net