1 Workshop EVALUATION PROCESS – PROGRAMMES AND PROJECTS Waterford 10 TH April 2006. Dr Jim Fitzpatrick Managing Director Fitzpatrick Associates 10 Lad.

Slides:



Advertisements
Similar presentations
1 Planning an Evaluation Observations from a Practitioner.
Advertisements

1 EU Strategy for the Baltic Sea Region Evaluation: Setting Outcome Indicators and Targets Seminar: 15 March 2011, La Hulpe Veronica Gaffey Acting Director.
Project design, preparation and approval Basel Convention Resource Mobilization Workshop Nairobi, 3 – 7 December 2006 Andreas Arlt Secretariat of the Basel.
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Designing an Effective Evaluation Strategy
Project Monitoring Evaluation and Assessment
IDBM industry project Project Plan. Add text here giving a brief background of the project Project Background.
What are the role and tasks of the Managing Authority in the area of prior appraisal and the preparation of an interim evaluation that is effective and.
CAP 252 Lecture Topic: Requirement Analysis Class Exercise: Use Cases.
1 SOCIAL RESEARCH ASSOCIATION IRELAND/IRISH EVALUATION NETWORK SEMINAR GOOD PRACTICE IN COMMISSIONING RESEARCH AND EVALUATION 10 TH JANUARY 2006.
University of Sunderland CSEM04 ROSCO Unit 13 Unit 13: Risk Methods CSEM04: Risk and Opportunities of Systems Change in Organisations Dr Lynne Humphries.
Project Workshops Results and Evaluation. General The Results section presents the results to demonstrate the performance of the proposed solution. It.
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
HOW TO WRITE A GOOD TERMS OF REFERENCE FOR FOR EVALUATION Programme Management Interest Group 19 October 2010 Pinky Mashigo.
Monitoring & Evaluation in World Bank Agriculture and Rural Development Operations SASAR Monitoring & Evaluation Workshop New Delhi; June 20, 2006.
Participants should expect to understand: Concept of M&E Importance of gender in M&E Different steps in the M&E process Integrating gender into program/project.
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Evaluation plans for programming period in Poland Experience and new arrangements Ministry of Infrastructure and Development, Poland Athens,
Performance Measurement and Analysis for Health Organizations
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Developing a result-oriented Operational Plan Training
Strategic Planning Session David Rudder, Ph.D. Rudder Consultants, LLC. May 17, 2006.
Enhancing Organizational Effectiveness April 28, 2009.
18 March th meeting of the Evaluation Expert Committie 1 Thematic Working Group „Ex post Evaluation Guidelines” State of play Jela Tvrdonova.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Management of assessments and decision making: execution, facilitation, evaluation Mikko V. Pohjola, Nordem Oy (THL)
UNEP Training Resource ManualTopic 1 Slide 1 Aims and objectives of EIA F modify and improve design F ensure efficient resource use F enhance social aspects.
Human Computer Interaction
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
Strategic Plan th October Management and Governance “GeSCI’s corporate structures and management arrangements were appropriate for.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
Regional Seminar 2005 EVALUATING POLICY Are your policies working? How do you know? School Development Planning Initiative.
1 CORAT AFRICA MANAGEMENT WORKSHOP FOR IMBISA/AMECEA COMMUNICATION COORDINATORS, MAPUTO, MOZAMBIQUE.
Assessment of project proposals within EU grant schemes 17 & 18 December 2013 Government Office for Cooperation with NGOs Zagreb VLADA REPUBLIKE HRVATSKE.
Presentation 3: “Practical experiences gained from carrying out case studies in relation to different policy themes and/or overall context settings” Dr.
The Results-Based System Awoke Kassa ENTRO M&E Officer ENTRO M&E Officer SDCO Capacity Building Workshop IV October 2008 Cairo Nile Basin Initiative.
Building evaluation capacity – training Ben Barnes Principal Research Scientist July 2015.
The Program Evaluation Cycle Module 3. 2 Overview n Overview of the evaluation cycle n Major components of the cycle n Main products of an evaluation.
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Object-Oriented Software Engineering using Java, Patterns &UML. Presented by: E.S. Mbokane Department of System Development Faculty of ICT Tshwane University.
Regional Policy Result Orientation of future ETC Programes Veronica Gaffey Head of Evaluation & European Semester 23 April 2013.
Maximising the Use of Evaluation Reports 1 Thomas Alveteg HIV Joint Evaluation Workshop, Limbé, March 15 –
Project Management Learning Program 23 Mar – 3 Aprl 2009, Mekong Institute, Khon Kaen, Thailand Managing for Development Results Results-Oriented Monitoring.
Workshop: RIA for Prime Ministry Experts 13 October 2009 EuropeAid/125317/D/SER/TR Session 3 RIA Consultation for Public Sector and Government.
REPORT OF THE TASK FORCE ON IMPACT ASSESSMENT OF FELLOWSHIPS IN THE UN SYSTEM AND AGENCIES Dr Alexandre Goubarev, WHO/Impact Assessment Taskforce Chair.
How to measure the impact of R&D on SD ? Laurence Esterle, MD, PhD Cermes and Ifris France Cyprus, 16 – 17 October L. ESTERLE Linking science and.
National Agencies’ contribution to the evaluation of Grundtvig Action and to the National Evaluation Report on Socrates Programme Socrates - Grundtvig.
Copyright 2010, The World Bank Group. All Rights Reserved. QUALITY ASSURANCE AND EVALUATION Part 1: Quality Assurance 1.
Advanced Project Management Project Planning Phase Ghazala Amin.
ACTED AME Appraisal, Monitoring and Evaluation. Summary 1/ ACTED AME department 2/ AME Responsibilities 3/ AME throughout project cycle 4/ Involvement.
Module 7- Evaluation: Quality and Standards. 17/02/20162 Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques.
Copyright © 2007 Pearson Education Canada 9-1 Chapter 9: Internal Controls and Control Risk.
WEAP Workshop Work Plans/Progress Reports Geoff O’Brien.
Developing Monitoring & Evaluation Frameworks: Process or Product? Anne Markiewicz.
1 ISWGNA and AEG: Mandate and governance 7 th meeting of the Advisory Expert Group on national accounts Apr 2012, New York.
Croatia: Result orientation within the process of preparation of programming documents V4+ Croatia and Slovenia Expert Level Conference Budapest,
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Evaluation What is evaluation?
GUIDELINES Evaluation of National Rural Networks
Evaluation from the External Evaluator’s Perspective
Lithuanian Standards for Evaluation of EU Structural Funds
PURPOSE AND SCOPE OF PROJECT MONITORING AND EVALUATION M&E.
Helene Skikos DG Education and Culture
CATHCA National Conference 2018
Government evaluations - Views from the supplier perspective
OGB Partner Advocacy Workshop 18th & 19th March 2010
Integrating Gender into Rural Development M&E in Projects and Programs
Presentation transcript:

1 Workshop EVALUATION PROCESS – PROGRAMMES AND PROJECTS Waterford 10 TH April Dr Jim Fitzpatrick Managing Director Fitzpatrick Associates 10 Lad Lane Dublin 2 Tel: Fax:

CONTENT 2  Nature/basis of comments  Types of evaluation  Evaluation as a “process”  Key Steps in doing evaluation  Methodologies in practice  Common problems  Specific cross-border issues  Some “tips”

NATURE/BASIS OF COMMENTS 3  Wide practical experience, esp. programme-level  EU-co financed, other  Related work  INTERREG and other cross-border  All external evaluation

4 THE PROJECT CYCLE Source: 2004 PCM Manual

EVALUATION CYCLE AND FOCUS 5 Evaluation questionEx-AnteInterim/Mid-TermPost mid-term Rationale *** ** Relevance****** Effectiveness****** Efficiency****** Impact****

A PROCESS, NOT JUST A TECHNIQUE 6 Managing the team Stakeholder Involvement Research Analysis Time, resources, budget Client, user relations EVALUATION A Balancing Act

TASKS IN TYPICAL EVALUATION PROCESS 7 1.Establish/Understand Context  who is the “client”?  why being done?  any specific use intended?  what kind of evaluation is needed? 2.Obtain/Prepare/Agree Brief (ToR)  is there one?  is it clear?  write one?  is it agreed? 3.Prepare Work Plan (Proposal)  overall approach (i.e. how interpreting brief, how going about it)  analytical framework (i.e. overall logic)  methodology/techniques (e.g. CBA, CEA, MCA)  work programme (i.e. the data, data collection)

8 4.Evaluation Team/Resources (budget)  no. of people/person days  types of people  necessary expertise (e.g. on technical aspects) 5.Doing the Evaluation  implement method/work programme  client relations  manage team  deal with unexpected issues 6.The Output/Report/Schedule  meet how often, how many, when?  nature of report e.g. length? style? nature?  presentations? EVALUATION TASKS CONTINUED

EVALUATION METHODOLOGY IN PRACTICE 9  trying to establish if intervention did (or will) make a difference  so “with-without” scientific method at its core  formal quantitative techniques very desirable, but very different  MCA/scoring, weighting and ranking most used  others useful are:  before v after (time-series)  places that do, don’t have (“control group”)  “expert” opinion  views of stakeholders  always need some framework for answering the evaluation questions (samples available)

COMMON PROBLEMS IN PRACTICE 10  poor initial projects/programme design  inability to control for external influences  poor/unavailable indicators (too few, too many, not really capturing essence of intervention)  lack of consensus about purpose of evaluation  “scope creep”

PARTICULAR ISSUES IN INTERREG CONTEXT 11 “Evaluation meets cross-border co-operation!”  different member state/regional contexts  different national attitudes towards evaluation  projects/programmes relatively small  interventions often “soft” in nature, harder to evaluate  cross-border evaluation teams needed  language/communication problems  more costly (travel, telecoms, translation, time)

SOME PRACTICAL TIPS 12  ensure programme/project planning are good (monitoring, evaluation considered at outset)  take time to get shared understanding of what’s happening  ensure there is some kind of method/framework being used  indicators – use “sensibly”, and note they are the fuel of monitoring/evaluation, they are not it themselves  cross-border work needs cross-border teams  remember it’s a process, not just a technique

QUESTIONS AND ANSWERS 13 LOOKING FORWARD TO HEARING BOTH THE QUESTIONS AND THE ANSWERS!