Evaluation from the External Evaluator’s Perspective

Slides:



Advertisements
Similar presentations
1 Planning an Evaluation Observations from a Practitioner.
Advertisements

Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
HM Inspectorate of Education 1 The Quality Framework for Scottish FE Colleges Angus Allan, HMIE.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Project Monitoring Evaluation and Assessment
1 Workshop EVALUATION PROCESS – PROGRAMMES AND PROJECTS Waterford 10 TH April Dr Jim Fitzpatrick Managing Director Fitzpatrick Associates 10 Lad.
National Commission for Academic Accreditation & Assessment Preparation for Developmental Reviews.
Benchmarking as a management tool for continuous improvement in public services u Presentation to Ministry of Culture of the Russian Federation u Peter.
1 SOCIAL RESEARCH ASSOCIATION IRELAND/IRISH EVALUATION NETWORK SEMINAR GOOD PRACTICE IN COMMISSIONING RESEARCH AND EVALUATION 10 TH JANUARY 2006.
The Analyst as a Project Manager
Quality evaluation and improvement for Internal Audit
1 PMIG PUBLIC SECTOR PROCUREMENT BEST PRACTICES & LESSONS LEARNED Kevin James Barrie Kroukamp.
Certificate IV in Project Management Project Management Environment Course Number Qualification Code BSB41507.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Project “Ex-ante evaluation of programming documents and strengthening evaluation capacity for EU funds post-accession” (EUROPAID/130401/D/SER/HR) Project.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
Workshop II Monitoring and Evaluation INTERACT ENPI Annual Conference December 2009 | Rome.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
Copyright  2005 McGraw-Hill Australia Pty Ltd PPTs t/a Australian Human Resources Management by Jeremy Seward and Tim Dein Slides prepared by Michelle.
Provincial M&E Forum 18 August 2011 The Presidency Department of Performance Monitoring and Evaluation Draft National Evaluation Policy Framework.
Evaluation Seminar Czech Republic CSF and OP Managing Authorities Session 3: Mid-Term Evaluation.
Quick Recap.
Organisation Development(OD)
Module 7- Evaluation: Quality and Standards. 17/02/20162 Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques.
The Role of the Internal and External Evaluators in Student Assessment Arthur Brown Advisor to the Quality Assurance and Accreditation Project Republic.
AssessPlanDo Review QuestionYesNo? Do I know what I want to evaluate and why? Consider drivers and audience Do I already know the answer to my evaluation.
Croatia: Result orientation within the process of preparation of programming documents V4+ Croatia and Slovenia Expert Level Conference Budapest,
Mrs Katie Enock National Information & Intelligence Workforce Programme Lead.
Information Technology Planning
Stages of Research and Development
Commissioning Care Act advocacy
Regulatory Strategies and Solutions Group, LLC
Evaluation Methodologies May 18, 2006
Project Management BBA & MBA
GUIDELINES Evaluation of National Rural Networks
17. Managing GIS © John Wiley & Sons Ltd.
Proposed Organisation of Evaluation of the Romanian NSRF and Operational Programmes, Niall McCann, Technical Assistance Project for Programming,
Right-sized Evaluation
Ivor Beazley, World Bank
PURPOSE AND SCOPE OF PROJECT MONITORING AND EVALUATION M&E.
Niall McCann, TA Project on Programming, Monitoring and Evaluation
Advocacy and CampaiGning
Application Form Sections 4-9 Christopher Parker & Kirsti Mijnhijmer 28 January 2009 – Copenhagen, Denmark European Union European Regional Development.
Evaluation of SF in Romania
Tips for tenderers Liz Frizi: Head of Procurement
Introduction to National Evaluation System
Outcome Harvesting nitty- gritty Promise and Pitfalls of Participatory Design Atlanta, 10:45–11:30 29 October, 2016.
Project Audit and Closure
Safety Culture Self-Assessment Methodology
Assess Plan Do Review Resource 1. Resources
Evaluation plans for programming period in Poland
Safety Management System Implementation
Using Data for Program Improvement
Quality Assurance Framework
Adult Education Survey : recommendations of the TF AES
Managing a PSIA process
ESF Evaluation Plan England
Using Data for Program Improvement
Use of External Consultants
Guidance document on ex ante evaluation
Plan your journey.
The Estonian experience with ex-ante evaluation – set-up and progress
Customer Satisfaction Measurement Work
Guidelines on the Mid-term Evaluation
Monitoring and Evaluating FGM/C abandonment programs
Where We Are Now 14–2. Where We Are Now 14–2 Major Tasks of Project Closure Evaluate if the project delivered the expected benefits to all stakeholders.
Project Audit and Closure
EUnetHTA Assembly May 2018.
NICE has many methods and processes
ESF evaluation partnership
Presentation transcript:

Evaluation from the External Evaluator’s Perspective MINISTERUL FINANŢELOR PUBLICE Autoritatea de Management pentru Cadrul de Sprijin Comunitar Unitatea Centrală de Evaluare Evaluation Working Group – second training seminar for evaluation staff of 2007-2013 Romanian NSRF and Operational Programmes Evaluation from the External Evaluator’s Perspective Dr Jim Fitzpatrick, Fitzpatrick Associates Economic Consultants, Ireland May 18, 2006

Content Evaluation as a “process” Tasks in a typical evaluation process Methodologies in practice Common problems in practice Procurement of evaluation - typical stages… Some “tips” for evaluation commissioners Pitfalls the evaluator faces Some tips for evaluators Wider issues for the future

Some overall considerations from the external consultant’s perspective… A wide variety of different contexts (e.g. doing v supervising, policy v service delivery, ex ante v ex post, technical v non-technical) “Planning” and “doing” closely related Experience across a wide range of organisations, topics, etc Overlaps with planning other types of assignments An external consultancy perspective

Evaluation is a process, not just a technique! Evaluation is a balancing act between.. Client, user relations Research, analysis Managing the team Stakeholder Involvement Time, resources, budget

Tasks in a typical evaluation process… Establish/Understand Context who is the “client”? why is the evaluation being done? any specific use intended? what kind of evaluation is needed? Obtain/Prepare/Agree Brief (ToR) is there one? is it clear? write one? is it agreed?

Evaluation tasks (continued)… 3. Prepare Work Plan (Proposal) overall approach (i.e. how interpreting brief, how going about it) analytical framework (i.e. overall logic) methodology/techniques (e.g. CBA, CEA, MCA, benchmarking) work programme (i.e. the data, data collection, e.g. surveys, interviews*) Evaluation Team/Resources (budget) no. of people/person days types of people necessary expertise (e.g. on technical aspects) *data not just statistics, includes other information

Evaluation tasks continued… 5. Doing the Evaluation implement method/work programme client relations manage team deal with unexpected issues 6. The Output/Report/Schedule meet how often, how many, when? nature of report e.g. length? style? nature? presentations?

Evaluation methodology in practice… trying to establish if intervention did (or will) make a difference so “with-without” (scientific method at its core) formal quantitative techniques very desirable, but very different MCA/scoring, weighting and ranking most used others useful are: before v after (time-series) places that do, don’t have (“control group”) “expert” opinion views of stakeholders always need some framework for answering the evaluation questions (samples available)

Common problems in practice… poor initial project/programme design inability to control for external influences poor/unavailable indicators (too few, too many, not really capturing essence of intervention) lack of consensus about purpose of evaluation “scope creep”

Procurement of evaluation - typical stages… Policy issue or topic, regulatory requirement Terms of Reference, brief Invitations, tendering Selection, contracting Inception Managing, undertaking, analysing Reporting

Common Challenges, Good Practice Stage Challenges Good practice 1. Policy issue, topic, need lack of clarity lack of client consensus clarity, consensus spell out objectives 2. Terms of Reference, Brief problems of no.1 flow in here over-specifying methodology “piece of string” problem don’t rush the ToR focus on objectives 3. Procurement, Tendering poor procedures overly formalised competitive procedures which prevent dialogue 4. Selection, Contracting, Inception disproportionate procedures rush to get started proportionality don’t rush the Inception

Common Needs, Challenges and Good Practice (Cont’d) Stage Challenges Good practice 5. Managing, Undertaking, Analysing unrealistic deadlines absence of data data collection v analysis multiple stakeholders “scope creep” good project management role of Steering Committees? avoid surprises 6. Reporting emphasis on written “tomes” difficulty finalising reports “reversing” compromises into reports separation of consultants’ recommendations from policy decision

Some comments on the procurement process… You need to balance competition with the need for dialogue with evaluators Can you invite too many bidders? Need for guidance on scale Circulation of all replies to all bidders! Ability, availability of clients representatives

Some practical tips for evaluation commissioners… ensure programme/project planning are good (monitoring, evaluation considered at outset) Make sure the Terms of Reference have: Clarity Focus Indicate Scale Relationships: Be open post selection Avoid surprises take time to get shared understanding of what’s happening ensure there is some kind of method/framework being used performance indicators – use “sensibly”, and note they are the fuel of monitoring/evaluation, they are not it themselves

Pitfalls the evaluator faces… Misunderstand context Objectives unclear, not agreed Client unclear, not agreed Lack of balance, being one-dimensional Thinking you know the answer Work that’s not used in the end Having no analytical framework Being over-ambitious Not having the right expertise Failing to consult stakeholders Not allowing time for project/process/management No “intellectual leadership” Report not doing work justice

Some practical tips for the evaluator watch for “scope-creep” keep re-reading the brief estimate time needed and double it! avoid surprising the client don’t over-promise structure the report early on set internal deadlines SATISFACTION = PERCEPTIONS MINUS EXPECTATIONS (S=P-E)

Wider Issues for the Future extent of evidence-informed, evaluation culture the need for research, evaluation to “speak” to policy makers need for more basic, neutral data collection balance between “independence” and “relevance” emphasis on costs of research/evaluation v costs of poor policy decision more inter-disciplinary research, evaluation (e.g. “economic” v “social”) over-evaluation of some areas, under-evaluation of others