Download presentation
Presentation is loading. Please wait.
Published byΆτροπος Φωτόπουλος Modified over 5 years ago
1
Evaluation Helpdesk Peer reviews of Cohesion policy evaluations
Terry Ward Applica ESF Evaluation Partnership Meeting Brussels, 28 September 2017 ISMERI EUROPA
2
Helpdesk activities Three main activities of Helpdesk in relation to ESF: to report on all evaluations of Cohesion policy programmes undertaken in current programming period and provide short reviews of these to review evaluation plans of MAs for current period and revisions of these to undertake peer reviews by experts of selected evaluations of Cohesion policy programmes
3
Peer reviews Aim: to subject selected evaluations to critical appraisal by leading evaluation experts to highlight strengths and weaknesses and ways of improving Purpose: to encourage MAs to undertake good quality evaluations and to demonstrate how to do so From evaluations reviewed, shortcomings evident as discussed at Evaluation Partnership meeting in Milan in October last year: evaluation questions – too many, too imprecise, too ambitious methods specified not appropriate to address evaluation questions inadequate supervision of evaluation process Failures originating partly from deficiencies in evaluation process itself: ToRs poorly framed restrictive procurement regulation weight of price in selection criteria disproportional
4
Peer reviews – extended approach
Implication: to improve evaluations, need to review whole evaluation process to identify reasons for shortcomings and how to correct them – i.e. not only to advise evaluators but also MAs employing them To subject ‘evaluation dossiers’ rather than just final evaluation reports to critical review, covering: ToRs Selection of evaluators Budget Inception and interim reports as well as final Other documents relating to process Requires cooperation of MAs to have access to reports, documents and details of evaluation process To open up peer review meetings to MAs (and possibly evaluators) so that they can learn directly from experts’ comments and advice and enter into discussion with them
5
Putting these steps into practice
Three extended peer reviews in 2017: First in Malta on ERDF programmes in May Second in Berlin on ESF programmes in July Third will take place in Tallinn in October Two dossiers subject of reviews at each meeting: Completed programme: Dossier of ex post evaluation relating to period Ongoing programme: Dossier of planned evaluation and ToR (Malta) and of mid-term evaluation of programme (Berlin, Tallinn) Both dossiers subjected to critical appraisal by 3-4 leading evaluation experts with MA present at meeting (plus evaluators in Berlin) Written assessments by experts on various elements in dossiers circulated to participants ahead of meeting
6
Zoom on peer review meeting in Berlin 14 July
One-day meeting: 17 participants, 2 evaluation dossiers, discussion in German and English (simultaneous translation) Meeting was a pilot exercise: first time that evaluators participated Specific points from discussion: Two step tendering process: participation competition followed by award competition Selection criteria in participation competition: experience and professional qualification Successful tenderers in participation competition, invited to prepare proposal and make presentation. Details discussed at presentation. Offer revised and fine tuned. Price is part of tender competition: budget not mentioned in ToR Conflict between political and programme agenda: pressure to present results earlier than available – evaluation too early to capture effects entirely Ex post evaluation logic conflicts with timely planning of data needed for randomisation approaches and other innovative methods
7
Results of the extended peer reviews – first observations
Discussion at meeting of elements in evaluation dossier helped to better understand process and identify strong and weak points in cycle In one case, national procurement regulations found to hinder selection of evaluators – need for change Expert guidance on ways of improving evaluation more tangible and to the point because of understanding of underlying process and MA aims Criteria used to assess evaluations sent before meeting useful for: highlighting minimum set of standards which evaluations should comply with demystifying peer review process by making approach transparent Extended peer reviews require more effort from both peer reviewers and MAs, but result in more useful outcomes – MAs learn as well as evaluators
8
Evaluation Helpdesk is ready and willing to support MAs in similar way
How widen effects? As emphasised by Gábor TÓTH at last ESF Evaluation Partnership meeting in March this year: Follow an evaluation from its inception and advise on each part of process before put into effect so as to improve it To use such evaluations as ‘showcases’ so that other MAs can learn from them Invite other MAs to participate in meetings and learn from discussion Evaluation Helpdesk is ready and willing to support MAs in similar way
9
Thank you for your attention
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.