Download presentation
Presentation is loading. Please wait.
Published byJulian Bryan Modified over 9 years ago
1
EVALUATION HELPDESK 2014-2020 Quality assessment of Evaluation Plans Evaluation Network Meeting 5-6 November 2015
2
Evaluation Helpdesk 2014-2020 Introduction Quality assessment of evaluation plans (EPs) is one of the task of the Evaluation Helpdesk; other tasks are: to collect and summarise evaluations carried out over 2014-20 period and maintain a database on evaluations and evaluation plans (Task 1) to manage peer reviews of selected evaluations (Task 2) to organise training for Managing Authorities on evaluation (Task 3) to provide methodological support to MAs on evaluation and related issues (Task 4) Quality assessments based on structured approach developed by high-level evaluation experts during last year’s ‘Pilot’ Evaluation helpdesk Structured approach covers all key requirements set out in Commission’s Guidance Document on Evaluation Plans
3
Evaluation Helpdesk 2014-2020 Structured approach for quality assessment of evaluation plans Six Focus areas for assessing quality of evaluation plans: 1. Management and planning: evaluation function, use of available evidence, time planning, quality management 2. Responsibility and coordination: partnership involvement, MA coordination, Cross MA coordination, budget 3. Design and methods: evaluation design, selection of designs and methods, results orientation, contribution to results 4. Data availability and data systems: data requirement, data availability, comprehensive data sets 5. Skills and expertise: evaluation independence, internal expertise, evaluation networks and providers, training and development 6. Use and communication: evaluation users, evaluation communication, analysis and comparison at EU level
4
Evaluation Helpdesk 2014-2020 Implementation of structured approach and work carried out so far Development of template with questions covering 6 focus areas. Assessments carried out by Applica, Ismeri Europa and network of national experts Since start in September 2015, over 50 evaluation plans reviewed (i.e. just over 1 a day on average) Breakdown by country : 14 French OPs; 8 German; 4 Slovak, 3 Romanian; 2 for Croatia, Czech Republic, Finland, Poland, Spain, UK; 1 in Italy, Malta, Netherlands, Portugal, Slovenia, Sweden. 4 EPs of ETC Breakdown by funding: 27 plans of OPs funded by ERDF and Cohesion Fund, 8 ESF OPs and 16 OPs with mixed funding
5
Evaluation Helpdesk 2014-2020 Main findings from the assessments carried out so far The evaluation plans are relatively complete and coherent with regard to: The set-up of an operating evaluation function to plan, procure, coordinate and manage evaluations The division of responsibilities and coordination of MAs in terms of stated powers and obligations The involvement of relevant partners and stakeholders in defining the evaluation plan and implementing it The independence of evaluators and the units responsible for evaluations The identification of users of evaluation and the communication and dissemination of evaluation findings The budget set aside for the evaluations planned
6
Evaluation Helpdesk 2014-2020 Main findings from the assessments carried out so far (continued) The evaluation plans are less complete and coherent with regard to: Quality management and application of quality criteria to review deliverables of evaluation process Internal expertise available for evaluations and involvement of external sources of expertise Training and development of internal staff of evaluation unit and identification of needs and sources of training Scheduling of evaluations and feeding of evaluation findings into decision-making Coordination between MAs in terms of arrangements to exchange cross-cutting aspects of evaluation Planning of operational requirements of evaluations to enable analysis and comparison at EU level
7
Evaluation Helpdesk 2014-2020 Main findings from the assessments carried out so far (continued) But main weaknesses common to many plans relate to: Limited use made of existing evidence from past evaluations and research to identify main gaps in knowledge about effect of programmes and measures supported Evaluation design - especially failure to set out key evaluation questions to be investigated and why The approaches or methods selected to address these evaluation questions and rationale for choice Identification of data required to answer evaluation questions in enough detail to be able to define data sources and check availability Assessment of data available and identification of possible gaps and deficiencies, including where counterfactual analysis planned of data for non-recipients of funding Formulation of plan to fill gaps in data and correct deficiencies
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.