Study on the monitoring and evaluation systems of the ESF [Contract VC/2017/0131] 4 online surveys + selection of 53 EPs based on several criteria: all MS covered, category of region, multifund/monofund, YEI, use of CIE and budget. Results presented based on replies and clarifications received. New replies and clarifications helped in developing proposals and difficulties, but no changes in the characterisation of the systems. Replies from all MS for Survey A & B. Survey C, no replies from: BE, DE, ES, FR, IT, MT, RO. Survey D, no replies from: ES, IT, MT. Consultation, no replies from : DE, HR, HU, MT We present preliminary findings, not focusing on statistics, but rather on interpretation and analysis of problems and solutions. Not all problems are presented here, only most important ones. Report includes analysis of all issues identified. Further assessment complemented with other sources is foreseen. ESF Evaluation Partnership Meeting 7th December 2017, Brussels
Monitoring: problems and possible solutions Preliminary results Monitoring: problems and possible solutions Monitoring requirements - Problems Complexity of requirements Relevance & specification of (common) indicators Possible solutions Reduce number of indicators Introduce flexible reporting requirements Revise indicators: To harmonise indicators/definitions across funds To facilitate use of administrative registers To broaden scope (e.g. education and soft indicators) To simplify complex indicators (e.g. other disadvantaged) Problems: Complexity of requirements: MAs: understanding (67%) and coordinating with beneficiaries (65%). MCs: monitoring and reporting requirements challenging, regulations too detailed. difficult to understand, coordination between MAs and beneficiaries difficult, regulations too detailed. Relevance and specification of common indicators: indicators focused on labour market status and transitions not appropriate for all priority axes Solutions: Reduce number of indicators: Different views (PS vs common indicators). MA & beneficiaries would like to see the personal data collection requirements reduced (removed or made optional). Particular mention was made of sensitive personal data - not only to reduce the administrative burden, but also to avoid the risk of deterring participation (see below in relation to primary data collection) – and indicators related to household situation. Flexible reporting requirements: report only on indicators relevant to the specific activities, outputs and results of a programme (e.g. by TO). Harmonise indicators across funds: particularly beneficial (in terms of reducing the burden) in case of existing multi-fund monitoring systems and could help to pave the way for more such systems in future. MC respondents also stressed that this would make the job of evaluation and monitoring easier. Facilitating use of registers: by having more flexible definitions of the indicators, for example (see section on use of registers). Implies dropping indicators for which information cannot be obtained from registers (e.g. homelessness, improved labour market situation, household situation, etc.). Limits the potential use of soft indicators Study on the monitoring and evaluation systems of the ESF
Monitoring: problems and possible solutions Preliminary results Monitoring: problems and possible solutions Data collection (direct data collection) - Problems High volume of data requested Difficulties with data collection on entry Difficulties with data collection upon and after exit Possible solutions Reduce the number of variables (part. sensitive data and hh situation) Possible solutions (MS level) Increase the use of existing administrative data Good practices reported by beneficiaries: allocation of staff to assist participants with questionnaires simplification of questionnaires (easy to understand language) collecting data during implementation instead upon entry Problems: Volume of data requested: Most common problem for beneficiaries and MAs Collecting data on entry: Collection of personal data: Participants refusing to provide personal data, obtaining data for minors, strict data protection laws. Data collection methods/forms: e.g. questionnaires too difficult, problems for migrants, difficulties for those with limited IT skills to fill online applications, etc. Collecting data upon and after exit: Collecting data for participants that dropped out (Time of exit not known in advance, thus, data collection cannot be scheduled.), Reaching specific groups (e.g. migrants and youth that change contact details), Low response rates from participants 6 months after exiting the operation. Study on the monitoring and evaluation systems of the ESF
Monitoring: problems and possible solutions Data collection (admin. registers) - Problems Inconsistencies between variables Tight privacy/data protection laws Synchronising existing data with ESF observation points Lack of cooperation of relevant organisations in the country Possible solutions Introduce more flexible indicator definitions Tolerate differences between available data and ESF observation dates Total OPs using administrative data: 16/68 OPs covered by the survey (13 MS) Problems: - Inconsistency between variables: administrative registers do not cover all variables and/or population; differences in definitions >Problem with participants refusing to provide personal data required to link registers (e.g. ID) may also occur. Possible solutions (MS level) Take necessary legal steps to facilitate/increase use of existing admin. data
Monitoring: problems and possible solutions Preliminary results Monitoring: problems and possible solutions Use of data – Problems MC meetings: difficulties with the assessment of performance/progress Impact evaluations: limitations in available data Performance incentives: issues to define relevant targets and baselines, unintended outcomes from performance incentive system Possible solutions Improve access to monitoring and performance data (Open Data Portal delayed) Organise training and mutual learning events for MAs and MCs in the preparation of OPs Monitor costs per output or per result Proposals on the relevance of indicators (mentioned above) Problems Two fifths of MC respondents (39%) reported some difficulties with the assessment of performance and progress, though fewer (23%) had issues with access to information to review the OP. MC meetings: Recurring issues: timeliness of information and inappropriate presentation of data, meetings organised at short notice, with insufficient time for preparation, and not sufficient time allocated for the discussion of performance and effectiveness. Unintended outcomes: use of targets to limit funding, discouragement of potential beneficiaries, encouraging rapid use of funds ahead of result orientation, “creaming” of participants. Possible solutions: Training and mutual learning events to set up targets and baselines. Seven in ten MCs (72%) supported the idea of monitoring costs per output or per result. Concerns about admin. burden, comparison of results of different types of intervention, methods required complicated and not cost efficient. Other possible solutions mentioned: - 50% of MCs welcome more active involvement of the Commission in their meetings (i.e. not only as an observer). Rather than having voting rights, the respondents suggest that the EC should have an increased competence in making recommendations, bringing good examples from other countries, raising new ideas and drawing attention to recent developments. Proposal from MCs was to have a common operational framework/guidelines for MCs- adopting a common approach to assessment processes could improve efficiency. Three quarters (76%) of MC respondents were in favour of a more efficient system of performance incentives. But no specific proposals were made in this respect. Possible solutions (MS level) Improve access to monitoring and performance data (e.g. IT, NL & LT) Study on the monitoring and evaluation systems of the ESF
Evaluation: problems and possible solutions Preliminary results Evaluation: problems and possible solutions Quality of evaluation plans - Problems Evaluation questions not always specified Evaluation methods confused with data collection methods and not linked to envisaged questions Little information on data requirements & data collection procedures and on use of findings Possible solutions Common obligatory evaluation questions (MCs in favour, MAs mixed views) Require detailed information on data requirements in EvPs (77%) reported no difficulty with preparing EvP, however, weaknesses were identified in assessments of EvP by Helpdesk. Some of these shortcomings may reflect a deliberate intention on the part of MAs to allow expert contractors to propose the most appropriate evaluation methodology and questions. However, this strategy has various drawbacks, in particular that the data required for the methods proposed may not be available when the evaluation is undertaken, which can result in significant delays and/or compromises in the quality and robustness of the evaluations. EC competence: Common evaluation questions: In favour: simplify and accelerate the evaluation process (for instance by helping MAs in the preparation of the evaluation plans and in drafting the tender specifications), and facilitate the comparison and aggregation of results within and across Member States. Questions should be very general and basic, separate for each TO/IP. Against: difficult to relevant and useful for all ESF-programmes; data availability and the needs of the supported groups; extra administrative burden. Respondents were also asked about mid-term evaluations: (MAs against, MCs in favour) Favour: would help to make necessary amendments to the OPs. Against: few results available; particularly when operations do not begin at the same time and have varying speeds of implementation; difficult (and costly) to conduct multiple simultaneous evaluations MA competence: - Partnership with data providers: in order to better plan data availability, data needs and data collection. - Common evaluation framework: help rationalise evaluations and ensure coherence. Common guidelines for all OPs can enable the harmonisation of practices across OPs, the active involvement of all OPs and, overall, faster and more efficient processes. It would give a minimum standard for all evaluations. >Link to proposals on use of findings later (last slide) Possible solutions (MS level) Partnership with data providers to prepare evaluation activities (e.g. BG) Common evaluation framework for all OPs in the MS (e.g. ES, PT) Study on the monitoring and evaluation systems of the ESF
Evaluation: problems and possible solutions Preliminary results Evaluation: problems and possible solutions Implementation of evaluations - Problems Reasons for delay in evaluations: Delay in implementation of OP /late start of projects Issues with data availability Difficulties with the procurement process (e.g. low quality offers) Difficulties with the set-up of the evaluation system Lack of resources within MAs Possible solutions Simplify evaluation requirements for small programmes - 36 EvP: 615 evaluations, of which 90 evaluations already delayed. Problems: - Procurement process: e.g. low quality offers, may imply have to relaunch the process. - Difficulties with evaluation systems: for example, governance, defining evaluation competences. Solutions: - Focus evaluation efforts in areas receiving majority of funds - Terms of reference: data requirements (next slide on data availability) and ensure quality offers & relevant expertise (see later). ToR need to be comprehensive It is a key issue to address many problems in evaluations. Possible solutions (MS level) Include data requirements in the ToR and focus on quality criteria Study on the monitoring and evaluation systems of the ESF
Evaluation: problems and possible solutions Preliminary results Evaluation: problems and possible solutions Availability of data Data required not available from MIS Direct data collection difficulties Difficulties in accessing administrative registers (legal restrictions) Difficulties for data on control group Possible solutions Obligatory specification of data requirements in EvPs Problems: Limited availability of appropriate/relevant data (63% of MAs) Data not available from MIS: parallel data collection required. Direct data collection difficulties: low response rates & personal sensitive information (similar issues mentioned in monitoring) Difficulties for data on control group data: maybe related with both identifying control group (similar characteristics than ESF participants), as well as obtaining the required data from individuals. Note: some problems on data availability are similar to those outlined in monitoring (data collection, use of registers). Solutions here only address the evaluation element. >Interestingly, these issues arise not only in cases where data are collected directly from participants but also in cases where administrative registers are used, highlighting that even when registers are made accessible for the purposes of ESF monitoring/evaluation, the available data may not fulfil the evaluation needs. >Difficulties in this respect reflect the fact that data needs and requirements have generally not been adequately dealt with in the evaluation plans so that no provision has been made to ensure that the necessary data would be available. Thus, the solution at EC level is: Solutions – EC level: Obligatory specification of data requirements in EvPs: so that data needs are taken into account in programme design Solutions – MS level: - The ToR, in addition to the data that are already available (e.g. from the MIS), should include what data needs to be collected. Possible solutions (MS level) Better definition of evaluation questions in EvPs Include data requirements specifications in ToR for potential evaluators Facilitate access to administrative data Study on the monitoring and evaluation systems of the ESF
Evaluation: problems and possible solutions Preliminary results Evaluation: problems and possible solutions Relevant expertise Lack of technical expertise within MAs & MCs on evaluations, part. CIEs Contractors without necessary expertise due to selection criteria Difficulties in finding suitable experts (ESF & evaluation techniques) Possible solutions Facilitate access to evaluation expertise and good practices Possible solutions (MS level) Strengthen capacity of MAs (e.g. training, mutual learning events) Reduce the weight of price in the award criteria Increase the maximum price in the ToR to attract evaluators Encourage involvement of academic evaluation experts Problems: Difficulties in expertise in CIE: partly because this method was not used in the previous programming periods) Contractors without necessary expertise due to selection criteria: too much emphasis on price over technical quality. Difficulty to find suitable experts exacerbated by the increased number of evaluations required and is more problematic in small countries. Solutions are not related to evaluation requirements at EU level. Increase maximum prices: For 80% of the MA respondents, the financial resources available for evaluations are not an issue. Higher prices may also attract foreign experts with more suitable profiles. Study on the monitoring and evaluation systems of the ESF
Evaluation: problems and possible solutions Preliminary results Evaluation: problems and possible solutions Use of findings Evaluations from previous programming period completed after OP design Evaluations do not assess impact (e.g. too general) Limited information and lack of formal procedures to ensure follow-up of recommendations Possible solutions Include section in all AIR to report on follow-up of recommendations Problems: Evaluations completed after OP design: see possible solutions on implementation of evaluations Evaluations do not assess impact: see possible solutions on relevant expertise/quality of evaluation plans. Sign of being just administrative procedure. Limited information on the follow-up of recommendations in the AIR (assessment of submitted AIR), and lack of formal procedures. Solutions: - Section in AIR on follow-up of recommendations: this is already in AIR2017, AIR2019 and final report. Although information is lacking in many cases. Most MAs respondents agreed on usefulness of existing section on evaluation findings in AIR, although some mentioned administrative burden. Possible solutions (MS level) Examples of actions to follow-up of recommendations: biennial report on status of each recommendation (IT) upload of recommendations in the MIS (CZ) indicator on % of recommendations addressed (ES) Study on the monitoring and evaluation systems of the ESF
M&E of Conditional Support – Steps taken so far Literature review on M&E schemes for Conditional Support (CS) Adaptation of monitoring scheme from DG DEVCO/DG NEAR to ESF (and other DG EMPL funds) related CS Adaptation of Comprehensive Evaluation Framework from OECD/DAC to ESF related CS Construction of a potential intervention logic for an ESF related CS support Analysis of the practical implications for M&E of CS along the lines of the intervention logic
Next steps? Focus groups with selected participants (survey respondents & experts) will be held (Jan 2018) to debate proposals for improving monitoring and evaluation requirements for the next programming period. Set of proposals considering 2 options: Option A: unchanged ESF scope and delivery mechanisms (but with some possible changes) Option B: ESF into a single Human Capital fund Design of structure and formulation of M&E framework for a CS program Focus groups: Selection of participants has been based on various criteria from respondents (experience, type of stakeholders, MS, organization of ESF in the country, type of monitoring and evaluation system etc. Other participants also invited (e.g. experts, EC officials). A set of proposals for changes will be prepared based on preliminary findings complemented with other sources of information. They will include possible solutions presented here plus others. More detailed assessment of feasibility and implications of changes in M&E requirements. Conditional support: final formulation of M&E framework for CS following any inputs required and establishment of conditions for the CS framework. Study on the monitoring and evaluation systems of the ESF
Thank you! esf@applica.be Study on the monitoring and evaluation systems of the ESF