Creating Robust Monitoring Systems Dr Susan Denman Head of Research, Monitoring & Evaluation Open Days Workshops Managing Regional Development September 2004 Brussels
Creating Robust Monitoring Systems How can we ensure that we get high quality data which are useful for programme management and evaluation purposes?
Structure of Presentation Welsh context Definitions Issues at design and planning stages Targeting projects for quality Issues at data collection and entry stages Reporting
The Welsh Context (1) Three main programmes: Objective 1 Objective 2 Objective 3 Four Community Initiatives: INTERREG IIIA, URBAN II, LEADER+ and EQUAL
The Welsh Context (2) Total European Funding of 2,180 million Total value of 4,675 million (including private sector match funding) Largest programme is Objective 1 Main aims: to create 43,000 net additional jobs to contribute to raising the Gross Domestic Product (GDP) reduce economic inactivity
ED&T Group of Welsh Assembly Government WEFO ERDF INTERREG Project Management Policy and Strategy ESF EAGGF FIFG LEADER+ EQUAL LCD URBAN Finance and Corporate Services Paying Authority FinancePaymentsArticle 4Article 10IT RM&E Policy Branch Communic ations Assembly Business
RM&E Research, Monitoring & Evaluation Branch Monitoring and Reporting 4 staff: 1 researcher, 3 administrators Responsibilities: PMC Papers Annual Monitoring Business Plans Annual Implementation Reports Local Partnership Reports Evaluation 1 researcher, 2 research and evaluation specialists Responsibilities: Data Quality Programme Level Evaluation M&E advice and guidance Head of Branch Corporate 1 administrator, 1 researcher Responsibilities: Training Dissemination Synthesis Papers Ex-Post Evaluation Closure Reports
Definitions Monitoring is the regular and systematic examination of the resources, activities and results to inform management decisions Audit is the verification of the legality and regularity of the implementation of resources. Evaluation is a broad concept concerning the judgement made on the value of the initiative, which can encompass the needs which have to be met and the effects produced by it. Evaluation concerns not only outcomes but the processes by which they have been produced.
Design and Planning Consider monitoring and evaluation at outset start with what not how involve users from start choose indicators with care ensure hierarchy of indicators is coherent build capacity
Indicators Helps if they are: clearly linked with goal, objectives or targets cover monitoring and evaluation needs measurable on a regular basis using reliable methods provide clear and simple information measurable against baselines congruent with other monitoring indicator systems
Data Provided by Projects Projects should: understand purpose of centrally collected data collect other useful data have access to training, support and guidance be clear on definitions be checked for quality of their plans
Project Sponsor Gateway Project development Submission of application Project appraisal project approval Delivery and review Payments Audit Closure WEFO guidance Advice on prioritisation from Thematic Advisory Groups WEFO advise WEFO appraise Advice from thematic groups, cross-cutting theme advisors and local partnerships WEFO monitor Advice from local partnerships and project officers Assistance from local partnerships and project officers Support and aftercare from local partnerships and project officers The Project Lifecycle and Monitoring
Data Collection and Storage timing and methods dealing with inaccuracies, gaps, double counting and under-reporting designing a good IT system testing and evaluating the whole monitoring system management checks of data entry liaison with global grants/grant schemes
Use of data involve users in report design co-ordinate release of data ensure users understand data limitations ensure that definitions are known and understood by all analyse the data collected help with data interpretation