ESF Evaluation Plan England

Slides:



Advertisements
Similar presentations
Evaluation at NRCan: Information for Program Managers Strategic Evaluation Division Science & Policy Integration July 2012.
Advertisements

Lifecycle of a project. Project Management What makes a successful project Planning Communication The Project Plan Reporting.
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Delivering as One UN Albania October 2009 – Kigali.
A Health and Wellbeing Board for Leicestershire Cheryl Davenport Programme Director.
Developing an Evaluation Strategy – experience in DFID Nick York Director – Country, Corporate and Global Evaluations, World Bank IEG Former Chief Professional.
UNITED NATIONS NORMS AND STANDARDS FOR EVALUATION UNITED NATIONS EVALUATION GROUP (UNEG) Maya Bachner WIPO IDEAS 1st BIENNIAL CONFERENCE, NEW DELHI, APRIL.
Governance and delivery through Opt-In Organisations.
Challenge Questions How good is our operational management?
Strategic Environmental Assessment and environmental issues in programme evaluation Ivana Capozza Italian Evaluation Units Open Days Roma, July 5, 2006.
Evaluation plans for programming period in Poland Experience and new arrangements Ministry of Infrastructure and Development, Poland Athens,
EU Project Croatia 12 th May 2011 UK Sector Skills Councils, working together.
The Draft Indicative SEN Code of Practice – Primary and Special School Head teachers briefing Tessa HodgsonSept 2013 CHILDREN’S & ADULTS’ SERVICES.
Report on the Evaluation Function Evaluation Office.
EU-Regional Policy Structural actions 1 Structural Funds Evaluation A VIEW FROM THE EUROPEAN COMMISSION Anna Burylo, DG Regional Policy, Evaluation.
Name Position Organisation Date. What is data integration? Dataset A Dataset B Integrated dataset Education data + EMPLOYMENT data = understanding education.
Special Educational Needs Reforms What is happening in Wandsworth.
School Improvement Partnership Programme: Summary of interim findings March 2014.
ESEA Consolidated Monitoring Office of Federal Programs December 10, 2013.
Action 12:Internal Monitoring BMW Regional Assembly.
Use of Resources Calderdale workshop 18 May 2009 Janet Matthews- Audit Commission Steve Brennan- Calderdale PCT.
Building Better Opportunities January 2016 Sue Ormiston and Kate Sawdy.
Future Council Programme Update to the Birmingham Smart City Commission 17 June 2015 Page 1.
Project Planning and Management Gail Campbell and Tom Broadhurst.
WP6 – Monitoring and Evaluation 17th November 2014 Rome.
The Global Partnership Monitoring Framework Purpose and Scope of Monitoring, Role of Participating Countries UNDP-OECD support team Copenhagen, 12 June,
Understanding DWCPs, tripartite process and role of Trade Unions How the ILO works at a national level.
1 Establishing a New Gallaudet Program Review Process Pat Hulsebosch Office of Academic Quality CUE – 9/3/08: CGE – 9/16/08.
Knowledge for Healthcare: Driver Diagrams October 2016
Presentation to the Portfolio Committee on Communications on the:
Minimum Contract Level Data, Management Information and Data Sharing January 2012 Barry Langfield, LSIS Partnership Adviser Emphasise for contractors.
Understanding DWCPs, tripartite process and role of Trade Unions
UCL Annual Student Experience Review
Equality and Human Rights Exchange Network
2 Seas Monitoring and Evaluation approach INTERACT seminar – Evaluation Plan workshop Vienna, 12 November 2015.
Information Sharing for Integrated care A 5 Step Blueprint
5 April 2016 Briefing to the Higher Education Portfolio Committee on review of the draft APPs.
08 March 2016 Briefing to the Portfolio Committee of Tourism on review of the draft APP.
Leading Teaching and Learning through School Self-Evaluation
05 April 2016 Briefing to the Portfolio Committee on review of the draft APP - Department of Arts and Culture.
New Zealand Partnerships for International Development
16 May 2018 Briefing to the Portfolio Committee of the Department of Sport and Recreation portfolio on the review of the draft APP.
Hull CLLD Round 2 Information Events
Draft Guidance Document (ERDF/ESF)
Summary of key findings Inga Pavlovaite
Linking assurance and enhancement
CRUE – The Way Forward Vicki Jackson
Evaluation plans for programming period in Poland
Evaluation in the GEF and Training Module on Terminal Evaluations
The role of the ECCP (1) The involvement of all relevant stakeholders – public authorities, economic and social partners and civil society bodies – at.
The partnership principle in the implementation of the CSF funds ___ Elements for a European Code of Conduct.
ESF EVALUATION PARTNERSHIP MEETING 21 March 2014
Understanding DWCPs, tripartite process and role of Trade Unions
Joint inspections and co-operation in Scotland
Post-2020 discussions 1. State of play of discussions 2. On-going work 3. Questions for debate.
Managing a PSIA process
Strategy
Understanding DWCPs, tripartite process and role of Trade Unions
Developing a User Involvement Strategy.
Martyn Pennington Head of Unit, EuropeAid
Guidelines on the Mid-term Evaluation
BNSSG STP Apprenticeships
ESF Ex-post evaluation Finalisation and Dissemination
Evaluating Community Link Working in Scotland: Learning from the ‘early adopters’ Jane Ford, NHS Health Scotland Themina Mohammed & Gordon Hunt, NSS Local.
Local Authority Research Difference to Services for
NICE has many methods and processes
Roles and Responsibilities
Hazel Benza Employability and Third Sector Secondment Overview.
Evaluation of Youth Employment Initiative
4th Meeting of the Expert Group on the Integration of Statistical and Geospatial Information (UN EG-ISGI) – Nov 2017 Summary of progress Martin Brady,
Presentation transcript:

ESF Evaluation Plan England 2014-2020 Bruce Byrne, Head of England ESF evaluation team

Development of the evaluation plan January to March 2014 Develop ideas Development of the evaluation plan April Workshop May First draft June Consultation Ex ante evaluators Timing to be agreed Second draft Timing to be agreed Management Approval Submission to Monitoring Committee November/December

Principles for where to evaluate Requirements - what is “the impact”? - how/why does it work? - how can the provision be improved? - how do our different contracting arrangements help provision to be a success? Data availability - E.g. counterfactual impact evaluations need rich data to ensure you have a genuine counterfactual. Expertise - Where this is for evaluation methods and knowledge of the data. Budgets - Where these come from What we already know about what works - Less need for evaluation where we already have good evidence

Other considerations Role of Local Enterprise Partnerships We don’t think (most) local bodies have the capacity to add value by conducting their own evaluations. Do need their buy-in, to help providers see the importance of evaluation. Do need to ensure that results are feeding into local bodies to aid decisions about what provision they want in their area. Oversight of the plan 2007-13 programme has oversight from the Evaluation sub-committee (and to a lesser extent directly from the National Monitoring Committee) For 2014-2020 we are looking at ways to make this role more strategic by: - making this a joint ERDF/ESF group. - adding independent evaluation expertise to the group.

Improving how we use data All participants should sign consent for their data to be linked for evaluation purposes All providers should make available participant names and contact details for the leavers survey. All participants should sign consent to share these for evaluation purposes Data requirements for evaluation need to be embedded at the beginning of the programme Requirements for evaluation should be built into contracts in the same way that requirements for management information currently are

Progress (1) Good progress on: Learning lessons from 2007-13 Establishing oversight Arrangements for improving data Timings Outstanding work to do on: What the evaluation team will look like The approach to evaluation e.g. big projects looking across priorities and delivery areas or a series of more focused studies followed by a synthesis of findings? Budgets required Specifying individual evaluations

Progress (2): Early thoughts on individual evaluations 2015 YEI Implementation evaluation 2016 ESF Implementation evaluations Set up leavers survey 2017 First year of fieldwork for leavers survey Start of counterfactual impact evaluation 2018 Final year of fieldwork for leavers survey Conclusion of counterfactual impact evaluation Theory based impact evaluation YEI impact evaluation

Summary: Key points in building the England ESF Evaluation Plan Consulting on the plan is going to be particularly important as we are moving to a new organisational set up Ensuring clarity of purpose on what each evaluation will add and how we will use it Doing more counterfactual impact evaluations can require a lot of preparation, ensuring: the understanding of the data; the expertise in methods; and the right resources are in place. Getting the data sorted out now so that when it comes to doing the evaluations everyone’s life will be a lot easier. Securing early approval from the National Monitoring Committee so that we can put in place data collection and start the YEI evaluation