Connections and issues in the European evaluation context Professor Murray Saunders President of the European Evaluation Society.

Slides:



Advertisements
Similar presentations
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Advertisements

Evaluation use and usability: theoretical foundations and current state of international thinking Murray Saunders IOCE Lancaster University Joint Meeting.
The Research Forum For Allied Health Professionals Professor Wesley Vernon.
Implications for Think Tanks Need to be able to: –Understand the political context –Do credible research –Communicate effectively –Work with others Need.
Action Learning Set: Support for Middle Leadership in Multi- agency settings Summary of progress: January 20th Output from questionnaires: -What.
Commissioning Dignity in Care Homes Clare Henderson Asst. Director Planning, Independence & Older Adults Sue Newton Commissioning Manager Older Adults.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Assessing student learning from Public Engagement David Owen National Co-ordinating Centre for Public Engagement Funded by the UK Funding Councils, Research.
Developing an Evaluation Strategy – experience in DFID Nick York Director – Country, Corporate and Global Evaluations, World Bank IEG Former Chief Professional.
What Gets Measured Gets Done Presented by Frances Head George Elliott.
© UKCIP 2011 Learning and Informing Practice: The role of knowledge exchange Roger B Street Technical Director Friday, 25 th November 2011 Crew Project.
Research and KE Opportunities IMPAKT programme (October 2011); SPA Research/KE programme; Small Grant Competition; Practitioner Fellowships; Opportunities.
Workshop Policy & Science: Who defines the problem? 7 th of July 2014, Charles Darwin House, Central London Workshop Policy & Science: Who defines the.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Putting Research Evidence to Work Research Seminar 14 th January 2009.
Project Monitoring Evaluation and Assessment
Higher Education Academy Annual Conference Nottingham July 5-6th 2011
MSP course 2007 Phase 0 – Setting up Kumasi, Ghana 2008 Wageningen International.
Improving the Energy Efficiency of the Heat and Hot Water Supply Presenter: Bayramgul Garabaeva, Programme Officer Decentralization and Community Development.
Evaluation of OCHA’s Role in Humanitarian Civil-Military Coordination Findings and Recommendations Seminar on Evaluation of UN Support for Conflict Affected.
Planning and submitting a shadow report Charlotte Gage Women’s Resource Centre.
Ofsted lessons Clerks’ Update Jan Ofsted Sept 2012 The key judgements: Inspectors must judge the quality of education provided in the school – its.
Scrutiny and Public Engagement 15 June 2012 Tim Buckle
Professional Certificate – Managing Public Accounts Committees Ian “Ren” Rennie.
GOVERNMENT OF ROMANIA MINISTRY OF PUBLIC FINANCE MANAGING AUTHORITY FOR COMMUNITY SUPPORT FRAMEWORK Evaluation Central Unit Development of the Evaluation.
APAPDC National Safe Schools Framework Project. Aim of the project To assist schools with no or limited systemic support to align their policies, programs.
The institute for advanced studies ias “… ideas for a better world ” “To generate and disseminate innovative interdisciplinary responses to challenges.
JOINT STRATEGIC NEEDS ASSESSMENT Rebecca Cohen Policy Specialist, Chief Executive’s.
SPA Seminar Managing Numbers for 2010 Entry Welcome and context 2 June 2010 Janet Graham, Director of SPA.
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
Roma Education Fund Presentation by Rumyan Russinov Deputy Director.
Key features of the University of Manchester Professor Cathy Cassell Deputy Director (Academic) Sarah Featherstone Head of Undergraduate Services Original.
Towards a European network for digital preservation Ideas for a proposal Mariella Guercio, University of Urbino.
Evaluating Effectiveness To what effect? Joanne Sharpe, Office of Development Effectiveness, AusAID.
Sub-Regional Workshop for GEF Focal Points in West and Central Africa Accra, Ghana, 9-11 July 2009 Tracking National Portfolios and Assessing Results.
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
Country-led Development Evaluation The Donor Role in Supporting Partner Ownership and Capacity Mr. Hans Lundgren March 2009.
Paul Griffiths and Roland Simon Wrap-up presentation What has the EMCDDA learned ?
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
From policy to practice and back to policy 3 March 2015 Children in Wales Conference Gail Bennett, Parenting Network Chair Flintshire Early Years and Family.
Regional Co-operation Council Workshop on enhancing women entrepreneurs in SEE Milena Corradini Sarajevo, 1 October 2009.
BCO Impact Assessment Component 3 Scoping Study David Souter.
111 ABCI 6 March 2009 Internal Communication at the European Commission.
School Improvement Partnership Programme: Summary of interim findings March 2014.
European Public Health Alliance Lobbying, the role of NGOs and communication strategies Tamsin Rose Sofia, 29 October 2005.
1 Self-directed Support – Older People’s Service Providers EVOC thinkSpace 20 June 2014.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
National Oceanography Centre, Southampton European Way, Southampton SO14 3ZH, UK Tel: National Marine Coordination Office.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Evaluating your EQUIP Initiative Helen King. Objectives To enable teams to develop a shared understanding of the purpose, use and stakeholders for evaluation;
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Supporting policy development in the field of ICH in Africa CONCLUSIONS Constantine — ALGERIA 28 September to 2 October 2015.
PRESENTATION AT THE TECHNOLOGICAL UNIVERSITIES QUALITY FRAMEWORK Professor Sarah Moore, Chair, National Forum for the Enhancement of Teaching and Learning.
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Global Partnership for Enhanced Social Accountability (GPESA) December 19, 2011 World Bank.
Transforming lives through learning CLD Annual Conference: October 29 th Putting our ambitions for community development into practice An overview of the.
Capacity Development Results Framework A strategic and results-oriented approach to learning for capacity development.
EUROPEAN COMMISSION Directorate-General for Education and Culture Life Long Learning: Education and Training policies School Education and Higher education.
Being The Best We Can A self-evaluation & improvement process for libraries Key results for Victoria’s public library services.
United Nations Economic Commission for Europe Statistical Division WHAT MAKES AN EFFECTIVE AND EFFICIENT STATISTICAL SYSTEM Lidia Bratanova, Statistical.
Stimulating innovation in engaged practice and developing institutional cultures that support it 3. Capacity building and skills development  Supporting.
ChiMat user survey and feedback: highlights ChiMat Board Meeting – 29 March 2010.
European Agency for Development in Special Needs Education Project updates Marcella Turner-Cmuchal.
Supporting Learning and Organisational Effectiveness
Abertay University.
Governance and leadership roles for equality and diversity in Colleges
British Institute of Learning Disabilities
Joint Meeting of DG REGIO Evaluation Network and ESF Evaluation Partnership Gdansk 8 July 2011 Evaluation use and usability: theoretical foundations and.
Getting Knowledge into Action for Healthcare Quality
Presentation transcript:

Connections and issues in the European evaluation context Professor Murray Saunders President of the European Evaluation Society

Congratulations on the formation of the Slovak Society for Evaluation and welcome to the National Evaluation Societies and Networks in Europe (NESE)

A word about what we mean by evaluation (a changing landscape) What the evaluation community in Europe identifies as important What is gained by sharing and working together Discussing the key issue of use and usability

My own background Began evaluation work in 1982 from a research background in educational change, policy and development Help to found the UK Evaluation Society in 1992 President between 2001 and 2003 Chaired the development group which formed the IOCE in 2003 Current president of the European Evaluation Society and Director of the Centre for the Study of Education and Training (CSET), and Professor of Evaluation, Lancaster University Still defining what evaluation is……………………..

My own background

Evaluative practice For me (and this is unstable!!) an evaluative practice is a practice that is the routine, rule governed behaviour prompted by an evaluative impulse i.e. an impulse to attribute ‘value’ or ‘worth’ in some way, to a process, a programme, an object, a policy, a development or an intervention.

Evaluative practice Evaluative practice concerns: the purposeful gathering, analysis and discussion of evidence from relevant sources about the quality, worth and impact of provision, development, policy or practice It is how we attribute value Evaluative practice also concerns things such as : Balancing diverse ethical interests Managing stakeholders Charting a way though ‘difficult’ or ‘inconvenient truths’

Ideas for a first work programme for NESE Possible objective Exchanges of information Presentation of good practices Monitoring of evaluation activities and context Encouraging use and usability Possible means Page on EES website Permanent contacts Continued moderation by EES + a national society Meeting in Lisbon - October 2008 Meeting in Munster October 8/9 th 2009

Monitoring evaluation in Europe: supply of evaluators education activities regarding evaluation institutional arrangements within the public sector activities by the supreme audit offices pluralism within each policy domain scope of evaluations

Overall strategy for professional development Filling gaps Complementing national training Contacts, conversations and exchanges of information Jointly sponsoring Thinking about capability, competence and standards

Picking up the issues of use and usability……..

“Use refers to the extent to which the outputs of an evaluation are used as a resource for onward practice, policy or decision making” What counts as Use?

“Usability refers to the way an evaluation design shapes the extent to which it’s outputs can be used” What counts as usability?

‘Use’ enhanced by inclusivity: * Authentication of focus and instrumentation * Interest in outputs/findings * Social capital building * New knowledge as socially owned * Increased chance of changes in practice How do different types of evidence/data determine ‘use’ practices e.g. narratives or stats Political use of different types of evidence How might ‘use’ be encouraged?

Process use refers to the unintended effects of carrying out an evaluation or ‘asking questions’: Foregrounding new issues Drawing attention to ‘hot spots’ or problem areas Forcing attention on difficult areas Providing a ‘voice’ for the powerless Drawing attention to time-lines Making participants think about ‘audience’ and ‘users’ Policing role ‘process use’

use as ‘engagement’ Lessmore Dissemination practice Report Executive summary Article Interactional practice: Working alongside colleagues Analysis of situational enabling and constraining factors Presentational practice: Seminars Presentations Active workshops Embodiments

Embedded in decision making cycles (clear knowledge on when decisions take place and who makes them) Clear understanding of organisational memory (how evaluations might accumulate) Capacity of an organisation to respond  Systemic processes (feeding into structures that are able to identify and act on implications)  Organisations that are lightly bureaucratised (complex adaptive systems) are better placed to respond to ‘tricky’ or awkward evaluations  Evaluations that are strongly connected to power structures (what does this mean?)  Evaluations that are congruent: suggestions based on evaluation need to build on what is already in place. Issues concerning “use”

Reasons and Purposes [planning, managing, learning, developing, accountability] Uses [providing and learning from examples of good practice, staff development, strategic planning, PR, provision of data for management control] Foci [activities, aspects, emphasis to be evaluated, should connect to the priority areas for evaluation] Data and Evidence [numerical, qualitative, observational, case accounts] Audience [Community of practice, commissioners, yourselves] Timing [Coincidence with decision making cycles, life cycle of projects] Agency [Yourselves, external evaluators, combination] Designing evaluations for usability: some critical questions

In our survey of last year (15 societies and networks in Europe), these were some of the issues for the evaluation community in Europe Key issues going forward? Evaluation use and usability Raising politicians’ awareness of evaluation Promoting research on evaluation Promoting and defining standards and good practice Supporting evaluation capacity builders in the public service Promoting evaluation training Setting up evaluation societies