New frontiers for evaluation: challenges to evaluation practice and knowledge base by Philippe Larédo ENPC and University of Manchester International Conference.

Slides:



Advertisements
Similar presentations
RD4S Exercise Pillar C: How can the contribution of research to SD be measured ? By L. Esterle, Cermes and Ifris, France.
Advertisements

Tools for Policy Influence. RAPID Programme SMEPOL, Cairo, February, Practical Tools.
Assessing student learning from Public Engagement David Owen National Co-ordinating Centre for Public Engagement Funded by the UK Funding Councils, Research.
Lucila Beato UNMIL/HRPS
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Building open regional innovation strategies: New opportunities provided by Smart Specialisation Strategies Claire Nauwelaers Independent STI policy expert.
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Role of RAS in the Agricultural Innovation System Rasheed Sulaiman V
Connections and issues in the European evaluation context Professor Murray Saunders President of the European Evaluation Society.
The quality framework of European statistics by the ESCB Quality Conference Vienna, 3 June 2014 Aurel Schubert 1) European Central Bank 1) This presentation.
Capturing the impact of research Briony Rayfield.
MONITORING & QUALITY CONTROL ERASMUS MUNDUS II PROJECT CENTAURI MOBILITY KAZAKHSTAN, KYRGYZSTAN, TAJIKISTAN, UZBEKISTAN.
. 1 What Is University Governance and Does It Matter? Presentation Higher School of Economics, Moscow, 27 April 2012 Prof. Dr. Barbara M. Kehm
Factors that Influence Evaluation Utilisation. Theoretical Perspectives Michael Quinn Patton’s ‘Utilisation Focused Evaluation’ Addresses issue of use.
Fostering entrepreneurial mindsets through education and learning
Adviser, Ministry for State Reform, Lebanon
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Quality in Education and Training
Reflections on the Independent Strategic Review of the Performance-Based Research Fund by Jonathan Adams Presentation to Forum on “ Measuring Research.
1 UNDECLARED WORK IN CROATIA Executive Capacity of Governance and Underground Economy: The Case of Croatia Zagrebl, September 1, 2015.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
An Integrated Approach to LED Promotion and the Critical Role of Local Government Challenges & Opportunities CLGF Energising Local Economies: Partnership.
Quality as a Tool for Autonomy Autonomy as a Condition for Quality Prof. Dr. Dirk Van Damme VLIR / Ghent University.
1 NEST New and emerging science and technology EUROPEAN COMMISSION - 6th Framework programme : Anticipating Scientific and Technological Needs.
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
Making Good Use of Research Evaluations Anneli Pauli, Vice President (Research)
RAPID Outcome Mapping Approach Simon Hearn, ODI 16 April 2010 Bern, Switzerland.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
Tertiary Professional Education Development and Challenges OECD Country Report Discussion Introduction Praha, November 29, 2006 Michal Karpíšek Sdružení.
Making Universities More Entrepreneurial Dr. David Woollard Special projects Manager.
CHE Business Plan Mission The mission of the CHE is to contribute to the development of a higher education system that is characterised by.
Performance Stories Evaluation - A Monitoring Method to Enhance Evaluation Influence Riad Naji, Catriona King, Richard Habgood.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
Faisal Naru Head of Better Regulation DAI Europe Ltd November 2007 Washington London Johannesburg Ramallah RIA – An Art and not a Science.
WHO EURO In Country Coordination and Strengthening National Interagency Coordinating Committees.
LIFELONG GUIDANCE SYSTEMS: COMMON EUROPEAN REFERENCE TOOLS ELGPN PEER LEARNING ACTIVITY WP2 Prague April 2008 Dr John McCarthy, Director International.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Using results frameworks to shift the focus of evaluation to a strategic level Emerging research on the principles underpinning results frameworks Kate.
Kathy Corbiere Service Delivery and Performance Commission
0 ©2015 U.S. Education Delivery Institute While there is no prescribed format for a good delivery plan, it should answer 10 questions What a good delivery.
Council on Higher Education: Three-year Business Plan and MTEF Budget Presentation to the Portfolio Committee on Higher Education and Training.
Why is this on the agenda? Our baseline study was highly critiqued Our monitoring method was revised Our mid-term review highlights monitoring and evaluation.
"The challenge for Territorial Cohesion 2014 – 2020: delivering results for EU citizens" Veronica Gaffey Acting Director EUROPEAN COMMISSION, DG for Regional.
How to measure the impact of R&D on SD ? Laurence Esterle, MD, PhD Cermes and Ifris France Cyprus, 16 – 17 October L. ESTERLE Linking science and.
Session 4. Evaluation of publicly funded research development and innovation: The Integral Monitoring and Evaluation System (SISE) Alfonso Beltrán García-Echániz.
Influencing Policy through Research: Introduction to Principles and Tools Arnaldo Pellini:
Guidelines Recommandations. Role Ideal mediator for bridging between research findings and actual clinical practice Ideal tool for professionals, managers,
Landscape Heritage Sustainable Development Indicator Assessment using Geographical Information Systems in County Clare Lianda d’Auria Department of Geography,
. 1 What Is University Governance and Does It Matter? Presentation at the international Conference organised by SEAMEO RETRAC, 28 – 29 June 2012 in Ho.
The evaluation of publicly funded research activities An overview Luis Sanz Menéndez OECD Symposium Madrid 3 July 2008.
Module 7- Evaluation: Quality and Standards. 17/02/20162 Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques.
Strategic Planning and Future of Cohesion Policy after 2020 Panel 2 “Strategic Planning as an X-factor for effective management of ESI funds“ (V4+4 Conference;
1 FTA Seminar 2006 FUTURE OF THE EU UNIVERSITY Preliminary design for a Foresight Exercise Antoine SCHOEN (JRC-IPTS) FTA Seminar.
Version VTT TECHNOLOGY STUDIES Evaluating the societal impacts of Public research organisations: A (belated) paradigm shift in the making Kaisa.
United Nations Economic Commission for Europe Statistical Division WHAT MAKES AN EFFECTIVE AND EFFICIENT STATISTICAL SYSTEM Lidia Bratanova, Statistical.
About the European Science Foundation 1. 2 ESF Member Organisations ESF is an independent association of 13 Member Organisations ● research funding organisations.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
New Ecological Science Advice for Ecosystem Protection The EPA Science Advisory Board (SAB) Staff Office supports three external scientific advisory committees.
Project: EaP countries cooperation for promoting quality assurance in higher education Maria Stratan European Institute for Political Studies of Moldova.
Design of foresight-based evaluation in Tekes Activities
Monitoring and Evaluating Rural Advisory Services
Knowing science Synopsis of the state of the art based on collected research results of the team.
LESSONS LEARNED FROM THE PRESENT GENERATION OF HIGHER EDUCATION PROGRAMMES IN EASTERN PARTNERSHIP COUNTRIES Klaus Haupt, Head of Tempus Unit Education,
Project Cycle Management
Draft OECD Best Practices for Performance Budgeting
Evaluating support to capacity development
The Use and Impact of FTA
Panel “Key performance indicators for Serbian higher education“
ESF evaluation partnership
Role of Evaluation coordination group and Capacity Building Projects in Lithuania Vilija Šemetienė Head of Economic Analysis and Evaluation Division.
Presentation transcript:

New frontiers for evaluation: challenges to evaluation practice and knowledge base by Philippe Larédo ENPC and University of Manchester International Conference on IoIR

Starting Point Starting point: The come back of evaluation Main source mobilised: the recent review conducted for OECD with Like Georghiou Two main “surprises” (compared to 1990s)... - growing trust by policymakers and stronger articulation with decision making process - changing set of issues (institutional transformation, changing understanding of challenges faced) … Which question evaluation practices and requires reconsidering part of the accumulated knowledge base on evaluation

Three main messages Rethinking delivery: “Being taken seriously” drives to new requirements on evaluation products and delivery processes Changing foci require to adapt processes (an issue as much for policy makers as for research analysts) For research in evaluation methods - less an issue of refining existing tools - than of reconsidering approaches to existing questions - and of new designs for changing understandings

The presentation Focus on changing foci 1- system review and institutional “renewal” 2- excellence, frontier science and the evaluation of “institutes” Analysing for each: delivery, process and methodological issues.

I- NSI & institutional transformation: The “old” model revisited 1.Political shaping of the institutional framework: revisiting the external (OECD reviews) - internal (advisory bodies) divide: - reporting mechanisms - new forms of external reviews 2.‘research’ evaluation: from the “performance & relevance” of individual instruments (mainly programmes) to - relevance of present funding arrangements - portfolios of programmes

System reviews National policy evaluations (OECD without OECD, e.g. Finland) - Based on nationally selected international peers --> ‘authoritative’ reports (credibility for political staff) - Process and methods secondary - Main issue = “pragmatism” in recommendations (what is politically acceptable and feasible in terms of implementation) Monitoring systems (e.g. GPRA and PART). - De facto, mostly new “reporting” systems (reshaping the format of annual reports for institutions) --> one danger (e.g. the Australian University monitoring): forgetting institutions! - Quantitative data = the will of “benchmarking” and/or “ranking” institutions --> One danger: in between the ‘banal’ (what is available) and the ‘ad-hoc’ (no possible comparisons) --> Thus a central research issue for evaluation research = ‘positioning’ indicators

Reviewing funding arrangements (1) Complex evaluations of key funding structures (e.g. RCN, FFF and FWF) Main characteristics: professional, international consortia, multiple entry points, de facto encompassing (whatever terms of reference), benchmarking often central, new products (see above) Methodological issues: most evaluations conclude on professionalisation of evaluated body and of corresponding ministry. But what are ‘relevant standards’? Process: consortia selected after very complex and detailed calls while very difficult to anticipate hierarchy of aspects (especially points to deepen) --> one critical issue = administrative shaping of evaluations

Reviewing funding arrangements (2): delivery issues The changing landscape of ‘decision-making’: - no longer advice to administrative decision making - more and more feeding into a public debate (which started before and will go on after: see Austria) Impacts on publications - Two step: evaluation files & evaluation synthesis - Need to delineate targeted audiences and the issue of adequate writing of the synthesis Impact on delivery process: not a one-off product but repetitive occurrences of interactions with ‘stakeholders’.

Evaluating portfolios of programmes The panel based model (e.g. Finnish academy of sciences, FP) Characteristics: - not a meta evaluation - tackling broader issues (composition, coherence and relevance of portfolio, relevance of implementation structures...) Process problems: - On-going: at best preliminary ‘characterisation’ studies (recipients, effect, evaluations of individual programmes) + usual panel meetings - Problem: the need for ‘professional studies’ of transversal problems identified - Major issue: organising a 2-step panel-based evaluation process Delivery issue: reports mostly ‘boring’ with usual long list of recommendations. How to frame synthesis and interaction?

Programme portfolios: methodological issues Still problems at programme level - Relationship between programme aims and evaluation criteria: where was the ‘problem solving dimension’ of FP5 evaluation? Or how to cope with “societal effects” or “social benefits”? - Relationship between effects identified and their interpretation: e.g. discussing the skewed distribution of effects. Major work needed at portfolio level - Analysing the composition of portfolio - Assessing relevance and performance of implementation structures: which references, “benchmarks”... - Benchmarking: the need for a ‘clearing house’?

II- Evaluation & capability building A fast growing focus for policy and evaluation - shaping and core funding of institutes by institutions, e.g. Helmholtz society, CSIC, INSERM... - multiplication of programmes for “centres of excellence”, “competence centres”... (see overview by Technopolis) - rapid deployment of national (and regional) systems of evaluation of University research (following the UK RAE) Lines of change: periodic & articulated to institution strategic programming, introducing competitive processes, based on a international peers model reviewing of quality (“excellence”), direct connection with funding

Evaluation & ‘Institutes’: process issues Process: the “international peer based model” (with even delegations to outside bodies: e.g. EMBO). - How to cope with other aspects than ‘academic quality’? - Path & organisational dependency: how can the model internalise these? - Critical ‘ex-ante’ shaping by required formats (often highly specified) Delivery: articulation between evaluation & funding - Often blurred mechanisms within institutions - The specific case of university research: forgetting “universities as institutions” (a key incoherence of most, if not all, existing mechanisms)

Capacity building: methodological issues New key words: excellence, fragmentation, attractivity, frontier science... “Picturing” the landscape: - The role of mapping approaches and ‘positioning” tools and indicators (e.g. Shanghai ranking) - Handling the normative dimension: is the more or the higher the better? Measuring transformation: Changing relations between action, outputs, outcomes and effects - time for new (human) capability building - markers of output = new articles, patents, diploma - outcome = capacity mobilised (direct via contracts or indirect via mobility) - effect = performance or result of mobilisation (with all well known problems of attribution) Assessing policy: which relation between given policy support and the construction of new given capabilities? (another type of “project fallacy?)

Some conclusions We are still in infancy when discussing the articulation of evaluation and decision making --> impacts on evaluation products, interaction with audiences and ‘administrative shaping of evaluations’ Positioning problems and actor’ capacity and strategy are major issues --> shifting from input- output indicators to “positioning” indicators Growing focus on capabilities ask for important methodological developments Issues of institutional relevance entail new “two step” (or even more) processes.