How to measure the impact of R&D on SD ? Laurence Esterle, MD, PhD Cermes and Ifris France Cyprus, 16 – 17 October 2009 1.L. ESTERLE Linking science and.

Slides:



Advertisements
Similar presentations
Linking regions and central governments: Indicators for performance-based regional development policy 6 th EUROPEAN CONFERENCE ON EVALUATION OF COHESION.
Advertisements

RD4S Exercise Pillar C: How can the contribution of research to SD be measured ? By L. Esterle, Cermes and Ifris, France.
EURADWASTE 29 March 2004 LOCAL COMMUNITIES IN NUCLEAR WASTE MANAGEMENT THE COWAM EUROPEAN PROJECT EURADWASTE, 29 March 2004.
Measuring Research and Innovation in Cuban Universities Concepción Díaz Mayans, PhD. Walfredo González Rodríguez, MSc. Ministry of Higher Education.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Indian Institute of Remote Sensing Indian Space Research Organisation Dehradun Challenges in Capacity Building in Remote Sensing & GIS P. S. Roy
From Research to Advocacy
The Use and Impact of FTA Attila Havas and Ron Johnston Institute of Economics, Hungarian Academy of Sciences, and Australian Centre for Innovation Third.
Module 4 Planning SP. What’s in Module 4  Opportunities for SP  Different SP models  Communication plan  Monitoring and evaluating  Working session.
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Determining CLIMASP Competencies Jerash University Development of Interdisciplinary Program on Climate Change and Sustainability Policy- CLIMASP Development.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
“Why Are We Bringing The Fish?” Matching Methods and Culture Martha Ann Carey, PhD, RN Kells Consulting Media, PA, USA 72 nd ICP Conference, Paris, France.
‘Building Bridges’ An innovative tool to capture small health behaviour changes; the development process. Mills, H., Uphill, M., & Weed, M. Introduction.
Return On Investment Integrated Monitoring and Evaluation Framework.
Summarizing Community-Based Participatory Research: Background and Context for the Review Lucille Webb, MEd Eugenia Eng, DrPH Alice Ammerman, DrPH Meera.
Comprehensive M&E Systems
PPA 502 – Program Evaluation
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Multidisciplinary Research Methods Training Professor Linda A Lawton Graduate School Leader & Director of PgCert Research Methods.
Evaluation and Policy in Transforming Nursing
Why Return on Investment (ROI) Matters Raimo Vuorinen presenting for: James P. Sampson, Jr. Florida State University.
Challenges of a Harmonized Global Safety Regime Jacques Repussard Director General IRSN IAEA 2007 Scientific Forum.
KNOWLEDGE PRODUCTION: THE NEW ROLE OF UNIVERSITIES Two experts group have prepared reports on the future of university/research relations They have proposed.
Communication Degree Program Outcomes
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Designing and implementing of the NQF Tempus Project N° TEMPUS-2008-SE-SMHES ( )
INNOWATER Introduction to Business Proposition Toolkit July 2013.
DIRECTORY OF EXISTING PROFESSIONAL AND TECHNICAL QUALIFICATIONS IN THE EU (Guy Van Gyes, Tom Vandenbrande, Ellen Schryvers) Budapest, June 12 & 13, 2003.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
ESPON Seminar 15 November 2006 in Espoo, Finland Review of the ESPON 2006 and lessons learned for the ESPON 2013 Programme Thiemo W. Eser, ESPON Managing.
Proposal under Science With and For the Society Sofoklis A. Sotiriou.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Across Latitudes and Cultures Bus Rapid Transit Centre of Excellence Durban, South Africa; September 16, 2011 General Assembly 1.
Dr C Svanfeldt; DG RTD K2; October 6, Support for the coherent development of policies Regional Foresight in a European Perspective Dr. Christian.
Evaluation workshop on the Economic Development OP Budapest, 24 April 2013 Jack Engwegen Head of Unit, Hungary DG Regional and Urban Policy European Commission.
ROLE OF INFORMATION IN MANAGING EDUCATION Ensuring appropriate and relevant information is available when needed.
9 December 2005 Toward Robust European Air Pollution Policies Workshop, Göteborg, October 5-7, 2005.
UKPopNet Workshop 1 Undertaking a Systematic Review Andrew S. Pullin Centre for Evidence-Based Conservation University of Birmingham, UK.
1 1 The Global Project on Measuring the Progress of Societies OECD World Forum on Statistics, Knowledge and Policy Jon Hall, World Forum Project Leader,
Graduate studies - Master of Pharmacy (MPharm) 1 st and 2 nd cycle integrated, 5 yrs, 10 semesters, 300 ECTS-credits 1 Integrated master's degrees qualifications.
A Strategic Research Agenda for Europe in the field of illicit drugs Priorities for socio-economic and humanities research HDG Brussels - December 10,
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
Indicators to Measure Progress and Performance IWRM Training Course for the Mekong July 20-31, 2009.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Advice on Data Used to Measure Outcomes Friday 20 th March 2009.
Working in Partnership
Evaluation of NRNs Andreas Resch, Evaluation Advisor.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
ESPON Workshop at the Open Days 2012 “Creating Results informed by Territorial Evidence” Brussels, 10 October 2012 Introduction to ESPON Piera Petruzzi,
From IN-EUR/Interreg IVC to SHAPES model Measuring innovation among European subregions.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
Potentials and challenges of using M&E for institutionalizing impact evaluation and adaptive learning within organizations and programs involved in climate.
Generic competencesDescription of the Competence Learning Competence The student  possesses the capability to evaluate and develop one’s own competences.
Your Input Feedback from NGOs Presentation by Anne Marie Darmanin.
The BMBF Foresight Process Kerstin Cuhls, Walter Ganz, Philine Warnke Fraunhofer ISI and Fraunhofer IAO Third International Seville Conference on Future-Oriented.
First European conference on drug supply indicators, , Brussels – Feed-back Chloé Carpentier, Laurent Laniel, EMCDDA 33 rd meeting of the.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Engaging with End–Users of Research OCTOBER 2015 Cathy Harris MBA, MInstKT Research Engagement Manager Research & Enterprise Extn 6755,
Social Sciences and Humanities in Europe: New Challenges, New Opportunities “The influence of the Framework Programmes in Social Sciences and Humanities.
European Agency for Development in Special Needs Education Project updates Marcella Turner-Cmuchal.
Evaluation What is evaluation?
Monitoring and Evaluating Rural Advisory Services
Projects, Events and Training
The GEO-6 Matrix Drafting Approach
UNECE Work Session on Gender Statistics, Belgrade,
Ross O. Love Oklahoma Cooperative Extension Service
Information Technology (IT)
Data Collection: Designing an Observational System
Presentation transcript:

How to measure the impact of R&D on SD ? Laurence Esterle, MD, PhD Cermes and Ifris France Cyprus, 16 – 17 October L. ESTERLE Linking science and policy in SD research

Questions from the RPF 1.What kind of indicators are needed that serve ‘to measure the contribution of R&D to SD’ ? 2.Which are the most appropriate methodologies for the development of indicators measuring the contribution of R&D to SD and the implementation of their use ? 3.Which are the necessary resources needed to support the development and assessment of these indicators ? 4.Are they any other appropriate methods ? Cyprus, 16 – 17 October 2009Linking science and policy in SD research2

1 st question: What kind of indicators are needed that serve ‘to measure the contribution of R&D to SD’ ? Measurement of the contribution of R&D to SD: – Important for the stakeholders and the society – Justification of the public funding for R&D – A part of the ex-post evaluation process: measuring the effects of the policy intervention – Useful for improving design and implementation of the intervention Measurement of the impact of R&D to SD: outcome indicators Cyprus, 16 – 17 October 2009Linking science and policy in SD research3

Impact measurement: A new conceptual challenge ? There is a wide variety of impacts: – Economic or social – Direct or indirect – Short term or long term – Tangible or related to knowledge and skill effects Many reports from evaluation experts suggest that: – Impact measurement represents a ‘new conceptual challenge’ – Despite the efforts, the impact measurement is one of the most methodological intractable area of evaluation Cyprus, 16 – 17 October 2009Linking science and policy in SD research4

Impact measurement: A non realistic challenge ? (1) Conceptual gap: -Innovation is not developing according to a linear model (from basic research to use in society) Technical gap: -The question of attribution: what part of the modification can be attributed to research ? -Question of additionality: how to ascribe an observed effect to the results of research only Epistemological gap: -There is a chain of events from analytical knowledge gained by research to synthetic knowledge represented by capabilities and expertise, and from there to decision and action. -Then, the measurement of such impact has to model this whole chain. Cyprus, 16 – 17 October 2009Linking science and policy in SD research5

Impact measurement: A non realistic challenge ? (2) Delivery gap: -The research results are by nature unpredictable and uncertain and the spin-off risky -The research life cycle is very long and the gap between discovery and application may be years or even decades For these reasons, there is no satisfactory response to the challenge of measuring the impact of R&D taken into a narrow sense, and such measurement are in danger of loosing credibility and relevance for the users. Cyprus, 16 – 17 October 2009Linking science and policy in SD research6

2 nd Question : Which are the most appropriate methodologies for the development of indicators measuring the contribution of R&D to SD and the implementation of their use ? A ‘modest’ alternative’: Measuring the activities of research groups which will contribute to SD Background: -The direct measurement of the contribution of R&D to SD looks illusory -But it is possible to trace and measure the activities of the research groups which will contribute - directly or indirectly - to SD Methods in three steps – To analyse the needs of the policy makers and the goals of the R&D programme – To trace the activities of the research groups which are likely to answer these needs and objectives: traditional activities (e.g. knowledge production ) and socio-economic activities (e.g. links with enterprises, expertises, communication forward the society…) – To build corresponding indicators Cyprus, 16 – 17 October 2009Linking science and policy in SD research7

Examples of the issues examined during the workshop To adopt an interdisciplinary style of research Related activities: Participation of researchers from various disciplines in the project/programme Effective translation from research results to science advice to policy Related activities: Production of new knowledge Participation of the researchers to expertises for policy makers Participation to researchers to international conference (e.g. on the climate) Make the research findings available in accessible forms to key decision makers and the wider public Related activities: Communication towards the policy makers and the public Participation to public conferences, debates Cyprus, 16 – 17 October 2009Linking science and policy in SD research8

From identifying activities to building indicators First phase: Identify the needs and the expectations of the stakeholders Second phase: Identify the related activities Third phase: Identify the corresponding indicators and the possible sources of data Cyprus, 16 – 17 October 2009Linking science and policy in SD research9

Examples of indicators Activities Communication for policy makers, international organisations, etc Communication for the public Cooperation with professionals Corresponding indicators Nb of participations to SD conferences, workshops Nb of expertises Nb of publications cited in SD reports Nb of communications in various media Nb of participation in public conferences Nb of participative research projects Nb of research projects on the ground Cyprus, 16 – 17 October 2009Linking science and policy in SD research10

Cyprus, 16 – 17 October 2009 Linking science and policy in SD research 1.L. ESTERLE Benefits/Expectations from policy-makers Corresponding activities of researchers IndicatorsOrigin of data Impact of people’s behavior Non academic communication  Number of communications through various media  Number of participations in public conferences Media database (Factiva) Surveys Learning and education activities  Number of participations in non academic courses  Involvement of stakeholders in the research project Surveys Informing policies Communication towards policy- makers  Number of publications cited in SD reports  Number of communications to SD conferences Data from SD reports Surveys Interaction with policy-makers  Number of participations in evaluation  Number of reports Surveys

3 rd Question (1): Which are the necessary resources needed to support the development and assessment of these indicators ? Building indicators: Is to bridge the gap between detailed data and interpreted information An indicator is a sign of a complex system, should be seen as a process rather than a product Needs and resources Collecting data organised in databases Having capabilities in indicators building (methodology, tools, etc…) Having expertises in science policy for identifying the questions and the policy issues, analysing and interpreting the results Cyprus, 16 – 17 October 2009Linking science and policy in SD research12

One major issue: collecting data Many possible sources of data and databases Acquiring existing databases (e.g. scientific publication database, patent database, media database): in these cases, the database should be adapted and exploited according the needs Building internal databases (e.g. project database, publication database, human resources and funding databases…). In these cases, the data should be collected (e.g. by surveys) and the database should be built according to standards of quality and controls. In any case, the database should be maintained and actualised for long term use. Cyprus, 16 – 17 October 2009Linking science and policy in SD research13

3 rd Question (2): How to assess the interest and the quality of indicators ? By using evaluation process! At periodic and regular time For assessment of the indicator relevance: By stakeholders and external experts in science policy and evaluation For assessment of the methodologies used and the quality of indicators With external experts in indicator methodology and science policy. Cyprus, 16 – 17 October 2009Linking science and policy in SD research14

4 th Question: Are they any other appropriate methods for measuring the contribution of R&D to SD? Other quantitative approach: building indicators measuring the behavioural additionality Allow to answer the question: what differences does the intervention (e.g. launching a R&D programme for SD) make on behaviour? Does it change the practises toward SD ? An essential dimension is related to the situation which is considered as alternative to this intervention (including no intervention at all) One of the difficulty in the field of SD is that the effect of the intervention is indirect on the behaviour of the society. Qualitative approaches can be used: Surveys Field study, socio-anthropology methods Participatory approach Both qualitative and quantitative methods could be conjugated: e.g. building indicators and a jury approach (RD4SD exercise) Cyprus, 16 – 17 October 2009Linking science and policy in SD research15

The advantages for developing a dual (quantitative and qualitative /participative) approach The process of building indicators is by itself a way to make explicit the implicit questions regarding the issues of the intervention The presentation of indicators can itself generates debates and contribute to improve them The participative debates allows the various stakeholders (policy makers, professionals, researchers, citizens…) to express their opinions and concerns A dual approach will be a source of information, interaction and learning from which all parties can benefit. Cyprus, 16 – 17 October 2009Linking science and policy in SD research16

Thank you very much for your attention! Cyprus, 16 – 17 October 2009Linking science and policy in SD research17