Download presentation
Presentation is loading. Please wait.
Published byOpal Burns Modified over 9 years ago
1
How to measure the impact of R&D on SD ? Laurence Esterle, MD, PhD Cermes and Ifris France Cyprus, 16 – 17 October 2009 1.L. ESTERLE Linking science and policy in SD research
2
Questions from the RPF 1.What kind of indicators are needed that serve ‘to measure the contribution of R&D to SD’ ? 2.Which are the most appropriate methodologies for the development of indicators measuring the contribution of R&D to SD and the implementation of their use ? 3.Which are the necessary resources needed to support the development and assessment of these indicators ? 4.Are they any other appropriate methods ? Cyprus, 16 – 17 October 2009Linking science and policy in SD research2
3
1 st question: What kind of indicators are needed that serve ‘to measure the contribution of R&D to SD’ ? Measurement of the contribution of R&D to SD: – Important for the stakeholders and the society – Justification of the public funding for R&D – A part of the ex-post evaluation process: measuring the effects of the policy intervention – Useful for improving design and implementation of the intervention Measurement of the impact of R&D to SD: outcome indicators Cyprus, 16 – 17 October 2009Linking science and policy in SD research3
4
Impact measurement: A new conceptual challenge ? There is a wide variety of impacts: – Economic or social – Direct or indirect – Short term or long term – Tangible or related to knowledge and skill effects Many reports from evaluation experts suggest that: – Impact measurement represents a ‘new conceptual challenge’ – Despite the efforts, the impact measurement is one of the most methodological intractable area of evaluation Cyprus, 16 – 17 October 2009Linking science and policy in SD research4
5
Impact measurement: A non realistic challenge ? (1) Conceptual gap: -Innovation is not developing according to a linear model (from basic research to use in society) Technical gap: -The question of attribution: what part of the modification can be attributed to research ? -Question of additionality: how to ascribe an observed effect to the results of research only Epistemological gap: -There is a chain of events from analytical knowledge gained by research to synthetic knowledge represented by capabilities and expertise, and from there to decision and action. -Then, the measurement of such impact has to model this whole chain. Cyprus, 16 – 17 October 2009Linking science and policy in SD research5
6
Impact measurement: A non realistic challenge ? (2) Delivery gap: -The research results are by nature unpredictable and uncertain and the spin-off risky -The research life cycle is very long and the gap between discovery and application may be years or even decades For these reasons, there is no satisfactory response to the challenge of measuring the impact of R&D taken into a narrow sense, and such measurement are in danger of loosing credibility and relevance for the users. Cyprus, 16 – 17 October 2009Linking science and policy in SD research6
7
2 nd Question : Which are the most appropriate methodologies for the development of indicators measuring the contribution of R&D to SD and the implementation of their use ? A ‘modest’ alternative’: Measuring the activities of research groups which will contribute to SD Background: -The direct measurement of the contribution of R&D to SD looks illusory -But it is possible to trace and measure the activities of the research groups which will contribute - directly or indirectly - to SD Methods in three steps – To analyse the needs of the policy makers and the goals of the R&D programme – To trace the activities of the research groups which are likely to answer these needs and objectives: traditional activities (e.g. knowledge production ) and socio-economic activities (e.g. links with enterprises, expertises, communication forward the society…) – To build corresponding indicators Cyprus, 16 – 17 October 2009Linking science and policy in SD research7
8
Examples of the issues examined during the workshop To adopt an interdisciplinary style of research Related activities: Participation of researchers from various disciplines in the project/programme Effective translation from research results to science advice to policy Related activities: Production of new knowledge Participation of the researchers to expertises for policy makers Participation to researchers to international conference (e.g. on the climate) Make the research findings available in accessible forms to key decision makers and the wider public Related activities: Communication towards the policy makers and the public Participation to public conferences, debates Cyprus, 16 – 17 October 2009Linking science and policy in SD research8
9
From identifying activities to building indicators First phase: Identify the needs and the expectations of the stakeholders Second phase: Identify the related activities Third phase: Identify the corresponding indicators and the possible sources of data Cyprus, 16 – 17 October 2009Linking science and policy in SD research9
10
Examples of indicators Activities Communication for policy makers, international organisations, etc. ------------------------------------------------------- Communication for the public ------------------------------------------------------- Cooperation with professionals Corresponding indicators Nb of participations to SD conferences, workshops Nb of expertises Nb of publications cited in SD reports ------------------------------------------------------ Nb of communications in various media Nb of participation in public conferences ------------------------------------------------------- Nb of participative research projects Nb of research projects on the ground Cyprus, 16 – 17 October 2009Linking science and policy in SD research10
11
Cyprus, 16 – 17 October 2009 Linking science and policy in SD research 1.L. ESTERLE Benefits/Expectations from policy-makers Corresponding activities of researchers IndicatorsOrigin of data Impact of people’s behavior Non academic communication Number of communications through various media Number of participations in public conferences Media database (Factiva) Surveys Learning and education activities Number of participations in non academic courses Involvement of stakeholders in the research project Surveys Informing policies Communication towards policy- makers Number of publications cited in SD reports Number of communications to SD conferences Data from SD reports Surveys Interaction with policy-makers Number of participations in evaluation Number of reports Surveys
12
3 rd Question (1): Which are the necessary resources needed to support the development and assessment of these indicators ? Building indicators: Is to bridge the gap between detailed data and interpreted information An indicator is a sign of a complex system, should be seen as a process rather than a product Needs and resources Collecting data organised in databases Having capabilities in indicators building (methodology, tools, etc…) Having expertises in science policy for identifying the questions and the policy issues, analysing and interpreting the results Cyprus, 16 – 17 October 2009Linking science and policy in SD research12
13
One major issue: collecting data Many possible sources of data and databases Acquiring existing databases (e.g. scientific publication database, patent database, media database): in these cases, the database should be adapted and exploited according the needs Building internal databases (e.g. project database, publication database, human resources and funding databases…). In these cases, the data should be collected (e.g. by surveys) and the database should be built according to standards of quality and controls. In any case, the database should be maintained and actualised for long term use. Cyprus, 16 – 17 October 2009Linking science and policy in SD research13
14
3 rd Question (2): How to assess the interest and the quality of indicators ? By using evaluation process! At periodic and regular time For assessment of the indicator relevance: By stakeholders and external experts in science policy and evaluation For assessment of the methodologies used and the quality of indicators With external experts in indicator methodology and science policy. Cyprus, 16 – 17 October 2009Linking science and policy in SD research14
15
4 th Question: Are they any other appropriate methods for measuring the contribution of R&D to SD? Other quantitative approach: building indicators measuring the behavioural additionality Allow to answer the question: what differences does the intervention (e.g. launching a R&D programme for SD) make on behaviour? Does it change the practises toward SD ? An essential dimension is related to the situation which is considered as alternative to this intervention (including no intervention at all) One of the difficulty in the field of SD is that the effect of the intervention is indirect on the behaviour of the society. Qualitative approaches can be used: Surveys Field study, socio-anthropology methods Participatory approach Both qualitative and quantitative methods could be conjugated: e.g. building indicators and a jury approach (RD4SD exercise) Cyprus, 16 – 17 October 2009Linking science and policy in SD research15
16
The advantages for developing a dual (quantitative and qualitative /participative) approach The process of building indicators is by itself a way to make explicit the implicit questions regarding the issues of the intervention The presentation of indicators can itself generates debates and contribute to improve them The participative debates allows the various stakeholders (policy makers, professionals, researchers, citizens…) to express their opinions and concerns A dual approach will be a source of information, interaction and learning from which all parties can benefit. Cyprus, 16 – 17 October 2009Linking science and policy in SD research16
17
Thank you very much for your attention! esterle@vjf.cnrs.fr Cyprus, 16 – 17 October 2009Linking science and policy in SD research17
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.