Options for Evaluating Research: Inputs, Outputs, Outcomes Michele Garfinkel Manager, Science Policy Programme ICSTI ITOC Workshop 19 January 2015, Berlin.

Slides:



Advertisements
Similar presentations
Usage statistics in context - panel discussion on understanding usage, measuring success Peter Shepherd Project Director COUNTER AAP/PSP 9 February 2005.
Advertisements

Cities and Green Growth OECD Green Cities Programme
1 Working together to strengthen research in Europe Open access and preservation: how can knowledge sharing be improved in ERA? (session 1.5) Alma Swan.
Assessing and Increasing the Impact of Research at the National Institute of Standards and Technology Susan Makar, Stacy Bruss, and Amanda Malanowski NIST.
EFSA’s Mission and Priorities Bernhard Berger Head of the Advisory Forum and Scientific Cooperation Unit Conference “Importance of food additives today.
Occur when a societal condition is improved due to a participant’s action taken in the previous column. For example, specific contributions to: - Increased.
1 7th Framework Programme Specific Programme “Ideas” European Commission Directorate B November 2005.
Altmetrics and impact Altmetric.com Euan AdieCOPE, 17 th April 2015.
Jukka-Pekka Suomela 2014 Ethics and quality in research and publishing.
About use and misuse of impact factor and other journal metrics Dr Berenika M. Webster Strategic Business Manager 23 January 2009, Sydney.
Using Journal Citation Reports The MyRI Project Team.
T H O M S O N S C I E N T I F I C Editorial Development James Testa, Director.
Institutional Perspective on Credit Systems for Research Data MacKenzie Smith Research Director, MIT Libraries.
National Frameworks of Qualifications, and the UK Experience Dr Robin Humphrey Director of Research Postgraduate Training Faculty of Humanities and Social.
Other responses: Librarian 3 Data Scientist/data manager/data analyst 7 Student/assistant 2 Writer/Editor/publications support 3 Programme Manager 1 Computer.
EFSA MANAGEMENT PLAN 2008 The Management Plan
The Hungarian system of ex post and on-going evaluation focusing on Structural Funds Kinga Kenyeres, Evaluation Division6-7 May, 2010 National Development.
Professional Certificate – Managing Public Accounts Committees Ian “Ren” Rennie.
Biodiversity and Ecosystem Services (BES) – Net Dialogue Workshop on Knowledge for the 21 st Century: connecting diverse knowledge systems April.
Bibliometrics and Impact Analyses at the National Institute of Standards and Technology Stacy Bruss and Susan Makar Research Librarians SLA Pharmaceutical.
University Ranking and benchmarking: How can we get in to the league table? Dr Muhammad Sohail Microbiology.
Leading Change. THE ROLE OF POLICY IN CHANGE Leading Change – The Role of Policy Drift to Quantitative Compliance- Behavior will focus on whatever is.
Strengthening global awareness in the local communities - Kolping 2020 Strategy.
LORRIE JOHNSON U.S. DEPARTMENT OF ENERGY OFFICE OF SCIENTIFIC AND TECHNICAL INFORMATION (OSTI) ICSTI TECHNICAL ACTIVITIES COORDINATING (TACC) MEETING OCTOBER.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
HECSE Quality Indicators for Leadership Preparation.
Berlin 3 Open Access University of Southampton February 28 th – March 1 st, 2005 Study on the economic and technical evolution of the scientific publication.
Journal Impact Factors: What Are They & How Can They Be Used? Pamela Sherwill, MLS, AHIP April 27, 2004.
IT Governance Review Presentation to SAAG – January 11 th, 2011.
Is there a better model for P&T preparation and evaluation of geoscience education research (GER) in geoscience departments? Kristen St. John James Madison.
START global change SysTem for Analysis, Research & Training UNFCCC Expert Workshop on Monitoring and Evaluating Capacity Building in Developing Countries.
Expert group meeting on draft delegated act on the European code of conduct on partnership (ECCP) under cohesion policy
1 Introduction of Research and Development Project Evaluation System at NEDO Momoko OKADA New Energy and Industrial Technology Development Organization(NEDO)
Data Management and Accessibility S.M. Kaye PPPL Research Seminar 12/16/2013.
Results The final report was presented to NICE and published by NICE and WHO. See
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
DETERMINE Working document # 4 'Economic arguments for addressing social determinants of health inequalities' December 2009 Owen Metcalfe & Teresa Lavin.
María Amor Barros del Río Gender as content in research in Horizon 2020 GENDER AS CONTENT IN RESEARCH IN HORIZON 2020 CAPACITY BUILDING WORKSHOP FOR RESEARCHERS.
Leiden University. The university to discover. A national deal with Springer: an institutional view of national transition arrangements to Gold OA Kurt.
The partnership principle and the European Code of Conduct on Partnership.
Making use of (alt)metrics Phill Jones, PhD Head of Publisher Outreach @altmetrics.
Outcomes of the online academia consultation Mr. Christopher Clark Head, Partnership and Resource Mobilization Division International.
Kathy Corbiere Service Delivery and Performance Commission
Developing New Journals Alison Mercer Kathryn Wilson.
Session 4. Evaluation of publicly funded research development and innovation: The Integral Monitoring and Evaluation System (SISE) Alfonso Beltrán García-Echániz.
Future outlook and next steps for ESPON The ESPON 2013 Programme OPEN DAYS Bruxelles, 10 October 2007.
Working with your archive organization: Broadening your user community Robert R. Downs, PhD Socioeconomic Data and Applications Center (SEDAC) Center for.
Working with Your Archive : Broadening Your User Community Robert R. Downs, PhD NASA Socioeconomic Data and Applications Center (SEDAC) Center for International.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
MEASURING RESEARCHERS: FROM RESEARCH PUBLICATIONS AND COMMUNICATION TO RESEARCH EVALUATION Lucie Vavříková 1.
OA Ambassadors Conference, November, 2014 DORA the reformer – prospects for new approaches to research assessment Mark Patterson, Executive Director, eLife.
Research and Innovation Support Conference Library Support for Research Dr Stella Butler, University Librarian.
Measuring Your Research Impact Citation and Altmetrics Tools University Libraries Search Savvy Seminar Series April 9 & 10, 2014 Prof. Amanda Izenstark.
Governance and Institutional Arrangements What they have to do with Regional Water Planning (RWP)
Project: EaP countries cooperation for promoting quality assurance in higher education Maria Stratan European Institute for Political Studies of Moldova.
Role of librarians in improving the research impact and academic profiling of Indian universities J. K. Vijayakumar Ph. D Manager, Collections & Information.
Critique The Impact Factor Game
Association for Teacher Education in Europe
National planning for Open Research euroCRIS 2017, 30 May 2017
Post-publication evaluation through tags
By: Azrul Abdullah Waeibrorheem Waemustafa Hamdan Mat Isa Universiti Teknologi Mara, Perlis Branch, Arau Campus SEFB, Universiti Utara, Malaysia Disclosure.
Open Science at the Royal Society Dr Stuart Taylor Publishing Director
How to publish from your MEd or PhD research
THE OFFICE FOR SCHOLARLY COMMUNICATION/ Responsible Metrics at Kent
Scientific Publishing in the Digital Age
مدل زنجیره ای در برنامه های سلامت
This content is available under a Creative Commons Attribution License
Measuring Your Research Impact
Internal and External Quality Assurance Systems for Cycle 3 (Doctoral) programmes "PROMOTING INTERNATIONALIZATION OF RESEARCH THROUGH ESTABLISHMENT AND.
Presentation transcript:

Options for Evaluating Research: Inputs, Outputs, Outcomes Michele Garfinkel Manager, Science Policy Programme ICSTI ITOC Workshop 19 January 2015, Berlin

Today’s talk About EMBO A policy view of research assessment Stakeholder roles

About EMBO European Molecular Biology Organization (Maria Leptin, Director) Founded 1964, Heidelberg, DE Funded by the European Molecular Biology Conference –27 Member States –3 cooperation agreements Advancing policies for a world-class European research environment

Governance Three main areas: biotechnology, responsible conduct of research, scientific publishing –Technology assessment Scientific publishing –Open access –Data –Responsibilities of editors, administrators, authors Science Policy Programme

Scientific publishing The publication of scientific information is intended to move science forward. More specifically, the act of publishing is a quid pro quo in which authors receive credit and acknowledgment in exchange for disclosure of their scientific findings.

Journal name as proxy for quality Journal Impact Factor: a librarian’s number The concern is not use, but misuse –Research assessment –“JIF : a subscription for the price of the IF” Why has this been adopted for research assessment? –Cross-disciplinary –Intuitive and reflective –Prospective

Research assessment is an ecosystem Funders Researchers Journals Other assessors?

What DORA sets out Main recommendation: Do not use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions, or in hiring, promotion, or funding decisions Implementation?

What DORA sets out Research institutions and funding agencies: be clear on evaluation criteria and consider all contributions Publishers: do not use JIF as a marketing tool, make more article level metrics available, make all reference lists open, remove limits on reference list length

What DORA sets out Metrics suppliers: provide methodology and data in a useful form, account for variation in article types (reviews v. research articles) Researchers: as assessors, review for scientific content; as authors, cite appropriate (primary) literature; challenge bad practices

What DORA does not say Metrics based research assessment is wrong JIF is flawed for assessing journals Citations are a flawed metric There is a simple alternative Publishers are to blame Thomson Reuters is to blame

What DORA does not say Metrics based research assessment is wrong JIF is flawed for assessing journals Citations are a flawed metric There is a simple alternative Publishers are to blame X is to blame Altmetric Score

More institutions and funders emphasizing biosketches and 'select your 5 best papers' strategies over IF. Constructive discussions with Thomson Reuters. More interest in dialogue and a willingness to improve the JIF as a metric Competition is good for everyone Incremental advances

Engagement with funders Engaging additional research communities Study national/regional variations Editorials forthcoming –Key point: better analyses needed Policy analysis –Implementation and governance issues, metrics, stakeholders Incremental advances

This is not (just) about overworked or lazy promotion committees and rapacious journals The reward system in science is (becoming) warped Resources for thorough evaluation are not available Journal articles have become the currency of rewards rather than a contribution to knowledge It’s the system (?)

Researchers Publishers Research administrators Funders Metrics researchers Metrics providers Decision-makers Research Assessment: Stakeholders

We are great at measuring inputs (funding, numbers of students) We are good at measuring outputs (numbers of papers, some impact measures) Outcomes measurements are a problem What should we be assessing?

Papers –And how they are discovered? Data –And how they are discovered? Reviewing? Teaching? Committee work? Responsible conduct? What should we be assessing?

Workshops –Governance issues –Stakeholders Engagement with funders Ongoing work