LIDA 2014, 16-20 June 2014, Zadar, Croatia The metrics acumen: supporting individual researchers in assessment Paul Wouters, Centre for Science and Technology.

Slides:



Advertisements
Similar presentations
Usage statistics in context - panel discussion on understanding usage, measuring success Peter Shepherd Project Director COUNTER AAP/PSP 9 February 2005.
Advertisements

Geography FACULTY OF Environment Living with Difference in Europe: making communities out of strangers in an era of super mobility and super diversity.
Beyond the article: Altmetrics, publishing and marketing 1: AM, Altmetrics conference, London, 26 September 2014, Hans Zijlstra,
The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.
A ‘how to’ guide to measuring your own academic and external impacts Patrick Dunleavy and Jane Tinkler LSE Public Policy Group Investigating Academic Impacts.
Ronald L. Larsen May 22, Trace relationships amongst academic journal citations Determine the popularity and impact of articles, authors, and publications.
Bibliometrics: Measuring the Impact of Your Publications Jane Buggle Deputy Librarian.
Measuring Research: Impact Factors, Citation Metrics, and "Altmetrics" Can Tweets be more important than a good Impact Factor? Thane Chambers Christina.
CURTIN UNIVERSITY LIBRARY Curtin University is a trademark of Curtin University of Technology CRICOS Provider code 00301J July 2014 Tell your impact story:
Prof. Robert Morrell, UCT Research Office Presentation to North West University 28 February 2014.
Primož Južnič, Polona Vilar & Tomaž Bartol, University of Ljubljana, Slovenia What do researchers think about altmetrics and are they familiar with their.
Global Poverty Action Fund Community Partnership Window Funding Seminar January 2014 Global Poverty Action Fund Community Partnership Window Funding Seminar.
Presenting your track record for ARC proposals Mariella Herberstein: Mary Spongberg:
3. Challenges of bibliometrics DATA: many problems linked to the collection of data FIELDS – DISCIPLINES : various classifications, not satisfactory INDICATORS.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
1 Using Scopus for Literature Research. 2 Why Scopus?  A comprehensive abstract and citation database of peer- reviewed literature and quality web sources.
1 Scopus Update 15 Th Pan-Hellenic Academic Libraries Conference, November 3rd,2006 Patras, Greece Eduardo Ramos
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
Using Journal Citation Reports The MyRI Project Team.
Web of Science Pros Excellent depth of coverage in the full product (from 1900-present for some journals) A large number of the records are enhanced with.
Bibliometrics in Computer Science MyRI project team.
OCLC Changing support/supporting change, June 2014, Amsterdam New roles for research libraries in performance measurement? Paul Wouters, Centre for.
ACUMEN Portfolio Workshop: Opstellen van goed en breed portfolio Paul Wouters en Inge van der Weijden Symposium Grensoverstijgende Talentontwikkeling Utrecht,
E-journal Publishing Strategies at Pitt Timothy S. Deliyannides Director, Office of Scholarly Communication and Publishing and Head, Information Technology.
OST Workshop 12 May 2014, Paris The debate on uses and consequences of STI indicators Paul Wouters, Sarah de Rijcke and Ludo Waltman, Centre for Science.
Assessing child-well-being: perspectives and experiences of Health Behaviour in School- Aged Children (HBSC) Study A World Health Organization Cross- National.
The Web of Science database bibliometrics and alternative metrics
The Latest in Information Technology for Research Universities.
Bibliometrics toolkit: ISI products Website: Last edited: 11 Mar 2011 Thomson Reuters ISI product set is the market leader for.
Rajesh Singh Deputy Librarian University of Delhi Research Metrics Impact Factor & h-Index.
Accountability in Health Promotion: Sharing Lessons Learned Management and Program Services Directorate Population and Public Health Branch Health Canada.
SSHRC Partnership and Partnership Development Grants Rosemary Ommer 1.
EVALUATING SOURCES. THE NEED FOR EFFECTIVE SOURCES Lend credibility to your arguments Support your points with researched information A source is only.
February 28, 2008The Teaching Center, Washington University The Teaching Citation Program & Creating a Teaching Portfolio Beth Fisher, Ph.D. Assistant.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
Building An Academic Career
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
Bibliometrics for your CV Web of Science Google Scholar & PoP Scopus Bibliometric measurements can be used to assess the output and impact of an individual’s.
How to use Bibliometrics in your Career The MyRI Project Team.
DOSSIER PREPARATION MENTORING PROGRAM Session #4 June 23, 2015  CV and Summary Statements (feedback)  Review Teaching Statement of Endeavors and Supporting.
An Open Access Bibliometrics Toolkit Ellen Breen, Library, Dublin City University Open University Library, March 2013.
RESEARCH PROPOSAL: HOW TO REVIEW THE LITERATURE MNGT Özge Can.
Research related to the workshop Data sources and output types Which h-index? (Scientometrics, 2008) Web of Science with the Conference Proceedings Citation.
College of Education Graduate Programs
DOSSIER PREPARATION MENTORING PROGRAM Session #3 June 17, 2014  CV and Summary Statements (feedback)  Review Teaching Statement of Endeavors and Supporting.
Carla Basili - Luisa De Biagi Carla Basili * - Luisa De Biagi * * IRCrES Institute, Rome (IT) *CNR –IRCrES Institute, Rome (IT) Central Library ‘G. Marconi’,
Global Partnership for Enhanced Social Accountability (GPESA) December 19, 2011 World Bank.
ACADEMIC PROMOTIONS Promotions Criteria Please note, these slides only contain a summary of the promotions information – full details can be found.
Bibliometrics in support of research strategy & policy Anna Grey and Nicola Meenan.
Health Technology Assessment Methodology, an EUnetHTA View Basics for the Assessment in EU Countries PharmDr. Martin Visnansky, MBA, PhD., MSc. (HTA) HTA.
MEASURING RESEARCHERS: FROM RESEARCH PUBLICATIONS AND COMMUNICATION TO RESEARCH EVALUATION Lucie Vavříková 1.
What are sponsors looking for in research fellows? Melissa Bateson Professor of Ethology, Institute of Neuroscience Junior Fellowships.
Measuring Research Impact Using Bibliometrics Constance Wiebrands Manager, Library Services.
Bibliometrics at the University of Glasgow Susan Ashworth.
CitEc as a source for research assessment and evaluation José Manuel Barrueco Universitat de València (SPAIN) May, й Международной научно-практической.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Data Mining for Expertise: Using Scopus to Create Lists of Experts for U.S. Department of Education Discretionary Grant Programs Good afternoon, my name.
Strategies to Address Research Opportunity and Performance Evidence (ROPE)/Track Record: ARC Discovery Projects Weighting of Selection Criteria to Obtain.
Metrics What they are and how to use them
The NRF and Me.
Demonstrating Scholarly Impact: Metrics, Tools and Trends
Measuring Scholarly and Public Impact: Let’s Talk Metrics
CMNS 110: Term paper research
How to Improve the Visibility and Impact of Your Research
Metrics: a game of hide and seek
THE OFFICE FOR SCHOLARLY COMMUNICATION/ Responsible Metrics at Kent
CMNS 110: Term paper research
The metrics acumen: supporting individual researchers in assessment
Rating in 2002 for funding from 2003
Internal and External Quality Assurance Systems for Cycle 3 (Doctoral) programmes "PROMOTING INTERNATIONALIZATION OF RESEARCH THROUGH ESTABLISHMENT AND.
Presentation transcript:

LIDA 2014, June 2014, Zadar, Croatia The metrics acumen: supporting individual researchers in assessment Paul Wouters, Centre for Science and Technology Studies (CWTS)

New roles for research libraries Increased bibliometric services at university level available through databases Increased self-assessment via “gratis bibliometrics” on the web (h-index; publish or perish; etc.) Emergence of altmetrics Increased demand for bibliometrics at the level of the individual researcher Societal impact measurements required Career advice – where to publish? 1

Mushroom growth of evaluation Relatively recent phenomenon (since mid 1970s) Formal evaluation protocols: performance indicators all over the place but citation indicators hardly visible Science policy studies tend to underestimate the proliferation and impact of indicator based evaluations Recent studies focus on performance based funding “Anecdotal evidence” shows the proliferation of especially the Hirsch Index and the JIF

Peter Dahler-Larsen The Evaluation Society – “Evaluations are not something that the individual can reject” – Evaluation as disembedded reflexive social practice – Evaluation consists of: Evaluand Criteria Systematic methodology Purpose

Evaluations are liminal One often has the feeling that there should have been a clear-cut plan for the purpose and process of an evaluation, but this is often not the case. (…) people realize too late that they had very different notions of plans for evaluation (…) The purpose of the evaluation constitutes an ongoing controversy rather than a common logical starting point. (p. 15)

Evaluation Machines Primary function: make stuff auditable Mechanization of control – degradation of work and trust? (performance paradox) Risks for evaluand and defensive responses What are their costs, direct and indirect? Microquality versus macroquality – lock-in Goal displacement & strategic behaviour

Constitutive effects Limitations of conventional critiques (eg ‘perverse or unintended effects’) Effects: Interpretative frames Content & priorities Social identities & relations (labelling) Spread over time and levels Not a deterministic process Democratic role of evaluations

Effects of indicators Intended effect: behavioural change Unintended effects: – Goal displacement – Structural changes The big unknown: effects on knowledge? Institutional rearrangements Does quality go up or down?

Responses scientific community Strategic behaviour Ambivalence Sophisticated understanding of indicators and citation numbers Responses vary by discipline, style, position (Hargens and Schuman 1990) “Self-interest” not a valid explanation

The ACUMEN project with Mike Thelwall and Judit Bar-Ilan 9

Academic Careers Understood through Measurements and Norms European 7th Framework collaborative project Capacities, Science in Society 2010 Grant Agreement: institutional partners, in 7 countries

ACUMEN research comparative analysis of peer review systems in Europe assessment of scientometric indicators in performance evaluation analysis of gender dimension in researcher evaluation Common Data Strategy assessment of webometric (and altmetric) indicators Bulgaria Czech Republic Denmark Estonia Finland France Germany Hungary (a) astronomy and astrophysics (b) public and occupational health (c) environmental engineering (d) philosophy (including history and philosophy of science) ethnographic study of important evaluation events 15 European countries Israel Italy Netherlands Poland Slovenia Spain United Kingdom 4 Academic Disciplines Tatum & Wouters | 14 November 2013

aim is to give researchers a voice in evaluation ➡ evidence based arguments ➡ shift to dialog orientation ➡ selection of indicators ➡ narrative component ➡ Good Evaluation Practices ➡ envisioned as web service portfolio expertise output influence narrative

ACUMEN Portfolio Career Narrative Links expertise, output, and influence together in an evidence-based argument; included content is negotiated with evaluator and tailored to the particular evaluation Output - publications - public media - teaching - web/social media - data sets - software/tools - infrastructure - grant proposals Expertise - scientific/scholarly - technological - communication - organizational - knowledge transfer - educational Influence - on science - on society - on economy - on teaching Evaluation Guidelines -aimed at both researchers and evaluators -development of evidence based arguments (what counts as evidence?) -expanded list of research output -establishing provenance -taxonomy of indicators: bibliometric, webometric, altmetric -guidance on use of indicators -contextual considerations, such as: stage of career, discipline, and country of residence Tatum & Wouters | 14 November 2013

Use of the ACUMEN Portfolio As a self-evaluation tool – To give insights into your career As part of a job, grant or promotion application, evaluators: – request an ACUMEN Portfolio from candidates – may request a full or cut-down Portfolio – compare the candidates’ Portfolios with the help of the guidelines

Data sources Bibliometric sources – Limitations in terms of accuracy and interpretation – Google Scholar, Scopus, Google Books and Web of Science recommended as main sources Webometric sources – More limitations in terms of accuracy and interpretation – Many different web and social web sources (e.g., Twitter, Mendeley) – Covers types of impact invisible to bibliometric indicators

Details Evaluators: Portfolio selection – Select aspects of portfolio most relevant to task, discipline and seniority of applicants – Full portfolio (10 hours) only for serious case Academics: Portfolio completion – Start with your CV - and get a librarian’s help? Evaluators: Portfolio evaluation – Compare candidates based on the importance, relevance, and reliability of the indicators driven by the narrative – Take into account academic age (and typical team size) – Start with reading and checking narrative; narrow down candidates on the basis of the narrative; compare the full ACUMEN Portfolios for similar candidates

Detailed advice about indicators Read the Guidelines for Good Evaluation Practice for help interpreting the indicators Reliability and importance of each indicator Bibliometrics – E.g., the reliability of Google Scholar results Webometrics – E.g., whether the number of tweets of an article is relevant

Academic age Start date: PhD defence Correction for: – Part time work – Work outside academia – Number of children raised after start PhD (1 year, 0.5 year is shared by carers) – Special allowances [e.g. disability, illness-related time off work > 6 months] Academic age = number of years since PhD defence – corrections

Narrative Highlights: achievements, ambitions and interests Links the three sub-portfolios together Presents self-perspective Situation dependent Not too long – Not more than 500 words

Expertise (1) Scientific/scholarly – Theoretical – Subject – Methodological – Originality/independence A few sentences briefly summarising the specific expertise. Evidence to support claims should be provided, such as citing a paper

Expertise (2) Knowledge transfer – Reviewing – Entrepreneurship Educational expertise – Courses taught or developed – Other educational expertise Examples (not more than 3 - the most prominent ones)

Output (1) Scholarly outputs (total count + top 3 for each sub- factor) – Books – Book chapters – Reviews – Editorials – Journal articles – Conference papers

Output (2) Communication to the general public (total count + top 3 for each sub-factor) – Press stories – Encyclopedia articles – Popular books/articles Teaching – Textbooks (total count + top 3) – Online courses (top 3) – Students completed (counts of: undergraduates, Master’s, PhD students)

Influence (1) Influence of science Total and average citations – Preferably both from Google Scholar and from Web of Science or Scopus Article citations – Number of citations received by each of top 3 articles h-index – The h-index is the largest number h such that at least h articles have received at least h citations

Influence (3) Influence in science (continued): top 3 Scholarly prizes Editing, editorial board membership and reviewing Conference/program committee memberships Downloads – Download counts for top 3 downloaded articles Invited talks – Total counts per type + details of top 3

Influence (7) Influence on economy Income Consultancies (top 3) Citations from patents (total count + details of top 3) Citations to patents (total count + details of top 3) Spin-offs (count)

Influence (8) Influence on education/teaching (top 3) Teaching awards Online views of presentations – Number of views of top 3, if substantial on SlideShare, YouTube, Vimeo, online learning environments, etc. Syllabus mentions – Online syllabuses or course notes pages listing the academic’s works (top 3)* – Include only when educational uptake is important

Portfolio & Guidelines ➡ Instrument for empowering researchers in the processes of evaluation ➡ Taking in to consideration all academic disciplines ➡ Suitable for other uses (e.g. career planning) ➡ Able to integrate into different evaluation systems Tatum & Wouters | 14 November 2013