Download presentation
Presentation is loading. Please wait.
Published byMarjory Waters Modified over 9 years ago
1
Research evaluation at CWTS Meaningful metrics, evaluation in context
scheduled: 23 aug 2012 Ed Noyons, Centre for Science and Technology Studies, Leiden University RAS Moscow, 10 October 2013
2
Outline Centre of science and Technology Studies (CWTS, Leiden University) history in short; CWTS research program; Recent advances.
3
25 years CWTS History in Short 3
4
25 years CWTS history in short (1985-2010)
Started around 1985 by Anthony van Raan and Henk Moed; One and a half person funded by university; Context is science policy, research management; Mainly contract research and services (research evaluation); Staff stable around 15 people (10 researchers); Main focus on publication and citation data (in particular Web of Science).
5
25 years CWTS history in short (2010 - …)
Block funding since 2008; Since 2010 moving from Services mainly with some research to: Research institute with services; New director Paul Wouters; New recruitments: now ~35 people.
6
CWTS Research programme
Research and services CWTS Research programme
7
Bibliometrics (in context science policy) is ...
8
Research Accountability => evaluation
Opportunities Research Accountability => evaluation Need for standardization, objectivity More data available
9
Quantitative analyses Beyond the ‘lamppost’
Vision Quantitative analyses Beyond the ‘lamppost’ Other data Other outputs Research 360º Input Societal impact/quality Researchers themselves
10
Background of the CWTS research program
Already existing questions New questions: How do scientific and scholarly practices interact with the “social technology” of research evaluation and monitoring knowledge systems? What are the characteristics, possibilities and limitations of advanced metrics and indicators of science, technology and innovation? Existing: how are actors doing, what does a field look like, what are the main developments, which university performs best, …
11
Current CWTS research organization
Chairs Scientometrics Science policy Science Technology & innovation Working groups Advanced bibliometrics Evaluation Practices in Context (EPIC) Social sciences & humanities Society using research Evaluation (SURE) Career studies
12
A look under the lamp post
Back to Bibliometrics
13
Recent advances at CWTS
Platform: Leiden ranking Indicators: New normalization to address: Multidisciplinary journals (Journal based) classification Structuring and mapping Advanced network analyses Publication based classification Visualization: VOSviewer
14
The Leiden Ranking 14
15
Platform: Leiden Ranking http://www.leidenranking.com
Based on Web of Science ( ); Only universities (~500); Only dimension is scientific research; Indicators (state of the art): Production Impact (normalized and‘absolute’) Collaboration. Research institutes will added Size dependent and independent
16
Leiden Ranking – world top 3 (PPtop10%)
Normalized impact Stability: Intervals to enhance certainty Expected value is 10% Stability intervals to prevent misinterpretation
17
Russian universities (impact)
RAS is not a university. Will be added in the next edition
18
Russian universities (collaboration)
19
Impact Normalization (MNCS)
Dealing with field differences Impact Normalization (MNCS) We will look at the Mean Normalized Citation Score (MNCS). This is similar to PP top10%. The former is an average and more sensitive to outiers 19
20
Background and approach
Impact is measured by numbers of citations received; Excluding self-citations; Fields differ regarding citing behavior; One citation is one field is more worth than in the other; Normalization By journal category By citing context. 20
21
Issues related to journal category-based approach
Scope of category; Scope of journal.
22
Journal classification ‘challenge’(scope of category) (e. g
Journal classification ‘challenge’(scope of category) (e.g. cardio research)
23
Approach Source-normalized MNCS
Source normalization (a.k.a. citing-side normalization): No field classification system; Citations are weighted differently depending on the number of references in the citing publication; Hence, each publication has its own environment to be normalized by. 23
24
Source-normalized MNCS (cont’d)
Normalization based on citing context; Normalization at the level of individual papers (e.g., X) Average number of refs in papers citing X; Only active references are considered: Refs in period between publication and being cited Refs covered by WoS. 24
25
Networks and visualization
Collaboration, connectedness, similarity, ... Networks and visualization 25
26
VOSviewer: collaboration Lomonosov Moscow State University (MSU)
WoS ( ) Top 50 most collaborative partners Co-published papers
27
Structure of science output (maps of science); Oeuvres of actors;
Other networks Structure of science output (maps of science); Oeuvres of actors; Similarity of actors (benchmarks based on profile); …
28
Publication based classification
Structure of science independent from journal classification Publication based classification 28
29
Publication based classification (WoS 1993-2012)
Publication based clustering (each pub in one cluster); Independent from journals; Clusters based on Citing relations between publications Three levels: Top (21) Intermediate (~800) Bottom (~22,000) Challenges: Labeling Dynamics.
30
Map of all sciences (784 fields, WoS 1993-2012)
Each circle represents a cluster of pubs Colors indicate clusters of fields, disciplines Social and health sciences Cognitive sciences Maths, computer sciences Biomed sciences Physical sciences Earth, Environ, agricult sciences Distance represents relatedness (citation traffic) Surface represents volume
31
Positioning of an actor in map
Activity overall (world and e.g., Lomonosov Moscow State Univ, MSU) Proportion Lomonosov relative to world; Activity per ‘field’ (world and MSU) Proportion MSU in field; Relative activity MSU per ‘field’; Scores between 0 (Blue) and 2 (Red); ‘1’ if proportion same as overall (Green).
32
Positioning Lomonosov MSU
33
Positioning Lomonosov MSU
34
Positioning Russian Academy of Sciences (RAS)
35
Alternative view Lomonosov (density)
36
Using the map: benchmarks
Benchmarking on the basis of research profile Distribution of output over 784 fields; Profile of each university in Leiden Ranking; Distributions of output over 784 fields; Compare to MSU profile; Identify most similar.
37
Most similar to MSU (LR) universities
FR - University of Paris-Sud 11 RU - Saint Petersburg State University JP - Nagoya University FR - Joseph Fourier University CN - Peking University JP - University of Tokyo For these we have cleaned data
38
Density view MSU
39
Density view St. Petersburg State University
40
VOSviewer (Visualization of Similarities) http://www.vosviewer.com
Open source application; Software to create maps; Input: publication data; Output: similarities among publication elements: Co-authors Terms co-occurring Co-cited articles …
41
More information CWTS and methods
42
THANK YOU
43
Basic model in which we operate (research evaluation)
Research in context
44
Example (49 Research communties of a FI univ)
‘Positive’ effect ‘Negative’ effect
45
RC with a‘positive’effect
Most prominent field Impact increases
46
Rc with a‘negative’ effect
Most prominent field Impact same Less prominent field Impact decreases
47
Wrap up Normalization Normalization based on journal classification has its flaws; We have developed recently an alternative; Test sets in recent projects show small (but relevant) differences;
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.