Download presentation
Presentation is loading. Please wait.
Published byBrandon Emery Montgomery Modified over 8 years ago
1
Tools for Effective Evaluation of Science InCites David Horky Country Manager – Central and Eastern Europe david.horky@thomsonreuters.com
2
AGENDA Challenges of evaluating science and research Bibliometrics as a way of effective evaluation of science Normalised metrics 2
3
Government agencies/funding organizations External Entities University Management Management, including committees, chancellor, provost Individuals Faculty, staff, students University Departments Institutional research, academic affairs, tech transfer, etc. Who are the stakeholders in research evaluation?
4
Overview What problems are people trying to solve –Aggregate, track, and analyze output –Create systems to do this –Develop Reports –Publicize their work –Enable researchers to do deep analyses What do people do today? –Disparate approaches –Ad hoc
5
Our resources are based on the Web of Science, the gold standard for research evaluation used by universities and governments around the world. The Web of Science is the best resource available because it has: o Authoritative and trustworthy content o Consistent, 25+ years of archive and a well understood collection o Multidisciplinary, with no emphasis on any subject area Thomson Reuters presents not just citation and record counts, but meaning behind the numbers. Global averages and percentiles enable our customers to benchmark and make comparisons and ultimately make the right decisions. Thomson Reuters Expertise and Processing Research Analytics Resources Data Address Unification Data Cleansing & Standardization Normalization and Baselines Web of Science Research Analytics: Source and Foundation
6
Web of Science® : STANDARD for bibliometrics “Multidisciplinary” coverage –enable to analyze the whole context of scientific research “Multiyear” coverage –enable to analyze the history and development of sciences “Cover to Cover” policy –enable to follow the flow of a topic regardless of communication type “ALL Authors, ALL Addresses” –enable to analyze by author name, by institution “ALL Cited References” –enable to perform analyses on literature that is not indexed The CONSISTENCY enables large-scale counting and reliable analyses
7
US National Science Foundation: Science & Engineering Indicators European Commission: European Union Science & Technology Indicators US National Research Council: Doctoral Program Ranking Also governments in France, Australia, Italy, Japan, UK, Portugal, Norway, Spain, Belgium, South Korea, Canada, etc. to shape higher education policy. 7 Major assessments rely on Web of Science data
8
The External vs Internal view of Research 8 Research Performance Profiles (Indicators) –Macro analysis –Compare your institution / country to your peers –All Institutions within a region –All Nations/Territories of the world. Global Comparisons (Citation Report) –Customized data set –Measure and compare the internal entities at your institution –Article, Author, Department, Collaboration analysis –Typically all the records from an institution
9
Research Performance Profiles 9
10
Research Performance Profiles Institutional Comparison Confidential - Thomson Reuters -- Not for Redistirbution Time trends Output vs Performance Where do we stand?
11
Confidential - Thomson Reuters -- Not for Redistirbution Compare institutions in a particular field Research Performance Profiles
12
Confidential - Thomson Reuters -- Not for Redistirbution What are overall trends for our fields
13
Global Comparisons 13
14
CHALLENGES OF RESEARCH EVALUATION USING CITATION METRICS Citation behavior is very different for different disciplines –Life Sciences = many articles, highly cited, quickly tail off –Mathematics = low citations, continue to be cited for many years How can I account for these differences? Older material is cited more, so how can I account for the different ages of publications that I wish to compare? –How can I better compare the performance of a researcher with a long publication history to a researcher with a short publication history? Citation distribution is very uneven –Only a small number of publications are highly cited –Most publications have little or no citations 14
15
Thomson Reuters value added metrics 15 Basic bibliographic information about the article (including the field) Number of citations The Journal Impact Factor from the latest edition of the Journal Citation Reports
16
Thomson Reuters value added metrics 16 2 nd generation citation data, the articles that have cited the citing articles
17
Thomson Reuters value added metrics Expected performance metrics. We calculate the number of citations a typical article would expect to receive. This is calculated for each Journal (Journal Expected Citations - JXC) and for each Category (Category Expected Citations - CXC) these metrics are also normalized for the year and document type.
18
Thomson Reuters value added metrics Although, it is not displayed on this screen, we also calculate the ratio between the actual and expected performance. This provides meaning and understanding of the citation counts and is a normalized performance measure. JXC Ratio (157 / 45.09) = 3.48 CXC Ratio (157 / 3.66) = 42.90
19
Thomson Reuters value added metrics 19 The percentile. As compared for the set of documents in the same field and the same year. This paper is in the top 0.2% of all papers in “General & Internal Medicine” for the year 2007
20
Citing articles Additional Information Link to the Web of Science
21
InCites: Global Comparisons Many pre-defined reports are presented for immediate use. Citation Metrics provide fundamental information on the papers within a dataset and their collective citation influence. Disciplinarity Metrics Disciplinarity Index provides a measure of the concentration of a set of source articles across a set of categories. Interdisciplinarity Index communicates the extent to which a collection of source articles is or is not diversely multidisciplinary in nature.
22
InCites: Global Comparisons Collaboration Metrics provide fundamental information for authors, institutions, and countries represented within the source articles dataset.
23
View the data in time series to understand trends.
24
InCites: Global Comparisons The Source Articles Listing and additional Ranking reports associated with source articles enable detailed examination and evaluation of Papers, Authors, Institutions, etc. that have produced the source articles.
25
InCites: Global Comparisons
26
This is a useful tool for making direct comparisons of entities such as 2 individual authors. The normalized data and multiple indicators make for a comprehensive and accurate evaluation. The graphical summaries provide instant understanding
27
Collaborations Are these collaborators contributing to our performance? We can make direct comparisons at a glance
28
InCites: Citation Report Citing Articles Listing and associated reports provide unique insight into the body of published Papers, Authors, Institutions, and Countries influenced by the Source Articles.
29
Summary Authoritative, consistent data from the world’s leading provider of Research Evaluation solutions. A tailored data set and the ability to create your own sub-sets and associated metrics provides for specificity, answers to questions at a local level. Context around the data, such as baselines and percentiles, gives the metrics genuine meaning, comparative value. Standard report tools, generate consistent results and combine with your own datasets such as funding etc. A Web-Based tool, accessible by any number of select users within your institution. Centralized access ensures everyone gets the same data
30
Thank you! More info: http://in-cites.com/rsg/ David Horky Country Manager – Central and Eastern Europe david.horky@thomsonreuters.com
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.