Download presentation
Presentation is loading. Please wait.
Published byNicholas Randall Modified over 9 years ago
1
RESEARCH EVALUATION WORKSHOP UNITED KINGDOM OCTOBER 2010
2
WHY EVALUATE RESEARCH PERFORMANCE? Quantitative analysis is the main tool of science Communicating research results complex Personal knowledge no longer sufficient for decision making Need to be selective over support for research projects Peer review was foundation of policy decisions –Library collection decisions –Foundation allocating limited funding –Government office weighing national research needs
3
WHY EVALUATE RESEARCH PERFORMANCE? Evaluation and strategic planning –Periodic evaluation of research performance –Institution, departmental or researcher level assessments Accreditation, tenure, faculty review –Performance indicators Used in strategic planning Reporting to government bodies, boards of directors/trustees Research Centers –Find new staff –Develop lines of investigation –Compete for funds
4
HOW IS RESEARCH EVALUATED? Research –Volume, income, reputation Prestigious awards –Nobel Prizes Innovation –Industry income and patents Teaching –Academic Reputation Survey, higher degrees International Mix –National / International staff and students Citation analysis –Normalised for volume and subject area Peer Evaluation –Reputational survey
5
HOW IS RESEARCH EVALUATED? 5
6
6
7
HOW IS RESEARCH EVALUATED? SOUTH AFRICA 7
8
THE GROWING USE OF BIBLIOMETRICS Nations with significant science enterprises have embraced bibliometrics Today, bibliometrics programs with large teams of analysts are firmly established in many nations –These groups issue bibliometric reports, often called science indicators studies, at regular intervals In almost all cases, the publication and citation data of Thomson Reuters form the basis of their bibliometric analyses
9
INSTITUTIONS USING WEB OF SCIENCE CITATION DATA FOR EVALUATION (INCL.) United Kingdom: KCL, HEFCE, St. Andrew’s Germany: IFQ, Max Planck Society, DKFZ, MDCUS Netherlands: NWO & KNAW France: Min. de la Recherche, OST - Paris, CNRS European Union: EC’s DGXII(Research Directorate) US: NSF: biennial Science & Engineering Indicators report (since 1974) Canada: NSERC, FRSQ (Quebec), Alberta Research Council Australian Academy of Science, gov’t lab CSIRO Japan: Ministry of Education, Ministry of Economy, Trade & Industry People’s Republic of China: Chinese Academy of Science Multiple rankings agencies 9
10
THE DATA
11
A BRIEF HISTORY OF THE CITATION INDEX Concept first developed by Dr Eugene Garfield –Science, 1955 The Science Citation Index (1963) –SCI print (1960’s) –On-line with SciSearch in the 1970’s –CD-ROM in the 1980’s –Web interface (1997) Web of Science Content enhanced: –Social Sciences Citation Index (SSCI) –Arts & Humanities Citation Index (AHCI) The Citation Index –Primarily developed for purposes of information retrieval –Development of electronic media and powerful searching tools have increased its use and popularity for purposes of Research Evaluation
12
THE VALUE OF A CITATION Why do people cite? –Pay homage / give credit to pioneer –Identifying a methodology –Provide background reading –Quotations –Authenticating data, reproducing work etc –Corrections –Criticizing/Disclaiming someone's work/opinions Citations are an indicator of an article’s impact and usefulness to the research community; they are the mode by which peers acknowledge each other’s research. The value of a citation is only as important as its source. –Clearly a citation from a prestigious peer review journal has more value than a citation from non-scholarly material. –How can you be sure that the citing source is reputable? “When to Cite”, E. Garfield, Library Quarterly, v66, p449-458, 1996
13
WHY NOT INDEX ALL JOURNALS? 40% of the journals: 80% of the publications 92% of cited papers 4% of the journals: 30% of the publications 51% of cited papers
14
HOW DO WE DECIDE WHICH JOURNALS TO INDEX? Approx. 2.500 journals evaluated annually –10-12% accepted Thomson Reuters editors –Information professionals –Librarians –Experts in the literature of their subject area Web of Science Journals under evaluation Journal ‘quality’
15
THOMSON REUTERS JOURNAL SELECTION POLICY Publishing Standards –Peer review, Editorial conventions Editorial content –Addition to knowledge in specific subject field Diversity –International, regional influence of authors, editors, advisors Citation analysis –Editors and authors’ prior work
16
Region# Journals from Region in Web of Science Europe5,573 49% North America4,251 38% Asia-Pacific965 9% Latin America272 2% Middle East/Africa200 1% Language# Journals in Web of Science English9114 81% Other2147 19% GLOBAL RESEARCH REPRESENTATION WEB OF SCIENCE COVERAGE
17
Full range of scholarly research disciplines Adheres to a consistent selection policy Ensures that publications and citations are comparable Consistent indexing Cover-to-cover indexing All author names All author addresses CONSISTENCY IS THE KEY TO VALIDITY - COMPARE APPLES WITH APPLES
18
Government agencies/funding organizations Individuals Faculty, staff, students University Departments Institutional research, academic affairs, tech transfer, etc. External Entities University Management Management, including committees, provost, vice provosts PRIMARY USERS OF CITATION DATA IN RESEARCH EVALUATION
19
Government agencies/funding organizations External Entities EXTERNAL ENTITIES RESEARCH EVALUATION –Higher Education Funding Council for England, UK –National Science Foundation (USA) –European Commission (EU) –L’Observatoire des Sciences et des Techniques (OST ) –National Institute for Science and Technology Policy (NISTEP ), Japan –Human Sciences Research Council, South Africa
20
University Management Management, including committees, provost, vice provosts UNIVERSITY MANAGEMENT RESEARCH EVALUATION
21
INSTITUTIONAL LEVEL RESEARCH EVALUATION University Management Management, including committees, provost, vice provosts Source: Thomson Reuters U.S. and Canadian University Science Indicators Number of citations to North American scientific papers
22
NUMBER OF RESEARCHERS BY DEPARTMENT AND ROLE
23
PRODUCTIVITY BY UNIVERSITY DEPARTMENT Allows user to analyse output and performance based on their institution’s departments
24
University Departments Institutional research, academic affairs, tech transfer, etc. SUBJECT CATEGORY LEVEL RESEARCH EVALUATION
25
DOCUMENT TYPE BY DEPARTMENT
26
IN WHICH JOURNALS HAS OUR CHEMISTRY DEPT. PUBLISHED?
27
HOW HAVE THOSE PAPERS PERFORMED? 27
28
HOW MANY CITATIONS HAS THE CHEMISTRY DEPT. RECEIVED? 28
29
IN WHICH JOURNALS ARE THOSE CITATIONS PUBLISHED? 29
30
Individuals Faculty, staff, students INDIVIDUAL LEVEL RESEARCH EVALUATION
31
EVALUATING INDIVIDUALS Nancy J Rothwell, DBE, FRS President & Chancellor, Univ Manchester Dame Commander Order of the British Empire Fellow of the Royal Society Research Chair, Medical Research Council
32
EVALUATING INDIVIDUALS 32 Number of articles : 485 Sum of the Times Cited : 18,943 Average Citations / Item : 39.06 h-index : 70
33
WHO CITED THIS AUTHOR’S RESEARCH? A very international profile illustrating the global impact of Prof. Rothwell’s research
34
EVALUATING INDIVIDUALS INDIRECT INFLUENCE
35
Full range of scholarly research disciplines Adheres to a consistent selection policy Ensures that publications and citations are comparable Consistent indexing Cover-to-cover indexing All author names All author addresses CONSISTENCY IS THE KEY TO VALIDITY
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.