Using Incites to evaluate Research Performance Advanced

Slides:



Advertisements
Similar presentations
INFORMATION SOLUTIONS Citation Analysis Reports. Copyright 2005 Thomson Scientific 2 INFORMATION SOLUTIONS Provide highly customized datasets based on.
Advertisements

Getting started with bibliometrics White Rose Research Support Forum Jane Saunders & Marion Tattersall J.
IDENTIFYING HOT BRAZILIAN SCIENCE AND TECHNOLOGY: TECH MINING METHODS FOR RELATING SOURCES OF KNOWLEDGE AND EMERGING RESEARCH AREAS EU-SPRI CONFERENCE,
The Thomson Reuters CITATION CONNECTION Digital Library st March – 3 rd April 2014, Jasná David Horký Country Manager – Central and Eastern Europe.
Information Retrieval to Informed Action with Research Metrics Thomson Scientific Research Services Group 2007.
INCITES PLATFORM NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION (NOAA)
1 Guide to exercise 10 Bibliometric searching on indicators for journals, papers, and institutions Tefko Saracevic.
1 Scopus Update 15 Th Pan-Hellenic Academic Libraries Conference, November 3rd,2006 Patras, Greece Eduardo Ramos
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
My Research, its Potential, and its Contribution to SCIT Mike Thelwall.
InCites TM 1.
About use and misuse of impact factor and other journal metrics Dr Berenika M. Webster Strategic Business Manager 23 January 2009, Sydney.
Using Journal Citation Reports The MyRI Project Team.
THE ROLE OF CITATION ANALYSIS IN RESEARCH EVALUATION Philip Purnell September 2010.
Journal Impact Factors and H index
How to promote your publications via live CV? Stop Searching, Start Discovering.
New Web of Science Rachel Mangan Customer Education
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE KIEV, 31 JANUARY.
InCites TM
The Latest in Information Technology for Research Universities.
Bridging the Impact Gap: Systems and Serendipity 13 September 2013 ARMS Conference, Adelaide 2013 Natalie Thompson Acting Research Support Services Manager.
Bibliometrics toolkit: ISI products Website: Last edited: 11 Mar 2011 Thomson Reuters ISI product set is the market leader for.
Digital Libraries: Redefining the Library Value Paradigm Peter E Sidorko The University of Hong Kong 3 December 2010.
SCIENTIFIC SOLUTIONS Journal Citation Reports ® New Features of Version 4.0.
Discovery tools and research assessment solutions APRIL 2012 Shahrooz Sharifrazy Regional Sales Manager.
A month in the life of a university bibliometrician Dr Ian Rowlands University of Leicester, UK.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE LVIV, 11 SEPTEMBER.
THOMSON REUTERS RESEARCH IN VIEW Philip Purnell September 2011 euroCRIS symposium Brussels.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
THOMSON REUTERS—GLOBAL INSTITUTIONAL PROFILES PROJECT DR. NAN MA SCIENCE AND SOLUTION CONSULTANT THOMSON REUTERS OCT 19 TH, 2010.
Bibliometrics for your CV Web of Science Google Scholar & PoP Scopus Bibliometric measurements can be used to assess the output and impact of an individual’s.
Citation Searching with Web of Knowledge Roger Mills Catherine Dockerty OULS Bio- and Environmental.
An Open Access Bibliometrics Toolkit Ellen Breen, Library, Dublin City University Open University Library, March 2013.
Bibliometrics toolkit Website: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Further info: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Scopus Scopus was launched by Elsevier in.
SciVal Spotlight Training for KU Huiling Ng, SciVal Product Sales Manager (South East Asia) Cassandra Teo, Account Manager (South East Asia) June 2013.
RESEARCH – DOING AND ANALYSING Gavin Coney Thomson Reuters May 2009.
ESSENTIAL SCIENCE INDICATORS (ESI) James Cook University Celebrating Research 9 OCTOBER 2009 Steven Werkheiser Manager, Customer Education & Training ANZ.
RESEARCH EVALUATION - THE METRICS UNITED KINGDOM OCTOBER 2010.
We provide web based benchmarking, process diagnostics and operational performance measurement solutions to help public and private sector organisations.
1 RUSSIAN SCIENCE UNDER THE MICROSCOPE Philip Purnell Moscow, October 2013.
INCITES TM INSTITUTIONAL PROFILES David Horky Country Manager – Central & Eastern Europe Informatio Scientifica / Informatio.
MARKO ZOVKO, ACCOUNT MANAGER STEPHEN SMITH, SOLUTIONS SPECIALIST JOURNALS & HIGHLY-CITED DATA IN INCITES V. OLD JOURNAL CITATION REPORTS. WHAT MORE AM.
The Thomson Reuters Journal Selection Policy – Building Great Journals - Adding Value to Web of Science Maintaining and Growing Web of Science Regional.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
Bibliometrics at the University of Glasgow Susan Ashworth.
CitEc as a source for research assessment and evaluation José Manuel Barrueco Universitat de València (SPAIN) May, й Международной научно-практической.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Role of librarians in improving the research impact and academic profiling of Indian universities J. K. Vijayakumar Ph. D Manager, Collections & Information.
Tools for Effective Evaluation of Science InCites David Horky Country Manager – Central and Eastern Europe
Data Mining for Expertise: Using Scopus to Create Lists of Experts for U.S. Department of Education Discretionary Grant Programs Good afternoon, my name.
Where Should I Publish? Journal Ranking Tools
Our Digital Showcase Scholars’ Mine Annual Report from July 2015 – June 2016 Providing global access to the digital, scholarly and cultural resources.
CHALLENGES 1.
Demonstrating Scholarly Impact: Metrics, Tools and Trends
Bibliometrics toolkit: Thomson Reuters products
Citation Analysis Your article Jill Otto InCites Other?
Welcome slide.
Digital Measures Replacement
TRENT UNIVERSITY 2007 CGPSS REPORT
How to Improve the Visibility and Impact of Your Research
Optimize your research performance using SciVal
Advanced Scientometrics Workshop
Identifying Research Strengths through Bibliometric Analysis
SciVal to support building a research strategy
An Introducation to ResearcherID
Evaluation Activities
Comparing your papers to the rest of the world
Bibliometric Services at the Masaryk University
Citation databases and social networks for researchers: measuring research impact and disseminating results - exercise Elisavet Koutzamani
Presentation transcript:

Using Incites to evaluate Research Performance Advanced

Objectives: During this session we will explore the scenarios and questions related to evaluating research performance on three different levels: –Individual researcher –Academic department –Institution We will discuss each scenario/question and provide evidence based responses using metrics and reports taken from the various modules of Incites. The aim of this workshop is to show with practical examples how Incites data can be applied to provide citation based evidence to support decisions for the purpose of Research Evaluation. This is an interactive workshop and participants are encouraged to input questions they have regarding Research Evaluation at their institutions and the group will discuss how Incites can be used to provide a solution. Thomson Reuters will also be presenting the upcoming developments to the Incites platform including the improvements coming to optimise the integration of Research Analytics data. (Slides 55-63) 2

GUIDELINES FOR CITATION ANALYSIS 3 Compare like with like – The Golden Rule Use relative measures, not just absolute counts More applicable to hard sciences than arts/humanities Know your data parameters: –journal categories –author names –author addresses –time periods –document types Obtain multiple measures Recognize skewed nature of citation data Ask whether the results are reasonable Understand your data source For further guidance on citation metrics, download the white papers at:

Know your dataset (1) Options for building a Research Performance Profile  Address Based Extraction of Web of Science records based on author affiliations. Created snapshot in time of work produced at your institution.  Extraction of WOS records that contain at least one occurrence of affiliation in the address field All potential address variants are searched, with input from you. This is a straightforward, relatively fast way to create an RPP dataset, and by far the most common method.  Author Based Reflects your internal groups (specific departments, schools, etc.) Data will reflect papers by current staff produced at prior institutions, if the publication lists are provided. This does require effort on your part with provision of information for each author. 4

Know your dataset (2) Address based v Author based RPP Address-based dataset + Fast to build +Easy to maintain +Easy to keep up with data + Can receive dataset data (including metrics/percentiles) through FTP or any other type of file exchange system -Authors are not uniquely identified - No differentiation between departments -Can not use an API to pull data (The API can only be used to pull their dataset's WoS data (abstract, affiliations, etc.) without any InCites metric). Author-based dataset + Authors can be uniquely identified +There is differentiation between departments +Can provide bibliometric information at department level + Can receive dataset data (including metrics/percentiles) through FTP or any other type of file exchange system PLUS they can also pull their dataset (with InCites metrics) through an API. -More complex to build if repository data are not “clean” 5

Individual Researcher Evaluation-what do you want to measure/analyse/identify? 6

Individual Researcher 1.How many papers have I published? (Slide 8) 2.What types of papers do I publish? (Slide 8) 3.Which is my most cited paper? (Slide 9) 4.In which journals do I publish? (Slide 10) 5.In which journals should I look to publish my research? (IF) (Slide 11) 6.Who do I collaborate with (other institution) and which are the best performing collaborations? (Slide 12, 13 & 14) 7.Who do I collaborate with (with in my organisation) and with which co-authors does the research perform the best? (Slide 12, 13 & 14) 8.Who is funding my research? (Slide 15) 9.How can I be more successful at procuring funding? (Slide16) 10.Do I have papers which have an impact above the journal average? (Slide 16) 11.Do I have papers which have an impact above the field average? (Slide 16) 12.How many papers do I have in the top 1%, 5% or top 10% of their field? (Slide 17) 13.Can you think of any other author related questions/topics? 7

Author Profile- Author Publication Report 8 1.How many papers have I published? 4.In which journals do I publish? 2.What types of documents do I publish?

Author- Source Articles Listing 9 3. Which is my highest cited paper?

Journal Ranking 10 4.In which journals do I publish? How does the performance compare to similar research?

5. In which journals should I publish? 11 The papers in this journal have a below expected impact. The journal ranks in the 2 nd quartile of the category in JCR. The author might want to publish in journals that are in the 1 st quartile. Journals in 1 st Quartile for Environmental Studies-2012 JCR.

Collaborating Authors-list report Who do I collaborate with (within organisation) and which are the best performing collaborations? 6. Who do I collaborate with (internal and external) and which are the best performing collaborations?

Collaborating Authors-ego network report Who do I collaborate with (within organisation) and which is the best performing collaboration?

Collaborating Authors-ego network report Who do I collaborate with (internal and external) and which are the best performing collaborations?

Funding Agency Listing 15 8.Which funding bodies are funding my research? Which funding agency is occurring most frequently? With which agency does the research provide greatest return on investment ( order by average impact or view source articles listing for paper level metrics)

9. How can I be more successful at procuring funding? 16 Provide evidence that your research has an impact above expected in the journals and categories where you publish. 10. Do I have papers which have an impact above the journal expected citations? 11. Do I have papers which have an impact above the category expected citations?

Summary Metrics How many papers do I have in the top 1%, 5% and top10% of the categories where I publish?

Academic Department- what do you want to measure/analyse/evaluate? 18

Academic Department (Author dataset, focus on Biological Sciences at Simon Fraser University) 1.What is the total output of this department? (Slide 20 & 21) 2.What type of documents do we produce? (Slide 22) 3.In which journals do we publish? (Slide 23) 4.Which are our highest cited papers? (Slide 24) 5.Which papers have exceeded the journal average impact? (Slide 24) 6.Which papers have exceeded the category average impact? (Slide 24) 7.What percentage of our research is uncited? (Slide 25) 8.Who are our top producing authors? (Slide 26) 9.Which of our authors have a better than expected performance in the journals and categories where they publish? (Slide 27) 10.Which authors could be mentors to our faculty members? (Slide 28) 11.Which institutions do we collaborate with? (Slide 29) 12.Which are the best performing (impact in field-use percentile) collaborations? (Slide 30) 13.Are there collaborations which are not providing return on investment? (Slide 31) 14.Who are potential new collaborators? (Citing Articles) (Slide 32 & 33) 15.Which funding agencies are funding our research? (Slide 34) 16.Has our research impact changed over time? Is it better in recent years? (Slide 35) 17.Has our department grown in size? (Slide 36) 18.Has our output increased over time and how does this compare to the total output for the university? (Slide 37) 19.How does our performance compare to other departments? (Slide 38) 20.Can you think of any other department related topics/questions? 19

1. What is the total output of the department? 20

1. What is the total output of the department? 21 Document type breakdown

2. What type of documents do we produce? 22 Measure the performance of the document types

Journal Ranking In which journals do we publish our research? How does the impact compare to similar research?

Department Source Article Listing Which papers have exceeded the journal expected impact? 6. Which papers have exceeded the category expected impact? 4. Highest cited papers for department?

Department Summary Metrics What percentage of our research is un-cited?

Department Author Ranking Who are our top producing authors?

Department Author Ranking Which authors have a better than expected performance in the journals and categories where they publish?

10. Which authors could be mentors to our faculty members? 28 List authors in department who have published a minimum number of papers and identify authors who have a performance above journal/category expected impact.

Department- Institutional Collaborations Which institutions do our authors collaborate with?

Department Institutional Collaborations Which are the best performing collaborations?

Department- Institutional Collaborations Are there collaborations which are not providing return on investment? Note: Time period must be taken into consideration. Further investigate these papers with Source Articles Listing

Citing Article Set- Institution Ranking Who are potential new collaborations? Authors from Simon Fraser Biological Sciences have collaborated with Univ Sussex on one paper. Authors from Univ Sussex have cited 18 papers from the Biological Sciences Department- this could be a potential new collaboration.

Source Article Listing-Citing Article Dataset 33 These are the 18 papers that cited publications from the Biological Sciences Department at Simon Fraser University Which papers were influential to authors at Univ Sussex? How influential are the 2 nd generation citation papers?

Department Funding Agency Listing Which funding bodies are funding our research?

Department- Trended Graph Has our research impact changed over time? Is it better in more recent years? Publications from 2002 to 2007 are slightly under or slightly above the expected impact at the journal level. Publications from 2008 onwards have an improved impact relative to the journal.

Department size 36 Has our department grown in size? (defined by number of authors). Run this report periodically.

Department- Trended Output Has our output increased over time and how does this compare to the total output for the university? Department Biological Sciences Simon Fraser University

Department Comparison How does our performance compare to other departments? A comparison of Summary Metrics report for Biological Sciences and Computer Science

University- What do want to measure/analyse/evaluate? 39

University 40 1.How does our productivity compare to institution x? (Slide 41) 2.Has our citation impact changed over time and how does that compare to university x? (Slide 42) 3.What effect do international collaborations have on our impact? (Slide 43) 4.Which are our field strengths ? How does that compare to university x? (Slide 44) 5.Has our research focus changed over time? How does that compare to university x? (Slide 45) 6.In comparison to other institutions within “country” how are we performing in field x? (Slide 46) 7.How can we identify top researchers for recruitment? (Slide 47 &48) 8.How does our research reputation compare to university x? (Slide 49) 9.How does our institutional income compare to university x? (Slide 50) 10.How is our teaching performance compare to university x? (Slide 51) 11.Which research areas need more support? (Slide 52) 12.Which metrics can support promotion/ tenure decisions? (Slide 53) 13.Which metrics/reports are best to provide evidence that a research strategy has been successful over time? (Slide 54) 1.Example. Has a recruitment drive in year x provided return on investment?

Institutional Comparisons 41 1.How does our productivity compare to other institutions?

Institutional Comparisons Has our citation impact (average cites) changed over time and how does that compare to other universities?

Institutional Comparisons-ESI Disciplines What effect do international collaborations have on our impact?

4. Which are our field strengths? How does this compare to university x? 44 University of GlasgowUniversity of Edinburgh

Institutional Comparisons- % of papers in institution for ESI Disciplines Has our research focus changed over time? How does that compare to university x?

Institutional Comparisons- Comparison of UK institutions in UoA 2014 Clinical Medicine In comparison to other UK institutions, how are we performing in ‘Clinical Medicine’? -All UK, in Clinical Medicine, , ordered by Impact Relative to Subject Area

7. How can we identify top researchers for recruitment? Institutions may have various strategies for identifying new researchers to recruit. The following example looks at using data in Incites The Process a)Identify international or domestic collaborations (where do you want to recruit from) b)Identify top collaborating institutions within region c)Drill down to author providing most impactful research (that contributes to your impact) 47

48 a) Look at which countries your researchers most frequently collaborate with and select one. b) Here I have selected ‘England’ and examined the institutions by average impact. c) I then selected Univ London Imperial Coll since this collaboration has the highest average impact. I then viewed the authors just from this institution. The report provides a list of authors who maybe be potential new recruitments. They have collaborated with authors at Simon Fraser and by using the Average Percentile or other normalised metrics you are able to identify the most impactful research by the collaborating authors. These authors could be potential recruits since the collaboration has proven successful to Simon Frasers research impact.

8. How does our research reputation compare to university x? 49

Institutional Profiles- Finance Indicators How does our institutional income compare to university X

10. How does our teaching performance compare to university x? 51

11. Which research areas need more support? 52 Web of Science categories that have an impact below what is expected (Impact Relative to Subject Area, value below 1)

12. Which reports/metrics can support promotion/tenure decisions? Report Source articles listing- individual author Summary Metrics- individual author Collaborating institutions- individual author Funding Ranking- individual author Trended Report- individual author Can you think of any others? Metrics Journal Actual/Journal Expected Category Actual/Category Expected Percentile in Field %Documents uncited 53

13. Which metrics/reports are best to provide evidence that a research strategy has been successful? Eg. Has a recruitment drive (in specific area of research) in year x provided return on investment? Impact relative to Subject Area (overall or trended) % documents in institution (overall or trended) % documents cited (overall or trended) % documents cited relative to subject area (overall or trended) Aggregate Performance Indicator Can you think of any other reports/metrics? 54

THE NEXT GENERATION OF INCITES Services Global Comparisons Research In View Institutional Profiles Research Profiles Journal Citation Reports INCITES A single destination for all research assessment & evaluation needs Tools Content Essential Science Indicators Semi-Custom Analytics Custom Analytics Syndicated Analytics Custom Data Sets Web Based Cloud & API Mobile Access 55

THE NEXT GENERATION INCITES Simplified interface for at-a-glance information from your desktop or mobile device Compelling, customizable visualizations with links to underlying data Compare institutions and individuals any time, any where Access both summary level and detailed faculty profiles and benchmarks to make informed staffing 56

57

58

59

60

61

62

63

Thank You!