Office of Portfolio Analysis CSR Advisory Council October 20, 2014 George Santangelo Ian Hutchins Fai Chan Office of Portfolio Analysis (OPA) Division.

Slides:



Advertisements
Similar presentations
Integrating the gender aspects in research and promoting the participation of women in Life Sciences, Genomics and Biotechnology for Health.
Advertisements

Paris, May 2007 How good is the research base? New approaches to research indicators Colloque de l’Académie des sciences "Évolution des publications scientifiques"
The Research Consumer Evaluates Measurement Reliability and Validity
How your NIH grant application is evaluated and scored Larry Gerace, Ph.D. June 1, 2011.
CURTIN UNIVERSITY LIBRARY Curtin University is a trademark of Curtin University of Technology CRICOS Provider code 00301J July 2014 Tell your impact story:
Information Retrieval to Informed Action with Research Metrics Thomson Scientific Research Services Group 2007.
Latest Developments in Research Performance Evaluation Yasushi Adachi Regional Manager, Asia Pacific Elsevier 研究產出 競爭力剖析 洞悉全球趨勢.
INCITES PLATFORM NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION (NOAA)
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
Using Journal Citation Reports The MyRI Project Team.
T H O M S O N S C I E N T I F I C Editorial Development James Testa, Director.
Reading Science Critically Debi A. LaPlante, PhD Associate Director, Division on Addictions.
Types of Sources Used in Research Nancy McEnery, MLIS.
Biology Chapter 1 The Science of Biology
Journal Impact Factors and H index
The Graduate School University of Colorado Anschutz Medical Campus.
Other responses: Librarian 3 Data Scientist/data manager/data analyst 7 Student/assistant 2 Writer/Editor/publications support 3 Programme Manager 1 Computer.
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
JOURNAL CITATION REPORTS ® – “THE JCR” FEBRUARY 2009 ENHANCEMENTS GSS – Thomson Reuters, Scientific Business, A&G January 2009.
Orientation to Web of Science Dr.Tariq Ashraf University of Delhi South Campus
The Role of Citations in Warwick’s Strategy and Improving Them Nicola Owen (Academic Registrar) Professor Mark Smith (PVC Research: Science and Medicine)
Bibliometrics toolkit: ISI products Website: Last edited: 11 Mar 2011 Thomson Reuters ISI product set is the market leader for.
DIVISION OF LIBRARY SERVICES | OFFICE OF RESEARCH SERVICES | NATIONAL INSTITUTES OF HEALTH. Terrie Wheeler, MLS Anne White-Olson, MLS Brigit Sullivan,
Bibliometrics: coming ready or not CAUL, September 2005 Cathrine Harboe-Ree.
Beyond the RAE: New methods to assess research quality July 2008.
Impact factorcillin®: hype or hope for treatment of academititis? Acknowledgement Seglen O Per (BMJ 1997; 134:497)
T H O M S O N S C I E N T I F I C Marian Hollingsworth Manager, Publisher Relations July 18, 2007 Using Metrics to Improve your Journal Veterinary Journal.
LORRIE JOHNSON U.S. DEPARTMENT OF ENERGY OFFICE OF SCIENTIFIC AND TECHNICAL INFORMATION (OSTI) ICSTI TECHNICAL ACTIVITIES COORDINATING (TACC) MEETING OCTOBER.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
JOURNAL CITATION REPORTS ® “THE JCR” Jon Stroll, Key Account Manager January 2011.
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
The 2007 International Conference of the System Dynamics Society (Boston) 29Jul-2Aug 2007 Warren W. Tignor SAIC 1 “System Dynamics: Publication.
Bibliometrics toolkit Website: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Further info: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Scopus Scopus was launched by Elsevier in.
SciVal Spotlight Training for KU Huiling Ng, SciVal Product Sales Manager (South East Asia) Cassandra Teo, Account Manager (South East Asia) June 2013.
RESEARCH – DOING AND ANALYSING Gavin Coney Thomson Reuters May 2009.
ESSENTIAL SCIENCE INDICATORS (ESI) James Cook University Celebrating Research 9 OCTOBER 2009 Steven Werkheiser Manager, Customer Education & Training ANZ.
1 Making a Grope for an Understanding of Taiwan’s Scientific Performance through the Use of Quantified Indicators Prof. Dr. Hsien-Chun Meng Science and.
RESEARCH EVALUATION - THE METRICS UNITED KINGDOM OCTOBER 2010.
Universiteit Antwerpen Conference "New Frontiers in Evaluation", Vienna, April 24th-25th Reliability and Comparability of Peer Review Results Nadine.
TIME-TO-DEGREE FOR GRADUATE STUDENTS ENROLLED IN PHD STEM PROGRAMS AT OREGON HEALTH AND SCIENCE UNIVERSITY: DOES THE FORMATION OF THE THESIS ADVISORY COMMITTEE.
Focusing on quality International Research Assessment Exercise 2008.
Citation Searching To trace influence of publications Tracking authors Tracking titles.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
Restructured NIH Applications One Year Later:
An Insider’s Look at a Study Section Meeting: Perspectives from CSR Monica Basco, Ph.D. Scientific Review Officer Coordinator, Early Career Reviewer Program.
Research Publications Analysis Saint Mary’s
NIH LIBRARY | OFFICE OF RESEARCH SERVICES | NATIONAL INSTITUTES OF HEALTH Support for Bibliometric/Portfolio Analysis from the NIH Library Terrie Wheeler,
Unit 11: Evaluating Epidemiologic Literature. Unit 11 Learning Objectives: 1. Recognize uniform guidelines used in preparing manuscripts for publication.
1 RUSSIAN SCIENCE UNDER THE MICROSCOPE Philip Purnell Moscow, October 2013.
MEASURING RESEARCHERS: FROM RESEARCH PUBLICATIONS AND COMMUNICATION TO RESEARCH EVALUATION Lucie Vavříková 1.
The Thomson Reuters Journal Selection Policy – Building Great Journals - Adding Value to Web of Science Maintaining and Growing Web of Science Regional.
الله الرحيم بسم الرحمن علیرضا صراف شیرازی دانشیار و مدیر گروه دندانپزشکی کودکان رئیس کتابخانه مرکزی و مرکز علم سنجی دانشگاه علوم پزشکی مشهد.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
Rigor and Transparency in Research
Writing for Psychology II: Finding the best references
Where Should I Publish? Journal Ranking Tools
It’s not about searching…. It’s about finding.
Bibliometrics toolkit: Thomson Reuters products
Critique The Impact Factor Game
Influence of UK Academic Research in Biochemistry
NSF/NIH Review Processes University of Southern Mississippi
How to read a paper D. Singh-Ranger.
NSF/NIH Review Processes University of Southern Mississippi
Being an effective consumer of preclinical research
Advanced Scientometrics Workshop
Journal of Visualized Experiments
College of Public Health and Human Sciences
Publishers of Quality Research
Critical Appraisal วิจารณญาณ
Cell Biology and Genetics
Presentation transcript:

Office of Portfolio Analysis CSR Advisory Council October 20, 2014 George Santangelo Ian Hutchins Fai Chan Office of Portfolio Analysis (OPA) Division of Program Coordination, Planning, and Strategic Initiatives (DPCPSI) Relative Citation Rate (RCR): A new way to measure the influence of a publication on its field of research

Office of Portfolio Analysis Modeling NIH output Tool development Database management Improving portfolio analysis at NIH

Office of Portfolio Analysis Commonly used metrics: Impact Factor: journal-level metric Publication Count: field-dependent, use-independent Citation Rates: field- and journal-dependent h-index: field-dependent, time-dependent Relative Citation Rate (RCR): Need: An article-level metric that is independent of field, journal, and time Assumption: Citation of a publication reveals value to or influence on the citer RCR normalizes citations to the publications co-citation network Limitations of current bibliometrics in detecting impact/value of a publication

Office of Portfolio Analysis Thomson Reuters Science Citation Index Expanded, = never cited 1 = average 2 = twice the average >20 = exceptionally highly cited How is the paper of interest cited relative to other papers in its co-citation network? RCR

Office of Portfolio Analysis 684 papers published in 2009 supported by R01s and representing a range of RCRs Papers assigned based on content of the reviewer’s published work 537 IRP Investigators were recruited with approval of their SDs 3-5 PIs received the same set of 5 publications Pubs and responses via a secure intranet site 6 questions asked with a 5 point response scale Response as of Oct 14, % Investigators responded 832 responses to 235 papers, ≥2 responses/paper RCR validation study using subject matter experts Is there a correlation between RCR and expert assessment of value/quality/impact?

Office of Portfolio Analysis Cell Biology Genetics and Genomics Neuroscience Chromosome Biology Developmental Biology Immunology Molecular Biology and Biochemistry Cancer Biology Stem Cell Research Molecular Pharmacology Systems Biology Epidemiology Clinical Research Virology Computational Biology Biomedical Engineering & Biophysics Chemical Biology Microbiology and Infectious Diseases Structural Biology Health Disparities Social and Behavioral Science Fields of science represented in the RCR validation study

Office of Portfolio Analysis IMPORTANCE Rate whether the question being addressed is important to answer. (1 = Not Important, 2 = Slightly Important, 3 = Important, 4 = Highly Important, 5 = Extremely Important) METHODS Rate whether you agree that the methods are appropriate and the scope of the experiments adequate. (1 = Strongly Disagree, 2 = Disagree, 3 = Neutral, 4 = Agree, 5 = Strongly Agree) ROBUSTNESS Rate how robust the study is based on the strength of the evidence presented. (1 = Not Robust, 2 = Slightly Robust, 3 = Moderately Robust, 4 = Highly Robust, 5 = Extremely Robust) HUMAN HEALTH RELEVANCE Rate the likelihood that the results could ultimately have a substantial positive impact on human health outcomes. (1 = Very unlikely, 2 = Unlikely, 3 = Foreseeable but uncertain, 4 = Probable, 5 = Almost Certainly) LIKELY IMPACT Rate the impact that the research is likely to have or has already had. (1 = Minimal Impact, 2 = Some Impact, 3 = Moderate Impact, 4 = High Impact, 5 = Extremely High Impact) OVERALL EVALUATION Provide your overall evaluation of the value and impact of this publication. (1 = minimal or no value, 2 = Moderate value, 3 = Average value, 4 = High value, 5= Extremely high value) Relative Citation Rate – Review Criteria

Office of Portfolio Analysis OVERALL EVALUATION Relative Citation Rate (log-scale) Reviewer Score Relative Citation Rate (RCR) validation study

Office of Portfolio Analysis Relative Citation Rate (log-scale) Reviewer Score LIKELY IMPACT Relative Citation Rate (RCR) validation study

Office of Portfolio Analysis Relative Importance on Article Value (Random Forest) Contribution of each individual category to the overall score

Office of Portfolio Analysis Validated use: compare individual publications within a network Possible uses: compare output of individual investigators compare output of cohorts (portfolios, programs, mechanisms…) flagging low impact for inspection to follow trends Possible misuses of RCR or any bibliometric assessment: determining importance of the endeavor predicting long-term impact of the work evaluating effectiveness of “downstream” application (e.g. patentable) Uses of RCR to compare impact

Office of Portfolio Analysis Questions? Comments?