1 Amy Rubinstein, Ph.D., Scientific Review Officer Adrian Vancea, Ph.D., Program Analyst Office of Planning, Analysis and Evaluation Study on Direct Ranking.

Slides:



Advertisements
Similar presentations
Applicant and Reviewer Perspectives on the NIH Review process 2012 NIH Summer Institute Thursday, July 10, 2012 Steven Schinke.
Advertisements

How a Study Section works
NIH Mentored Career Development Awards (K Series) Part 5 Thomas Mitchell, MPH Department of Epidemiology & Biostatistics University of California San Francisco.
P roblem S olving I nnovator Solving Tomorrows Problems Today Manual Prioritisation of 1 st Why / Functions. Manual Prioritisation of 1 st Why / Functions.
The NIH Peer Review Process
How Your Application Is Reviewed Robert Elliott, Ph.D. Scientific Review Officer (SRO)
ENHANCING PEER REVIEW What Reviewers Need to Know Now Slides Accompanying Video of Dr. Alan Willard, March
Laurie Tompkins, PhD Acting Director, Division of Genetics and Developmental Biology NIGMS, NIH Swarthmore College May 14, 2012 NIH 101.
Faculty Performance-Based Compensation Presentation to Academic Senate August 30, 2006.
Bus 411 DAY 19. Agenda Case Feedback  1 B, 1 C and 1 D  Weakest part is Financials Ratios - Trends and comparisons Cost - EPS/EBIT cost of implementations.
Center for Scientific Review National Institutes of Health Department of Health and Human Services Toni Scarpa NIH Peer Review: Continuity and Change NIDA.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
How Your Application Is Reviewed Vonda Smith, Ph.D. Scientific Review Officer (SRO)
NIH Regional Seminars 2014 Sally A. Amero, Ph.D.Dana Plude, Ph.D. NIH Review Policy OfficerBiobehavioral and Behavioral Processes IRG National Institutes.
The Life Cycle of an NIH Grant Application Alicia Dombroski, Ph.D. Deputy Director Division of Extramural Activities NIDCR.
NIH OBSSR Summer Institute July 2012 National Institutes of Health U.S. Department of Health and Human Services Overview of the NIH Peer Review Process.
CSR Advisory Council Meeting May 19, 2014 Editorial Board Review A Few Good Reviewers Don Schneider, Ph.D.
The NIH Peer Review Process Sally A. Amero, Ph.D. NIH Review Policy Officer Office of Extramural Research 2010 NIH Regional Seminars.
The NIH Peer Review Process
Division of AIDS, Behavioral and Population Sciences Risk, Prevention and Health Behavior IRG August, 2014 Reorganization/Realignment of RPHB Addictive.
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
CSR Quick Feedback Pilot Mary Ann Guadagno, PhD Senior Scientific Review Officer CSR Office of the Director.
NIH Review Procedures Betsy Myers Hospital for Special Surgery.
CHAPTER 3 SCOPING AND AGENCY COORDINATION. Scoping - the procedure for determining the appropriate level of study of a proposed project/activity - process.
Assessing Reliability in Peer Review René Etcheberrigaray, Xiang-Ning Li, James Mack,, Elena Smirnova, Boris Sokolov, and Leo Wu.
The NIH Grant Review Process Hiram Gilbert, Ph.D. Dept. of Biochemistry, Baylor College of Medicine Xander Wehrens, M.D. Ph.D. Dept. of Molecular Physiology.
SUBMITTING AN SBIR/STTR APPLICATION FOR DECEMBER 5? November 25, 2008 National Institutes of Health U.S. Department of Health and Human Services Suzanne.
NIH Mentored Career Development Awards (K Series) Part 5 Thomas Mitchell, MPH Department of Epidemiology & Biostatistics University of California San Francisco.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
Disclosure of Financial Conflicts of Interest in Continuing Medical Education Michael D. Jibson, MD, PhD and Jennifer Seibert, MD University of Michigan.
NIH Submission Cycle. Choosing a Study Section Ask Program Officer for advice Review rosters: – sp
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
Components of a Successful AREA (R15) Grant Rebecca J. Sommer Bates College.
12/11/2009 Writing a NIH Grant Application Ellen Puré, PhD, Professor and Associate Vice President of Academic Affairs, Wistar Institute Mitchell Schnall.
Bus 411 DAY 24. Agenda Jet Blue Airways Student choose Last Case Send of 1 St and 2 nd choice Only received one teams I will choose if I do not.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
Analysis of Overall Impact Scoring Trends within AHRQ Peer Review Study Sections Gabrielle Quiggle, MPH; Rebecca Trocki, MSHAI; Kishena Wadhwani, PhD,
Mary Ann Guadagno, PhD Senior Scientific Review Officer CSR Office of the Director Review Issues – CSR Surveys.
What Happens to your NIH Grant After You Hit the Send Button.
NIH is divided into two sections 1) Center for Scientific Review (CSR) 2) Institutes (eg., NIDDK, NCI, NHLBI) What is the difference? CSR organizes the.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
How is a grant reviewed? Prepared by Professor Bob Bortolussi, Dalhousie University
Restructured NIH Applications One Year Later:
An Insider’s Look at a Study Section Meeting: Perspectives from CSR Monica Basco, Ph.D. Scientific Review Officer Coordinator, Early Career Reviewer Program.
Bus 411 DAY 25. Agenda Course Evaluations AirTran Airways Student choice of Last Case  Team 1 Home Depot  Team 2 Harrah's T  Team 3 Liz Claiborne Team.
Insider Guide to Peer Review for Applicants Dr. Valerie Durrant Acting Director CSR Division of Neuroscience, Development and Aging.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
Bus 411 DAY 22. Friday the 13 th Word of the day  paraskavedekatriaphobia – fear of Friday the 13 th in Tales of the Knights Templar (Warner Books: 1995):
Amy Rubinstein, Ph.D. Scientific Review Officer Direct Ranking of Applications: Pilot Study.
Richard Nakamura, Ph.D. October 2014 CSR Update for CSRAC.
NIH Regional Seminars 2015 Sally A. Amero, Ph.D.Weijia Ni, Ph.D. NIH Review Policy OfficerChief, RPHB, Center for Scientific Review National Institutes.
The TDR Targets Database Prioritizing potential drug targets in complete genomes.
Peer Review and Grant Mechanisms at NIH What is Changing? May 2016 Richard Nakamura, Ph.D., Director Center for Scientific Review.
Selection of System Operators 1.  WIOA Requirement  Competitive Selection  RFP Team  Selection Process  Key Elements of the RFP  County Considerations.
Performance Evaluations
Understanding NIH Peer Review
NATA Foundation Student Grants Process
NSF/NIH Review Processes University of Southern Mississippi
NSF/NIH Review Processes University of Southern Mississippi
The NIH Peer Review Process
Safeguarding Objective Decision making
The NIH Peer Review Process
Engagement Follow-up Resources
Rick McGee, PhD and Bill Lowe, MD Faculty Affairs and NUCATS
Engagement Follow-up Resources
Bus 411 DAY 19.
A Moodle-based Peer Assessment Tool
Presentation transcript:

1 Amy Rubinstein, Ph.D., Scientific Review Officer Adrian Vancea, Ph.D., Program Analyst Office of Planning, Analysis and Evaluation Study on Direct Ranking of Applications: Advantages & Limitations

2 Applications are assigned to 3 reviewers who provide preliminary impact scores (1-9) and critiques. After panel discussion of each of the top 50% of applications, all panel members vote on a final overall impact score. Each application’s score is derived from the average of all panel members’ votes and multiplied by 10 (resulting in final scores of 10-90). R01 applications are assigned a percentile based on the scores of applications reviewed in the relevant study section in that round and the previous 2 rounds. Current System for Evaluating and Ranking Applications Reviewed in CSR Study Sections

3 The number of applications reviewed by NIH is at or near historic highs and award rates are at historic lows. It can be difficult to differentiate between the top 1-20% of applications reviewed in study sections using raw scores. Concerns about the potential for an application to be funded results in compression of scores in the 1-3 range (final scores between 10 and 30). The current system of percentiles is used to rank applications reviewed in different study sections. However, score compression results in many applications with the same percentile, making funding decisions more difficult. Why Consider Direct Ranking?

4 1%: %: 2521%: 33 2%: %: 2622%: 34 3%: %: 2724%: 35 4%: %: 2825%: 36 6%: 2016%: 2927%: 37 8%: %: 3029%: 38 9%: 2319%: 3131%: 39 10%: 2420%: 3233%: 40 Percentile Base Report Council Date: 2015/01 IC: CSR

5

6 Reviewers not be forced to give applications higher (worse) overall impact scores than they think the applications deserve. Reviewers required to distinguish between applications of similar quality and separate the very best from the rest. Reviewers have the opportunity to re-rank applications after hearing the discussion of all applications, something that is less practical with the current system. Potential Advantages of a Rank Order Method

7 New Investigators are reviewed in a separate cluster but must be integrated into the final rank order of applications that are reviewed. Applications cannot be ranked with respect to applications in the previous two rounds as is done with the percentile system. Reviewers in study sections that cover highly diverse scientific areas may find direct ranking more difficult. Private ranking may lack the transparency of the current system where reviewers who vote out of the range set by assigned reviewers must provide justification during or after the discussion. Challenges Associated with Direct Ranking

8 Carried out in parallel with the current review system in the 2014_10 and 2015_01 council rounds. Applications were scored as usual; reviewers were asked to privately rank their top 10 R01 applications discussed on a separate column on the score sheet. Rank data was analyzed for informational purposes and not used to influence funding decisions. Pilot Study for Direct Ranking of Applications

9 32 chartered scientific review groups (SRGs) from the 2014_10 and 2015_01 council rounds Number of discussed R01 applications per SRG ranges from 12 to 39 (average 26.12) Number of reviewers per SRG ranges from 13 to 31 (average 22.97) Participating Study Sections

10 Measure correlation between the percentiles/scores and direct ranking results –Each application has an associated percentile/score –Associate an “average rank” with each application –Expect good correlation Propose a method for breaking up ties using the ranking results Visualize correlation between ranking and percentiles. Data Analysis

11 NP = Not present, CF = Conflict, NR = Not ranked Source Data Format

12 Data with Imputed Ranks

13 Next step is to calculate average the rank for each application Data with Imputed Ranks

14 For application A, only 19-1=18 reviewers can rank –Average rank = average of the 18 ranks = 83/18 = 4.61 For application B, only 19-2=17 reviewers can rank –Average rank = average of the 17 ranks = 165/17 = 9.71 Applic ation R1R2R3R4R5R6R7R8R9R10R11R12R13R14R15R16R17R18R19Avg Rank A CF B NP CF9.71 Average Rank

15 Data with Imputed Ranks

16 Correlation Coefficient Between Rank and Percentile

17 B is better than A R1R2R3R4R5R6R7R8R9R10R11R12R13R14R15R16R17R18R19 Avg Rank A 143CFNP CF B 57NP8547CF How can one differentiate between two applications? We want something natural and easy to understand. 19-5=14 common reviewers considered to rank/compare both applications 5 reviewers that consider A better than B 9 reviewers that consider B better than A Comparing Applications with Similar Percentiles

18 Appl Indices 7 14% 8 14% 9 16% 10 18% 11 21% 12 21% 13 21% 14 21% 7 14% / / / / / / / % / / /16 8* 9/ / / % / / / /19 9* 6/ % / / / / % 11 9/ / / % / / % 13 15/ % out of 20 common reviewers ranked 7 as better than 8 * Indicates ties Direct Comparison Matrix

19 Application AApplication B score (Average)27 percentile11% score range2,3, 42, 3 reviewers preference by score3/19 (16%) 13/19 (68%) awarded each application the same score ranking (Average) ranking range 1- NR2-NR reviewers preference13/18 (72% of reviewers)5/18 (28% of reviewers)A stronger than B Comparing Two Applications with the Same Percentile

20 Visualization of Binning for All SRGs

21 Visualization of Binning for Single SRG

22 Helped reviewers prioritize applications and improved score spreading. Reviewers more engaged in discussions because of the need to rank. Difficult to rank applications that the reviewer did not read. May provide some complementary information but should not replace current system. Reviewer Comments

23 Does ranking add value to the peer review process? Could the rank ordering exercise be used as a tool by SROs to help panels spread scores and become more engaged in discussion? Can rank ordering be used by Program Staff to break ties or provide more information needed for funding decisions? Questions and Next Steps

Dr. Ghenima Dirami, SRO, Lung Injury and Repair study section Dr. Gary Hunnicutt, SRO, Cellular, Molecular and Integrative Reproduction study section Dr. Raya Mandler, SRO, Molecular and Integrative Signal Transduction study section Dr. Atul Sahai, SRO, Pathobiology of Kidney Disease study section Dr. Wei-qin Zhao, SRO, Neurobiology of Learning Memory study section Dr. Adrian Vancea, Program Analyst, Office of Planning, Analysis and Evaluation Dr. Amy Rubinstein, SRO, Gene and Drug Delivery Systems Study Section Direct Ranking Pilot Working Group Members

25 Post Ranking Pilot Office of Planning, Analysis and Evaluation Q & A