1 REVIEWER ORIENTATION TO ENHANCED PEER REVIEW April 2009 1.

Slides:



Advertisements
Similar presentations
Planning Reports and Proposals
Advertisements

By D. Fisher Geometric Transformations. Reflection, Rotation, or Translation 1.
Slide 1 FastFacts Feature Presentation September 6, 2012 To dial in, use this phone number and participant code… Phone number: Participant.
1 Balloting/Handling Negative Votes September 11, 2006 ASTM Training Session Bob Morgan Brynn Iwanowski.
1 Balloting/Handling Negative Votes September 22 nd and 24 th, 2009 ASTM Virtual Training Session Christine DeJong Joe Koury.
Task Group Chairman and Technical Contact Responsibilities ASTM International Officers Training Workshop September 2012 Scott Orthey and Steve Mawn 1.
The National Connection for Local Public Health Model Practice Reviewer Guide April - May 2010.
XP New Perspectives on Microsoft Office Word 2003 Tutorial 6 1 Microsoft Office Word 2003 Tutorial 6 – Creating Form Letters and Mailing Labels.
XP New Perspectives on Microsoft Office Word 2003 Tutorial 7 1 Microsoft Office Word 2003 Tutorial 7 – Collaborating With Others and Creating Web Pages.
State of New Jersey Department of Health and Senior Services Patient Safety Reporting System Module 2 – New Event Entry.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Determine Eligibility Chapter 4. Determine Eligibility 4-2 Objectives Search for Customer on database Enter application signed date and eligibility determination.
Michigan Electronic Grants System Plus
PrevNext | Slide 1 Michigan Electronic Grants System MEGS MEGS Additional Features Last Updated: 2/4/2011.
DIVIDING INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
Addition Facts
Academic Quality How do you measure up? Rubrics. Levels Basic Effective Exemplary.
Presented by: Guy Prescott Common Sense Safety, Inc. (530)
The UEA House of Delegates Directing YOUR Association through the democratic process. 1.
07/30/ ODE, Office for Child Nutrition Completing Commodity Orders.
Text 1 July, 2010 DCMS: Training Manual Campaign Management.
AS9102 First Article Inspection Report
Freight Management System
Page 1 of 30 To the Create Assignment Request Online Training Course An assignment request is created by an assignor to initiate the electronic assignment.
Access Lesson 13 Programming in Access Microsoft Office 2010 Advanced Cable / Morrison 1.
Post 9/11 ERA Veterans Gold Card Initiative Webinar November 8 th, 2011: 2:00 - 3:00.
In Depth Panel Review Training. Activity: Mock Panel Review To evaluate the Need for Assistance, reviewers will consider the extent to which the application.
Benchmark Series Microsoft Excel 2013 Level 2
What is Pay & Performance?
CREATING A PAYMENT REQUEST FOR A NEW VENDOR
General Navigation Training Presentation for Supply Chain Platform: BAE Systems July 2012.
U.S. Department of Health and Human Services Health Resources and Services AdministChairpersontion NATIONAL HEALTH SERVICE CORPS (NHSC) SCHOLARSHIP PROGRAM.
U.S. Department of Health and Human Services Health Resources and Services Administration APPLICATION REVIEW MODULE (ARM) National Health Service Corps.
1 How Do I Order From.decimal? Rev 05/04/09 This instructional training document may be updated at anytime. Please visit and check the.
Addition 1’s to 20.
25 seconds left…...
New Jersey School Districts Teachscape Reflect. Leona Jamison Teachscape Service Provider.
School Census Summer 2011 Headlines Version Jim Haywood Product Manager for Statutory Returns.
Pasewark & Pasewark Microsoft Office XP: Introductory Course 1 INTRODUCTORY MICROSOFT ACCESS Lesson 3 – Creating and Modifying Forms.
Week 1.
Page 1 of 36 The Public Offering functionality in Posting allows users to submit requests for public offerings of Petroleum and Natural Gas(PNG) and Oil.
1. 2 Easy-to-Use Administrator Features 3 Customisable Screen Cutoff Scores.
We will resume in: 25 Minutes.
Module 12 WSP quality assurance tool 1. Module 12 WSP quality assurance tool Session structure Introduction About the tool Using the tool Supporting materials.
ACD Training.
1 PART 1 ILLUSTRATION OF DOCUMENTS  Brief introduction to the documents contained in the envelope  Detailed clarification of the documents content.
Glenn Kirksey| June 2014 U.S. Department of Education Software Vendors Webinar EDE Suite Update for Software Vendors.
How your NIH grant application is evaluated and scored Larry Gerace, Ph.D. June 1, 2011.
How Your Application Is Reviewed Robert Elliott, Ph.D. Scientific Review Officer (SRO)
ENHANCING PEER REVIEW What Reviewers Need to Know Now Slides Accompanying Video of Dr. Alan Willard, March
The New NIH Review System: Reviewer’s perspective Liz Madigan, FPB School of Nursing.
How Your Application Is Reviewed Vonda Smith, Ph.D. Scientific Review Officer (SRO)
1 Major changes Get ready! Changes coming to Review Meetings Considering Potential FY2010 funding and beyond: New 1-9 Scoring System Scoring of Individual.
Enhancing Peer Review at NIH University of Central Florida Grant Day Workshop October 26, 2009 Anne K. Krey Division of Scientific Review.
NIH eRA Internet Assisted Review August 19, 2005.
NIH Mentored Career Development Awards (K Series) Part 5 Thomas Mitchell, MPH Department of Epidemiology & Biostatistics University of California San Francisco.
Presubmission Proposal Reviews at the College of Nursing (CON) Nancy T. Artinian, PhD, RN, FAAN Associate Dean for Research and Professor.
Yolonda L. Colson MD, PhD Associate Professor of Surgery Brigham and Women’s Hospital Harvard Medical School 2011 AATS Grant Writing Workshop WRITING A.
Rev.04/2015© 2015 PLEASE NOTE: The Application Review Module (ARM) is a system that is designed as a shared service and is maintained by the Grants Centers.
Changes is NIH Review Process and Grant Application Forms Shirley M. Moore Professor of Nursing and Associate Dean for Research Frances Payne Bolton School.
Restructured NIH Applications One Year Later:
National Center for Research Resources NATIONAL INSTITUTES OF HEALTH T r a n s l a t I n g r e s e a r c h f r o m b a s i c d i s c o v e r y t o i m.
NIH Scoring Process. NIH Review Categories 1.Significance How important is the research? 2. Investigator Is the team comprised of experts in the area?
NATA Foundation Student Grants Process
NATA Foundation General Grants Program Process
NSF/NIH Review Processes University of Southern Mississippi
NSF/NIH Review Processes University of Southern Mississippi
NATA Foundation General Grants Program Process
Presentation transcript:

1 REVIEWER ORIENTATION TO ENHANCED PEER REVIEW April

2 Changes to Review Beginning with May/June 2009 Meetings   Enhanced Review Criteria for certain mechanisms   Templates for Structured Critiques   Scoring of Individual Review Criteria – –All applications will receive criterion scores from assigned reviewers   New 1 to 9 Scoring Scale 2

3 Goals of the Changes   Clearer understanding of the basis of application ratings   More emphasis on impact and less emphasis on technical details   Succinct, well-focused critiques that evaluate, rather than describe, applications   Routine use of the entire rating scale 3

4 Before the Review Meeting When reading applications the assigned reviewers should:   Address all applicable criteria and other review considerations   Identify major strengths and weaknesses   Assign scores to each of the 5 “core” criteria   Assign an overall impact/priority score 4

5 Preparation of Critiques When writing your critiques:   Use bulleted points to make succinct, focused comments   Short narratives may occasionally be appropriate, but should be rare   Focus on major strengths and weaknesses (ones that impacted your overall rating of the application) 5

6 Features of Critique Templates   Boxes for evaluating: – –Each core review criterion – –Other applicable review criteria and considerations – –Overall impact of the application   A box for “advice to applicants”   Hyperlinks to web pages providing descriptions of review criteria and additional review considerations 6

7 Excerpt from a Critique Template: Criterion  List major strengths and weaknesses that influenced the overall impact/priority score  Limit text to ¼ page per criterion, although more text may occasionally be needed  Do not enter scores on critiques 1. SignificancePlease limit text to ¼ page Strengths Weaknesses

8 Excerpt from a Critique Template: Protected Form Fields and Drop-downs  Protected elements (Drop-down boxes and form fields) are shaded gray  Part of each template is a PROTECTED form  Reviewers should NOT unprotect the forms!

9 Scoring Individual Review Criteria   There are 5 “core” criteria for most types of grant applications   For example, the core criteria for R01s are: – –Significance – –Investigator(s) – –Innovation – –Approach – –Environment   Use the 9-point scale (1 = exceptional, 9 = poor) for the five “core” review criteria.   Do not enter scores in the critique 9

10 Overall Impact/Priority Scores   Consider criterion strengths and weaknesses of each application in determining an overall impact/priority score   Recognize this is a NEW scoring system and focus on the guidelines for its use   This new scoring system is intended to reflect the “real-world” range of the quality of applications typically seen in actual study sections   It is ESSENTIAL that reviewers take advantage of this unique opportunity to use the entire 1 to 9 range 10

Scoring Descriptions ImpactScoreDescriptorAdditional Guidance on Strengths/Weaknesses High 1ExceptionalExceptionally strong with essentially no weaknesses 2OutstandingExtremely strong with negligible weaknesses 3ExcellentVery strong with only some minor weaknesses Medium 4Very GoodStrong but with numerous minor weaknesses 5GoodStrong but with at least one moderate weakness 6SatisfactorySome strengths but also some moderate weaknesses Low 7FairSome strengths but with at least one major weakness 8MarginalA few strengths and a few major weaknesses 9PoorVery few strengths and numerous major weaknesses Non-numeric score options: NR = Not Recommended for Further Consideration, DF = Deferred, AB = Abstention, CF = Conflict, NP = Not Present, ND = Not Discussed Minor Weakness: An easily addressable weakness that does not substantially lessen impact Moderate Weakness: A weakness that lessens impact Major Weakness: A weakness that severely limits impact 11

12 Before Attending the Review Meeting   Post critiques to the Internet Assisted Review (IAR) Web module   Enter criterion scores and overall/priority score in IAR   Do not enter scores as part of the critique! – –Ensures better data integrity – –Allows scores to be placed where needed i.e. Summary Statements, Commons Status – –Makes scores available for future analysis 12

13 IAR: New Drop Down for Five Core Criteria   Reviewers will see new drop-down menus in IAR for entering scores for each criterion New drop-down

14 IAR: Assigned reviewers must submit a critique to upload scores   Reviewers must close the critique file before submitting

15 IAR: Entering Scores and Critiques   Assigned reviewers may not submit Criterion or Preliminary Scores without a critique – –If a reviewer tries to save the criterion and/or preliminary score without uploading the critique, an error message will occur   The maximum file size for a critique is 1 MB

16 IAR: New Header Information in Critique   Preliminary IAR Critique now includes criterion scores

17 IAR: Updating Criterion Scores   Criterion scores can be updated in IAR during the submit phase, edit phase and the final scoring phase   If criterion scores are edited, the PDF of the critique file is regenerated each time because the critique has header information with the criterion scores – –If the criterion scores change, the PDF critique changes

18 IAR: New Popup Listing Criterion Scores   New link on List of Applications screen will display criterion scores for each application View All Scores

19 At the Review Meeting: Procedure for Discussed Applications   Assigned reviewers will discuss strengths and weaknesses of each application – –Recommend overall impact/priority score – –Criterion scores will not be discussed by the committee   All eligible members will record an overall impact/priority score (as is presently true) 19

20 IAR: Edit Criterion Scores on Voter Sheet  Criterion scores can easily be edited by using the voter sheet

21 After the Review Meeting: Updating Scores or Critiques   Assigned reviewers whose opinions changed as a result of discussion at the meeting should use IAR: – –To modify their criterion scores – –To post revised critiques   If criterion scores are edited, the PDF of the critique file is regenerated 21

22 Summary Statements   Overall impact/priority scores of discussed applications will be the average of scores voted by all eligible reviewers, multiplied by 10   Final scores will range from 10-90, in whole numbers   Summary statements for ALL applications will include the criterion scores and critiques posted by assigned reviewers 22

23 For additional information: Enhancing Peer Review at NIH Web Site Thank you for your review service 23