CLINICAL TRIAL METHODOLOGY COURSE 2017 WEBINAR SERIES

Slides:



Advertisements
Similar presentations
ing%20for%20Success.pdf Information from NIH: Louis V. De Paolo NICHD Roger G. Sorensen.
Advertisements

1 REVIEWER ORIENTATION TO ENHANCED PEER REVIEW April
How your NIH grant application is evaluated and scored Larry Gerace, Ph.D. June 1, 2011.
How Your Application Is Reviewed Robert Elliott, Ph.D. Scientific Review Officer (SRO)
ENHANCING PEER REVIEW What Reviewers Need to Know Now Slides Accompanying Video of Dr. Alan Willard, March
Grant Writing: Specific Aims and Study Design Zuo-Feng Zhang, MD, PhD EPIDEMIOLOGY
Grant Writing Thomas S. Buchanan NIH Review Process Study Sections Review Criteria Summary Statement Responding to a Review.
The New NIH Review System: Reviewer’s perspective Liz Madigan, FPB School of Nursing.
How Your Application Is Reviewed Vonda Smith, Ph.D. Scientific Review Officer (SRO)
PRESENTER: DR. ROBERT KLESGES PROFESSOR OF PREVENTIVE MEDICINE UNIVERSITY OF TENNESSEE HEALTH SCIENCE CENTER AND MEMBER, DEPARTMENT OF EPIDEMIOLOGY AND.
Preparing Grant Applications
1 Major changes Get ready! Changes coming to Review Meetings Considering Potential FY2010 funding and beyond: New 1-9 Scoring System Scoring of Individual.
Presented by the Office of Research and Grants (ORG)
Getting Funded: How to write a good grant
How to Improve your Grant Proposal Assessment, revisions, etc. Thomas S. Buchanan.
Effective proposal writing Session I. Potential funding sources Government agencies (e.g. European Union Framework Program, U.S. National Science Foundation,
Enhancing Peer Review at NIH University of Central Florida Grant Day Workshop October 26, 2009 Anne K. Krey Division of Scientific Review.
THE NIH REVIEW PROCESS David Armstrong, Ph.D.
UAMS Department of Biochemistry and Molecular Biology
Policy WG NIH policy proposal. Goal: Incorporating global access licensing as one of the additional review criteria Question 1: Should we propose this.
NIH – CSR and ICs. The Academic Gerontocracy Response to the Crisis Early investigator status: first real grant application. K awards, R13s etc don’t.
Writing Successful Research Grant Proposals
NIH Review Procedures Betsy Myers Hospital for Special Surgery.
1 Introduction to Grant Writing Beth Virnig, PhD Haitao Chu, MD, PhD University of Minnesota, School of Public Health December 11, 2013.
COMPONENTS OF A GOOD GRANT PROPOSAL Philip T. LoVerde.
Proposal Development Sample Proposal Format Mahmoud K. El -Jafari College of Business and Economics Al-Quds University – Jerusalem April 11,2007.
Research Project Grant (RPG) Retreat K-Series March 2012 Bioengineering Classroom.
NIH Mentored Career Development Awards (K Series) Part 4 Thomas Mitchell, MPH Department of Epidemiology & Biostatistics University of California San Francisco.
NIH Challenge Grants in Health and Science Research RFA OD
AHRQ 2011 Annual Conference: Insights from the AHRQ Peer Review Process Training Grant Review Perspective Denise G. Tate Ph.D., Professor, Chair HCRT Study.
Presubmission Proposal Reviews at the College of Nursing (CON) Nancy T. Artinian, PhD, RN, FAAN Associate Dean for Research and Professor.
1 Preparing an NIH Institutional Training Grant Application Rod Ulane, Ph.D. NIH Research Training Officer Office of Extramural Research, NIH.
National Institutes of Health AREA PROGRAM (R15) Thomas J. Wenzel Bates College, Lewiston, Maine.
NSF Peer Review: Panelist Perspective QEM Biology Workshop; 10/21/05 Dr. Mildred Huff Ofosu Asst. Vice President; Sponsored Programs & Research; Morgan.
Changes is NIH Review Process and Grant Application Forms Shirley M. Moore Professor of Nursing and Associate Dean for Research Frances Payne Bolton School.
How is a grant reviewed? Prepared by Professor Bob Bortolussi, Dalhousie University
OCTOBER 18, 2011 SESSION 9 OF AAPLS – SELECTED SUPPORTING COMPONENTS OF SF424 (R&R) APPLICATION APPLICANTS & ADMINISTRATORS PREAWARD LUNCHEON SERIES Module.
National Center for Research Resources NATIONAL INSTITUTES OF HEALTH T r a n s l a t I n g r e s e a r c h f r o m b a s i c d i s c o v e r y t o i m.
Response to Prior Review and Resubmission Strategies Yuqing Li, Ph.D Division of Movement Disorders Department of Neurology Center for Movement Disorders.
Peer Review and Grant Mechanisms at NIH What is Changing? May 2016 Richard Nakamura, Ph.D., Director Center for Scientific Review.
Rigor and Transparency in Research
NIH R03 Program Review Ning Jackie Zhang, MD, PhD, MPH College of Health and Public Affairs 04/17/2013.
NIH Scoring Process. NIH Review Categories 1.Significance How important is the research? 2. Investigator Is the team comprised of experts in the area?
Reviewers Expectations Peter Donkor. Outline Definitions The review process Common mistakes to avoid Conclusion.
An Analysis of D&I Applications
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Upcoming NIH Proposal Preparation Changes
NATA Foundation Student Grants Process
Presenter: dr. Robert Klesges Professor of Preventive Medicine
NATA Foundation General Grants Program Process
NSF/NIH Review Processes University of Southern Mississippi
NSF/NIH Review Processes University of Southern Mississippi
Research and Grant Writing
Grant Writing Information Session
Grant Writing for the NIH: Basics and Specific Tips for Success
What Reviewers look for NIH F30-33(FELLOWSHIP) GRANTS
Reading Research Papers-A Basic Guide to Critical Analysis
Research Project Grant (RPG) Retreat R-series
How to Write a Successful NIH Career Development Award (K Award)
Writing that First Research Grant
Preparing Research Proposals for NSF and NIH April 20, 2018
Dr. Lani (Chi Chi) Zimmerman, UNMC Dr. Bill Mahoney, IS&T
BU Career Development Grant Writing Course- Session 3, Approach
How to Succeed with NIH: September 28, 2018
WCHRI Innovation Grants Application information session 2018
K R Investigator Research Question
UAMS Department of Biochemistry and Molecular Biology
NATA Foundation General Grants Program Process
Writing an Effective Grant Application
CLINICAL TRIAL METHODOLOGY COURSE 2019 WEBINAR SERIES
Presentation transcript:

CLINICAL TRIAL METHODOLOGY COURSE 2017 WEBINAR SERIES

To view and hear the content please join on a computer or device with speakers. Use the Chat Box to enter questions for the speaker. Instructions for how to obtain CMEs will be provided at the end of the webinar.

AAN CME Disclosure The presenter has no commercial or financial interests, relationships, activities, or other conflicts of interest to disclose.

Significance/Innovation/Preliminary data/Approach/Rigor/Recruitment feasibility/analyses, Significance (Scored criterion): Define the problem and demonstrate impact of solving it. Common approaches include listing disease incidence, economic costs. When dealing with a rare disorder explain how knowledge gained could impact more broadly Define impact on treatment /diagnosis/ prognostication if the project is completed. Innovation (Scored Criterion): This can be technological, methodological or conceptual. Most commonly development or early adoption, or first adoption of novel technology. Methods: novel trial design, novel outcome measure etc. Novel hypothesis that challenges existing paradigm

Approach Preliminary data (Scored under approach): Opportunity to demonstrate that the applicant/team is capable of performing proposed plan. Addresses feasibility: recruitment, feasibility of assay, technology, analytics Unpublished Innovations Approach (Scored: most common target of criticism; Weaknesses are found here). How will you accomplish your aims? Clear, precise delineation of Inclusion criteria, exclusion criteria, subject enrollment mechanism, intervention.

Approach Primary outcome Must be clearly stated, easy to understand. Provide published or preliminary evidence that it is feasible, reproducible, sensitive and specific measure of the variable of interest (rigor). Secondary outcomes:Justify each. Refer to statistical analysis plan (rigor), with justification of expected effect size, and sample size. Recruitment (common problem) Provide all evidence to support your claim that you can recruit sufficient number of subjects. Be realistic, common error is to be overly optimistic. Address briefly inclusion of women, children, minorities. Also safety plan. Longer description in human subjects.

Environment & investigator Environment: Demonstrate that adequate institutional/ network resources exist to support proposed research. If single institution grant then list department/ school/university resources (equipment /infrastructure/ people). If through a network then list network resources and individual site resources Investigator: New Bisoketch form lets you tell your story. List your strengths and weaknesses. Explain how you will address your weakness(es) (usually by using collaborators). List your areas of experitse and publications. Online tool to create Biosketch: My NCBI » SciENcv

Grant preparation (Admin) Timing, institutional support, administrative needs. Multicenter grants require administrative planning, sub contract negotiation, site budgets, support letters. Plan 6-12 months ahead. Prepare Budget early, revise often as you think of new items. Ask for budgets from other sites early. Justify each and every item/ person esp if non modular. If >500 K per year get NIH permission. Make sure to meet with your Institutional officials to ensure that you have support letters as needed.

Summary statement Scored criteria Significance: Investigator(s): Review Group: Example: NSD-K SRG Action: Impact Score: Range 10-90 or ** Summary of discussion (If scored) This is what was said at the meeting about your application. This discussion determined the final score. Longer when committee is divided. Individual critiques ( Minimum of 3 ): Scored criteria Significance: Investigator(s): Innovation: Approach: Environment: Discussed after scores Protections for Human Subjects, Data and Safety Monitoring Plan , Inclusion of Women, Minorities and Children, Vertebrate Animals, Biohazards, Applications from Foreign Organizations, Select Agents, Resource Sharing Plans. Authentication of Key Biological and/or Chemical Resources. Budget and Period of Support

WRIGHT Qs What study sections love and hate (Wright) What happens during the study section (Wright) What are the review criteria (Wright) What is an ND?

WHAT STUDY SECTIONS HATE Sloppiness (grammar, spelling, organization)- ALL sections Too dense, play on font size and space, small images Overly ambitions (especially for Jr, or training grants) Dependent Aims! Lack of serious statistical support Poor justification for your approach (populations, testing instruments, outcome measures, alternative hypotheses) Outrageous budgets Lack of attention to human subjects

WHAT STUDY SECTIONS LOVE Clean, supported story Solid connection from significance, foundation (prelim data), approach, outcomes, interpretation of results Thoughtful discussion of limitation and how the outcome of the grant will guide discovery even if negative Evidence that you can do the study - recruitment (type and number), expertise in group, ...especially showing have done before. Clear path for next step especially if Go-No-Go grant Lack of serious statistical support Strong statistical plan

WHAT HAPPENS IN STUDY SECTION

Scoring Descriptions 16

Additional Guidance on Strengths/Weaknesses Score Descriptor Additional Guidance on Strengths/Weaknesses 1 Exceptional Exceptionally strong with essentially no weaknesses 2 Outstanding Extremely strong with negligible weaknesses 3 Excellent Very strong with only some minor weaknesses 4 Very Good Strong but with numerous minor weaknesses 5 Good Strong but with at least one moderate weakness 6 Satisfactory Some strengths but also some moderate weaknesses 7 Fair Some strengths but with at least one major weakness 8 Marginal A few strengths and a few major weaknesses 9 Poor Very few strengths and numerous major weaknesses Minor Weakness: An easily addressable weakness that does not substantially lessen impact Moderate Weakness: A weakness that lessens impact Major Weakness: A weakness that severely limits impact

What Happens in Study Section Initial scores Primary Reviewer 1 Provides and overview of your proposal Scientific foundation reviewer Presents the preclinical supporting data Assigned Reviewers 2-(10) Each get their shot at what’s wrong with the grant Floor opens to other non-assigned reviewers Questions back and forth to assigned reviewers Final points by assigned reviewers Final scores (tightening of the range sought) Budget

Evaluation Criteria in Grant

Review Criteria Approach – (90% of the review) Each scored 1-9 Overall IMPACT (Total Score – not average) Significance – (incremental advance vs change the field) Investigators – (experience, experts, stats) Innovation – (Too little, too much) Approach – (90% of the review) Environment – (mostly covered if at academic facility) Human Subjects (target diversity and genders) Other (training grant etc) Budget Overall Impact Score 1-9 x 10 = final score (e.g. 30); Final Percentile Rank – how many grants scored better than yours (e.g. 50%) in the study section – rank of your grant and payline based on percentile, not your grant score

OMG – NOT DISCUSSED (ND) MY PROPOSAL WAS NOT SCORED!!!! Does not mean the idea is doomed Proposals had enough flaws to push score into non-competitive Read reviews carefully and respond to EVERY comment Category for doomed – Not Recommended for Further consideration (NRFC) Understanding the Percentile A percentile is the approximate percentage of applications that received a better overall impact/priority score from the study section during the past year. For applications reviewed in ad hoc study sections, a different base may be used to calculate percentiles. All percentiles are reported as whole numbers Only a subset of all applications receive percentiles. The types of applications that are percentiled vary across different NIH Institutes and Centers. The summary statement will identify the base that was used to determine the percentile.

ITS OK, YOU CAN RE-APPLY

YEAR AB H HR RBI BB AVG 2004 263 77 17 40 14 0.293 2005 575 176 42 102 72 0.306 2006 582 181 116 66 0.311 2007 604 196 107 94 0.325 2008 626 189 124 0.302 2009 535 164 39 74 0.307 2010 587 166 36 103 69 0.283 2011 389 99 23 61 52 0.254 2012 581 178 41 93 81 2013 430 132 58 55 2014 144 30 63 0.269 2015 152 44 7 22 0.289 2016 29 8 3 0.276

Evaluation form for CMEs bit. ly/r25eval Email ninds-ctmc-info@umich Evaluation form for CMEs bit.ly/r25eval Email ninds-ctmc-info@umich.edu with questions CTMC webinars and slides will be archived on the course website https://nett.umich.edu/training/ctmc