Assessment of Proposed Research: National Cancer Institute Provocative Questions Initiative Case Study Elizabeth Hsu American Evaluation Association 2012.

Slides:



Advertisements
Similar presentations
Computer English For Computer Major Master Candidates
Advertisements

Quantitative vs. Qualitative Research Method Issues Marian Ford Erin Gonzales November 2, 2010.
Asha Balakrishnan Vanessa Peña Bhavya Lal Task Leader November 5, 2011
Quantitative Research Deals with quantities and relationships between attributes (variables). Involves the collection and analysis of highly structured.
The LEADS Database at ICPSR: Identifying Important Social Science Studies for Archiving Presentation prepared for 2006 Annual Meeting of the IASSIST Friday,
Decoding RFAs and PAs Charlotte FlippDivision of Epidemiology & Community Health (EpiCH) Anne EverettDivision of Epidemiology & Community Health (EpiCH)
November 13, 2009 NIH PROPOSAL SUBMISSIONS: 2010 REVISONS.
4/17/2017 Maximizing Investigators’ Research Award for New and Early Stage Investigators (R35) Jon Lorsch, Director, NIGMS Peter Preusch, Program Director,
Report.
UAMS Department of Biochemistry and Molecular Biology
Designing an Evaluation of the Effectiveness of NIH’s Extramural Loan Repayment Programs.
Navigating the Changes to the NIH Application Instructions Navigating the Changes to the NIH Application Instructions EFFECTIVE JANUARY 25, 2010.
American Evaluation Association Annual Meeting, November 2011 Using Multiple Methods and Data Sources to Analyze Complex Cancer Research Portfolios Joshua.
Data provided by the Division of Statistical Analysis & Reporting (DSAR)/OPAC/OER Contact: Best Practices: Leveraging Existing Data.
Newsjunkie: Providing Personalized Newsfeeds via Analysis of Information Novelty Gabrilovich et.al WWW2004.
Research Project Grant (RPG) Retreat K-Series March 2012 Bioengineering Classroom.
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Evaluation of novelty metrics for sentence-level novelty mining Presenter : Lin, Shu-Han Authors : Flora.
Elizabeth A. Martinez, MD, MHS Johns Hopkins Medical Institutions September 10, 2008 Organization of Care and Outcomes in Cardiac Surgery AHRQ grant 1K08HS A1.
Academic Research Enhancement Award (AREA) Program Erica Brown, PhD Director, NIH AREA Program National Institutes of Health 1.
NIH Challenge Grants in Health and Science Research RFA OD
The NIH Grant Review Process Hiram Gilbert, Ph.D. Dept. of Biochemistry, Baylor College of Medicine Xander Wehrens, M.D. Ph.D. Dept. of Molecular Physiology.
Gianpietro van de Goor, PhD Deputy Head of Unit “Strategic matters and relations with the ERC Scientific Council” ERC-DIS / European Commission Kalkara/Malta,
Helping Your Mentees Develop a Competitive K Award Application (K01, K07, K08, K23, K25, K99) Thomas Mitchell, MPH Dept. of Epidemiology and Biostatistics.
NATIONAL INSTITUTES OF HEALTH CHALLENGE GRANT APPLICATIONS Dan Hoyt Survey, Statistics, and Psychometrics(SSP) Core Facility March 11, 2009.
1 Preparing an NIH Institutional Training Grant Application Rod Ulane, Ph.D. NIH Research Training Officer Office of Extramural Research, NIH.
Career Development Awards (K series) and Research Project Grants (R series) Thomas Mitchell, MPH Department of Epidemiology & Biostatistics University.
What Makes a Proposal Successful Dr. George B. Stefano The State University of New York College at Old Westbury October 6, 2008.
Restructured NIH Applications One Year Later:
OCTOBER 18, 2011 SESSION 9 OF AAPLS – SELECTED SUPPORTING COMPONENTS OF SF424 (R&R) APPLICATION APPLICANTS & ADMINISTRATORS PREAWARD LUNCHEON SERIES Module.
Reviewing the quality of evidence – a different act of remembering David Earle Tertiary Sector Performance Analysis Ministry of Education, New Zealand.
Toward Entity Retrieval over Structured and Text Data Mayssam Sayyadian, Azadeh Shakery, AnHai Doan, ChengXiang Zhai Department of Computer Science University.
Funding Opportunities for Investigator-initiated Grants with Foreign Components at the NIH Somdat Mahabir, PhD, MPH Program Director Epidemiology and Genetics.
R01? R03? R21? How to choose the right funding mechanism Thomas Mitchell, MPH Department of Epidemiology & Biostatistics University of California San Francisco.
Organizational Funding Portfolios and Beyond: Assessing the Full Research Landscape Panel Session 731 American Evaluation Association EVALUATION 2012 October.
Rigor and Transparency in Research
Investing in research, making a difference. Applying for the WARF Discovery Challenge Research Award 1.
Reviewers Expectations Peter Donkor. Outline Definitions The review process Common mistakes to avoid Conclusion.
Training Program Evaluation: Creating a Composite Indicator to Measure Career Outcomes National Institutes of Health / National Cancer Institute & Thomson.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Intelligent Exploration for Genetic Algorithms Using Self-Organizing.
2016 Primary Assessment Update 27th September 2016
American Evaluation Association
The Pathway to Independence: Early
Understanding Standards: Higher Course Event
Writing Scientific Research Paper
American Evaluation Association
Kari Wojtanik, Ph.D. Susan G. Komen Samantha Finstad, Ph.D.
Snapshot of the Clinical Trials Enterprise as revealed by ClinicalTrials.gov Download date: Sept 2011.
Design and Critique of Grants for Implementation Research
Thomas Mitchell, MA, MPH Department of Epidemiology & Biostatistics
Planning your Dissertation
A Methodology for Finding Bad Data
EVALUATION OF V&V TOOLS
Identifying Programs and Contacting Program Directors
Grant Writing Information Session
Center Specific Outcomes Reporting
Guidelines for Green Computing projects
An Efficient method to recommend research papers and highly influential authors. VIRAJITHA KARNATAPU.
School-wide Positive Behavior and Intervention Supports (PBIS)
Quantitative vs. Qualitative Research Method Issues
Russell Center Small Research Grants Program
Conducting a STEM Literature Review
K R Investigator Research Question
School-wide Positive Behavior and Intervention Supports (PBIS)
UAMS Department of Biochemistry and Molecular Biology
LITERATURE REVIEW by Moazzam Ali.
Understanding How the Ranking is Calculated
Thomas Mitchell, MA, MPH Department of Epidemiology & Biostatistics
Analyzing and Organizing Information
Marine Strategy Coordination Group 14 November 2011, Brussels
M. Wang, J. Wood, J. Hale, L. Petsonk, and M. Attfield
Presentation transcript:

Assessment of Proposed Research: National Cancer Institute Provocative Questions Initiative Case Study Elizabeth Hsu American Evaluation Association 2012 Annual Meeting October 27, 2012 Co-Authors: Samantha Finstad, Emily Greenspan, Jerry Lee, James Corrigan (NIH) Leo DiJoesph, Duane Williams, Joshua Schnell (Thomson Reuters)

Outline Analysis Questions Applicant characterization study Novelty and relevance study

Outline Analysis Questions Applicant characterization study Novelty and relevance study

Analysis Questions What was the breadth of the investigators who applied to the PQ initiative? Did the PQ initiative compel investigators to propose new lines of research? -“Novelty” as compared to self and general Did the ideas presented in the applications fill gaps in the NCI portfolio, as identified by the PQ RFA? -“Relevance” as compared to RFA text

Outline Analysis Questions Applicant characterization study Novelty and relevance study

Methodology: Applicant Characterization Discipline -Determined using combination of degree, institution department, primary degree field, expertise, and PI biosketch -MD and MD/PhD not further characterized Investigator status -NIH definition of new investigator and early stage investigator -Flags in NIH database; suspect flags confirmed using prior grant history and date of degree information Multiple PI applications

Applicant Characterization Investigator Status% of PQ Applicants New Investigator20.7% Early Stage Investigator (NI Subset)15.1% Experienced Investigator64.2% Investigator Discipline% of PQ Applicants Basic/life sciences47.7% Behavioral1.6% Epidemiology2.1% Physical science/engineering13.3% MD15.3% MD/PhD20.0% Multi-PI Applications% of PQ Applications 1 PI77.7% 2 PI17.5% 3+ PI4.8% NIH awarded PIs: ~30% MD or MD/PhD NCI R01/R21 applicants: ~30% MD or MD/PhD NIH R01: ~30% New Investigator/Early Stage Investigator NCI R01: ~33% New Investigator/Early Stage Investigator

Applicant Characterization PQ RFA attracted: -More new investigators/early stage investigators -More MD and MD/PhDs Every PQ received multi-PI applications

Outline Analysis Questions Applicant characterization study Novelty and relevance study Methodology Results Conclusions

Methodology: Novelty and Relevance Text similarity algorithm, as previously described Self novelty Compare PQ application to prior applications by same PI General novelty Compare PQ application to sample of NIH applications, excluding PI’s own applications Relevance Compare PQ application to question text (question, background, feasibility, implications of success) as posed in RFA Scaled from 0 to 1 Thresholds chosen by manual review

Methodology: Measurement Goals 754 PQ applications received FY ,088 comparison cohort NIH applications FY ,008 HHS/NIH applications found as similar to the PQ RFA text in the portfolio analysis study FY Provocative Question (PQ) RFAs FY 2012 Pre-RFA Portfolio Analysis “Coincidental” Relevance Relevance Novelty

Methodology: Plotting Text Similarity Scores 01 Text Distance Decreasing Document Similarity Increasing Document Similarity Not RelevantRelevant Not Novel Novel Distance (Comparing RFA text to PQ Application text) Distance (Comparing PQ Application text to Prior Application text)

Outline Analysis Questions Applicant characterization study Novelty and relevance study Methodology Results Did the PQ initiative compel investigators to propose new lines of research? Did the ideas presented in the applications fill gaps in the NCI portfolio, as identified by the PQ RFA? Conclusions

Did Applicants Change Their Research Direction? In general, applicants stayed close to their prior lines of research Median novelty distance by- self = 0.08 Notable exceptions where investigators appeared to move far from prior applications Multiple outliers with high novelty scores found in questions 5,11,17,18,21,23

Were Applications Similar to Other Existing NIH Research (non-self)? Questions with largest proportion of novel applications relative to the NIH-wide set were: - 3 (100%) - 2 (87%) -13 (81%) -12 (75%) - 4 (73%) Comparison group is needed to better understand the general trends.

Did the PQs Compel Investigators to Propose New Lines of Research? PQ applicants generally did not stray far from previous lines of research Subtle scientific differences may result in low similarity scores PQ application text was generally more similar to the PI’s previous work than to NIH-wide pool Comparison group is needed to evaluate how the similarity seen here compares to responses to other RFAs

Outline Analysis Questions Applicant characterization study Novelty and relevance study Methodology Results Did the PQ initiative compel investigators to propose new lines of research? Did the ideas presented in the applications fill gaps in the NCI portfolio, as identified by the PQ RFA? Conclusions

Did Applications Address Gaps in Scientific Research as Intended by the RFA? Relative Ranking of Relevance: PQ vs. Portfolio Analysis Applications N = , PQ RFA ResponsePortfolio Analysis Application Distance Question Number

Did Applications Address Gaps in Scientific Research as Intended by the RFA? Proxy examination question: Did the text of the PQ applications show a closer relationship to the RFA text compared to previously existing grants? -Drawbacks of using RFA text May refer to specific examples (e.g., particular cancer site or drug), biasing towards applications that refer to that example Some applications repeat the RFA text verbatim Application Level -613 of applications were found to be relevant. Question Level -Some questions received only a small percentage of applications below threshold -Some questions received applications that were generally no more responsive than previously existing applications -Questions that received the most responsive applications were 2, 16, 22 and 24

Were Applications Generally Novel and Relevant? Yes No NI or ESI Relevance = 1 – rel. distance Minimum Novelty Distance Relevance and Best (Minimum) By-Self Novelty of PQ Applications with Investigator Status PQ 1 (n = 77) PQ 5 (n = 64)PQ 14 (n = 47)PQ 17 (n = 29) Relevance and Best (Minimum) General Novelty of PQ Applications with Investigator Status PQ 1 (n = 84)PQ 5 (n = 67)PQ 14 (n = 50)PQ 17 (n = 32)

Outline Analysis Questions Applicant characterization study Novelty and relevance study Methodology Results Conclusions

Methodology Conclusions Novelty score detected applications that were very similar to prior work with high recall Challenging to distinguish scientific nuances and extensions of previous work Current relevance measurement is dependent upon the target text Comparison group may provide a more thorough understanding of both the novelty and relevance metrics Lessons learned -A larger cohort is needed to avoid missing novelty spoilers -Similarity measurements are sensitive to many factors, including cohort size and scaling parameters -Specific aims may be more informative than abstract text for measuring relevance

Questions?

Applicant Discipline Basic/Life sciences: 47.7% Behavioral: 1.6% Epidemiology: 2.1% MD: 15.3% MD/PhD: 20.0% Physical science/Engineering: 13.3%

Applicant Investigator Status New Investigator: 20.7% Early Stage Investigator: 15.1% Experienced Investigator: 64.2%

Multi-PI Applications 1 PI: 77.7% 2 PI: 17.5% 3+ PI: 4.8%

Did Applications Address Gaps in Scientific Research as Intended by the RFA? , , , , , ,155 Relative Ranking of Relevance: PQ Applications vs. Portfolio Analysis Applications PQ RFA ResponsePortfolio Analysis Application n (PQ) = n (Portfolio Analysis) =

Were Applications Generally Novel and Relevant? The majority of questions were found to be similar to the applicant’s prior work and relevant to the RFA