Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessment of Proposed Research: National Cancer Institute Provocative Questions Initiative Case Study Elizabeth Hsu American Evaluation Association 2012.

Similar presentations


Presentation on theme: "Assessment of Proposed Research: National Cancer Institute Provocative Questions Initiative Case Study Elizabeth Hsu American Evaluation Association 2012."— Presentation transcript:

1 Assessment of Proposed Research: National Cancer Institute Provocative Questions Initiative Case Study Elizabeth Hsu American Evaluation Association 2012 Annual Meeting October 27, 2012 Co-Authors: Samantha Finstad, Emily Greenspan, Jerry Lee, James Corrigan (NIH) Leo DiJoesph, Duane Williams, Joshua Schnell (Thomson Reuters)

2 Outline Analysis Questions Applicant characterization study Novelty and relevance study

3 Outline Analysis Questions Applicant characterization study Novelty and relevance study

4 Analysis Questions What was the breadth of the investigators who applied to the PQ initiative? Did the PQ initiative compel investigators to propose new lines of research? -“Novelty” as compared to self and general Did the ideas presented in the applications fill gaps in the NCI portfolio, as identified by the PQ RFA? -“Relevance” as compared to RFA text

5 Outline Analysis Questions Applicant characterization study Novelty and relevance study

6 Methodology: Applicant Characterization Discipline -Determined using combination of degree, institution department, primary degree field, expertise, and PI biosketch -MD and MD/PhD not further characterized Investigator status -NIH definition of new investigator and early stage investigator -Flags in NIH database; suspect flags confirmed using prior grant history and date of degree information Multiple PI applications

7 Applicant Characterization Investigator Status% of PQ Applicants New Investigator20.7% Early Stage Investigator (NI Subset)15.1% Experienced Investigator64.2% Investigator Discipline% of PQ Applicants Basic/life sciences47.7% Behavioral1.6% Epidemiology2.1% Physical science/engineering13.3% MD15.3% MD/PhD20.0% Multi-PI Applications% of PQ Applications 1 PI77.7% 2 PI17.5% 3+ PI4.8% NIH awarded PIs: ~30% MD or MD/PhD NCI R01/R21 applicants: ~30% MD or MD/PhD NIH R01: ~30% New Investigator/Early Stage Investigator NCI R01: ~33% New Investigator/Early Stage Investigator

8 Applicant Characterization PQ RFA attracted: -More new investigators/early stage investigators -More MD and MD/PhDs Every PQ received multi-PI applications

9 Outline Analysis Questions Applicant characterization study Novelty and relevance study Methodology Results Conclusions

10 Methodology: Novelty and Relevance Text similarity algorithm, as previously described Self novelty Compare PQ application to prior applications by same PI General novelty Compare PQ application to sample of NIH applications, excluding PI’s own applications Relevance Compare PQ application to question text (question, background, feasibility, implications of success) as posed in RFA Scaled from 0 to 1 Thresholds chosen by manual review

11 Methodology: Measurement Goals 754 PQ applications received FY2012 38,088 comparison cohort NIH applications FY1974-2011 14,008 HHS/NIH applications found as similar to the PQ RFA text in the portfolio analysis study FY2007-2011 24 Provocative Question (PQ) RFAs FY 2012 Pre-RFA Portfolio Analysis “Coincidental” Relevance Relevance Novelty

12 Methodology: Plotting Text Similarity Scores 01 Text Distance 0.10.20.30.40.50.60.70.80.9 Decreasing Document Similarity Increasing Document Similarity Not RelevantRelevant Not Novel Novel Distance (Comparing RFA text to PQ Application text) Distance (Comparing PQ Application text to Prior Application text)

13 Outline Analysis Questions Applicant characterization study Novelty and relevance study Methodology Results Did the PQ initiative compel investigators to propose new lines of research? Did the ideas presented in the applications fill gaps in the NCI portfolio, as identified by the PQ RFA? Conclusions

14 Did Applicants Change Their Research Direction? In general, applicants stayed close to their prior lines of research Median novelty distance by- self = 0.08 Notable exceptions where investigators appeared to move far from prior applications Multiple outliers with high novelty scores found in questions 5,11,17,18,21,23

15 Were Applications Similar to Other Existing NIH Research (non-self)? Questions with largest proportion of novel applications relative to the NIH-wide set were: - 3 (100%) - 2 (87%) -13 (81%) -12 (75%) - 4 (73%) Comparison group is needed to better understand the general trends.

16 Did the PQs Compel Investigators to Propose New Lines of Research? PQ applicants generally did not stray far from previous lines of research Subtle scientific differences may result in low similarity scores PQ application text was generally more similar to the PI’s previous work than to NIH-wide pool Comparison group is needed to evaluate how the similarity seen here compares to responses to other RFAs

17 Outline Analysis Questions Applicant characterization study Novelty and relevance study Methodology Results Did the PQ initiative compel investigators to propose new lines of research? Did the ideas presented in the applications fill gaps in the NCI portfolio, as identified by the PQ RFA? Conclusions

18 Did Applications Address Gaps in Scientific Research as Intended by the RFA? Relative Ranking of Relevance: PQ vs. Portfolio Analysis Applications N = 84 46415 129 67 5522 30550 1,02432 76 PQ RFA ResponsePortfolio Analysis Application Distance Question Number

19 Did Applications Address Gaps in Scientific Research as Intended by the RFA? Proxy examination question: Did the text of the PQ applications show a closer relationship to the RFA text compared to previously existing grants? -Drawbacks of using RFA text May refer to specific examples (e.g., particular cancer site or drug), biasing towards applications that refer to that example Some applications repeat the RFA text verbatim Application Level -613 of applications were found to be relevant. Question Level -Some questions received only a small percentage of applications below threshold -Some questions received applications that were generally no more responsive than previously existing applications -Questions that received the most responsive applications were 2, 16, 22 and 24

20 Were Applications Generally Novel and Relevant? Yes No NI or ESI Relevance = 1 – rel. distance Minimum Novelty Distance Relevance and Best (Minimum) By-Self Novelty of PQ Applications with Investigator Status PQ 1 (n = 77) PQ 5 (n = 64)PQ 14 (n = 47)PQ 17 (n = 29) Relevance and Best (Minimum) General Novelty of PQ Applications with Investigator Status PQ 1 (n = 84)PQ 5 (n = 67)PQ 14 (n = 50)PQ 17 (n = 32)

21 Outline Analysis Questions Applicant characterization study Novelty and relevance study Methodology Results Conclusions

22 Methodology Conclusions Novelty score detected applications that were very similar to prior work with high recall Challenging to distinguish scientific nuances and extensions of previous work Current relevance measurement is dependent upon the target text Comparison group may provide a more thorough understanding of both the novelty and relevance metrics Lessons learned -A larger cohort is needed to avoid missing novelty spoilers -Similarity measurements are sensitive to many factors, including cohort size and scaling parameters -Specific aims may be more informative than abstract text for measuring relevance

23 Questions?

24 Applicant Discipline Basic/Life sciences: 47.7% Behavioral: 1.6% Epidemiology: 2.1% MD: 15.3% MD/PhD: 20.0% Physical science/Engineering: 13.3%

25 Applicant Investigator Status New Investigator: 20.7% Early Stage Investigator: 15.1% Experienced Investigator: 64.2%

26 Multi-PI Applications 1 PI: 77.7% 2 PI: 17.5% 3+ PI: 4.8%

27 Did Applications Address Gaps in Scientific Research as Intended by the RFA? 84 464 15 129 12 2,015 15 885 67 55 31 1,206 19 179 19 817 31 193 27 193 50 485 28 4,571 22 305 50 1,024 8 35 9 822 32 76 69 536 9 10 31 3,218 42 386 24 522 23 363 37 3,155 Relative Ranking of Relevance: PQ Applications vs. Portfolio Analysis Applications PQ RFA ResponsePortfolio Analysis Application n (PQ) = n (Portfolio Analysis) =

28 Were Applications Generally Novel and Relevant? The majority of questions were found to be similar to the applicant’s prior work and relevant to the RFA


Download ppt "Assessment of Proposed Research: National Cancer Institute Provocative Questions Initiative Case Study Elizabeth Hsu American Evaluation Association 2012."

Similar presentations


Ads by Google