Download presentation
Presentation is loading. Please wait.
Published byRandell Bates Modified over 9 years ago
1
Out of Control? Selecting Comparison Groups for Analyzing NIH Grants and Grant Portfolios American Evaluation Association Meeting Saturday November 14, 2009
2
Session Purpose Explore the choices we make relating to comparison groups in a science management context Examples drawn from different NIH Institutes/Centers Variety of contexts Focused on the methodology of comparison group selection rather than results of particular evaluations Opportunity to learn about efforts to select comparison groups and to discuss the strengths and weaknesses of the various methodological choices with other evaluation experts
3
Session Overview 3:30 – 3:35: Introduction 3:35 – 3:50: Christie Drew: Unsolicited P01s at NIEHS (P01=multi-program project grant) 3:50 – 4:05: Jamelle Banks: NIH-Funded Research in the Context of A Scientific Field (NICHD) 4:05 – 4:20: Milton Hernandez: NIH Loan Repayment: Regression Discontinuity Analysis (OER) 4:20 – 4:35: Wesley Schultz: of Propensity Scores in a Longitudinal Science Study of Minority Biomedical Research Support (Cal State San Marcos/NIGMS) 4:35 – 5:00: Discussion
4
Common themes Establishing comparability Use of information technology Compromises Others….
5
Establishing a Comparison Set for Evaluating Unsolicited P01s at the National Institute of Environmental Health Sciences AEA, November 14, 2009 Christie Drew – drewc@niehs.nih.gov 919-541-3319drewc@niehs.nih.gov Martha Barnes, Jerry Phelps, Pat Mastin
6
Overview Brief overview of P01 grant mechanism and study goals Finding a comparison group –Key Challenges –Approach
7
NIH Extramural Grant Context Many different types of awards are given: –R: Research –P: Center (coordinated multi-project) –K: Career –T: Training –F: Fellowship R01 = Research Project –Discrete, specified circumscribed project, performed by the named investigator(s), in specific area of expertise P01 = Research Program Project –Broad based, multi-disciplinary, long-term, large groups under the direction of an established researcher, specific coordinated objectives, each project supports a common theme –Assumption = “whole” > sum of its parts
8
“Solicited” v. “Unsolicited” Solicited = grants submitted in response to Funding Announcements or Program Announcements (specific $ set aside for funding) Unsolicited = everything else. “Investigator Initiated” is a synonym This analysis was focused on “unsolicited” P01s and R01s Decision context: 2007 Moratorium on Unsolicited P01s, except “renewals”
9
Evaluation plan Five Core Questions: 1.What is the overall investment in the unsolicited P01 program? 2.Are P01s able to achieve scientific outcomes that are greater than the sum of their parts? 3.Do P01s achieve synergy among sub projects and with their home institutions? 4.What are the key roadblocks/challenges inherent inP01s 5.Is there a typical “natural history” of P01s? Phase 1 – Answer as many questions as possible by Dec 2008 using available data. Decide how to move forward with additional phases.
10
Compare Unsolicited P01 to Unsolicited R01s The average P01 has 3x as many projects as R01s. Are they 3x productive? How do we identify “the right” P01’s to compare.
11
Unsolicited P01 profile at NIEHS 2 Renewals (6) 1 Renewal (17) 0 renewals (29) 3 Renewals (5) 5+ Renewals (4) 4 Renewals (2)
12
NIEHS P01 Science These categories are an abstraction of the PCC Science codes – adapted from the T32 program analysis done in 2006.
13
Goal: Choose a Reasonable set of R01s for Comparison
14
Challenges (1) Variation in data quality –IMPAC II data system improved significantly over time –Publication data, and especially publication data linked to grants has improved considerably in the past 5 years –PI track record of citing grants in publications improves over time Responses –Narrowed our detailed analysis to 23 P01s grants active 2002-2007 (excluded the one that started in 2007) –Divided cumulative # of pubs by the # of years a grant had been operating
15
Challenges (2) How to find a “scientific” match –Nearest Neighbor match – eSPA/Discovery Logic assisted Mathematical approach “google style” context matching – focuses on unique words compared to broader set If had multiple science areas in subprojects, tried to match each area –Vetting with Program Officers Provided 5-10 potential matches; approved/disapproved each –Key criteria – “Would they publish in similar journals?” Given overlaps in science, some R01s matched many P01s; tricky resolution required to resolve multiple matches
16
Challenges (3) Varying lengths of P01 programs –Chose longer R01s when possible to ensure valid comparisons, but this is a study weakness –Only R01s that began before 2006 were eligible Small number in the study set (23) limited the comparisons –Aggregated the results – compared products of 23 P01s to the products of 98 R01s (rather than a matched case-control analysis)
17
Summary of the Decisions Narrowed the analytical set to 23 Active P01s 2002-07 Identified “matching R01s” using DL “nearest neighbor” approach to identify candidates, POs helped narrow/select/identify better matches. Selected 98 R01s. Each P01 had at 3-5 scientific matches. Analysis completed on the aggregated sets. Included the Solicited P01s as another reasonable comparison set for the Unsolicited P01s.
18
Questions 1.Was the comparison group reasonable? 2.What would we have gained/lost by doing a – case-control analysis? 3.Are their other methods such as propensity scores or regression discontinuity analysis?
19
P01 Evaluation Committee Members Barnes, Martha Drew, Christie Eckert-Tilotta, Sally Gray, Kimberly Lawler, Cindy Loewe, Michael Mastin, Pat Nadadur, Srikanth Kirshner, Annette Phelps, Jerry Puente, Molly Reinlib, Leslie Additional Participants (Project Officers) Jerry Heindel Kim McAllister Claudia Thompson Fred Tyson
20
Questions? Thank you!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.