Out of Control? Selecting Comparison Groups for Analyzing NIH Grants and Grant Portfolios American Evaluation Association Meeting Saturday November 14,

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Critical Reading Strategies: Overview of Research Process
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
Introduction: Towards an Integrated Reporting System for Marine Protected Areas in the Baja to Bering Sea (B2B) Commission for Environmental Cooperation.
How your NIH grant application is evaluated and scored Larry Gerace, Ph.D. June 1, 2011.
DIVISION OF LOAN REPAYMENT Milton J. Hernández, Ph.D. Director Division of Loan Repayment OEP, OER Mapping your Career with NIH.
Overview of Mentored K Awards Shawna V. Hudson, PhD Assistant Professor of Family Medicine and Community Health UMDNJ-RWJMS The Cancer Institute of New.
Behavioral Health Research Funding Opportunities For Social Science Research Dan Hoyt Department of Sociology.
Training and Career Development Analyses: NICHD Diversity Supplements and F31 Pre-doctoral Fellowships Jennifer Guimond, PhD and Sarah Glavin, PhD Science.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
How Your Application Is Reviewed Vonda Smith, Ph.D. Scientific Review Officer (SRO)
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
NIH Brown Bag Lunch SOT 2010 March 9, 2010 Janice Allen, PhD Michael Humble, PhD Division of Extramural Research and Training (DERT) National Institute.
November 13, 2009 NIH PROPOSAL SUBMISSIONS: 2010 REVISONS.
Roger Sorensen, Ph.D., MPA Program Official National Institute on Drug Abuse 1 Update on “New” Investigator Activities.
1 Major changes Get ready! Changes coming to Review Meetings Considering Potential FY2010 funding and beyond: New 1-9 Scoring System Scoring of Individual.
How to Improve your Grant Proposal Assessment, revisions, etc. Thomas S. Buchanan.
Director, AREA Program National Institutes of Health Meet the Experts in NIH Peer Review, November 2014.
Using NIH’s Research Portfolio Online Report Tool (RePORT) to Your Advantage June 2013 Megan Columbus Director, Division of Communications and Outreach.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Evaluating NSF Programs
Designing an Evaluation of the Effectiveness of NIH’s Extramural Loan Repayment Programs.
1 Introduction to Grant Writing Beth Virnig, PhD Haitao Chu, MD, PhD University of Minnesota, School of Public Health December 11, 2013.
Data provided by the Division of Statistical Analysis & Reporting (DSAR)/OPAC/OER Contact: Best Practices: Leveraging Existing Data.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
American Evaluation Association EVALUATION 2009 November 14, 2009 Building Data Systems to Support Evaluation in a Biomedical Research and Development.
Mathematics and Science Education U.S. Department of Education.
Academic Research Enhancement Award (AREA) Program Erica Brown, PhD Director, NIH AREA Program National Institutes of Health 1.
NIH Challenge Grants in Health and Science Research RFA OD
NIH Extramural Data Book – last update May 2008Data provided by the Division of Information Services, Reporting Branch SR 1 SUCCESS RATES A success rate.
Luci Roberts, Director of Planning and Evaluation Katrina Pearson, Assistant Director, Division of Statistical Analysis and Reporting Sally Amero, NIH.
10 Important Criteria for Change Management Success Karen Korb TELUS Health Solutions November 23, 2009.
Maternal and Child Health Public Health Catalyst Program HRSA FY 2015 Funding Opportunity Announcement Pre-Review Orientation Call Division of MCH.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
Susan R. Kayar, PhD Health Scientist Administrator Research Infrastructure NCRR, NIH Funding Opportunities through the National Center for Research Resources.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
Components of a Successful AREA (R15) Grant Rebecca J. Sommer Bates College.
1 Preparing an NIH Institutional Training Grant Application Rod Ulane, Ph.D. NIH Research Training Officer Office of Extramural Research, NIH.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
New Investigator and Early Career Grant Opportunities Dan Hoyt.
Innovation through Institutional Integration (I 3 ) National Science Foundation Directorate for Education and Human Resources National Science Foundation.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
1Mobile Computing Systems © 2001 Carnegie Mellon University Writing a Successful NSF Proposal November 4, 2003 Website: nsf.gov.
Career Development Awards (K series) and Research Project Grants (R series) Thomas Mitchell, MPH Department of Epidemiology & Biostatistics University.
Limited Submissions NCURA Region III Spring Meeting.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
 Office of Extramural Research National Institutes of Health U.S. Department of Health and Human Services Henry Khachaturian, Ph.D. Acting NIH Research.
1 SBIR/STTR Overview Wang Yongqiang. 2 Federal SBIR/STTR Program ‣ A +$2Billion funding program set-aside for small businesses seeking to early stage.
SIF II Briefing Session 21 st September Briefing Session Content SIF Cycle I – overview Funding and arising issues SIF Cycle II – Process for evaluation.
Future Approaches to NIAMS Clinical Trials * As presented to the NIAMS Advisory Council on June 2, 2009.
R01? R03? R21? How to choose the right funding mechanism Thomas Mitchell, MPH Department of Epidemiology & Biostatistics University of California San Francisco.
Michael Sesma, Ph.D. National Institute of Mental Health Early Stage Investigators and the Program Perspective.
CU Development Grants 2016 Information Session 482 MacOdrum Library June 2 nd, 2016.
Rigor and Transparency in Research
National Institutes of Health U.S. Department of Health and Human Services Planning for a Team Science Evaluation ∞ NIEHS: Children’s Health Exposure Analysis.
NIH R03 Program Review Ning Jackie Zhang, MD, PhD, MPH College of Health and Public Affairs 04/17/2013.
From Outputs to Impacts: Emerging Approaches to Track Scientific Research Impacts American Evaluation Association Saturday October 27, 2012.
Data Mining for Expertise: Using Scopus to Create Lists of Experts for U.S. Department of Education Discretionary Grant Programs Good afternoon, my name.
MedStar Health Research Institute
Portfolio Analysis in OPASI at NIH
NATA Foundation Student Grants Process
NSF/NIH Review Processes University of Southern Mississippi
Thomas Mitchell, MA, MPH Department of Epidemiology & Biostatistics
NSF/NIH Review Processes University of Southern Mississippi
Grant Writing Information Session
Highlights of the Sigma Xi Postdoc Survey
How to Write a Successful NIH Career Development Award (K Award)
Thomas Mitchell, MA, MPH Department of Epidemiology & Biostatistics
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Presentation transcript:

Out of Control? Selecting Comparison Groups for Analyzing NIH Grants and Grant Portfolios American Evaluation Association Meeting Saturday November 14, 2009

Session Purpose Explore the choices we make relating to comparison groups in a science management context Examples drawn from different NIH Institutes/Centers Variety of contexts Focused on the methodology of comparison group selection rather than results of particular evaluations Opportunity to learn about efforts to select comparison groups and to discuss the strengths and weaknesses of the various methodological choices with other evaluation experts

Session Overview 3:30 – 3:35: Introduction 3:35 – 3:50: Christie Drew: Unsolicited P01s at NIEHS (P01=multi-program project grant) 3:50 – 4:05: Jamelle Banks: NIH-Funded Research in the Context of A Scientific Field (NICHD) 4:05 – 4:20: Milton Hernandez: NIH Loan Repayment: Regression Discontinuity Analysis (OER) 4:20 – 4:35: Wesley Schultz: of Propensity Scores in a Longitudinal Science Study of Minority Biomedical Research Support (Cal State San Marcos/NIGMS) 4:35 – 5:00: Discussion

Common themes Establishing comparability Use of information technology Compromises Others….

Establishing a Comparison Set for Evaluating Unsolicited P01s at the National Institute of Environmental Health Sciences AEA, November 14, 2009 Christie Drew – Martha Barnes, Jerry Phelps, Pat Mastin

Overview Brief overview of P01 grant mechanism and study goals Finding a comparison group –Key Challenges –Approach

NIH Extramural Grant Context Many different types of awards are given: –R: Research –P: Center (coordinated multi-project) –K: Career –T: Training –F: Fellowship R01 = Research Project –Discrete, specified circumscribed project, performed by the named investigator(s), in specific area of expertise P01 = Research Program Project –Broad based, multi-disciplinary, long-term, large groups under the direction of an established researcher, specific coordinated objectives, each project supports a common theme –Assumption = “whole” > sum of its parts

“Solicited” v. “Unsolicited” Solicited = grants submitted in response to Funding Announcements or Program Announcements (specific $ set aside for funding) Unsolicited = everything else. “Investigator Initiated” is a synonym This analysis was focused on “unsolicited” P01s and R01s Decision context: 2007 Moratorium on Unsolicited P01s, except “renewals”

Evaluation plan Five Core Questions: 1.What is the overall investment in the unsolicited P01 program? 2.Are P01s able to achieve scientific outcomes that are greater than the sum of their parts? 3.Do P01s achieve synergy among sub projects and with their home institutions? 4.What are the key roadblocks/challenges inherent inP01s 5.Is there a typical “natural history” of P01s? Phase 1 – Answer as many questions as possible by Dec 2008 using available data. Decide how to move forward with additional phases.

Compare Unsolicited P01 to Unsolicited R01s The average P01 has 3x as many projects as R01s. Are they 3x productive? How do we identify “the right” P01’s to compare.

Unsolicited P01 profile at NIEHS 2 Renewals (6) 1 Renewal (17) 0 renewals (29) 3 Renewals (5) 5+ Renewals (4) 4 Renewals (2)

NIEHS P01 Science These categories are an abstraction of the PCC Science codes – adapted from the T32 program analysis done in 2006.

Goal: Choose a Reasonable set of R01s for Comparison

Challenges (1) Variation in data quality –IMPAC II data system improved significantly over time –Publication data, and especially publication data linked to grants has improved considerably in the past 5 years –PI track record of citing grants in publications improves over time Responses –Narrowed our detailed analysis to 23 P01s grants active (excluded the one that started in 2007) –Divided cumulative # of pubs by the # of years a grant had been operating

Challenges (2) How to find a “scientific” match –Nearest Neighbor match – eSPA/Discovery Logic assisted Mathematical approach “google style” context matching – focuses on unique words compared to broader set If had multiple science areas in subprojects, tried to match each area –Vetting with Program Officers Provided 5-10 potential matches; approved/disapproved each –Key criteria – “Would they publish in similar journals?” Given overlaps in science, some R01s matched many P01s; tricky resolution required to resolve multiple matches

Challenges (3) Varying lengths of P01 programs –Chose longer R01s when possible to ensure valid comparisons, but this is a study weakness –Only R01s that began before 2006 were eligible Small number in the study set (23) limited the comparisons –Aggregated the results – compared products of 23 P01s to the products of 98 R01s (rather than a matched case-control analysis)

Summary of the Decisions Narrowed the analytical set to 23 Active P01s Identified “matching R01s” using DL “nearest neighbor” approach to identify candidates, POs helped narrow/select/identify better matches. Selected 98 R01s. Each P01 had at 3-5 scientific matches. Analysis completed on the aggregated sets. Included the Solicited P01s as another reasonable comparison set for the Unsolicited P01s.

Questions 1.Was the comparison group reasonable? 2.What would we have gained/lost by doing a – case-control analysis? 3.Are their other methods such as propensity scores or regression discontinuity analysis?

P01 Evaluation Committee Members Barnes, Martha Drew, Christie Eckert-Tilotta, Sally Gray, Kimberly Lawler, Cindy Loewe, Michael Mastin, Pat Nadadur, Srikanth Kirshner, Annette Phelps, Jerry Puente, Molly Reinlib, Leslie Additional Participants (Project Officers) Jerry Heindel Kim McAllister Claudia Thompson Fred Tyson

Questions? Thank you!