Training Program Evaluation: Creating a Composite Indicator to Measure Career Outcomes National Institutes of Health / National Cancer Institute & Thomson.

Slides:



Advertisements
Similar presentations
 Why are you reading this? Both the Public Health Service and the National Science Foundation require WSU to provide all investigators training related.
Advertisements

Overview of Mentored K Awards Shawna V. Hudson, PhD Assistant Professor of Family Medicine and Community Health UMDNJ-RWJMS The Cancer Institute of New.
COURSE PROPOSAL: METHODS IN CLINICAL CANCER RESEARCH Primary Instructor: Elizabeth Garrett-Mayer, PhD Professor of Biostatistics, Dept. of Public Health.
Elements of a clinical trial research protocol
Training and Career Development Analyses: NICHD Diversity Supplements and F31 Pre-doctoral Fellowships Jennifer Guimond, PhD and Sarah Glavin, PhD Science.
Recently Issued OHRP Documents: Guidance on Subject Withdrawal and Draft Revised FWA Secretary’s Advisory Committee on Human Research Protections October.
Columbia University IRB IRB 101 September 21, 2005 George Gasparis, Executive Director, CU IRB Asst. V.P. and Sr. Asst. Dean for Research Ethics.
Research Proposal Development of research question
 Grants for projects  Grants for individuals  Grants for disaster recovery projects Foundation Grants.
Evaluating NSF Programs
Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition, or past practice. The importance.
Yesterday, today, and tomorrow
Designing an Evaluation of the Effectiveness of NIH’s Extramural Loan Repayment Programs.
American Evaluation Association Annual Meeting, November 2011 Using Multiple Methods and Data Sources to Analyze Complex Cancer Research Portfolios Joshua.
Who will play the most important role in your career? Department Chair Colleagues The NIH Mentor Trainees!!!
DCB New Grantee Workshop: Post-Award Administration of Grants Brett Hodgkins Team Leader National Cancer Institute Office of Grants Administration.
Got the Grant What’s next??????????? Joy R. Knipple Team Leader, National Institute of Mental Health July 26, 2006.
Systematic Reviews.
Responsible Conduct of Research (RCR) What is RCR? New Requirements for RCR Who Does it Affect? When? Data Management What is the Institutional Plan? What.
NIH LOAN REPAYMENT PROGRAM EVALUATION Milton J. Hernández, Ph.D. Director Division of Loan Repayment OEP, OER, OD National Institutes of Health Bethesda,
Research Program Overview National Institute on Disability and Rehabilitation Research Robert J. Jaeger, Ph.D. Interagency and International Affairs Interagency.
NIH Extramural Data Book – last update May 2008Data provided by the Division of Information Services, Reporting Branch SR 1 SUCCESS RATES A success rate.
Margo Michaels, MPH Executive Director, ENACCT Co PI, Communities as Partners in Cancer Clinical Trials, R13-HS Panel on Use and Implementation of.
”There is nothing so practical as a good theory” Management research in the improvement of health services Mats Brommels Medical Management Centre Karolinska.
The Scientific Method. Steps of Scientific Method 1.Observation: notice and describe events or processes 2.Make a question 1.Relate to observation 2.Should.
Evidence-Based Medicine Presentation [Insert your name here] [Insert your designation here] [Insert your institutional affiliation here] Department of.
+ Meeting of Assistant Professors June 29, Faculty and Academic Affairs Leadership Steven Abramson, M.D., Vice Dean for Education, Faculty and.
19/9/2005 Promotion and Tenure: Suggestions for Success Kimberly W. Anderson Professor Chemical and Materials Engineering.
Slide 1 Community Networks to Reduce Cancer Health Disparities Pre-Application Conference May 26, 2004 Bethesda, MD Kenneth C. Chu, PhD Chief, Disparities.
Outcome Evaluation of the National Cancer Institute (NCI) Career Development (K) Awards Program Julie L. Mason, Ph.D. Center for Cancer Training National.
DCB New Grantee Workshop: Post-Award Administration of Grants Brett Hodgkins Team Leader National Cancer Institute Office of Grants Administration.
Case Control Study Dr. Ashry Gad Mohamed MB, ChB, MPH, Dr.P.H. Prof. Of Epidemiology.
Understanding Medical Articles and Reports Linda Vincent, MPH UCSF Breast SPORE Advocate September 24,
I. Science is not A collection of never-changing facts or beliefs about the world.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Guidelines for Clinical Trial Registration.. Background. In 2005 the International Committee of Medical Journal Editors (ICMJE) announced that in order.
Evidence-Based Practice Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition,
HA Central Register of Clinical Research 1 Dec 04 HA REC.
Session 731 Progress Reporting for US Federal Grant Awards: Templates, Guidance, and Data Standards to Support Effective Program Evaluation Laure Haak,
Evaluation Itself is a Value: Combining Effectiveness Research and Epidemiology in a Naturalistic Realist Evaluation Paper presented at American Evaluation.
An Evaluation of Pipeline Interventions for Minority Scholars An Evaluation of Pipeline Interventions for Minority Scholars Roberta Spalter-Roth, Jean.
Assessment of Proposed Research: National Cancer Institute Provocative Questions Initiative Case Study Elizabeth Hsu American Evaluation Association 2012.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
October 24, 2006 Advisory Committee on Research on Women’s Health. 1 Summary Report of NIH Inclusion Data – FY2005 Carlos E. Caban, Ph.D., M.P.H. Office.
Looking for statistical twins
Career Sustainability Survey of KL2 Programs: Preliminary Results
The Pathway to Independence: Early
ClinicalTrials.gov Requirements
Patient Registries and Health Outcomes in Diabetes: A Retrospective Study Nipa Shah, MD1; Fern Webb, PhD1; Liane Hannah, BSH1; Carmen Smotherman, MS2;
Clinicaltrials.gov Update
American Evaluation Association
Present: Disease Past: Exposure
How to read a paper D. Singh-Ranger.
ClinicalTrials.gov: An introduction
Supplementary Table 1. PRISMA checklist
Introduction Results Methods Conclusion
Sneak Preview: The Revised PHS FCOI Regulations
Grant Writing Information Session
Project Personnel, Division, Agency Name (if applicable)
Analyzing Intervention Studies
کارگاه کشوری آموزش نقد و داوری مقالات علمی‌
Preparing for NIH’s sIRB Review Requirements
Snapshot of the Clinical Trials Enterprise as revealed by ClinicalTrials.gov Content downloaded: September 2012.
New NIH Human Subjects & Clinical Trials Information
Chapter 1 The Science of Biology.
What the Editors want to see!
Cindy Murray NP Princess Margaret Cancer Centre
Interpreting Epidemiologic Results.
NIGMS Training Programs – Opportunities for Synergies with SEPA
Changes to the Common Rule and Single IRB (sIRB)
Presentation transcript:

Training Program Evaluation: Creating a Composite Indicator to Measure Career Outcomes National Institutes of Health / National Cancer Institute & Thomson Reuters October 27, 2012 Presenter: Session Chair:

©2012 Thomson Reuters Acknowledgements National Cancer Institute (NCI) / Center for Cancer Training (CCT) –Jonathan Wiest –Julie Mason –Ming Lei –Jessica Faupel- Badger –Erika Ginsburg Discovery Logic / Thomson Reuters –Yvette Seger –Leo DiJoseph –Joshua Schnell –Laure Haak 2

©2012 Thomson Reuters What did we evaluate? – the NCI K program NCI/CCT administers grant mechanisms (K Awards) intended to stimulate career development of biomedical researchers. (cf. session by presenter Julie Mason). Cohort summary: –Fiscal Years 1970 to 2008 –7 grant mechanisms –2,889 principal investigators, 1,204 awardees (35%) 1,685 non-awardees 3

©2012 Thomson Reuters What was the “view through the Logiscope”? 4 NCI K Award program evaluation logic model

©2012 Thomson Reuters What data did we collect? Independent Variables –Individual Demographics –Prior Training –Sponsoring Institution –Application Timeline –Primary Mechanism –Funding Status of K Award Dependent Variables –grant activity –publications (productivity and quality) –clinical trials –professional society memberships –committee service, health care practice, and NIH employment. 5 Defined full cohort with selection rules, and a “bubble” cohort with propensity score matching having p(Award=Yes) ~ 0.5.

©2012 Thomson Reuters.What happened as we studied the data? Phase I – tabular and graphical descriptive summaries Phase II – linear & logit regression modeling and hypothesis testing of specific analysis questions Phase III – interpretation and revision –Leading to…a problem Missing data problem: high recall for NIH grants, but lower for grants from other sources, and other outcomes (publications, patents, etc.) Each outcome analyzed separately larger fraction of cohort affected by at least one recall issue 6

©2012 Thomson Reuters What was a typical missing data pattern ? 7 GrantsCommitteesPublications data missing data available Missing at least one 60% Missing all 41%

©2012 Thomson Reuters What did we add to the model to compensate for missing outcome data ? “Is (a) Researcher” –combines all sources of information about subsequent funded research activity “Is Engaged” –captures any available indication of continued participation in the field, even if there is no evidence of funded research –Definition restricted to {Is Researcher = No} cases, giving a scale: not engaged, engaged, researcher. 8

©2012 Thomson Reuters Using the indicators, what did our cohort look like ? 9 Indicator GroupIndividuals % of Cohort Is Researcher = Yes 1,55554% Is Engaged = Yes 1,04436% Is Engaged = No % For comparison the “not found” count for publications, as a single outcome, is 819 individuals.

©2012 Thomson Reuters How did we determine the indicator values ? Group 1: {Is Researcher = Yes} –Subsequent research activity with: NIH Department of Energy International Cancer Research Partnership (ICRP) Listed as Key Personnel on Registered Clinical Trials 10

©2012 Thomson Reuters How did we determine the indicator values ? Group 2: {Is Engaged= Yes} –Subsequent research engagement, including: Participation on NIH review panels (non-grant outcomes) Membership in the Federation of American Societies for Experimental Biology (FASEB), the American Association for Cancer Research (AACR) or the American Society of Clinical Oncology (ASCO) Inclusion in Healthlink physicians database Service on a Federal Advisory Committee (FACA), as reported through FIDO.gov 11

©2012 Thomson Reuters How did we determine the indicator values ? Group 3: {Is Engaged= Yes} –Individuals found as authors of articles in MEDLINE. 12

©2012 Thomson Reuters Did NCI K promote engagement ? 13 Total N Odds Ratio (95% CI) p value 1, (3.52, 8.34)8.2E-21 Tested as a 2x2 contingency table using the Fischer exact test Is Engaged YesNo Awardee Yes Awardee engaged Awardee not engaged No Non-Awardee engaged Non-Awardee not engaged

©2012 Thomson Reuters How well did specific grant programs do ? some uncertainty for smaller mechanisms, but generally odds of engagement significantly higher for awardees 14 GrantTotal NOdds Ratio (95% CI)p value K (2.00, 15.08) K (2.44, 42.81)3.3E-05 K (2.75, 13.17)2.2E-08 K (1.16, 28.06) K (0.87, 18.76) K (0.37, 18.92) K (0.97, 61.53)0.0414

©2012 Thomson Reuters Was the same effect present for the Bubble ? The results for the propensity-score-matched subset were not conclusive, except for the K08 mechanism. 15 GrantTotal NOdds Ratio (95% CI)p value K (0.28, 23.42) K0730∞ (0.62, ∞) K (1.47, 31.89)0.005 K110N/A K (0.17, ) K (0.12, 98.21)1 K (0.002, 14.84)1

©2012 Thomson Reuters Conclusions Composite indicator of engagement was effective in compensating for recall issues in matching to career outcome data. The composite indicator established a clear level of success for the K Award program, even for individuals for whom direct evidence of subsequent funded research was not easy to obtain. 16

©2012 Thomson Reuters Contact Information Title: Training Program Evaluation: Creating a Composite Indicator to Measure Career Outcomes Presenter: Leo DiJoseph, Chair: Joshua Schnell, 17