Presentation is loading. Please wait.

Presentation is loading. Please wait.

Trial Registration: Lessons from ClinicalTrials.gov

Similar presentations


Presentation on theme: "Trial Registration: Lessons from ClinicalTrials.gov"— Presentation transcript:

1 Trial Registration: Lessons from ClinicalTrials.gov
Deborah A. Zarin, M.D. Director, ClinicalTrials.gov

2 Evidence of Problems with the Clinical Research Enterprise (CRE)
One practical problem: Potential participants had trouble finding trials. Three scientific problems: Not all trials are published Not all outcome measures (or adverse events) are published Changes to protocols are not always acknowledged

3 Some Motivating Scandals
Paxil GSK suppressed evidence on harms and lack of efficacy in children Vioxx Merck failed to report heart attacks Celebrex Pfizer reported misleading results

4 Zarin CTSA Webinar (10-21-09)
CLASS Study; Celebrex Jüni P, Rutjes AW, Dieppe PA. BMJ Jun 1;324(7349): ClinicalTrials.gov / NLM 4

5 ENHANCE (NCT00552097): Prespecified Endpoints
1 3 5 Source: Kastelein JJ et al. Am Heart J Feb;149(2):234-9.

6 “…it appears that the study itself was not registered with ClinicalTrials.gov until October 31, 2007, a full 18 months after completion of the study. In addition, the endpoint indicated in the ClinicalTrials.gov web site1 appears to differ from the endpoint described in the initial study design.2” 6

7 What do we know now? Selective reporting and “publication bias” are worse than previously believed Unacknowledged (post-hoc) changes to protocol elements are rampant in the literature, and undermine fundamental conclusions

8

9 Summary of Findings Fewer than half of NIH funded trials registered at ClinicalTrials.gov after September 2005 and completed by December 2008 were published in a peer reviewed biomedical journal indexed by MEDLINE within 30 months of trial completion After a median of 51 months after study completion, a third of NIH-funded trials in the sample remained unpublished 9 Source: Ross JS et al. BMJ 2012;344:d7292.

10 Trial Transparency: The Big Picture
Zarin DA, Tse T. PLoS Med

11 History of ClinicalTrials.gov
FDAMA* 113 (1997) mandates registry Investigational New Drug application (IND) trials for serious and life-threatening diseases or conditions ClinicalTrials.gov launched in February 2000 Calls for increased transparency of clinical trials Maine State Law; State Attorneys General International Committee of Medical Journal Editors (ICMJE) statement (2004) ClinicalTrials.gov accommodates other policies FDAAA† Section 801 (2007): Expands registry & adds results reporting requirements Final Rule issued in September 2016 * Food and Drug Administration Modernization Act of 1997 † Food and Drug Administration Amendments Act of 2007

12 About ClinicalTrials.gov
Clinical studies registry and results database Over 237,000 studies in all 50 states and 195 countries Privately and publicly funded studies involving humans Study information submitted by study sponsors or investigators Web Site & registry launched in February 2000 Results database, in September 2008 Over 24,000 studies with results Intended Audience Registry: Public Results database: Readers of the medical literature Usage 199M+ page views/month 76,000 unique visitors per day 12

13 Registry Record Key Protocol Details Recruitment Information
Intervention(s) & Outcome measure(s) Eligibility Details Recruitment Information Administrative Info (includes Key Dates) Expected to be corrected or updated throughout the trial's life cycle

14 Archival Data: Tracking Changes in the Record
Each record is expected to be corrected or updated throughout the trial's life cycle, and all changes are tracked on a public archive site that is accessible from each record (through a “History of Changes” link). Tabular View Current Outcome Measures Original (First Registered) Outcome Measures Zarin et al. N Engl J Med 2011; 364:

15 De Angelis C et al. Clinical trial registration: a statement from the International Committee of Medical Journal Editors. N Engl J Med Sep 16;351(12): Source: De Angelis C et al. N Engl J Med Sep 16;351(12):1250-1

16 Zarin DA, Tse T, Ide NC. Trial registration at ClinicalTrials
Zarin DA, Tse T, Ide NC. Trial registration at ClinicalTrials.gov between May and October N Engl J Med Dec 29;353(26):

17 17

18 > 600 per week 200 - 250 per week 25-30 per week FDAAA Enacted
ICMJE Policy Effective FDAAA Enacted per week 18

19 Three Key Trial Disclosure Policies
Reporting Requirement ICMJE Policy (Effective in 2005) FDAAA Final Rule (Issued in 2016) Final NIH Policy Scope Registration Registration & Results Reporting Registration & Results Reporting Phase All Not Phase 1 Intervention Type Drugs, Biologics, & Devices regulated by the FDA All (e.g., including behavioral interventions) Funding Source Any NIH Enforcement Refusal to publish (non-legally binding) Criminal proceedings and civil penalties (up to $10,000/day); Loss of HHS funding Loss of NIH funding 19

20 Policies and laws are not sufficient:

21 Level of Specification ClinicalTrials.gov Initial Entry Publication
Domain (e.g., “anxiety”) Depression Specific measurement (e.g., “Hamilton Anxiety Rating Scale”) HAM-D (Hamilton Depression Rating Scale) MADRS (Montgomery–Asberg Depression Rating Scale) Specific metric (e.g., “change from baseline”) N/A MADRS score ≥13 during time frame Method of aggregation (e.g., “proportion of participants with decrease 50%”) Percentage of participants with specific metric Time frame (e.g., “12 weeks”) 24 weeks 50 weeks after receiving intervention for participants with HCV genotype 1 or 4 OR 26 weeks after receiving intervention for patients with HCV genotype 2 or 3 Zarin DA, Tse T. Trust but verify: trial registration and determining fidelity to the protocol. Ann Intern Med Jul 2;159(1):65-7. Source: Zarin, Tse. Ann Intern Med Jul 2;159(1):65-7.

22

23 Can We Obtain a Unique List of all Initiated Trials?

24 Source: Zarin DA et al. N Engl J Med. 2017 Jan 26;376(4):383-391.

25 Are all trials being registered?
Study: Reviewed all trials registered from “On Time” = Registered within 3 wks of Study Start Date “Late” = Registered after 3 wks Findings: 33% (16,342/49,751) trials “Late” Variation within “Late” by funder type: 23.5% Industry 24.9% NIH 38.7% Other (e.g., Academic) 57% “Late” by more than 12 months Zarin DA et al. N Engl J Med Jan 26;376(4):

26 NCT00136318 - Initial and Updated Entries for Primary Outcome Measures
Level of Specification ClinicalTrials.gov Initial Entry Publication Domain (e.g., “anxiety”) Depression Specific measurement (e.g., “Hamilton Anxiety Rating Scale”) HAM-D (Hamilton Depression Rating Scale) MADRS (Montgomery–Asberg Depression Rating Scale) Specific metric (e.g., “change from baseline”) N/A MADRS score ≥13 during time frame Method of aggregation (e.g., “proportion of participants with decrease 50%”) Percentage of participants with specific metric Time frame (e.g., “12 weeks”) 24 weeks 50 weeks after receiving intervention for participants with HCV genotype 1 or 4 OR 26 weeks after receiving intervention for patients with HCV genotype 2 or 3 Zarin DA, Tse T. Trust but verify: trial registration and determining fidelity to the protocol. Ann Intern Med Jul 2;159(1):65-7. Source: Zarin, Tse. Ann Intern Med Jul 2;159(1):65-7.

27 Wager E, Williams P; Project Overcome failure to Publish nEgative fiNdings Consortium. "Hardly worth the effort"? Medical journals' policies and their editors' and publishers' views on trial registration and publication bias: quantitative and qualitative study. BMJ Sep 6;347:f5248.

28 Are Registry Entries Specific Enough to Enable Detection of Important post-hoc Changes?

29 How specific are entries?
Issue: Do registered primary outcome measures (POMs) provide sufficient specificity for evaluating fidelity of published reports to the protocol? Study: Sample of 40 NEJM and 40 JAMA articles (80 total) Compared POMs across articles ClinicalTrials.gov registry records, and full protocols Zarin DA et al. N Engl J Med Jan 26;376(4):

30 Four Levels of Specification in Reporting Outcome Measures
Domain: Anxiety Depression Schizophrenia Beck Anxiety Inventory Hamilton Anxiety Rating Scale Fear Questionnaire Level 2 Specific Measurement: Level 3 Specific Metric: End Value Change from baseline Time to Event Level 4 Method of Aggregation: Continuous Mean Median Categorical Proportion of participants with decrease ≥50% Proportion of participants with decrease ≥ 8 points Description of Measure at Specified Time Zarin et al. N Engl J Med Mar 3;364(9):

31 POMs Registered with Sufficient Specificity
Findings (N = 83 trials & 101 POMs): Levels of specificity 0% Domain only + 11.9% Specific Measurement + 42.6% Specific Metric + 45.6% Method of Aggregation 94.1% Specific Time Frame 2 Inconsistencies 2 registrations with 2 different POMs for 1 article: Published as POM and SOM Discrepant analysis population for POM: transplant recipients (article) vs. kidney donors (registry) Zarin DA et al. N Engl J Med Jan 26;376(4):

32 Policies and Laws are not Sufficient Culture Change requires joint efforts

33 Actions for Stakeholders
Funders Identify gaps and potential overlaps before funding new research Hold awardees accountable for accurate, timely and complete reporting Institutional Review Boards Identify past and ongoing trials that may inform potential risks and benefits of proposed trials Academic Medical Centers Provide scientific leadership and institutional resources to support trial reporting by investigators Take responsibility for ensuring all sponsored trials are reported Create educational resources to support quality trial documentation as part of training and provide incentives for high-quality reporting

34 Actions for Stakeholders
Trialists Search for similar trials (landscape analysis) before starting a trial Register and report results with specificity and accuracy Journal Editors/Peer Reviewers Ensure prospective and complete trial registration occurred before publishing trial results Verify that data submitted in manuscript are consistent with prespecified registration and/or discrepancies are explained Meta-Researchers Continue characterizing and monitoring the clinical research enterprise ClinicalTrials.gov & Other Databases Continue to improve user interfaces and enhance resource materials available to users Continue to improve search interfaces

35 Challenges Outside of Medical Domain
Need very strong incentives, along with major culture change Lack of “barrier” to starting (or repeating) study How to enforce registration prior to study? How to inhibit conduct of multiple studies with registration of “favorite” after the fact Many journals, and only works if all require registration No official oversight or legal framework

36 Potential Benefits of Registration (even if not universal)
Helps to define “best practices” Incentivizes specification of protocol details (evidence that this has improved clinical research in some areas) Allows for meta-research—though risks of various biases depending on analytic question


Download ppt "Trial Registration: Lessons from ClinicalTrials.gov"

Similar presentations


Ads by Google