1 Changing Trial Designs on the Fly Janet Wittes Statistics Collaborative ASA/FDA/Industry Workshop September 2003.

Slides:



Advertisements
Similar presentations
Chapter 5 One- and Two-Sample Estimation Problems.
Advertisements

Robin Schumaker Coordinator, Office of Gifted Education
Question??? What is a prime factor?
Slide 1 Insert your own content. Slide 2 Insert your own content.
1 Copyright © 2010, Elsevier Inc. All rights Reserved Fig 3.1 Chapter 3.
Labeling claims for patient- reported outcomes (A regulatory perspective) FDA/Industry Workshop Washington, DC September 16, 2005 Lisa A. Kammerman, Ph.D.
Statistical Considerations for Implementing the FDA CV Guidance for T2DM Craig Wilson, PhD NIC-ASA Fall Meeting October 15, 2009.
Use of Data Monitoring Committees (DMC) in Device Trials: An FDA Division of Cardiovascular Devices (DCD) Perspective Bram Zuckerman MD, FACC
Flexible designs for pivotal clinical trials Vlad Dragalin, RSU-SDS-BDS-GSK FDA/Industry Workshop Session: Flexible Designs – Are We Ready Yet? Washington,
ISSUES THAT PLAGUE NON- INFERIORITY TRIALS PAST AND FUTURE RALPH B. DAGOSTINO, SR. BOSTON UNIVERSITY HARVARD CLINICAL RESEARCH INSTITUTE.
Interim Analysis in Clinical Trials: A Bayesian Approach in the Regulatory Setting Telba Z. Irony, Ph.D. and Gene Pennello, Ph.D. Division of Biostatistics.
1 FDA Industry Workshop Statistics in the FDA & Industry The Future David L DeMets, PhD Department of Biostatistics & Medical Informatics University of.
FDA/Industry Workshop September, 19, 2003 Johnson & Johnson Pharmaceutical Research and Development L.L.C. 1 Uses and Abuses of (Adaptive) Randomization:
Confidentiality and trial integrity issues for monitoring adaptive design trials Paul Gallo FDA-Industry Workshop September 28, 2006.
Unsafety: Making no mockery of honest ad-hockery Janet Wittes Statistics Collaborative ASA/FDA 2005.
Domain Driven Design and RIM. Introduction 1 RIMResistance is Futile, you Will be Assimilated!
1 1 Media as a Partner Lee Ann J. Kendrick, M.Ed. Regional Advocacy Specialist, National PTA.
Sampling Research Questions
Creating an innovation network Filip Meuris Intercommunale Leiedal Smart Cities Project Director.
0 - 0.
SUBTRACTING INTEGERS 1. CHANGE THE SUBTRACTION SIGN TO ADDITION
Addition Facts
Making the System Operational
The Perception of Privacy Risk Gerald W. Gates Chief Privacy Officer U.S. Census Bureau.
W ORKING A CROSS A GENCY L INES. P RESENTATION O VERVIEW K EY T ERMS AND C ONCEPTS U PDATE ON THE I NTERAGENCY V ISITOR U SE M ANAGEMENT C OUNCIL ( THE.
EHR stakeholder workshop – 11th October EHR integration for clinical research: toward new interaction models ? Isabelle de Zegher.
5.9 + = 10 a)3.6 b)4.1 c)5.3 Question 1: Good Answer!! Well Done!! = 10 Question 1:
Phase II/III Design: Case Study
Risk The chance of something happening that will have an impact on objectives. A risk is often specified in terms of an event or circumstance and the consequences.
Lecture 8: Testing, Verification and Validation
Past Tense Probe. Past Tense Probe Past Tense Probe – Practice 1.
© Crown Copyright Session Two: Shape and Space Aims of the Session To consider the use of the interactive whiteboard to enhance the teaching and.
Primary and secondary use of EHR: Enhancing clinical research Pharmaceutical Industry Perspectives Dr. Karin Heidenreich Senior Public Affairs Manager/Novartis.
Addition 1’s to 20.
Test B, 100 Subtraction Facts
11 = This is the fact family. You say: 8+3=11 and 3+8=11
INFANT MORTALITY FOLLOW-UP ALABAMA 2005 ALABAMA DEPARTMENT OF PUBLIC HEALTH CENTER FOR HEALTH STATISTICS.
Week 1.
Handling (and Preventing) Missing Data in RCTs ASENT March 7, 2009 Janet Wittes Statistics Collaborative.
T. A. LouisTrialNet Workshop March 7, The POPPI 1 Example: Statistical Comments Thomas A. Louis, PhD Department of Biostatistics Johns Hopkins Bloomberg.
Use addition to eliminate a variable
Four Schools of Software Testing Workshop on Teaching Software Testing, Florida Tech, February 2003.
Software Quality David Jones, Director. 2 Agenda What is it and why is it important? How do we deliver it? Conclusions.
User Research Findings. 1 Overview Background Study goals Methodology Participants Findings Recommendations.
Sample Size & Power Estimation Computing for Research April 9, 2013.
Futility Analysis A Miscellany of Issues Professor Andy Grieve King’s College London © Andy Grieve.
No: 1 CEMSIS 1 Potential for influencing standards and broadening collaboration N. Thuy EDF R&D.
Adverse Events, Unanticipated Problems, Protocol Deviations & other Safety Information Which Form 4 to Use?
HELPING GRIEVING CHILDREN BUILD RESILIENCE Saradha Ramachandran HELP Family Service Centre.
Background to Adaptive Design Nigel Stallard Professor of Medical Statistics Director of Health Sciences Research Institute Warwick Medical School
9 SEPTEMBER 2015 – KATHERINE MORRIS AND LUCY BOCHENEK Is it possible to comply with the safety aspects of the Petroleum & Gas (Production & Safety) Act.
DATA MONITORING COMMITTEES: COMMENTS ON PRESENTATIONS Susan S. Ellenberg, Ph.D. Department of Biostatistics and Epidemiology University of Pennsylvania.
1 TenStep Project Management Process ™ PM00.7 PM00.7 Project Management Preparation for Success * Manage Risk *
Mass BioTech Council DMC Presentation Statistical Considerations Philip Lavin, Ph.D. October 30, 2007.
An agency of the European Union Presented by: Paolo Tomasi Data Safety Monitoring Boards / Data Monitoring Committees in paediatric studies Paolo Tomasi,
Summary of Findings Improving the System of Reporting and Interpreting Unexpected Serious Adverse Events to Investigators Conducting Research Under an.
Drug Safety and Risk Management Advisory Committee May 18-19, Overview of Drug Safety Challenges Gerald J. Dal Pan, MD, MHS Director Division of.
BIOE 301 Lecture Seventeen. Progression of Heart Disease High Blood Pressure High Cholesterol Levels Atherosclerosis Ischemia Heart Attack Heart Failure.
1 Setting up a Data Safety Monitoring Board ASENT Meeting March 6, 2008 Jennifer Schumi, PhD Statistics Collaborative, Inc.
Imperial Oil Resources D.J.Fennell Strategies for Understanding and Addressing Risk Tolerance Factor # 5 Personal Experience with an Outcome.
Watching From Above: The Role of the DSMB
Health & Safety at Work Act 2015
Strategies for Implementing Flexible Clinical Trials Jerald S. Schindler, Dr.P.H. Cytel Pharmaceutical Research Services 2006 FDA/Industry Statistics Workshop.
Data Monitoring Committees: Current Issues and Challenges Some Discussion Points Jim Neaton University of Minnesota.
Sample Size Planning of Clinical Trials, Introduction
Aiying Chen, Scott Patterson, Fabrice Bailleux and Ehab Bassily
The DMC’s role in monitoring futility
Data Monitoring committees and adaptive decision-making
Gregory Levin, FDA/CDER/OTS/OB/DBIII
Finding a Balance of Synergy and Flexibility in Master Protocols
Presentation transcript:

1 Changing Trial Designs on the Fly Janet Wittes Statistics Collaborative ASA/FDA/Industry Workshop September 2003

2 Context Trial that is hard to redo Serious aspect of serious disease Orphan

3 Statistical rules limiting changes To preserve the Type I error rate To protect study from technical problems arising from operational meddling

4 Challenge sense rigor

5

6 Challenge senseless rigor mortis

7 Scale of rigor Over rigid Rigorous Prespecified methods for change – preserves Unprespecified but reasonable change Invalid analysis responders analysis outcome-outcome analysis completers

8 Consequences No change during the study OR Potential for the perception that change caused by effect

9 Prespecified changes Sequential analysis Stochastic curtailing Futility analysis Internal pilot studies Adaptive designs Two-stage designs

10 Problems Technical Solved Operational Risks accepted EfficiencyUnderstood

11 Add a DMC What if it acts inconsistently with guidelines? Something really unexpected happens? DMC initiates change Steering Committee initiates change

12 Reasons for unanticipated changes Unexpected high-risk group Changed standard of care Statistical method defective Too few endpoints Assumptions of trial incorrect Other

13 Examples 1.Too much censoring; DMC extends trial 2.Boundary not crossed but DMC stops 3.Unexpected adverse event 4.Statistical method defective 5.Event rate too low; DMC changes design

14 #1 Endpoint-driven trial Trial designed to stop after 200 deaths Observations different from expected Recruitment Mortality rate At 200 deaths, fu of many people<2 mo DMC: change fu to minimum 6 mo P-value: 0.20 planned; at end

15 #2. Boundary not crossed Endpoint Primary: 7 day MI Secondary: one-year mortality Very stringent boundary

16 What DMC sees Very strong result at 7 days No problem at 1 year Clear excess of serious adverse events

17 Haybittle-Peto bound (10%)

18 Haybittle-Peto bound (30%)

19 Haybittle-Peto bound (50%)

20 Haybittle-Peto bound (70%)

21 Haybittle-Peto bound (70%)

22 #3. Unexpected adverse event: PERT study of the WHI Prespecified boundaries for BenefitHarm Heart attackStroke FracturePE Colon cancerBreast cancer

23 Observations BenefitHarm -----Stroke FracturePE Colon cancerBreast cancer Heart attack

24 Actions Informed the women about increased risk of stroke, heart attack, and PE Informed them again Stopped the study

25 #4. Statistical method defective Neurological disease 20 question instrument Anticipated about 20% would not come Planned multiple imputation- results: Scale: 0 to 80 Value for ID 001: ? ? MI values: -22, 176

26 #5. Too few endpoints Example: approved drug Off-label use associated with AE Literature: SOC event rate: 20 percent Non-inferiority design - = 5 Sample size: 800/group

27 Observation 400 people randomized 0 events What does the DMC do?

28 Choices Continue to recruit 1600 Stop and declare no excess Choose some sample size Tell the Steering Committee to choose a sample size What if n=1? 2? 5? 10?

29 Conclusions Ensure that DMC understands role Separate decision-making role of DMC and Steering Committee Distinguish between reasonable changes on the fly and cheating Expect fuzzy borders

30 Technical Changing plans can increase Type I error rate We need to adjust for multiple looks How do we adjust for changes?

31 Operational Unblind assessments Subtle change in procedures In clinical trials, the FDA and SEC