Data Standards for Pharmacometric Analysis Data Sets

Slides:



Advertisements
Similar presentations
New Services for Data Creators and Providers Louise Corti, Head ESDS Qualidata/ Outreach & Training Alasdair Crockett, ESDS Data Services Manager.
Advertisements

EMS Checklist (ISO model)
Experience and process for collaborating with an outsource company to create the define file. Ganesh Sankaran TAKE Solutions.
Creating and Submitting a Necessary Wayleave Application
Kendle Implementation of Clinical Data Acquisition Standards Harmonization Dr Elke Sennewald Kendle 9th German CDISC User Group Meeting Berlin, 28 September.
Bay Area CDISC Implmentation Network – July 13, 2009 How a New CDISC Domain is Made Carey Smoak Team Leader CDISC SDTM Device Team.
Dominic, age 8, living with epilepsy SDTM Implementation Guide : Clear as Mud Strategies for Developing Consistent Company Standards PhUSE 2011 – CD02.
© 2011 Octagon Research Solutions, Inc. All Rights Reserved. The contents of this document are confidential and proprietary to Octagon Research Solutions,
1CDISC 2002 RCRIM – Standard Domains Agenda NCI Presentation Standard Domains Working Group Goals Introduction to FDA Information Model (FIM) Discussion:
Second Annual Japan CDISC Group (JCG) Meeting 28 January 2004 Julie Evans Director, Technical Services.
Research based, people driven CDISC ADaM Datasets - from SDTM to submission CDISC Experience Exchange and ADaM Workshop 15 Dec 2008 Zoë Williams, LEO Pharma.
Emerging Technologies Semantic Web and Data Integration This meeting will start at 5 min past the hour As a reminder, please place your phone on mute unless.
CONDUCTING COMPLIANCE ASSESSMENTS Allen Ditch Director Corporate Quality Bristol Myers Squibb Medical Research Summit March 6, 2003.
How Good is Your SDTM Data? Perspectives from JumpStart Mary Doi, M.D., M.S. Office of Computational Science Office of Translational Sciences Center for.
Submission Standards: The Big Picture Gary G. Walker Associate Director, Programming Standards, Global Data Solutions, Global Data Management.
Work experience as a gateway to talent in the UK: Assessing business views Dr Joan Wilson 21 July 2016 London Conference on Employer Engagement in Education.
End-to-End With Standards – A Regulatory Reviewer’s Perspective
Employee Guide: Working on a Virtual Team
Clinical database management: From raw data through study tabulations to analysis datasets Thank you for your kind introduction, and the opportunity to.
Dave Iberson-Hurst CDISC VP Technical Strategy
PantherSoft Financials Smart Internal Billing
AP Online Customer Support Help Desk - Kayako EBSC Bratislava Account Payables Customer Support and Invoice Query Resolution Teams.
CRAM Quarterly Meeting, December 2016
Status Report of EDI on the CAA
Challenges and Strategies in Pharmacometric Programming
Student Fees are Due.
FDA’s IDE Decisions and Communications
PLANNING AND DESIGNING A RESEARCH STUDY
HOW TO ENTER BASELINE DATA
Experience and process for collaborating with an outsource company to create the define file. Ganesh Sankaran TAKE Solutions.
Protocol References Section Title 6.2 Entry Visit 5.1
Implementation Strategy July 2002
The NICE Citizens Council and the role of social value judgements
Accelerate define.xml using defineReady - Saravanan June 17, 2015.
Protocol References Section Title 6.2 Entry Visit 5.1
CLINICAL DATA MANAGEMENT
2.2 | Use Planning Tools.
MAKE SDTM EASIER START WITH CDASH !
CPT and Disclosure: Connecting Critical Processes
Chapter 3 The Marketing Research Process and
Creating ADaM Friendly Analysis Data from SDTM Using Meta-data by Erik Brun & Rico Schiller (CD ) H. Lundbeck A/S 13-Oct
Traceability between SDTM and ADaM converted analysis datasets
Quality Control of SDTM Domain Mappings from Electronic Case Report Forms Noga Meiry Lewin, MS Senior SAS Programmer The Emmes Corporation Target: 38th.
Future State Business Process Discovery & Design Recap
This presentation document has been prepared by Vault Intelligence Limited (“Vault") and is intended for off line demonstration, presentation and educational.
In-Depth Report from Optimizing Data Standards Working Group
Patterns emerging from chaos
MQii Root Cause Analysis Overview
Evaluate the effectiveness of the implementation of change plans
HOW TO ENTER BASELINE DATA
Fabienne NOEL CDISC – 2013, December 18th
Chapter 13 Quality Management
Visit Cycles Justifying Audit Days
Alexander Sterin, Dmitri Nikolaev, RIHMI-WDC
To change this title, go to Notes Master
CPT and Disclosure: Connecting Critical Processes
Visualizing Subject Level Data in Clinical Trial Data
Presenters Emily Woolley
Jeroen Pannekoek, Sander Scholtus and Mark van der Loo
Generating Define.xml at Kendle using DefinedocTM
Chapter 5 Understanding Requirements.
SDTM and ADaM Implementation FAQ
Generating Define.xml at Kendle using DefinedocTM
Data Submissions Douglas Warfield, Ph.D. Technical Lead, eData Team
The role of metadata in census data dissemination
Stuart McAllister ESTAT F4
IRS Circular 230 Required Notice--IRS regulations require that we inform you that to the extent this communication contains any statement regarding federal.
USOAP Continuous Monitoring Approach (CMA) Workshop
Optimzing the Use of Data Standards Calling for Volunteers
How Should We Select and Define Trial Estimands
Presentation transcript:

Data Standards for Pharmacometric Analysis Data Sets Neelima Thanneer Bristol-Myers Squibb 1

Source data issues and Standard Rules for Imputation As far as possible imputations should be avoided, but they may be necessary particularly for Population PK datasets as complete dosing history is generally not available in source data. Dose date/time imputation typically occurs in below scenarios: IV dosing where every dose date, time should be recorded Oral doses where interval dosing is recorded

Standard Rules for Imputation Dose clock time imputation for CRF designs where every single dose date and time is captured If a dose date is available but time is missing: If a trough sample was taken on the same day, use the trough time as the dose time (if IV use start time of infusion). One can add 5 min If no troughs were taken, use the previous dose time. For BID/TID adjust the imputation based on frequency. If the first dose time is missing then impute using day 1 lab time or impute based on post first dose sample time. For BID/TID adjust the imputation based on frequency. The imputed clock time records should be flagged Account for missing doses/dose interruptions based on the number of tablets or comment in text like variable

Standard Rules for Imputation Infusion Duration Imputation If infusion stop time is available but infusion start time is missing, the protocol defined duration (e.g. 1 hour or 30 min) is used to determine the start of infusion and vice versa if stop time is not available. If both infusion start and stop times are missing on day 1, pre- dose sample time or end of infusion sample time is used along with nominal infusion duration to determine the start of infusion time The imputed date/time records should be flagged

Standard Rules for Imputation When CRF is designed to capture interval doses with start and stop dates recorded and only dose times relative to PK are recorded Variables ADDL and II are derived to capture the not recorded doses ADDL: Number of additional doses exactly like the current one II: Interdose Interval If the dose time relative to PK samples are not recorded then impute using the IV imputation rules ADDL needs to be adjusted based on the recorded time deviations.

Influencing CRF Design CRF’s are not consistently designed to capture the information need for Pharmacometric analysis For BID and TID CRF should capture the prior 2 and 3 dose times relative to PK. First dose time should be collected Dose interuptions for BID and TID doses should indicate which doses are missing Pharmacometricians and programmers can influence the CRF design by getting involved at the design stage. For oral studies where patient takes drug at home CRF should be designed to collect the dose date time relative to the PK sample drawn.

Population PK Data Standards Initiative Sponsored by International Society of Pharmacometrics (ISoP) Group of around 20 enthusiastic pharmacometricians and data programmers has come together at the beginning of 2016 to address the outlined situation and work on its improvement. With the sponsorship of ISoP, the group envisions the development of PPK data standards for interchange and analysis. Some of the benefits of the work will directly reflect on improved: Consistency and efficiency in pharmacometric datasets Quality (fewer errors, enabling development of open-source tools for automatic data checking) Enabling development of open-source tools for automatic graphical exploration Regulatory compliance and audit readiness (enabling development of open-source tools for data submission) Population PK analyses are most commonly included component of regulatory filings but the datasets for such analyses are prepared and documented inconsistently

Objective: Standardized datasets used directly by analysis software Source data Standardized dataset SDTM Population PK standard dataset Analysis ADaM NONMEM, Monolix, R, Stan, Julia, etc can use standardized datasets directly Already the case for R and some other tools Common variable names Common structure Will always look the same Suitable for collaboration Typically SAS datasets Radivojevic, et al. ACoP 2016

What we have done Draft standard for input data Classified by information group Source dataset and variable name Required input from the pharmacometrician? CDISC naming conventions and length requirements Imputation and derivation guidance Controlled terminology General comments

What we have done Current status Path to CDISC data standards for pharmacometrics hand-over of solid draft data structure review by CDISC teams and feasibility assessment implementation guide Preparing a “Perspectives” article to outline our thinking and objectives

For any other stakeholders Next steps Scripts for further conversion into tool dialects Once the standard is established, capture the deliverable in a “White paper” Longer term: Start expanding to PKPD data Develop and publish automation scripts For this initiative to succeed, it must make things faster, easier and more efficient For modelers For programmers For any other stakeholders