Experience and process for collaborating with an outsource company to create the define file. Ganesh Sankaran TAKE Solutions.

Slides:



Advertisements
Similar presentations
Dimitri Kutsenko (Entimo AG)
Advertisements

Principal Statistical Programmer Accovion GmbH, Marburg, Germany
Experience and process for collaborating with an outsource company to create the define file. Ganesh Sankaran TAKE Solutions.
Copyright © 2012 Quintiles Define.XML A CASE study describing how to produce Define.XML from within SAS Dianne Weatherall Sep 2012.
CDISC ADaM 2.1 Implementation: A Challenging Next Step in the Process Presented by Tineke Callant
Quick tour of CDISC’s ADaM standard
SEND Standard for the Exchange of Nonclinical Data
Cross Check between Define.xml and blankcrf.pdf
CDISC (CDASH/SDTM) integration into OC/RDC
© 2008 Octagon Research Solutions, Inc. All Rights Reserved. 1 PhUSE 2010 Berlin * Accessing the metadata from the define.xml using XSLT transformations.
7. German CDISC User Group Meeting Define.xml Generator ODM Validator (define.xml validation) 2010/03/11 Dimitri Kutsenko Marianne Neumann.
Copyright © 2010, SAS Institute Inc. All rights reserved. Define.xml - Tips and Techniques for Creating CRT - DDS Julie Maddox Mark Lambrecht SAS Institute.
Monika Kawohl Statistical Programming Accovion GmbH Tutorial: define.xml.
Updates on CDISC Standards Validation
Dominic, age 8, living with epilepsy SDTM Implementation Guide : Clear as Mud Strategies for Developing Consistent Company Standards PhUSE 2011 – CD02.
CBER CDISC Test Submission Dieter Boß CSL Behring, Marburg 20-Mar-2012.
© 2011 Octagon Research Solutions, Inc. All Rights Reserved. The contents of this document are confidential and proprietary to Octagon Research Solutions,
PhUSE SDE, 28-May A SAS based Solution for define.xml Monika Kawohl Statistical Programming Accovion.
Remapping of Codes (and of course Decodes) in Analysis Data Sets for Electronic Submissions Joerg Guettner, Lead Statistical Analyst Bayer Pharma, Wuppertal,
ODM-SDTM mapping Nicolas de Saint Jorre, XClinical June 20, 2008 French CDISC User Group Bagneux/Paris © CDISC & XClinical, 2008.
Overview and feed-back from CDISC European Interchange 2008 (From April 21 st to 25 th, COPENHAGEN) Groupe des Utilisateurs Francophones de CDISC Bagneux.
Confidential - Property of Navitas Accelerate define.xml using defineReady - Saravanan June 17, 2015.
CDISC XML Technologies Standards Development An Update Sam Hume
Second Annual Japan CDISC Group (JCG) Meeting 28 January 2004 Julie Evans Director, Technical Services.
© Copyright 2008 ADaM Validation and Integrity Checks Wednesday 12 th October 2011 Louise Cross ICON Clinical Research, Marlow, UK.
Copyright © 2011, SAS Institute Inc. All rights reserved. Using the SAS ® Clinical Standards Toolkit 1.4 to work with the CDISC ODM model Lex Jansen SAS.
SDTM Validation Delaware Valley CDISC user network Ketan Durve Johnson and Johnson Pharmaceutical Reasearch and Development May 11 th 2009.
RCRIM Projects: Protocol Representation and CDISC Message(s) January 2007.
Implementation of CDISC Standards at Nycomed PhUSE, Basel (19-21 October 2009) Nycomed GmbH, Dr. B Traub CDISC Implementation at Nycomed.
Dave Iberson-Hurst CDISC VP Technical Strategy
Overview of CDISC standards and use cases along the E2E data management process Dr. Philippe Verplancke ESUG Marlow, UK 27 May 2009.
WG4: Standards Implementation Issues with CDISC Data Models Data Guide Subteam Summary of Review of Proposed Templates and Next Steps July 23, 2012.
WG4: Data Guide/Data Description Work Group Meeting August 29, 2012.
1. © CDISC 2014 SDS ELT Rules Team Update Stetson Line 08 Dec
ICON CD04 - Ensuring Compliant and Consistent Data Mappings for SDTM-based Studies – an ICON Approach Jennie Mc Guirk 11 th October 2011.
April ADaM define.xml - Metadata Design Analysis Results Metadata List of key analyses (as defined in change order) Analysis Results Metadata per.
© CDISC 2015 Paul Houston CDISC Europe Foundation Head of European Operations 1 CTR 2 Protocol Representation Implementation Model Clinical Trial Registration.
How Good is Your SDTM Data? Perspectives from JumpStart Mary Doi, M.D., M.S. Office of Computational Science Office of Translational Sciences Center for.
Generation of real-time SDTM datasets and metadata through a generic SDTM converter mechanism CDISC (CDASH/SDTM) integration into OC/RDC Peter Van Reusel.
Mark Wheeldon, Formedix CDISC UK Network June 7, 2016 PRACTICAL IMPLEMENTATION OF DEFINE.XML.
Submission Standards: The Big Picture Gary G. Walker Associate Director, Programming Standards, Global Data Solutions, Global Data Management.
Paul Houston CDISC Europe Foundation Head of European Operations
A need for prescriptive define.xml
Dave Iberson-Hurst CDISC VP Technical Strategy
Monika Kawohl Statistical Programming Accovion GmbH
Greg Steffens Noumena Solutions
Practical Implementation of Define.xml
7. German CDISC User Group Meeting Define
Updates on CDISC Standards Validation
Accelerate define.xml using defineReady - Saravanan June 17, 2015.
Definition SpecIfIcatIons
Accenture Accelerated R&D Standards Metadata Management – version control and its governance Kevin Lee CDISC NJ Meeting at 01/28/2015 We help our Clients.
Monika Kawohl Statistical Programming Accovion GmbH
MAKE SDTM EASIER START WITH CDASH !
Traceability between SDTM and ADaM converted analysis datasets
Quality Control of SDTM Domain Mappings from Electronic Case Report Forms Noga Meiry Lewin, MS Senior SAS Programmer The Emmes Corporation Target: 38th.
Patterns emerging from chaos
Monika Kawohl Statistical Programming Accovion GmbH
CDISC UK Network Jun-16 – Welwyn
Definition SpecIfIcatIons
A SAS macro to check SDTM domains against controlled terminology
To change this title, go to Notes Master
Fabienne NOEL CDISC – 2013, December 18th
define.xml.v2 row-level metadata
WG4: Data Guide/Data Description Work Group Meeting
Monika Kawohl Statistical Programming Accovion GmbH
Generating Define.xml at Kendle using DefinedocTM
Generating Define.xml at Kendle using DefinedocTM
Data Submissions Douglas Warfield, Ph.D. Technical Lead, eData Team
PhilaSUG Spring Meeting June 05, 2019
Presentation transcript:

Experience and process for collaborating with an outsource company to create the define file. Ganesh Sankaran TAKE Solutions

Agenda Typical work flow when sponsors create the SDTM / ADaM in-house and collaborate with vendors for the Define files Define.xml Sections Define.xml Process - How do we go about extracting the information from the data & documents provided ..? Validating Define.xml & the typical Checks Common Issues Conclusion – How soon should the sponsor start..?

Typical Work flow collaborating with a Vendor for creating Define files Sponsor provides the documents & Draft Data Run the compliance / structure checks on the data Generate draft Define.xml & run the compliance checks Sponsor reviews the findings and update the specification / dataset / annotation Summarize the Issues/findings and deliver the draft define for review Send the updated Annotations/Specification / XPTs back to the vendor for a final delivery (Pass II) Runs the compliance checks, re-generate the final version of Define (Pass II)

Inputs that are provided.. Annotated Case Report Form Mapping Specification documents SAS Datasets / XPTs Sponsor Controlled Terminology Documents, if applicable Protocol, if Trial Design Domain to be produced Data Guide / Supplemental Document

Define.XML Section TOC – Metadata of Datasets blankcrf (Annotated ) Variable Level Metadata Value Level Metadata Controlled Terminology Computational Algorithms Supplemental Data Definition Document

Define.XML Section (Not visible through the Style Sheet) Xmlns - Identifies the default namespace for this document ODMVersion - Identifies the ODM version that underlies the schema for the Define-XML FileOID - unique identifier for this file. CreationDateTime - When the specific version of the define.xml file was created. StudyName, StudyDescription, ProtocolName – Study level Information

Define.XML Components and how do we generate them… MetaData Generation – DOMAIN Level VARIABLE Level VALUE Level ORIGIN, CODELIST, Comments and Computational Algorithm blankcrf, Data Guide / Supplemental Docs Generate Define.xml Validate Define files

Define.XML process

Input Sheet for Define.XML Generation DOMAIN Level Input – SAS based macro utility will create the Input s for this sheet based on the Datasets provided VARIABLE METADATA – By reading through the metadata of the SAS datasets provided, variable Level metadata input sheet is populated.

Input Sheet for Define.XML Generation ORIGIN information will be extracted based on the Annotations & Mapping Specification provided. Based on the variables for which CODELIST , COMPUTATION ALGORITHM and VALUELIST need to be populated, OID will be assigned here. Based on the OIDs assigned in the VARIABLE LEVEL sheet, VALUE LEVEL input sheet and CODELIST input sheet will be generated by reading the data and the associated codelist files.

Input Sheet for Define.XML Generation Value Level Input Codelist / Computation Methods Input

External Documents – blankcrf & Data Guide Annotated Case Report Form and Supplemental Documents like Data Guide will be linked to the define.xml ORIGIN Page number presented as part of the variable level metadata must be hyperlinked to the corresponding CRF pages attached to the Define file.

Input Sheet for Define.XML Generation Once the Domain Level, Variable level, Value Level, Codelist sheets are created, external documents linked and the ORIGIN, COMPUTATIONAL ALGORITHM & External Dictionary information updated and inputs reviewed, DEFINE.XML can be generated

Validation Checks Structural Checks: Type of Checks on the Metadata Domain Label mismatch Variable Label mismatch Data type mismatch Missing Expected & Required variables Required / Expected Variables with NULL values for all records 6. Non-standard SDTM variables 7. Variable Names in lower case 8. Variable Order mismatch 9. Variables with Formats 10. Permissible variables present with NULL Values for all records

Validate Define.XML A valid Define.xml should be well formed & conform to the XML schemas. Should reference correct versions of CDISC standards. Sample Validation Checks XML is well formed All Required Elements are included and / not empty OID attribute element must be unique within a single Metadata version – No duplicates def:leaf element, def:ComputationMethod , def:valueListDef, No Duplicates in ItemGroupDef, ItemDef, ItemRef, Study, CodeList element etc. Invalid Data type value for CODELIST elements CodeValue must be unique within a single CodeList Invalid Codelist for variable, non- extensible CT 8. Invalid Data type value for ItemDef elements 9. Invalid ‘Filetype’, ‘MedDRA’ values 10. Invalid ‘Repeating’, ‘Mandatory’ values

Common Issues Origin is ‘CRF’, but not annotated. ORIGIN ‘Derived’ but annotated in the CRF. Key variables not properly defined. While presenting Custom domains, Domain assumption should be followed. Sometimes custom domains derived without a TOPIC variable. Subjects collected as part of external data LB/EG, but not populated in DM domain. All Subjects must be present in DM domain. One-to-one relationship missing across some of the paired variables like TEST / TESTCD, PARAM / PARMCD, VISIT / VISITNUM, AVISIT /AVISITN, TPT / TPTNUM TPT & TPTREF Common variables across different domains having different ORIGIN derivation. If it’s the same across, can go with “Copied from ADSL.XX”

Common Issues (contd) Generally, XPTs up to 1 GB size is fine. If the XPT file size exceeds 1GB, it must be split to smaller datasets not exceeding 1 GB. Study Data Specifications Split files should have the same metadata structure so that concatenation / merging of the split datasets should be feasible. Both smaller split files & larger (non-split) file should be included. Split datasets and the method applied should be documented in the data guide If not following linear approach, need to make sure consistency between ADaM/SDTM sources.

Common Issues (Contd) ADaM when derived in a Parallel Stream might require extra efforts for ensuring traceability & Data Lineage.

Conclusion Finalize the scope of the work being outsourced / to be performed by the vendor. Explain the process being followed and agree to a common form for exchange of documets that could expedite the Define files generation. While working across a family of similar studies within the same indication, after a couple of iterations/studies, should look for achieving better efficiency. Identify the Vendor(s) at least three months before you expect the first Define.XML to be published. If possible, do a pilot or DEMO define.

Thank You