Presentation is loading. Please wait.

Presentation is loading. Please wait.

SEND Submissions – Quality review by QA – Yes/No?

Similar presentations


Presentation on theme: "SEND Submissions – Quality review by QA – Yes/No?"— Presentation transcript:

1 SEND Submissions – Quality review by QA – Yes/No?
29 SEND Submissions – Quality review by QA – Yes/No? Daniel Potenta, Michael Wasko, Nina Häuselmann PDS Life Sciences, Mt Arlington, New Jersey Abstract SEND Standards to Consider in a Dataset Submission continued SEND Standards to Consider in a Dataset Submission continued With FDA mandated electronic data submissions in SEND format for NDA, ANDA and BLA supporting Single Dose, Repeat Dose and Carcinogenicity studies, starting after December 17, 2016, how confident are you that datasets compiled into a submission package are complete and accurate for all studies?  Are all relevant studies necessary for a submission (even non-GLP) integrated into the submission package?  Do you know which data needs to be submitted electronically?  How are you integrating study data from multiple CRO's, vendors and LIMS, including assuring that a common controlled terminology was applied across the dataset.  Does the Non-Clinical Study Data Reviewer’s Guide (nSDRG) have the appropriate data for dataset warnings and combine the data from all CRO's and sites involved?  Does your define file link all imported data?  If incomplete or inaccurate datasets are submitted, and returned due to errors, are your companies prepared for the consequences of study submission delays?   The purpose of this poster is to share SEND experiences concerning dataset accuracy, and the possible role of QA in the submission process.  Microscopic Findings (MI) Domain Sponsors should ensure that the transformation of findings from MIORRES to MISTRESC closely adheres to the instructions in the SENDIG and TCG issued October Modifiers for which There are variables available (such as MISEV and MILAT) should be placed appropriately in the columns within domain structure. Severities (minimal, mild, moderate, marked, severe) should be placed in MISEV, and not duplicated in MISTRESC or SUPPMI. Non-neoplastic findings in MISTRESC, where controlled terminology has not yet been established, should be standardized in a way to ensure traceability between counts in tables, listings, and figures and the terms in MISTRESC. For example focal necrosis and necrosis should be maintained as separate findings, same for Inflammation varietals. Clinical Observations (CL) Domain Only Observations should be provided in CL; ensure that Events and Interventions are not included. Sponsors should ensure that the standardization of findings in CLSTRESC closely adheres to the SENDIG and TCG. The information in CLTEST and CLSTRESC, along with CLLOC and CLSEV when appropriate, should contain sufficient information to ensure traceability between counts in tables, listings, and figures to the unique terms in CLSTRESC. For example, if “vomitus, food” and “vomitus, clear” are tabulated separately in the study report, CLSTRESC should be standardized to “vomitus, food” and “vomitus, clear” rather than “vomitus”. (NOTE: CLSEV is not subject to CT, making data consistency and review even more challenging). Differences between the representation in CL and the presentation of Clinical Observations in the Study Report should be mentioned in the nSDRG. The requirement to submit SEND data using a particular study data standard is dependent on its support by FDA as listed in the FDA Data Standards Catalog at the time of study start. TSPARMCD = STSTDTC will allow the determination of the study start date and should be included in all SEND submissions. Ensure that Trial Arms and Trial Sets represented in TA and TX closely adhere to the SENDIG in study designs with recovery and/or toxicokinetic animals. Recovery and/or toxicokinetic animals should typically be presented in separate Trial Sets from the main Trial Arm. Tumor Dataset Carcinogenicity studies should include an electronic dataset of tumor findings to allow for a complete review. At this time sponsors should include a tumor.xpt file while following the specification in the SENDIG for its creation regardless of whether or not the study is in SEND format (See Variables in SDTM and SEND: Required, Expected, and Permissible CDISC data standards categorize SDTM and SEND variables as being Required, Expected, and Permissible. In some instances, sponsors have interpreted Permissible variables as being optional and, in other cases, sponsors have excluded Expected variables. For the purposes of SDTM and SEND submissions, all Required, Expected, and Permissible variables that were collected, plus any variables that are used to compute derivations, should be submitted. The following are examples of some of the Permissible and Expected variables SEND that should be included, if available: Baseline flags (e.g., last non-missing value prior to first dose) for Laboratory results, Vital Signs, ECG, Pharmacokinetic Concentrations, and Microbiology results. Whenever --DTC, --STDTC or --ENDTC, which have the role of timing variables, are included, the matching Study Day variables (--DY, --STDY, or --ENDY, respectively) should be included. For example, in most Findings domains, --DTC is Expected, which means that --DY should also be included. Dates in SEND Dates in SEND domains should conform to the ISO 8601 format. Examples of how to implement dates are included in the SENDIG. SEND Versions When submitting clinical or nonclinical data, sponsors should not mix versions within a study. As noted previously, the Standards Catalog lists the versions of both SEND and SDTM that are supported by FDA at any given time. Maintenance of Controlled Terminologies The use of supported controlled terminologies is recommended wherever available. If a sponsor identifies a concept for which no standard term exists, FDA recommends that the sponsor submit the concept to the appropriate terminology maintenance organization as early as possible to have a new term added to the standard dictionary. FDA considers this good terminology management practice. The creation of custom terms (i.e., so called extensible code lists) for a submission is discouraged, because this does not support semantically interoperable study data exchange. Furthermore, the use of custom or “extensible” code lists should not be interpreted to mean that sponsors may substitute their own nonstandard terms in place of existing equivalent standardized terms. Terminology maintenance organizations generally have well-defined change control processes. Sponsors should allow sufficient time for a proposed term to be reviewed and included in the terminology, as it is desirable to have the term incorporated into the standard terminology before the data are submitted. If custom terms cannot be avoided, the submitter should clearly identify and define them within the submission, reference them in the nSDRG, and use them consistently throughout the application. If a sponsor identifies an entire information domain for which FDA has not accepted a specific standard terminology, they may select a standard terminology to use, if one exists. FDA recommends that sponsors include this selection in the Study Data Standardization Plan (SDSP) or in an update to the existing plan, and reference it in the nSDRG. If no controlled terminology exists, the sponsor may define custom terms. The non-FDA supported terms (whether from a non-supported standard terminology or sponsor-defined custom terms) should then be used consistently throughout all relevant studies within the application. As SEND datasets are intended to almost remove the need of FDA reviewers to recreate report tables, quality is an important factor.  Part of any quality driven process include definition of processes, responsibilities, and education.  We find this to be held true of SEND dataset generation and verification as well.  Defining thorough processes and ownership early greatly reduce costly (financial and time) delays and complications when creating and submitting SEND datasets.  In conclusion, the following questions should be considered by each organization to guarantee successful SEND submissions: Does your Quality Assurance unit take part in the assurance of compliance with regard to SEND electronic data submissions? Is any QC of data performed by Quality Assurance? Is a compliance statement issued by your Quality Assurance unit? Who in your organization is responsible for the for the SDSP, and ensuring all studies are included in the SDSP? Who ensures the TS.XPT is present for all studies in the electronic submission for an NDA, ANDA or BLA? Who is responsible for data accuracy (answer is the study sponsor) and what Quality measures are taken to review and ensure that all data is present in regard to the electronic data submission? Do you outsource data verification of SEND datasets? FDA stance is that SEND data is owned by the study sponsor, and is responsible for the completeness and accuracy of the data in electronic submissions.  Does your organization have measures in place to assure this data completeness and accuracy check is done? SEND Standards to Consider in a Dataset Submission SEND (Standard for the Exchange of Nonclinical Data) version 3.0 is the standard for current electronic data submissions. This requires data be submitted in a standardized format, (SAS V5 .xpt files), for Single Dose, Repeat Dose and Carcinogenicity studies in electronic format. Detecting dataset and report discrepancies is a QC effort, especially when Controlled terminology plays a critical role in the accuracy of the data submitted. The FDA is utilizing the data in STRESC of the MI, MA and CL domains to visualize the finding incidences. (Study Data Technical Conformance Guide (TCG), October 2016). If findings are too vague, they may not accurately reflect incidence counts in the report requiring FDA to revert to paper review of the data. Thus, STRESC of your MI, MA and CL domains should match incidence tables in final report in order for FDA to use the electronic data effectively. The USUBJID (Unique Subject Identification) is a critical link to the cohesive SEND dataset, whereby the same USUBJID must be used in all datasets. It is not uncommon for CRO,s and Sponsors to have different USUBJID’s. It is something which must be discussed, and the same USUBJID MUST be used in all domains for the same animals. According to the TCG, October 2016, “As noted in section 1.1, the submission of standardized study data will be required according to the timetable specified in the eStudy Data guidance. Sponsors submitting legacy data should provide a TS dataset (ts.xpt) which includes the study start date in the form of SSTDTC (TSPARMCD = SSTDTC) and TSVAL= “yyyy-mm-dd”. In other words, all studies in a submission package, starting before or after December 17, 2016 MUST contain a TS.XPT file included in an electronic submission. Additional information may be added, like study title etc, in future. There should be one cohesive define, relrec.xpt and co.xpt per study, (define file, relationship records and comments) linking files such as MA, MI and CL, if Micro is read at another facility. DEFINE, RELREC and CO becomes a manual process when integrating CRO data into the final study submission. The ideal time to implement SEND is prior to the conduct of the study as it is very important that the results presented in the accompanying study report be traceable back to the original data collected. Each submitted SEND dataset should have its contents described with complete metadata in the define.xml file and within the nSDRG as appropriate. Sponsors should use the VISITDY variable if findings which were intended to be analyzed together were collected across multiple study days. This includes postmortem findings in OM, MA, MI, LB terminal body weight in BW, and in-life observations in other Findings domains that are grouped in the Study Report. Although this is a non-binding recommendation, the community has asked FDA to consider using the DS domain to concatenate the findings, since VISITDY variable label does not exist in the domains in which they ask for it to be used for the grouping of data. Conclusions


Download ppt "SEND Submissions – Quality review by QA – Yes/No?"

Similar presentations


Ads by Google