What can we do? Answers from CSS Nonclinical Topics WG

Slides:



Advertisements
Similar presentations
PROJECT TITLE Project Leader: Team: Executive Project Sponsor (As Required): Date: Month/Day/Year 110/17/2014 V1.
Advertisements

SEND Standard for the Exchange of Nonclinical Data
Study Data Standardization Plan Kick0ff Meeting 23 July 2014.
#PhUSE Standard Scripts Project Proposal for Qualification of Standard Scripts.
JumpStart the Regulatory Review: Applying the Right Tools at the Right Time to the Right Audience Lilliam Rosario, Ph.D. Director Office of Computational.
Systems Analysis And Design © Systems Analysis And Design © V. Rajaraman MODULE 14 CASE TOOLS Learning Units 14.1 CASE tools and their importance 14.2.
Changes without Prior Approval Breakout Session Summary Rick Smith Aventis Pasteur, Inc.
Second Annual Japan CDISC Group (JCG) Meeting 28 January 2004 Julie Evans Director, Technical Services.
PhUSE Computational Science Working Groups Solutions Through Collaboration.
Dave Iberson-Hurst CDISC VP Technical Strategy
WG4: Standards Implementation Issues with CDISC Data Models Data Guide Subteam Summary of Review of Proposed Templates and Next Steps July 23, 2012.
Optimizing Data Standards Working Group Meeting Summary
WG4: Data Guide/Data Description Work Group Meeting August 29, 2012.
Study Data Reviewer’s Guide (SDRG): Recommendations on Use of the Clinical SDRG Model for Nonclinical Data Submission Nonclinical Working Group, SDRG Project.
 Project management is the organization and management of resources (i.e. people, information, tools and machines, materials, time, capital and energy)
Cesg-1 CSS Area Report -- Super BOF Background From A. Hooke to CESG: (CSS AD emphasis ) Date: Fri 02 Oct 2009 To: CESG cc: CMC Subject: Proposed.
CDISC©2009 February CDISC INTRAchange Carey Smoak Device Team Leader Li Zheng Submission Data Standards Team Member Thurday, April 2, 2009.
PhUSE Computational Science Working Groups Solutions Through Collaboration.
Conduct User Analysis Website Design With handout UseNeedsAnalysis.doc.
DIA Electronic Submissions Meeting Olga Alfieri 26 April 2016
PhUSE Computational Science Working Groups Solutions Through Collaboration.
How Good is Your SDTM Data? Perspectives from JumpStart Mary Doi, M.D., M.S. Office of Computational Science Office of Translational Sciences Center for.
Validation Gary Gensinger Deputy Director Office of Business Process Support Center for Drug Evaluation and Research.
Submission Standards: The Big Picture Gary G. Walker Associate Director, Programming Standards, Global Data Solutions, Global Data Management.
California Department of Public Health California Department of Public Health Accreditation Readiness Team (ART) Orientation Office of Quality Performance.
© ITT Educational Services, Inc. All rights reserved. IS4680 Security Auditing for Compliance Unit 1 Information Security Compliance.
Continuing Medical Education Guidance for Planners of Regularly Scheduled Series (RSS) To insert your company logo on this slide From the Insert Menu Select.
Standard Design Process (SDP) Interfacing Procedures Ashley Taylor TVA
Investigating System Requirements
A need for prescriptive define.xml
Dave Iberson-Hurst CDISC VP Technical Strategy
CTD Content Management
Accelerate define.xml using defineReady - Saravanan June 17, 2015.
Definition SpecIfIcatIons
Maintaining the Clinical & Nonclinical Study Data Reviewer’s Guides
MAKE SDTM EASIER START WITH CDASH !
Traceability between SDTM and ADaM converted analysis datasets
PhUSE European CSS Working Group
PhUSE Key Performance Indicator Initiative
e-data Implementation Impact
SDTM and ADaM Implementation FAQ
PhUSE Computational Science
Study Data Reviewers’ Guide – Nonclinical Assessment
SDTM and ADaM Implementation FAQ
Test Submission Forum Goals of the project Project status: for example
Assuring the Quality of your COSF Data
How to Create a Research Poster
Definition SpecIfIcatIons
Data Consistency - SEND Datasets and Study Reports: Request for Collaboration in the Dynamic SEND Landscape  ISSUE 3: Data/metadata for permissible or.
Data Consistency: SEND Datasets and Study Reports: Request for Collaboration in the Dynamic SEND Landscape  ISSUE 3: Data/metadata for permissible or.
PhUSE Computational Science Working Groups
Visualization of Group Related Differences in Histopathology Data
Maintaining the Clinical & Nonclinical Study Data Reviewer’s Guides
WELCOME! Nonclinical Topics Working Group CSS Breakout Plan.
Demystifying Define.xml Codelists for Nonclinical Studies
WG4: Data Guide/Data Description Work Group Meeting
SDTM and ADaM Implementation FAQ
Plan Collaborate Deliver
Nonclinical Working Group Update CSS 2014
WG4: Standards Implementation Issues with CDISC Data Models
Employee engagement Delivery guide
SEND Implementation User Group
Nonclinical Topics WG Overview Working Group Co-leads
Standard Scripts Project 2
Nonclinical SDRG Goals of the project Project status: Ambition:
Data Submissions Douglas Warfield, Ph.D. Technical Lead, eData Team
Emerging Technologies
Assuring the Quality of your COSF Data
Optimzing the Use of Data Standards Calling for Volunteers
Template Slides for Assessment Projects
Presentation transcript:

What can we do? Answers from CSS Nonclinical Topics WG Follow up activity to “Nonclinical: Bringing it all Together” presentation by Dr. Lilliam Rosario What can we do? Answers from CSS Nonclinical Topics WG

Content This presentation summarizes the CSS Nonclinical Topics Introductory activity, in which Dr. Lilliam Rosario (Director, Office of Computational Science, FDA) asked: “What can we do to help?” regarding the e-data submission review processes and nonclinical PhUSE projects Each idea was written on a post-it note and stuck to a poster (see picture on title slide) describing e-data submission, processing and use by reviewers. The ideas are listed (verbatim) with “@....” To indicate where on the poster the idea was posted. The attendees of CSS then reviewed the list and either answered, or made recommendations on how to address the idea. Recommendations are listed on Slide 6.

Post-it ideas @CDISC: Need ability to generate tabulated summaries leveraging SEND data @Define: Where should information be submitted? Nonclinical SDRG and Define file? Does information need to be duplicated? Industry index of where to find information? Define usability, what is needed/helpful? Keys? Dictionary/codelists? @graph: Validation results – warning/errors, consistency, interpretation? Some sponsors not generating the datasets/CROs are doing it for them, that results in sometimes in negotiations – efforts to align this would be valuable. @scripts assessment: Visualization: we would like to see the data the way you see it.

Post-it ideas @Visualization: Expand visualization to capabilities to other data types Controlled terminology for histopath and clinical signs; MedRA into clinical @Tables: is there a need to create group comparison stats and summary tables from SEND for reviewers? @Kick-start: Checks for conformance and consistency before Kick-start? Sharing feedback from technical implementation (FDA to Public) Sharing anything from FDA on what hinders reviews: something like “hot issues” per quarter to public @Modeling Endpoints “Flow if possible” Define File: what is the usage in SEND Cross Domain Visualization Rational plan for standards development, testing, implementation, sunsetting. Input for next cycle. SDRG.XML and Define.xml: Can this be one deliverable, usable by all? Provide a mechanism whereby the reviewers routinely communicate their review experience and suggestions for improvement to the sponsors.

Breakout Discussion Outcome Post-it Ideas NCT WG Recommendation Need ability to generate tabulated summaries leveraging SEND data Possible PhUSE Project Industry index of where to find information? Refer to SEND Implementation User Group for assessment Define - usability, what is needed/helpful? Keys? Dictionary/codelists? Define File: what is the usage in SEND Where should information be submitted? Nonclinical SDRG and /or Define file? Refer to Define Codelist project team for bullet one. FDA will provide more information regarding the use of the define file for nonclinical studies. Then refer to nsdrg project team for recommendation on where info should be submitted. Validation results – warning/errors, consistency, interpretation? Some sponsors not generating the datasets/CROs are doing it for them, that results in sometimes in negotiations – efforts to align this would be valuable. Nsdrg and Csdrg project team will work on representation of validation in SDRGs. Possible PhUSE project to align CROs and Sponsors interpretations of warnings and errors. Visualization: we would like to see the data the way you see it. FDA to take this under advisement. Expand visualization to capabilities to other data types Cross Domain Visualization Visualization of Histopathology team and Scripts team are interested in new ideas . Possible new PhUSE project for specific visualization goals. (as done for histology) Controlled terminology for histopath and clinical signs; MedRA into clinical Histopath has CT for non-neoplastic findings, specimen, etc. General agreement that controlling clinical signs and mapping to MedRA are not very valuable for nonclinical. is there a need to create group comparison stats and summary tables from SEND for reviewers? No. Checks for conformance and consistency before Kick-start? Conformance and data consistency (i.e. number of noses) is expected from Sponsors. Sponsors should assure the accurate representation and format of the data. Sharing feedback from technical implementation (FDA to Public) Sharing anything from FDA on what hinders reviews: something like “hot issues” per quarter to public “Flow if possible” Possible new PhUSE project (Like ADA) to explore representing flow cytometry data in SEND. Rational plan for standards development, testing, implementation, sunsetting. Input for next cycle. Refer to CDISC SDRG.XML and Define.xml: Can this be one deliverable, usable by all? Pending answer from FDA on how define is used. Provide a mechanism whereby the reviewers routinely communicate their review experience and suggestions for improvement to the sponsors.