Download presentation
Presentation is loading. Please wait.
Published byDominic Hensley Modified over 8 years ago
1
CT-PPS DB Info (Preliminary) DB design will be the same as currently used for CMS Pixels, HCAL, GEM, HGCAL databases DB is Oracle based A DB for a sub-detector has three instances deployed Development (INT2R) in CERN IT Integration (CMSINTR) in P5 (if needed) Production (OMDS) in P5 This DB stores Detector construction data Detector conditions data (online & offline if desired) Detector configuration data 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 1
2
CT-PPS DB Info (Preliminary) We are currently working on the following CMS sub-detectors Pixels: ongoing run (minimal) & FPIX Phase 1 upgrade HCAL: ongoing run & upgrades GEM: new setup. Near future goals are Construct 4 slices (end of 2016) Configure detector (work very closely with DAQ) Conditions (work very closely with data producers) HGCAL: ongoing tests 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 2
3
For each DB, there will be 5 inter-connected core schemas CMS_CTP_CORE_CONSTRUCT (all detector hardware, electronics, etc.) CMS_CTP_CORE_MANAGEMNT (institutions, locations, etc.) CMS_CTP_CORE_IOV_MGMNT (IOVs & Tags) CMS_CTP_CORE_ATTRIBUTE (attributes of hardware & conditions) CMS_CTP_CORE_COND (all meta data including run info) 2 inter-connected user schemas (to store data) CMS_CTP_CTPPS_COND (all user generated data. Tables added as needed) CMS_CTP_CTPPS_CONSTRUCT (rarely used) 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 3 CT-PPS DB Info (Database Schemas)
4
Detector Components Store all detector components and electronics - configuration & readout Track every component – ROCs, pixels, modules, etc. Store all related data “Build the detector in the DB” We’ve embraced the concept of detector building in the DB Use components stored in the DB to build devices o Arm Stations RPs Sensor Planes Sensor Strips Build the readout and control chains o VFATs Readout Board electronic channels o VFATs Control Boards electronic channels Map detector components to readout and control chains o Detector channel electronic channel Detector configuration and monitoring becomes straightforward Store all configuration & Monitoring data Track performance of individual channels 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 4 CT-PPS DB Info (Info Stored)
5
Loading of data in DB is automated via a “DB Loader” process (cronjob) Input data is in XML format (tightly coupled to DB schema). Depends on table name, data type, type of part, etc. XML format for each data type to be loaded will be provided by a person working on DB. Data load procedure (once XML file format is defined) Generate XML files Zip them Copy them to a designated spool area DB Loader picks them up and loads the data Status of load procedure is recorded in the DB Schema:CORE MANAGEMNT Table:CONDITIONS_DATA_AUDITLOG 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 5 CT-PPS DB Info (Data Loading)
6
Accessing data from the DB WBM interface used to access data from DB (principal method) WBM is currently in use for CMS Run Registry and to access data from Pixel, HCAL, GEM, HGCAL DBs WBM interface enables data from DB to be viewed in various formats, e.g. in tabular format as plots as histograms Display outputs of ROOT scripts etc. NB: It will be possible to dump data from DB into a file and use a ROOT script (We have done this in the past) to publish the data as desired. 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 6 CT-PPS DB Info (Data Access)
7
In general, users do not need to know data storage details in the DB. However, if they would like to, we will provide them with access to the DB and help if needed. Data, once written to the DB, is never deleted. Changes are made using updates and disabling the earlier versions. We (Valdas Rapsevicius) have also developed a mechanism to use the online DB together with WBM to do the following store offline conditions data, together with tag and IOV, in the online DB Generate SQLite files of various offline conditions Put the SQLite files in the offline drop box for loading the data in the CMS Offline DB. This enables users to view readily the offline conditions used for a given tag & IOV without having to dump the data (as currently done). 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 7 CT-PPS DB Info (Data Access)
8
Assumptions DB & Loader have been deployed WBM has been deployed Tests to be performed specified Tables to contain test data have been deployed After the sensors (wafers & detectors) arrive, register them in the DB. Generate a XML file containing the sensors Zip it Copy to the spool area Sensors get loaded in the DB Load test data Encapsulate test data in XML format (already known when table is deployed) Zip it Copy to spool area Data is loaded in DB Retrieve data Use WBM to visualize data Use WBM to retrieve data (text file) Etc. 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 8 CT-PPS DB Info (Sensor as Example)
9
We have started work on CT-PPS DB. Development DB has been deployed & have requested production DB deployment Setup DDLs (data definition language) to deploy the DB. This includes deploying all tables, triggers, procedures, etc. Check and make sure the framework is properly setup Setup DB Loader for development & production Need a Linux server to host development version of DB Loader (in CERN IT) Need 2 Linux servers (for redundancy) to host production versions of DB Loader (in P5) For now we focus on the development DB in CERN IT 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 9 CT-PPS DB Info (Status)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.