CT-PPS DB Info (Preliminary) DB design will be the same as currently used for CMS Pixels, HCAL, GEM, HGCAL databases DB is Oracle based A DB for a sub-detector.

Slides:



Advertisements
Similar presentations
Kondo GNANVO Florida Institute of Technology, Melbourne FL.
Advertisements

June 19, 2002 A Software Skeleton for the Full Front-End Crate Test at BNL Goal: to provide a working data acquisition (DAQ) system for the coming full.
1 Databases in ALICE L.Betev LCG Database Deployment and Persistency Workshop Geneva, October 17, 2005.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Peter Chochula, January 31, 2006  Motivation for this meeting: Get together experts from different fields See what do we know See what is missing See.
ManageEngine TM Applications Manager 8 Monitoring Custom Applications.
Star (Traditional) Database Tasks & MySQL 1. Database Types & Operation Issues 2. Server & Database deployments 3. Tools with MySQL 4. Data definition.
Web Based Monitoring DT online shifter tutorial Jesús Puerta-Pelayo CIEMAT Muon_Barrel_Workshop_07/July/10.
HPS Online Software Discussion Jeremy McCormick, SLAC Status and Plans.
Chapter 1 : Introduction §Purpose of Database Systems §View of Data §Data Models §Data Definition Language §Data Manipulation Language §Transaction Management.
Databases E. Leonardi, P. Valente. Conditions DB Conditions=Dynamic parameters non-event time-varying Conditions database (CondDB) General definition:
©Silberschatz, Korth and Sudarshan1.1Database System Concepts Chapter 1: Introduction Purpose of Database Systems View of Data Data Models Data Definition.
Pre-OTS Testing in Penticton Sonja Vrcic Socorro, December 11, 2007.
ALICE, ATLAS, CMS & LHCb joint workshop on
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
Data Acquisition Backbone Core J. Adamczewski-Musch, N. Kurz, S. Linev GSI, Experiment Electronics, Data processing group.
Elizabeth Gallas August 9, 2005 CD Support for D0 Database Projects 1 Elizabeth Gallas Fermilab Computing Division Fermilab CD Grid and Data Management.
Umesh Joshi Fermilab Phase 1 Pixel Upgrade Workshop, Grindelwald August , 2012 CMS Pixel & HCAL Databases (An Overview)
A university for the world real R © 2009, Chapter 9 The Runtime Environment Michael Adams.
News on GEM Readout with the SRS, DATE & AMORE
Michele de Gruttola 2008 Report: Online to Offline tool for non event data data transferring using database.
Rack Wizard LECC 2003 Frank Glege. LECC Frank Glege - CERN2/12 Content CMS databases - overview The equipment database The Rack Wizard.
TDAQ Experience in the BNL Liquid Argon Calorimeter Test Facility Denis Oliveira Damazio (BNL), George Redlinger (BNL).
LHCb Configuration Database Lana Abadie, PhD student (CERN & University of Pierre et Marie Curie (Paris VI), LIP6.
Copyright 2007, Information Builders. Slide 1 iWay Web Services and WebFOCUS Consumption Michael Florkowski Information Builders.
AliRoot Classes for access to Calibration and Alignment objects Magali Gruwé CERN PH/AIP ALICE Offline Meeting February 17 th 2005 To be presented to detector.
Maria del Carmen Barandela Pazos CERN CHEP 2-7 Sep 2007 Victoria LHCb Online Interface to the Conditions Database.
ATLAS The ConditionDB is accessed by the offline reconstruction framework (ATHENA). COOLCOnditions Objects for LHC The interface is provided by COOL (COnditions.
ATLAS Detector Resources & Lumi Blocks Enrico & Nicoletta.
L1Calo Databases ● Overview ● Trigger Configuration DB ● L1Calo OKS Database ● L1Calo COOL Database ● ACE Murrough Landon 16 June 2008.
L1Calo DBs: Status and Plans ● Overview of L1Calo databases ● Present status ● Plans Murrough Landon 20 November 2006.
DB and Information Flow Issues ● Selecting types of run ● L1Calo databases ● Archiving run parameters ● Tools Murrough Landon 28 April 2009.
Online Database Developments ● Overview ● OKS database status and plans ● COOL database developments ● Validating calibrations ● Tools ● Summary Murrough.
Fundamental of Database Systems
MIKADO – Generation of ISO – SeaDataNet metadata files
Introduction to DBMS Purpose of Database Systems View of Data
Databases and DBMSs Todd S. Bacastow January 2005.
Slow Control and Run Initialization Byte-wise Environment
Slow Control and Run Initialization Byte-wise Environment
Computing and Software – Calibration Flow Overview
HCAL Database Goals for 2009
Essentials of UrbanCode Deploy v6.1 QQ147
Database Replication and Monitoring
Working in the Forms Developer Environment
Chapter 2 Database Environment.
Chapter 2 Database System Concepts and Architecture
Operating System.
CMS High Level Trigger Configuration Management
Online Control Program: a summary of recent discussions
Tango Administrative Tools
Chapter 2: System Structures
An Overview of the Pixel and HCAL Databases
Remaining Online SW Tasks
Level 1 (Calo) Databases
Conditions Data access using FroNTier Squid cache Server
Conditions System Update and Discussion
Copyright © 2011 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 2 Database System Concepts and Architecture.
Chapter 2 Database Environment.
Chapter 2 Database Environment Pearson Education © 2009.
Chapter 2 Database Environment.
Data, Databases, and DBMSs
Introduction to DBMS Purpose of Database Systems View of Data
GIL Users Group Meeting
Database System Concepts and Architecture
Chapter 2 Database Environment Pearson Education © 2009.
Chapter 2 Database Environment Pearson Education © 2009.
Overview of Database Framework for GEM Detector at CERN
Offline framework for conditions data
SSDT, Docker, and (Azure) DevOps
SDMX IT Tools SDMX Registry
Presentation transcript:

CT-PPS DB Info (Preliminary) DB design will be the same as currently used for CMS Pixels, HCAL, GEM, HGCAL databases DB is Oracle based A DB for a sub-detector has three instances deployed Development (INT2R) in CERN IT Integration (CMSINTR) in P5 (if needed) Production (OMDS) in P5 This DB stores Detector construction data Detector conditions data (online & offline if desired) Detector configuration data 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 1

CT-PPS DB Info (Preliminary) We are currently working on the following CMS sub-detectors Pixels: ongoing run (minimal) & FPIX Phase 1 upgrade HCAL: ongoing run & upgrades GEM: new setup. Near future goals are Construct 4 slices (end of 2016) Configure detector (work very closely with DAQ) Conditions (work very closely with data producers) HGCAL: ongoing tests 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 2

For each DB, there will be 5 inter-connected core schemas CMS_CTP_CORE_CONSTRUCT (all detector hardware, electronics, etc.) CMS_CTP_CORE_MANAGEMNT (institutions, locations, etc.) CMS_CTP_CORE_IOV_MGMNT (IOVs & Tags) CMS_CTP_CORE_ATTRIBUTE (attributes of hardware & conditions) CMS_CTP_CORE_COND (all meta data including run info) 2 inter-connected user schemas (to store data) CMS_CTP_CTPPS_COND (all user generated data. Tables added as needed) CMS_CTP_CTPPS_CONSTRUCT (rarely used) 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 3 CT-PPS DB Info (Database Schemas)

Detector Components Store all detector components and electronics - configuration & readout Track every component – ROCs, pixels, modules, etc. Store all related data “Build the detector in the DB” We’ve embraced the concept of detector building in the DB Use components stored in the DB to build devices o Arm  Stations  RPs  Sensor Planes  Sensor Strips Build the readout and control chains o VFATs  Readout Board  electronic channels o VFATs  Control Boards  electronic channels Map detector components to readout and control chains o Detector channel  electronic channel Detector configuration and monitoring becomes straightforward Store all configuration & Monitoring data Track performance of individual channels 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 4 CT-PPS DB Info (Info Stored)

Loading of data in DB is automated via a “DB Loader” process (cronjob) Input data is in XML format (tightly coupled to DB schema). Depends on table name, data type, type of part, etc. XML format for each data type to be loaded will be provided by a person working on DB. Data load procedure (once XML file format is defined) Generate XML files Zip them Copy them to a designated spool area DB Loader picks them up and loads the data Status of load procedure is recorded in the DB Schema:CORE MANAGEMNT Table:CONDITIONS_DATA_AUDITLOG 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 5 CT-PPS DB Info (Data Loading)

Accessing data from the DB WBM interface used to access data from DB (principal method) WBM is currently in use for CMS Run Registry and to access data from Pixel, HCAL, GEM, HGCAL DBs WBM interface enables data from DB to be viewed in various formats, e.g. in tabular format as plots as histograms Display outputs of ROOT scripts etc. NB: It will be possible to dump data from DB into a file and use a ROOT script (We have done this in the past) to publish the data as desired. 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 6 CT-PPS DB Info (Data Access)

In general, users do not need to know data storage details in the DB. However, if they would like to, we will provide them with access to the DB and help if needed. Data, once written to the DB, is never deleted. Changes are made using updates and disabling the earlier versions. We (Valdas Rapsevicius) have also developed a mechanism to use the online DB together with WBM to do the following store offline conditions data, together with tag and IOV, in the online DB Generate SQLite files of various offline conditions Put the SQLite files in the offline drop box for loading the data in the CMS Offline DB. This enables users to view readily the offline conditions used for a given tag & IOV without having to dump the data (as currently done). 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 7 CT-PPS DB Info (Data Access)

Assumptions DB & Loader have been deployed WBM has been deployed Tests to be performed specified Tables to contain test data have been deployed After the sensors (wafers & detectors) arrive, register them in the DB. Generate a XML file containing the sensors Zip it Copy to the spool area Sensors get loaded in the DB Load test data Encapsulate test data in XML format (already known when table is deployed) Zip it Copy to spool area Data is loaded in DB Retrieve data Use WBM to visualize data Use WBM to retrieve data (text file) Etc. 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 8 CT-PPS DB Info (Sensor as Example)

We have started work on CT-PPS DB. Development DB has been deployed & have requested production DB deployment Setup DDLs (data definition language) to deploy the DB. This includes deploying all tables, triggers, procedures, etc. Check and make sure the framework is properly setup Setup DB Loader for development & production Need a Linux server to host development version of DB Loader (in CERN IT) Need 2 Linux servers (for redundancy) to host production versions of DB Loader (in P5) For now we focus on the development DB in CERN IT 04/06/2016CT-PPS DB Intro April 6, 2016 Umesh Joshi 9 CT-PPS DB Info (Status)