The online Web-Based Monitoring (WBM) system of the CMS experiment consists of a web services framework based on Jakarta/Tomcat and the ROOT data display.

Slides:



Advertisements
Similar presentations
LC Calorimeter Testbeam Requirements Sufficient data for Energy Flow algorithm development Provide data for calorimeter tracking algorithms  Help setting.
Advertisements

23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
Calorimeter1 Understanding the Performance of CMS Calorimeter Seema Sharma,TIFR (On behalf of CMS HCAL)
First CMS Results with LHC BeamToyoko Orimoto, Caltech 1 First CMS Results with LHC Beam Toyoko Orimoto California Institute of Technology On behalf of.
US FAST site EMU CSC test results – a global view from ROOT N. Terentiev (Carnegie Mellon University) Fermilab July 23, 2004.
A Database Visualization Tool for ATLAS Monitoring Objects A Database Visualization Tool for ATLAS Monitoring Objects J. Batista, A. Amorim, M. Brandão,
Data Quality Monitoring for CMS RPC A. Cimmino, D. Lomidze P. Noli, M. Maggi, P. Paolucci.
CMS Alignment and Calibration Yuriy Pakhotin on behalf of CMS Collaboration.
Screen Snapshot Service Kurt Biery LAFS Meeting, 08-May-2007.
Data Acquisition Software for CMS HCAL Testbeams Jeremiah Mans Princeton University CHEP2003 San Diego, CA.
A. Cimmino - P. Paolucci - G. Polese / DCS meeting DQM tool for the DCS data What the RPC community needs: 1.Make online/offline DCS data.
Data Quality Monitoring of the CMS Tracker
14th IEEE-NPSS Real Time Conference 2005, 8 June Stockholm.
Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2012 Xavier Vilasis.
Brochure: Lucas Taylor CHEP 2009, Prague1CMS Centres Worldwide : A New Collaborative.
Approved Plots from CMS First Beam Runs 2-October-2008.
Offline Tracker DQM Shift Tutorial. 29/19/20152 Tracker Shifts Overview Online Shifts at P5 (3/day for 24 hours coverage) – One Pixel shifter and one.
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
06/03/06Calice TB preparation1 HCAL test beam monitoring - online plots & fast analysis - - what do we want to monitor - how do we want to store & communicate.
Requirements Review – July 21, Requirements for CMS Patricia McBride July 21, 2005.
The Run Control and Monitoring System of the CMS Experiment Presented by Andrea Petrucci INFN, Laboratori Nazionali di Legnaro, Italy On behalf of the.
Web Based Monitoring DT online shifter tutorial Jesús Puerta-Pelayo CIEMAT Muon_Barrel_Workshop_07/July/10.
RPC PAC Trigger system installation and commissioning How we make it working… On-line software Resistive Plate Chambers Link Boxes Optical Links Synchronization.
DC12 Commissioning Status GOALS: establish operating conditions, determine initial calibration parameters and measure operating characteristics for the.
Thomas Jefferson National Accelerator Facility Page 1 EC / PCAL ENERGY CALIBRATION Cole Smith UVA PCAL EC Outline Why 2 calorimeters? Requirements Using.
Databases E. Leonardi, P. Valente. Conditions DB Conditions=Dynamic parameters non-event time-varying Conditions database (CondDB) General definition:
ATLAS Liquid Argon Calorimeter Monitoring & Data Quality Jessica Levêque Centre de Physique des Particules de Marseille ATLAS Liquid Argon Calorimeter.
Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 1 Commissioning, Global Runs & ROC CMS Tier1 visit to FNAL November.
Tracker data quality monitoring based on event display M.S. Mennea – G. Zito University & INFN Bari - Italy.
A.Golunov, “Remote operational center for CMS in JINR ”, XXIII International Symposium on Nuclear Electronics and Computing, BULGARIA, VARNA, September,
CHEP06, Mumbai-India, Feb 2006V. Daniel Elvira 1 The CMS Simulation Validation Suite V. Daniel Elvira (Fermilab) for the CMS Collaboration.
CMS pixel data quality monitoring Petra Merkel, Purdue University For the CMS Pixel DQM Group Vertex 2008, Sweden.
ALICE Pixel Operational Experience R. Santoro On behalf of the ITS collaboration in the ALICE experiment at LHC.
Tracker Visualization Tool: integration in ORCA Maria S. Mennea, Giuseppe Zito University & INFN Bari, Italy Tracker b-tau Cosmic Challenge preparation.
Part I – Shifter Duties Part II – ACR environment Part III – Run Control & DAQ Part IV – Beam Part V – DCS Part VI – Data Quality Monitoring Part VII.
RPC DQM status Cimmino, M. Maggi, P. Noli, D. Lomidze, P. Paolucci, G. Roselli, C. Carillo.
Michele de Gruttola 2008 Report: Online to Offline tool for non event data data transferring using database.
Development of the CMS Databases and Interfaces for CMS Experiment: Current Status and Future Plans D.A Oleinik, A.Sh. Petrosyan, R.N.Semenov, I.A. Filozova,
Pixel DQM Status R.Casagrande, P.Merkel, J.Zablocki (Purdue University) D.Duggan, D.Hidas, K.Rose (Rutgers University) L.Wehrli (ETH Zuerich) A.York (University.
DQM for the RPC subdetector M. Maggi and P. Paolucci.
First CMS Results with LHC Beam
Measurement of the Charge Ratio of Cosmic Muons using CMS Data M. Aldaya, P. García-Abia (CIEMAT-Madrid) On behalf of the CMS Collaboration Sector 10 Sector.
TDAQ Experience in the BNL Liquid Argon Calorimeter Test Facility Denis Oliveira Damazio (BNL), George Redlinger (BNL).
ScECAL Beam FNAL Short summary & Introduction to analysis S. Uozumi Nov ScECAL meeting.
Software for the CMS Cosmic Challenge Giacomo BRUNO UCL, Louvain-la-Neuve, Belgium On behalf of the CMS Collaboration CHEP06, Mumbay, India February 16,
H C A L 11 th International Conference on Advanced Technology and Particle Physics Villa Olmo (Como - Italy), October 5 - 9, 2009 THE PERFORMANCE OF THE.
DQM for the RPC subdetector M. Maggi and P. Paolucci.
The DCS Databases Peter Chochula. 31/05/2005Peter Chochula 2 Outline PVSS basics (boring topic but useful if one wants to understand the DCS data flow)
Overview of PHENIX Muon Tracker Data Analysis PHENIX Muon Tracker Muon Tracker Software Muon Tracker Database Muon Event Display Performance Muon Reconstruction.
Electron Spectrometer: Status July 14 Simon Jolly, Lawrence Deacon 1 st July 2014.
Online Consumers produce histograms (from a limited sample of events) which provide information about the status of the different sub-detectors. The DQM.
TeV muons: from data handling to new physics phenomena Vladimir Palichik JINR, Dubna NEC’2009 Varna, September 07-14, 2009.
TeV Muon Reconstruction Vladimir Palichik JINR, Dubna NEC’2007 Varna, September 10-17, 2007.
CMS TRACKER VISUALISATION TOOLS M.S. MENNEA, a G. ZITO, a A. REGANO a AND I. OSBORNE b a Dipartimento Interateneo di fisica di Bari & INFN sezione di Bari,
1 Top Level of CSC DCS UI 2nd PRIORITY ERRORS 3rd PRIORITY ERRORS LV Primary - MaratonsHV Primary 1 st PRIORITY ERRORS CSC_COOLING CSC_GAS CSC – Any Single.
20OCT2009Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2009 Xavier Vilasis.
CMS Status & Commissioning Menu: 1 Recent Progress Commissioning Prior to and After First Beam Commissioning with first LHC Events Outlook Wolfgang Funk.
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. Off-the-Shelf Hardware and Software DAQ Performance.
CMS ROC Report Patricia McBride May 26, 2005.
Approved Plots from CRAFT
Approved Plots from CMS Cosmic Runs (mostly CRUZET, some earlier)
Commissioning of the ALICE HLT, TPC and PHOS systems
Data Quality Monitoring of the CMS Silicon Strip Tracker Detector
The IFR Online Detector Control at the BaBar experiment at SLAC
Project Presentations August 5th, 2004
The IFR Online Detector Control at the BaBar experiment at SLAC
The Online Detector Control at the BaBar experiment at SLAC
Status of RPC DQM for Global DAQ in CMSSW
DQM for the RPC subdetector
LC Calorimeter Testbeam Requirements
Presentation transcript:

The online Web-Based Monitoring (WBM) system of the CMS experiment consists of a web services framework based on Jakarta/Tomcat and the ROOT data display package. Due to security concerns, many monitoring applications of the CMS experiment cannot be run outside of the experimental site. As such, in order to allow remote users access to CMS experimental status information, we implement a set of Tomcat/Java servlets running in conjunction with ROOT applications to present current and historical status information to the remote user on their web browser. The WBM services act as a portal to activity at the experimental site. In addition to HTML, Javascripts are used to mark up the results in a convenient folder schema. No special browser options are necessary on the client side. The primary source of data used by WBM is the online Oracle database; the WBM tools provide browsing and transformation functions to convert database entries into HTML tables, graphical plot representations, XML, text and ROOT- based object output. The ROOT object output includes histogram objects and n-tuple data containers suitable for download and further analysis by the user. We devised a system of meta-data entries describing the heterogeneous database which allows the user to plot arbitrary database quantities, including multiple value versus time plots and time correlation plots. The server consists of two dual-core CPUs, and a RAID configuration of 5 TB of disks on which to store up to a year of monitoring output (older results will be archived to tape storage). Examples of WBM tools and services which assisted Shifters in their Remote Monitoring duties will be noted in this poster by. HCAL Test Beam MTCC Results (2) MTCC Results (1) Process Summary Web-Based Monitoring Remote Operations Center CMS Tracker Slice TestTracker DQM RunSummaryTIFScreen Snapshot Service Tracker DCS ROC Alan L. Stone (Fermilab) CMS Remote Monitoring 15 th IEEE NPSS Real Time Conference 2007 · Fermilab, Batavia IL, · April 29 – May 4, 2007 WBM The Magnet Test and Cosmic Challenge (MTCC) took place in two phases in August & October of MTCC was the first time CMS included more than one sub-detector in the readout, and in the presence of the full & stable magnetic field up to 4.1 T. Remote data quality monitoring (DQM) shifts were taken from the FNAL ROC, and shifters were tasked with running as many DQM modules as possible as well as the IGUANA Event Display. Communication was through video and phone conferencing, logbook entries and frequent s. Webcams were in FNAL ROC & CERN control room. B=3.8T Run=2605 Event=3981 We describe recent CMS activities performed at Fermi National Accelerator Laboratory (FNAL) Remote Operations Center (ROC). The FNAL ROC was located on the 11 th floor of Wilson Hall at Fermilab (built in 2005), and equipped with a dozen Linux PCs with multiple LCD displays, gigabit network connection, a file & web server, & videoconferencing capability. The main focus of the FNAL ROC group is independent of location. We concentrate on working closely with our CMS colleagues in the areas of data acquisition, triggers, data monitoring, data transfer, software analysis and database access, in order to commission the sub-detectors with the common goal of efficient & high quality data taking for physics  HCAL Test Beam Construction of WH11 ROC WBM project begins MTCC I & II DQM development Commissioning detector & trigger systems Snapshot Producer Captures screen images periodically Sends images to snapshot server Implemented as Java application Public Network Snapshot Consumer Periodically fetches snapshots from server Displays snapshots in a web browser Snapshot Server Receives images from producers Converts images to PNG Serves images to consumers Runs under Tomcat Snapshot Flow Private Network Example of SSS running on a PC in the Tracker Analysis Centre at CERN: Slow Controls System (25 Apr :11AM CERN) TIB:TOP: Top level of the control system for the powering of the TIB+ side. The three buttons correspond to the power for the TIB, the TID & the main LV supply. The first two are in error as the third is OFF. PLCs values: Shows an array of Temps of different probes of the TIB system. The name of the probe (current value) is in the white (colored) box. Green means the Temp is inside the acceptable window (chiller is set to 15 C) & grey means the probe is not functional. PLC Trending: Plot of the above values. The wayward trace shows one probe which currently needs investigation (it is disabled) Top plot shows a summary of the strip noise for six layers of TOB. Bottom plot presents the same information in a form of distribution. The panel on the right allows selection of a specific layer in order to look at the strip noise in the layer in more detail for effective trouble shooting. Similar summary plots are filled in for each tracker subsystem. A collection of plots on the left characterizes tracking and cluster performance. Shown are the signal-to- noise ratio and width of the clusters associated with tracks (top row), number of reconstructed hits per track and number of tracks per event (bottom row). The panel on the right shows a list of all available histrograms to monitor tracking performance. Assembled at the Tracker Integration Facility at CERN in Summer & Fall Instrumented ~20% of the CMS Tracker System for readout from Jan-May 2007 in warm and then cold (-10 C) stages: Outer Barrel (TOB±), Inner Barrel (TIB±) & End Cap (TEC±). Cosmic trigger (~1.1 Hz) with adjustable geometry. Over 1 Million events collected so far. Slice test of the CMS Calorimeter. Hadronic Barrel (HB) 2 wedges & 8 segs (40 deg); Endcap (HE) 4 segs (20 deg); Outer (HO) Rings 0,1,2. Plus one ECAL tile. And real electronics. CERN SPS H2 Beam line. Maximum intensities for 1E12 incident protons at 400 GeV/c: 9E7 pi+ (3E7 pi-) at 200 GeVc 1E6 e± at 150 GeV/c Located at Prevessin, HCAL TB data taking ran from June – Sept 2006 from the H2 facility. Data was transferred to FNAL ROC, typically within 5 minutes of end of run, and then reconstructed. Summary results were made available for online monitoring by anyone with a web browser. Particle ID w/TOF & Cerekov counters for very low energy (<9 GeV); muon veto & beam halo counters; trigger counters for multi- interactions. Example summary plots include the “Banana plots” – ECAL vs HCAL & total energy (muon subtracted) for 50 GeV pions and electrons. 50 GeV pi - 50 GeV e - HCAL Data Quality Monitor runs in live time at CERN & updates ~1000 events, processing only HCAL detector info, e.g. number of digis, occupancy maps. Output is HTML & PNG files, web browser friendly, feedback can be prompt so problem is addressed in minutes (loose cable, reinitialize electronics, etc.). Follow the links. Commissioning tool. For example, in monitoring the pedestals, one expects a Gaussian mean & RMS ~ 1. Import detector control system data from the Oracle database. Comfort displays visualize the DCS status of High and Low Voltage, Current, Temperature w/geometrical detector cartoon. Point the mouse cursor over a single channel & a pop-up window will display channel values. Click on channel for more details and histogram plots. WBM Run 7636, Event 2648 collected on April 13 th. The cosmic muon was detected in the TIB, TOB and TEC. The TIB (black) is in the center, followed by the TOB (lighter gray) and then the TEC (darker gray). In the IGUANA event displays, the track is drawn by a red line, the rechits are represented by green dots and the clusters of rechits are shown as blue dots. IGUANA can run remotely on a live stream of data (about 1 event/min) through a WBM Event Proxy. WBM FNAL ROC support for SiTracker activities included data transfer, bookkeeping, Elog, Event Display, RunSummary & DCS Monitoring. WBM The RunSummaryTIF servlet is a query, and the form which is shown allows the user to search for runs and various optional criteria such as date and time, Run mode (Physics, Pedestal, Timing, Calibration), Partition (trigger configuration), and more. An example of a date query is shown, with a multi-run result, & the result for Run 7647, which had 26 file fragments & 9736 events. From single run results, links are provided to Online DQM, Online Elog, and HV Status. Also provided is a method of returning a query in simple text format. Users can query the database from a script or batch job using a wget command w/ several options. WBM Information for last hour of single channel device. Histograms can be rescaled by value or time. WBM The Remote Operations Center is located on the 1 st floor of Wilson Hall at Fermilab. Its primary focus is to establish a ROC that supports the commissioning and operations of both LHC and CMS by the spring of One of the main goals is to give the CMS and accelerator scientists and engineers remote access and display information that are available in control rooms at CERN, therefore the consoles & layout of the were chosen to match the specifications at CERN. Each console includes 2 console PCs (eye level) & 1 fixed-display PC (upper level), with the latter connected to a projector. Network: open access (gigabit switch) & protected access (dedicated router). Videoconferencing in dedicated adjacent conference room, and point-to-point capability from console equipment. official opening: 12 Feb 2007 CMS Tracker Slice Test were first users of the ROC, commissioning the brand new infrastructure, took shifts, participated in live- time remote monitoring. Tier-1 computing facility team have developed detailed monitoring displays & alarms to ensure the mass storage and processing facilities at Fermilab are operational, and the ROC is staffed daily by a Tier-1 shifter. CMS Trigger & detector commissioning begins in late May 2007 – global data taking preparation. Tracker Slice Test Location Move to WH1 LHC Beam & Physics A cosmic muon was detected by all four detectors participating in the run. The event display shows how the particle transversed the detector with the reconstructed 4D segments in the Muon drift tubes (magenta), the reconstructed hits in HCAL (blue), the uncalibrated reconstructed hits in ECAL (light green), and the locally reconstructed track in the Tracker (green). A muon track was reconstructed in the drift tubes and extrapolated back into the detector taking into account the magnetic field. [I.Osborne Tier-0 & Tier-1 facility teams at CERN & FNAL handled all the data transfer & mass storage. However the files were not in a user friendly format (ROOT), no database was in place to inform users the “when, what & where” of files. FNAL ROC group devised a solution for MTCC data bookkeeping & created a automation scheme for job submission, file conversions and data quality monitoring (Python). 15 min. cronjob to check for new data files at FNAL. Condor-based work cluster batch system. Condor DAQman facility to synchronize processes file by file. Elog entry whenever new merged files completed. Available through WBM. The MTCC Process Summary page contains information on the status of the MTCC data, ordered by the most recent run number (linked to RunSummary). Process Summary color coding guides shifters about the state of transfer (CASTOR to dCache), conversion (Streamer to ROOT) and/or DQM processing (Trigger, Tracker, HCAL or CSC, with links to WBM). For reference, the number of events, total number of files per run, stop time and Magnet current was also included. WBM DQM Browser: Output of the online High Level Trigger (HLT) DQM made available through WBM tools to the CMS and FNAL shifters in live-time for prompt monitoring & feedback to LTC experts. WBM Local Trigger & Control (LTC) Muon bit pattern for run 2623: bit 0 : Drift Tube (DT) bit 1 : Cathode Strip Chamber (CSC) bit 2 : RBC1 (RPC trigger for wheel +1) bit 3 : RBC2 (RPC trigger for wheel +2) bit 4 : RPC-TB (RPC trigger from the trigger board) Only muon triggers were used in MTCC-I. Calorimeter triggers were added in MTCC-II. Magnet current as a functio n of time for Run WBM The CSC muon chambers provide signals from anode wires (cathode strips) along a constant η (  ). Each chamber has six layers of wires & strips. The 2D hits (recHits) are monitored in the r-  plane. The chambers are arranged in rings around the beam pipe. There are 4 separate layers of rings, referred to as Stations, with Station 1 (closest to the interation point) through 3 (furthest from IP) included in MTCC-I. Tracker DQM output for Run WBM Tracker participated only in MTCC-I. Reprocessing of MTCC-I Tracker data was crucial to algorithm & alignment studies. No additional tracker data of any kind would be available for several months & no magnetic data for over a year. FNAL ROC provided a common sample of ROOT files for tracker data analyses. Reprocessed data sample consisted of ~100 runs in total: B=0T (11M evts), B=3.8T (14M evts), B=4.0T (2M evts).