Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 1 Commissioning, Global Runs & ROC CMS Tier1 visit to FNAL November.

Slides:



Advertisements
Similar presentations
1 Brevi Notizie dal Commissioning di CMS… November Global Run (GREN): will last 2 weeks: 26 Nov-7 December (Dec Cosmic Run cancelled, Tracker will probably.
Advertisements

Development of the System remote access real time (SRART) at JINR for monitoring and quality assessment of data from the ATLAS LHC experiment (Concept.
DØ Global Monitoring and Ideas for LHC/CMS remote monitoring Pushpa Bhat Fermilab.
All DØ Meeting December 1, 2000 Uncle DØ Needs you!!! Commissioning Status & Plans Jae Yu All DØ Meeting Dec. 1, 2000 What have we been doing? What is.
5-Jan-2010 XEB Meeting at CERN 1 Debriefing from first data: CSC muons n Lessons learned u LHC is a quiet machine  CSC HV can remain on at all times,
Screen Snapshot Service Kurt Biery LAFS Meeting, 08-May-2007.
L. Taylor 2 March CMS Centres Worldwide See paper (attached to agenda) “How to create a CMS My Institute” which addresses the questions:
09/06/2006F. Scuri - Remote CO shifts at CDF 1 CDF Remote Consumer Operator (CO) shifts in Pisa Motivations and project contour CDF online issues for remote.
LHC Experiment Dashboard Main areas covered by the Experiment Dashboard: Data processing monitoring (job monitoring) Data transfer monitoring Site/service.
A. Cimmino - P. Paolucci - G. Polese / DCS meeting DQM tool for the DCS data What the RPC community needs: 1.Make online/offline DCS data.
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
Web Based Applications
US CMS Collaboration Meeting: April 7-8, LHC Remote Operations Center Erik Gottschalk Fermilab.
Screen Snapshot Service Kurt Biery SiTracker Monitoring Meeting, 23-Jan-2007.
William Badgett ― CMS Online Web-Based Monitoring ― CMS Online Web-Based Monitoring and Remote Operations William Badgett, Fermilab for the.
Central DQM Shift Tutorial Online/Offline. Overview of the CMS DAQ and useful terminology 2 Detector signals are collected through individual data acquisition.
Central DQM Shift Tutorial Online/Offline. Overview of the CMS DAQ and useful terminology 2 Detector signals are collected through individual data acquisition.
Brochure: Lucas Taylor CHEP 2009, Prague1CMS Centres Worldwide : A New Collaborative.
J. Patrick - Major Challenges/MOPA021 – a new Remote Operations Center at Fermilab J. Patrick, et al Fermilab.
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
Requirements Review – July 21, Requirements for CMS Patricia McBride July 21, 2005.
The Run Control and Monitoring System of the CMS Experiment Presented by Andrea Petrucci INFN, Laboratori Nazionali di Legnaro, Italy On behalf of the.
Web Based Monitoring DT online shifter tutorial Jesús Puerta-Pelayo CIEMAT Muon_Barrel_Workshop_07/July/10.
HPS Online Software Discussion Jeremy McCormick, SLAC Status and Plans.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Web application for detailed real-time database transaction monitoring for CMS condition data ICCMSE 2009 The 7th International Conference of Computational.
André Augustinus 10 October 2005 ALICE Detector Control Status Report A. Augustinus, P. Chochula, G. De Cataldo, L. Jirdén, S. Popescu the DCS team, ALICE.
Tracker data quality monitoring based on event display M.S. Mennea – G. Zito University & INFN Bari - Italy.
A.Golunov, “Remote operational center for CMS in JINR ”, XXIII International Symposium on Nuclear Electronics and Computing, BULGARIA, VARNA, September,
CMS pixel data quality monitoring Petra Merkel, Purdue University For the CMS Pixel DQM Group Vertex 2008, Sweden.
64 th PRC, DESY, Nov 2007 Wolfram Zeuner 1 CMS On Behalf of the DESY-CMS Group Wolfram Zeuner Organization Trigger & DQM Computing – CSA07 Tracker.
RDMS CMS DataBases: Current Status, Development and Plans. D.A Oleinik, A.Sh. Petrosyan, R.N.Semenov, I.A. Filozova V.V Korenkov, P.V. Moissenz, A. Vishnevskii,
Commissioning, Global Runs & ROC All USCMS Meeting October 05, 2007 Kaori Maeshima.
ALICE Pixel Operational Experience R. Santoro On behalf of the ITS collaboration in the ALICE experiment at LHC.
September 2007CHEP 07 Conference 1 A software framework for Data Quality Monitoring in ATLAS S.Kolos, A.Corso-Radu University of California, Irvine, M.Hauschild.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Сессия Ученого Совета ОФВЭ 25 декабря 2008 Проект CMS в 2008 В.В.Сулимов.
All CMS 30 May07 TSV1 All CMS 3 30 May 2007 LHC machine CMS Progress Overall Schedule.
RPC DQM status Cimmino, M. Maggi, P. Noli, D. Lomidze, P. Paolucci, G. Roselli, C. Carillo.
Online Monitoring for the CDF Run II Experiment T.Arisawa, D.Hirschbuehl, K.Ikado, K.Maeshima, H.Stadie, G.Veramendi, W.Wagner, H.Wenzel, M.Worcester MAR.
Linda R. Coney – 5 November 2009 Online Reconstruction Linda R. Coney 5 November 2009.
Online (GNAM) and offline (Express Stream and Tier0) monitoring produced results during cosmic/collision runs (Oct-Dec 2009) Shifter and expert level monitoring.
Michele de Gruttola 2008 Report: Online to Offline tool for non event data data transferring using database.
Pixel DQM Status R.Casagrande, P.Merkel, J.Zablocki (Purdue University) D.Duggan, D.Hidas, K.Rose (Rutgers University) L.Wehrli (ETH Zuerich) A.York (University.
DQM for the RPC subdetector M. Maggi and P. Paolucci.
Pixel power R&D in Spain F. Arteche Phase II days Phase 2 pixel electronics meeting CERN - May 2015.
First CMS Results with LHC Beam
LARP Collaboration Meeting April BNL -FNAL - LBNL - SLAC Status Report E. Harms 28 March 2006.
Software for the CMS Cosmic Challenge Giacomo BRUNO UCL, Louvain-la-Neuve, Belgium On behalf of the CMS Collaboration CHEP06, Mumbay, India February 16,
Notes in preparation by the Torino group Sara BolognesiDT Cosmic Meeting 01/11/2007  MTCC note:  calibration section  DQM section  Internal note about.
Remote Monitoring for Hardware & Beam Commissioning E. Harms/FNAL 7 April 2005 Port Jefferson, New York bnl - fnal- lbnl - slac US LHC Accelerator Research.
LHC CMS Detector Upgrade Project RCT/CTP7 Readout Isobel Ojalvo, U. Wisconsin Level-1 Trigger Meeting June 4, June 2015, Isobel Ojalvo Trigger Meeting:
AEM, Sept Frank Chlebana (Fermilab) Status of the LHC and CMS Frank Chlebana (Fermilab) All Experimenter’s Meeting Sept
LHCb 2009-Q4 report Q4 report LHCb 2009-Q4 report, PhC2 Activities in 2009-Q4 m Core Software o Stable versions of Gaudi and LCG-AA m Applications.
The online Web-Based Monitoring (WBM) system of the CMS experiment consists of a web services framework based on Jakarta/Tomcat and the ROOT data display.
CMS Status & Commissioning Menu: 1 Recent Progress Commissioning Prior to and After First Beam Commissioning with first LHC Events Outlook Wolfgang Funk.
CHEP 2010 – TAIPEI Robert Gomez-Reino on behalf of CMS DAQ group.
Visit to CERN/CMS Jan 2006 Patricia McBride Fermilab Slides taken from presentations by Hans Hoffmann and Werner Jank.
Barthélémy von Haller CERN PH/AID For the ALICE Collaboration The ALICE data quality monitoring system.
Status at CERN 8 fills  ~500/pb since last Tuesday – ~5.3/fb delivered, ~4.8/fb recorded, … SEU workshop on Friday – Different detectors have different.
CMS ROC Report Patricia McBride May 26, 2005.
The CMS Experiment at LHC
LHC Science Goals & Objectives
CMS Centres for Control, Monitoring, Offline Operations and Analysis
CMS Centres Worldwide CMS Centres Worldwide
CMS Centres for Control, Monitoring, Offline Operations and Analysis
HCAL Commissioning and Related Activities
DQM (Local and FNAL ROC) & Event Display experience at MTCC
DQM for the RPC subdetector
Presentation transcript:

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 1 Commissioning, Global Runs & ROC CMS Tier1 visit to FNAL November 05, 2007 Kaori Maeshima

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 2 Why Remote Operation? Why at FNAL? Thousands of collaborators located all over the world Most of them not resident at CERN Collider H.E.P. facilities have, however, never been more concentrated at a single site Need to disperse and disseminate Advantage at FNAL for USCMS: Natural base to serve large USCMS community LPC – LHC Physics Center Tier-1 center, Data Operation team Tevatron experiments' experience & resource sharing Remote work base for LHC accel. study & operation… Impact on future of HEP – way to operate --- ILC Tools developed here for the remote monitoring are NOT site specific --> can be used at any ROCs (eg. CMS centere at CERN)

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 3 ROC mini History Alan Stone arrives – full time ROC person Construction of WH11NW ROC room HCAL Test Beam WBM effort continues..... MTCC I,II DQM work continues... Location: Move to WH1 ‏ Tracker integ. test Global commissioning runs we are here

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 4 Remote Operations FNAL CERN Tools needed for remote status display (DQM, WBM, S 3, etc...) Must be easy to use and flexible, but robust Cooperative with firewall, security Must survive trans-Atlantic crossing Reuse where it make sense: online and offline monitoring Fermilab Remote Operations Center, CMS Underground Control Room 2007 MAY Global Integration Run. There are many of us (USCMS people) in action here and there! Infrastructure trust – one experiment technical ground- tool development

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 5 The first group to use the ROC for shifts was the CMS Tier-1 computing administration operations team which was during weekday business hours. Responsible for FNAL Tier-1 resources (~ 50% of CMS computing in U.S.) ‏ Provides central support for the several university-based Tier-2 centers The first detector group to use the ROC for shifts was the silicon tracker (Lenny's talk) CMS ROC Shift Activities at FNAL Current: Global runs & Data Operation We are working closely together with Commissioning people, (Tiziano/Darin et. al.,) DQM team (Emilio/Andreas, et. al.,), Subdetector DQM people, DPG, Data Operation, DBS, DataBase/Trigger/DAQ, software support people... We would like to engage even stronger with the data operation and DPG groups – for the data handling (Transfer/Reconstruction, etc.) and analysis of the commissioning data from the real detector.

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 6 ROC Web Page Base page for many useful links Information kept up-to-date by Alan Stone

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 7 Background: WBM (Web-based Monitoring)‏ These are tools developed mainly by Bill Badgett, et. al., over the years of CDF running/monitoring. These tools have been found extremely useful (Trigger/DAQ/subdetector experts at local and remote locations ). In February 2006, we proposed to install the WBM tools to CMS. Shortly after we began the development and implementation. WBM is a general tool, and CMS specific applications are rapidly developing. In addition to WBM software tool development, we also installed powerful server machines (cmsmon and cmsmon_dev) in order to distribute information outside of the experimental hall (P5)‏ reliably for all the CMS colleagues.

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 8 Examples of WBM --- RunSummary Pages Clickable measurements  Drill-down capability Plot creation  Provides Root TTree and histogram object in file  Resizeable on resubmit

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 9 Examples of WBM (cont.) ---- DCS information Access to current “right-now” conditions …and historical settings and trends… Tracker DCS info. from TIF Test HCAL Channel Info. in preparation

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 10 CMS “Page 1” top level status display, simplicity for even the most naïve user (Oracle Portal)‏ CMS Fermilab Data File Process Summary Page Files copied to FNAL Tier 1 site and status of processing‏ (S.Murray) ‏ More Run Monitor Tools

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 11 DQM / RootBrowser Silicon Tracker Integration Facility Cosmic Ray Run Hadron Calorimeter (HCAL) Global Integration / Cosmic Ray Run New DQM GUI with user markup (Lassi) Dynamic JavaScript displays with Tomcat/Java backend Andreas Meyer and Lassi are at FNAL this week. We will be working on core DQM, DQM GUI, interface, sub-detector needs, etc this week intensely.

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 12 Screen Snapshot Service, S 3 Remote Operations need Remote Knowledge  Operations screens, e.g. RunControl, HV Control, EventDisplay valuable for remote users to know what is going on  But normally have tight restrictions on access to nodes What is the Screen Snapshot Service?  A way to provide periodic, read-only copies of display images (snapshots) for remote viewing  Similar to products like VNC, pcAnywhere, and VGA2WEB but without the cost or danger of accidental remote control  Can be used to make private-network displays viewable on the public internet (useful for remote monitoring) ‏  Uses commonly available technologies for portability and ease of use: Java, JSP, Tomcat

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 13 Screen Snapshot Service Example IGUANA EventDisplay CMS RunControl (see Alex Oh, CHEP’07 279)‏ Actual snapshots from CMS global integration run S3S3

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 14 CMS Collision Hall – Busy & Crowded Solenoid Electro-magnetic calorimeter Status & Schedule: Most of the heavy lowering is done. Nov. global runs – extended to be two weeks long: Nov. 26 – Dec. 7th Decision on when to lower the tracker will be made this week. (likely in sometime in December). Cosmic Run (0T) with Tracker to be scheduled early in 2008 In April 2008, Cosmic runs with magnet (3.8T) is planned. Summer 2008 collisions CMS Collision Hall 2007 Muon drift tubes Silicon tracker goes here

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 15 Increasing Complexity of the 2007 Global Runs HF DT RPC CSC FEDs Tracker FEDs HB HO Progress in Trigger/DAQ/DQM, also Cosmic trigger with DT Using GT GRES- GCT read out (configured manually) GREM GREJ GREJ' GREA GRES

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 16 GRES Concluded (Sept. 26 – 28) –Ran 7HB wedges, 3 HO sectors, 4 DT sectors, 4 RPC sectors and CSC and Tracker FEDs –Data ( ~5GB) taken and shipped to T1-FNAL and T2-Florida, T2-MIT, as well as the CAF –Analysis of GRES data started show important findings which could not be found looking at MC data (noise, alignments, timing, etc...) Next global run is GREN (Nov. 26 – Dec. 1) –Implies completing transition to CMSSW170, XDAQ3.11/SLC4, and all the necessary preparations beforehand Prompt GRES and GREN data reconstruction by the data operation group would really help for the CMS commissioning! GRES Summary, GREN Preparation

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 17 Correlated DT, CSC Noise Run DT CSC (HV off!) Test triggers news from this morning: Problem found!!!!!! Elog posting by Frank Geurts Fri Oct 5 12:24:46 Entry time Subject: CSC/DT Global Noise Run: found it the major source..: welding equipment....

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 18 “Slide from HCAL GRES Report” (by Pavel, from a run meeting on Oct. 19th) Very interesting/useful data Progress with DT vs HCAL timing, DB, pedestal definition, noisy channels Looking forward to november run third of HB is now commissioned !!! Is it at all possible to try private DT/HCAL run earlier than late november ??? (note: this happened. run on Oct. 22)

Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 19 summary CMS is now in new phase: Most of detectors are in the collision hall and Commissioning “global runs”, while each sub-detector work continues. Many more sub-detectors are included in global runs, and can learn a lot from the real data. Availability of Coherent way of looking at the huge amount of information (data, monitoring information...) is crucial for the CMS operation as well as for the CMS remote operation. (most of the cases jobs to accomplish are common.) Similarly, online/offline monitoring can share many common tools. more organized 'shift activities' to be expected more and more at P5 & ROC. Our Wish List from 'experience working on the global runs from ROC' –Better notification mechanism of the data file arrivals at T1. –Better transfer trigger mechanism (which runs being transferred....) –Reconstruction!!!!!! –DQM results data transfer –We still need to experiment/experience about use of DB Latency of the Data Transfer became much much better. Thank you!!!