Commissioning, Global Runs & ROC All USCMS Meeting October 05, 2007 Kaori Maeshima.

Slides:



Advertisements
Similar presentations
Clara Gaspar on behalf of the LHCb Collaboration, “Physics at the LHC and Beyond”, Quy Nhon, Vietnam, August 2014 Challenges and lessons learnt LHCb Operations.
Advertisements

Development of the System remote access real time (SRART) at JINR for monitoring and quality assessment of data from the ATLAS LHC experiment (Concept.
F Fermilab Database Experience in Run II Fermilab Run II Database Requirements Online databases are maintained at each experiment and are critical for.
CMS Alignment and Calibration Yuriy Pakhotin on behalf of CMS Collaboration.
Screen Snapshot Service Kurt Biery LAFS Meeting, 08-May-2007.
L. Taylor 2 March CMS Centres Worldwide See paper (attached to agenda) “How to create a CMS My Institute” which addresses the questions:
WP2: Detector development Summary G. Pugliese INFN - Politecnico of Bari.
LHC Experiment Dashboard Main areas covered by the Experiment Dashboard: Data processing monitoring (job monitoring) Data transfer monitoring Site/service.
A. Cimmino - P. Paolucci - G. Polese / DCS meeting DQM tool for the DCS data What the RPC community needs: 1.Make online/offline DCS data.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
Screen Snapshot Service Kurt Biery SiTracker Monitoring Meeting, 23-Jan-2007.
William Badgett ― CMS Online Web-Based Monitoring ― CMS Online Web-Based Monitoring and Remote Operations William Badgett, Fermilab for the.
Central DQM Shift Tutorial Online/Offline. Overview of the CMS DAQ and useful terminology 2 Detector signals are collected through individual data acquisition.
Central DQM Shift Tutorial Online/Offline. Overview of the CMS DAQ and useful terminology 2 Detector signals are collected through individual data acquisition.
Brochure: Lucas Taylor CHEP 2009, Prague1CMS Centres Worldwide : A New Collaborative.
J. Patrick - Major Challenges/MOPA021 – a new Remote Operations Center at Fermilab J. Patrick, et al Fermilab.
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
06/03/06Calice TB preparation1 HCAL test beam monitoring - online plots & fast analysis - - what do we want to monitor - how do we want to store & communicate.
Requirements Review – July 21, Requirements for CMS Patricia McBride July 21, 2005.
The Run Control and Monitoring System of the CMS Experiment Presented by Andrea Petrucci INFN, Laboratori Nazionali di Legnaro, Italy On behalf of the.
Web Based Monitoring DT online shifter tutorial Jesús Puerta-Pelayo CIEMAT Muon_Barrel_Workshop_07/July/10.
HPS Online Software Discussion Jeremy McCormick, SLAC Status and Plans.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Web application for detailed real-time database transaction monitoring for CMS condition data ICCMSE 2009 The 7th International Conference of Computational.
Offline shifter training tutorial L. Betev February 19, 2009.
Nov. 5th, 2007CMS Tier1 visit to FNAL, “Commissioning, Global Runs & ROC”, Kaori Maeshima 1 Commissioning, Global Runs & ROC CMS Tier1 visit to FNAL November.
Tracker data quality monitoring based on event display M.S. Mennea – G. Zito University & INFN Bari - Italy.
A.Golunov, “Remote operational center for CMS in JINR ”, XXIII International Symposium on Nuclear Electronics and Computing, BULGARIA, VARNA, September,
CMS pixel data quality monitoring Petra Merkel, Purdue University For the CMS Pixel DQM Group Vertex 2008, Sweden.
Umesh Joshi Fermilab Phase 1 Pixel Upgrade Workshop, Grindelwald August , 2012 CMS Pixel & HCAL Databases (An Overview)
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
ALICE Pixel Operational Experience R. Santoro On behalf of the ITS collaboration in the ALICE experiment at LHC.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
All CMS 30 May07 TSV1 All CMS 3 30 May 2007 LHC machine CMS Progress Overall Schedule.
RPC DQM status Cimmino, M. Maggi, P. Noli, D. Lomidze, P. Paolucci, G. Roselli, C. Carillo.
Online Monitoring for the CDF Run II Experiment T.Arisawa, D.Hirschbuehl, K.Ikado, K.Maeshima, H.Stadie, G.Veramendi, W.Wagner, H.Wenzel, M.Worcester MAR.
Michele de Gruttola 2008 Report: Online to Offline tool for non event data data transferring using database.
Pixel DQM Status R.Casagrande, P.Merkel, J.Zablocki (Purdue University) D.Duggan, D.Hidas, K.Rose (Rutgers University) L.Wehrli (ETH Zuerich) A.York (University.
DQM for the RPC subdetector M. Maggi and P. Paolucci.
Pixel power R&D in Spain F. Arteche Phase II days Phase 2 pixel electronics meeting CERN - May 2015.
First CMS Results with LHC Beam
CERN IT Department CH-1211 Genève 23 Switzerland t Experiment Operations Simone Campana.
Oct HPS Collaboration Meeting Jeremy McCormick (SLAC) HPS Web 2.0 OR Web Apps and Databases (Oh My!) Jeremy McCormick (SLAC)
Software for the CMS Cosmic Challenge Giacomo BRUNO UCL, Louvain-la-Neuve, Belgium On behalf of the CMS Collaboration CHEP06, Mumbay, India February 16,
November 1, 2004 ElizabethGallas -- D0 Luminosity Db 1 D0 Luminosity Database: Checklist for Production Elizabeth Gallas Fermilab Computing Division /
R. Krempaska, October, 2013 Wir schaffen Wissen – heute für morgen Controls Security at PSI Current Status R. Krempaska, A. Bertrand, C. Higgs, R. Kapeller,
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
LHC CMS Detector Upgrade Project RCT/CTP7 Readout Isobel Ojalvo, U. Wisconsin Level-1 Trigger Meeting June 4, June 2015, Isobel Ojalvo Trigger Meeting:
July 27, 2002CMS Heavy Ions Bolek Wyslouch1 Heavy Ion Physics with the CMS Experiment at the Large Hadron Collider Bolek Wyslouch MIT for the CMS Collaboration.
AEM, Sept Frank Chlebana (Fermilab) Status of the LHC and CMS Frank Chlebana (Fermilab) All Experimenter’s Meeting Sept
INFSO-RI Enabling Grids for E-sciencE File Transfer Software and Service SC3 Gavin McCance – JRA1 Data Management Cluster Service.
CD Strategy Session/Briefing on Collaboration Tools: Sept. 12, Collaboration Tools Erik Gottschalk.
The online Web-Based Monitoring (WBM) system of the CMS experiment consists of a web services framework based on Jakarta/Tomcat and the ROOT data display.
Cofax Scalability Document Version Scaling Cofax in General The scalability of Cofax is directly related to the system software, hardware and network.
CMS Status & Commissioning Menu: 1 Recent Progress Commissioning Prior to and After First Beam Commissioning with first LHC Events Outlook Wolfgang Funk.
CHEP 2010 – TAIPEI Robert Gomez-Reino on behalf of CMS DAQ group.
Visit to CERN/CMS Jan 2006 Patricia McBride Fermilab Slides taken from presentations by Hans Hoffmann and Werner Jank.
Status at CERN 8 fills  ~500/pb since last Tuesday – ~5.3/fb delivered, ~4.8/fb recorded, … SEU workshop on Friday – Different detectors have different.
CT-PPS DB Info (Preliminary) DB design will be the same as currently used for CMS Pixels, HCAL, GEM, HGCAL databases DB is Oracle based A DB for a sub-detector.
CMS ROC Report Patricia McBride May 26, 2005.
Online remote monitoring facilities for the ATLAS experiment
CMS Centres Worldwide CMS Centres Worldwide
Hannes Sakulin, CERN/EP on behalf of the CMS DAQ group
Conditions Data access using FroNTier Squid cache Server
DQM (Local and FNAL ROC) & Event Display experience at MTCC
IT INFRASTRUCTURES Business-Driven Technologies
DQM for the RPC subdetector
Pierluigi Paolucci & Giovanni Polese
Plans for the 2004 CSC Beam Test
Presentation transcript:

Commissioning, Global Runs & ROC All USCMS Meeting October 05, 2007 Kaori Maeshima

Thanks... Alan Stone, Bill Badgett, Andreas Meyer, Darin Acosta, Zongru Wan and many more for preparing this talk in hurry

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”3 CMS Collision Hall – Busy & Crowded Solenoid Electro-magnetic calorimeter 2/3 in collision hall Heavy lowering of the rest begins in Oct. Services to central piece of barrel Preparing for silicon tracker installation in Nov. Global Runs End of Month, since May 2007 March 2008, pixel installation. April 2008, commissioning runs with magnet Summer 2008 collisions CMS Collision Hall Muon drift tubes Silicon tracker goes here

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”4 Theme of this talk: integration and commissioning Towards Remote Operation (ROC) at FNAL. –My last report here was ~ 1 year ago, right after MTCC put “ROC” on a CMS map CERN ROC (cms centre) ‏ –Today's talk focus: Updates on “Global Runs” and “ROC related activities” Activities (TIF tests, Global runs..) Monitoring tool development (WBM, DQM,...) Infrastructure FNAL) Status Overall: During the year, much of detector components are lowered & placed at UX5. They are being intIN2006_044.pdfegrated & commissioned to be read out as ONE experiment, while the detailed work to fully understand each sub-detector continues CMS New Phase: Integration/commissioning Note: More global view of commissioning overview talk: Next week Friday Wine/cheese FNAL, by Darin Acosta

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”5 Increasing Complexity of the 2007 Global Runs HF DT RPC CSC FEDs Tracker FEDs HB HO Progress in Trigger, also Cosmic trigger with DT Using GT GRES- GCT read out (configured manually) Mone on progress in DQM & ROC participation, later integration of DQM system development of each DQM programs USCMS/subdetector/trigger/DAQ ROC large contribution in this area GREM GREJ GREJ' GREA GRES

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”6 GRES concluded last Friday (Sept. 26 – 28) –Ran 7HB wedges, 3 HO sectors, 4 DT sectors, 4 RPC sectors and CSC and Tracker FEDs –Data ( ~5GB) taken and shipped to T1-FNAL and T2-Florida, T2-MIT, as well as the CAF GREO is cancelled. Next global run is GREN (Nov. 26 – 30), with at most a short break before the end-of-the year Cosmic run (CCR1) ( tentatively: Dec. 10 – 22). –Implies completing transition to CMSSW170, XDAQ3.11/SLC4, and all the necessary preparations beforehand Need to focus on preparation for GREN, and to work on analysis of GRES (more on this after a few pictures) GRES Summary, GREN Preparation

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”7 From GREA: ECAL + DT signal ECAL & DT chambers synchronised with the CMS clock The DT trigger is synchronised with the DT ReadOut The DT trigger is integrated with the CMS Global Trigger The ECAL & DT r/o is integrated with the GLOBAL DAQ Shifters also at FNAL, read out FEDs from each sub-detector

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”8 RPC-DT coincidence (GRES)

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”9 “conclusions” slide from HCAL Report (by Pavel, from today's run meeting) Very interesting/useful data Progress with DT vs HCAL timing, DB, pedestal definition, noisy channels Looking forward to november run third of HB is now commissioned !!! Is it at all possible to try private DT/HCAL run earlier than late november ???

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”10 Analyses Possible from GRES DataOps will make a “pass 1” reconstruction pass on GRES data based on this set of cfg’s Analysis of GRES data: –Inter-synchronization studies Are systems (or multiple sectors of same system) timed-in with respect to each other? Is the timing stable? (during runs, between runs) –Reconstruction performance Are cosmic muons identified in the right place, with right MIP signal in the calorimeters? –Correlated noise 

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”11 Correlated DT, CSC Noise Run DT CSC (HV off!) Test triggers news from this morning: Problem found!!!!!! Elog posting by Frank Geurts Fri Oct 5 12:24:46 Entry time Subject: CSC/DT Global Noise Run: found it the major source..: welding equipment....

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”12 hot off the press! (thanks to Ianna, Mayda, Darin.....) 10 degree shift? From GRES: DT and HCAL

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”13 GRES Run and Analysis Information Run description, access to data, etc. to aid those doing analyses are described here: – Global Run hypernews discussion forum also being setup.

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”14 Why Remote Operation? Why at FNAL? Thousands of collaborators located all over the world Most of them not resident at CERN Collider H.E.P. facilities have, however, never been more concentrated at a single site Need to disperse and disseminate Advantage at FNAL for USCMS: Natural base to serve large USCMS community LPC – LHC Physics Center Tier-1 center, Data Operation team Tevatron experiments' experience & resource sharing Remote work base for LHC accel. study & operation… Impact on future of HEP – way to operate --- ILC

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”15 Remote Operations FNAL CERN Tools needed for remote status display Must be easy to use, flexible, drillable Coöperative with firewall, security Must survive trans-Atlantic crossing Fermilab Remote Operations Center, CMS Underground Control Room 2007 MAY Global Integration Run. There are many of us (USCMS people) in action here and there! Infrastructure trust – one experiment technical ground

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”16 3-fold approach to ROC Infrastructure – being used since Feb CMS, LHC, & Outreach –WH11 ROC room – completed in the end of being used MTCC-I, II 2006, HCAL test beam and other tool development activities, and now as multi-purpose room (meetings, discussions, software/hardware development, etc.). Actions –MTCC I, II (Aug-Oct. 2006) ‏ –HCAL test beam (summer 2006, 2007) ‏ –SiTracker test (Feb. - Jun. 2007) ‏ –T0 shifts, Data Operation (Feb ) ‏ –Global Runs End of Month (GREM, GREJ, GREJ', GREA, GRES ) ‏ Monitoring Tool Development (software & hardware) ‏ –DQM (data quality monitoring) development (Trigger, HCal) ‏ –Web Based Monitoring Tool (WBM) ‏

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”17 The first group to use the ROC for shifts was the CMS Tier-1 computing administration operations team which was during weekday business hours. Responsible for FNAL Tier-1 resources (~ 50% of CMS computing in U.S.) ‏ Provides central support for the several university-based Tier-2 centers The first detector group to use the ROC for shifts was the silicon tracker. Coordinated effort with the silicon tracker group & CMS ROC people working together. The remote shift operation at the ROC involved about 15 people from several different institutions from Feb – Jun 2007 Remote Monitoring included Data Quality Data transfer & bookkeeping Event display Detector Control System 2007 CMS ROC Shift Activities Current: Global runs & Data Operation

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”18 ROC Web Page Base page for many useful links Information kept up-to-date by Alan Stone global run info. Commissioning, Run Coordination

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”19 Background: WBM (Web-based Monitoring)‏ These are tools developed mainly by Bill Badgett, et. al., over the years of CDF running/monitoring. These tools have been found extremely useful (Trigger/DAQ/subdetector experts at local and remote locations ). In February 2006, we proposed to install the WBM tools to CMS. Shortly after we began the development and implementation. WBM is a general tool, and CMS specific applications are rapidly developing. In addition to WBM software tool development, we also installed powerful server machines (cmsmon and cmsmon_dev) in order to distribute information outside of the experimental hall (P5)‏ reliably for all the CMS colleagues.

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”20 Web-based Monitoring Wealth of information in database  Trigger rates, event rates, cross sections, beam conditions, temperatures, voltages, environmental conditions, etc. …  Database is preferred locale for configuration and monitoring data persistency  Oracle 10 located at CMS site; replicated to offline world  Has current and historical status data Latency ~ < 1 second to ~1 minute Behind firewall for security reasons Need a portal to gain access  Provide display of contents  And provide access control Typical data present, “Value vs. Time”  Needs tools to access, plot, download, correlate Complex, heterogeneous database  Many schemas, many designers  Already have 140 schemas just in the online database & not nearly done Central description needed  Correlate across subsystems  Typical monitoring is “Value vs. Time” Global meta-data descriptive tables

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”21 RunSummary Pages Clickable measurements  Drill-down capability Plot creation  Provides Root TTree and histogram object in file  Resizeable on resubmit

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”22 Environmental - Slow Control Access to current “right-now” conditions …and historical settings and trends… (Zongru Wan)

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”23 Trends over Time Plots Search for performance anomalies Interactive, historical, downloadable Selection of type of data Zoom in on problems EM Calorimeter (ECAL) Test Beam

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”24 DQM / RootBrowser Silicon Tracker Integration Facility Cosmic Ray Run Hadron Calorimeter (HCAL) Global Integration / Cosmic Ray Run New DQM GUI with user markup (Lassi Tuura) ‏ Dynamic JavaScript displays with Tomcat/Java backend

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”25 CMS “Page 1” top level status display, simplicity for even the most naïve user (Oracle Portal)‏ CMS Fermilab Data File Process Summary Page Files copied to FNAL Tier 1 site and status of processing (jsp)‏ (S.Murray) ‏ More Run Monitor Tools (Francisco Yumiceva)

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”26 Screen Snapshot Service, S 3 Remote Operations need Remote Knowledge  Operations screens, e.g. RunControl, HV Control, EventDisplay valuable for remote users to know what is going on  But normally have tight restrictions on access to nodes What is the Screen Snapshot Service?  A way to provide periodic, read-only copies of display images (snapshots) for remote viewing  Similar to products like VNC, pcAnywhere, and VGA2WEB but without the cost or danger of accidental remote control  Can be used to make private-network displays viewable on the public internet (useful for remote monitoring) ‏  Uses commonly available technologies for portability and ease of use: Java, JSP, Tomcat

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”27 Screen Snapshot Service Example IGUANA EventDisplay CMS RunControl (see Alex Oh, CHEP’07 279)‏ Actual snapshots from CMS global integration run S3S3

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”28 More things Odds and Ends.... Playback storage manager service enabling online environment DQM test even when the DAQ is not running (Kurt Biery) Work on DQM GUI with Lassi, Nov. 7 - (visiting FNAL) DQM core programming, base monitor work starting –Summary of “Online DQM Baseline Architecture” linked from –For all the DQM related issues, we are working very closely with Andreas Meyer, et. al. IN 2007/025 “putting trigger scalar and luminosity information into the event stream”. Hardware and Software are being prepared and planning to do a preliminary test in GREN. (G3, TTC, test/programming – Frank Chelebana, TrigXMonitor (Richard Ruiz, Lorenzo Agostino) We are also started considering the mechanism for the interrupt driven fast alarms at ROC (for exmaple, beam dump), from what available at P5, without distracting the P5/Accel. operation. (from computing side): an MTCC reprocessing to make files readable with CMSSW > 1_5_0 and with new reconstruction is in the works.

ROC mini History and Plan Alan Stone arrives – full time ROC person Construction of WH11NW ROC room HCAL Test Beam WBM effort continues..... MTCC I and II DQM work continues... Location: Move to WH1 Tracker integ. test Commissioning & DQM for physics runs.... we are here

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”30 Plans and Man Goals for Future Runs

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”31 summary CMS is now in new phase: 2/3 of detectors are in the collision hall. The rest of the heavy parts are planed to be lowered within (pixel in March 2008). Commissioning “global runs”, while each sub-detector work continues. Many more sub-detectors are included in global runs, and can learn a lot from the data. Please start looking at REAL DATA! Availability of Coherent way of looking at the huge amount of information (data, monitoring information...) is crucial for the CMS operation as well as the CMS remote operation. CMS is being commissioned and so as the Remote Operation at the same time. ROC at FNAL (a beautiful facility – fnal) is here for you for all the USCMS colleagues. Please use it to commission your detector and also involve in shift taking – added bonus: shift service credit

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC” Backup Slides ----

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”33 ROC Features 4 CERN-style consoles (8 workstations) shared by CMS & LHC scientists 4 Projectors to share content within the ROC or to remote participants Videoconferencing installed for two consoles Webcams for remote viewing of ROC Secure keycard access to the ROC from Atrium and 1East Mtg Room Secure network for console PCs –Dedicated subnet, dedicated router w/Access Control Lists to restrict access. 12-minute video essay displayed on the large “Public Display” used by docents from the Education Department to explain CMS and LHC to tour groups High Definition videoconferencing system for conference room HD viewing of the ROC, and HD display capabilities in the ROC Secure group login capability for consoles, with persistent console sessions –Allows multiple users to share common console settings. Telephone lines share common number. International services enabled. Access to LHC Physics Center computing resources.

Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”34 Screen Snapshot Service Mechanism Snapshot Producer 1 Java web start app Snapshot Producer n Web Service Disk Cache Web Client m Periodic HTTP Web POST with image payload Private net Firewall Public net Web Client 1 Remote Ops Web Client Normal HTTP Request Web page, no special config (Kurt Biery)‏ control widget Provides real time images of monitor displays to remote sites

UXC: general logistic arrangement today Completing phase 1 of +end programme on YB-1 and -2 Phase 2 starts later this week ECAL barrel installed TK PP1, pipework and LV cabling off critical path. HB/EB cabling starting Plug installed HF raising test CASTOR /TOTEM tests Fwd pipe and VAX (pump) ‏ installed