Download presentation
Presentation is loading. Please wait.
Published byDominic Baldwin Modified over 9 years ago
1
Commissioning, Global Runs & ROC All USCMS Meeting October 05, 2007 Kaori Maeshima
2
Thanks... Alan Stone, Bill Badgett, Andreas Meyer, Darin Acosta, Zongru Wan...... and many more for preparing this talk in hurry
3
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”3 CMS Collision Hall – Busy & Crowded Solenoid Electro-magnetic calorimeter 2/3 in collision hall Heavy lowering of the rest begins in Oct. Services to central piece of barrel Preparing for silicon tracker installation in Nov. Global Runs End of Month, since May 2007 March 2008, pixel installation. April 2008, commissioning runs with magnet Summer 2008 collisions CMS Collision Hall 2007.08.30 Muon drift tubes Silicon tracker goes here
4
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”4 Theme of this talk: integration and commissioning Towards Remote Operation (ROC) at FNAL. –My last report here was ~ 1 year ago, right after MTCC 2006. put “ROC” on a CMS map CERN ROC (cms centre) –Today's talk focus: Updates on “Global Runs” and “ROC related activities” Activities (TIF tests, Global runs..) Monitoring tool development (WBM, DQM,...) Infrastructure (LHC @ FNAL) Status Overall: During the year, much of detector components are lowered & placed at UX5. They are being intIN2006_044.pdfegrated & commissioned to be read out as ONE experiment, while the detailed work to fully understand each sub-detector continues. 2007 CMS New Phase: Integration/commissioning Note: More global view of commissioning overview talk: Next week Friday Wine/cheese seminar @ FNAL, by Darin Acosta
5
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”5 Increasing Complexity of the 2007 Global Runs HF DT RPC CSC FEDs Tracker FEDs HB HO Progress in Trigger, also Cosmic trigger with DT Using GT GRES- GCT read out (configured manually) Mone on progress in DQM & ROC participation, later integration of DQM system development of each DQM programs USCMS/subdetector/trigger/DAQ ROC large contribution in this area GREM GREJ GREJ' GREA GRES
6
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”6 GRES concluded last Friday (Sept. 26 – 28) –Ran 7HB wedges, 3 HO sectors, 4 DT sectors, 4 RPC sectors and CSC and Tracker FEDs –Data ( ~5GB) taken and shipped to T1-FNAL and T2-Florida, T2-MIT, as well as the CAF GREO is cancelled. Next global run is GREN (Nov. 26 – 30), with at most a short break before the end-of-the year Cosmic run (CCR1) ( tentatively: Dec. 10 – 22). –Implies completing transition to CMSSW170, XDAQ3.11/SLC4, and all the necessary preparations beforehand Need to focus on preparation for GREN, and to work on analysis of GRES (more on this after a few pictures) GRES Summary, GREN Preparation
7
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”7 From GREA: ECAL + DT signal ECAL & DT chambers synchronised with the CMS clock The DT trigger is synchronised with the DT ReadOut The DT trigger is integrated with the CMS Global Trigger The ECAL & DT r/o is integrated with the GLOBAL DAQ Shifters also at FNAL, read out FEDs from each sub-detector
8
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”8 RPC-DT coincidence (GRES)
9
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”9 “conclusions” slide from HCAL Report (by Pavel, from today's run meeting) Very interesting/useful data Progress with DT vs HCAL timing, DB, pedestal definition, noisy channels Looking forward to november run third of HB is now commissioned !!! Is it at all possible to try private DT/HCAL run earlier than late november ???
10
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”10 Analyses Possible from GRES DataOps will make a “pass 1” reconstruction pass on GRES data based on this set of cfg’s Analysis of GRES data: –Inter-synchronization studies Are systems (or multiple sectors of same system) timed-in with respect to each other? Is the timing stable? (during runs, between runs) –Reconstruction performance Are cosmic muons identified in the right place, with right MIP signal in the calorimeters? –Correlated noise
11
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”11 Correlated DT, CSC Noise Run 20558 DT CSC (HV off!) Test triggers news from this morning: Problem found!!!!!! Elog posting by Frank Geurts Fri Oct 5 12:24:46 Entry time Subject: CSC/DT Global Noise Run: found it https:/cmsdaq.cern.ch/elog/CSC/2796..... the major source..: welding equipment....
12
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”12 hot off the press! (thanks to Ianna, Mayda, Darin.....) 10 degree shift? From GRES: DT and HCAL
13
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”13 GRES Run and Analysis Information Run description, access to data, etc. to aid those doing analyses are described here: –https://twiki.cern.ch/twiki/bin/view/CMS/GRESAnalysis Global Run hypernews discussion forum also being setup.
14
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”14 Why Remote Operation? Why at FNAL? Thousands of collaborators located all over the world Most of them not resident at CERN Collider H.E.P. facilities have, however, never been more concentrated at a single site Need to disperse and disseminate Advantage at FNAL for USCMS: Natural base to serve large USCMS community LPC – LHC Physics Center Tier-1 center, Data Operation team Tevatron experiments' experience & resource sharing Remote work base for LHC accel. study & operation… Impact on future of HEP – way to operate --- ILC.......
15
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”15 Remote Operations FNAL CERN Tools needed for remote status display Must be easy to use, flexible, drillable Coöperative with firewall, security Must survive trans-Atlantic crossing Fermilab Remote Operations Center, LHC@FNAL CMS Underground Control Room 2007 MAY Global Integration Run. There are many of us (USCMS people) in action here and there! Infrastructure trust – one experiment technical ground
16
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”16 3-fold approach to ROC Infrastructure –LHC@FNAL – being used since Feb. 2007 CMS, LHC, & Outreach –WH11 ROC room – completed in the end of 2005. being used MTCC-I, II 2006, HCAL test beam and other tool development activities, and now as multi-purpose room (meetings, discussions, software/hardware development, etc.). Actions –MTCC I, II (Aug-Oct. 2006) –HCAL test beam (summer 2006, 2007) –SiTracker test (Feb. - Jun. 2007) –T0 shifts, Data Operation (Feb. 2007 - ) –Global Runs End of Month (GREM, GREJ, GREJ', GREA, GRES..... 2007) Monitoring Tool Development (software & hardware) –DQM (data quality monitoring) development (Trigger, HCal) –Web Based Monitoring Tool (WBM)
17
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”17 The first group to use the ROC for shifts was the CMS Tier-1 computing administration operations team which was during weekday business hours. Responsible for FNAL Tier-1 resources (~ 50% of CMS computing in U.S.) Provides central support for the several university-based Tier-2 centers The first detector group to use the ROC for shifts was the silicon tracker. Coordinated effort with the silicon tracker group & CMS ROC people working together. The remote shift operation at the ROC involved about 15 people from several different institutions from Feb – Jun 2007 Remote Monitoring included Data Quality Data transfer & bookkeeping Event display Detector Control System 2007 CMS ROC Shift Activities Current: Global runs & Data Operation
18
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”18 http://uscms.org/roc ROC Web Page Base page for many useful links Information kept up-to-date by Alan Stone global run info. Commissioning, Run Coordination
19
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”19 Background: WBM (Web-based Monitoring) These are tools developed mainly by Bill Badgett, et. al., over the years of CDF running/monitoring. These tools have been found extremely useful (Trigger/DAQ/subdetector experts at local and remote locations ). In February 2006, we proposed to install the WBM tools to CMS. Shortly after we began the development and implementation. WBM is a general tool, and CMS specific applications are rapidly developing. In addition to WBM software tool development, we also installed powerful server machines (cmsmon and cmsmon_dev) in order to distribute information outside of the experimental hall (P5) reliably for all the CMS colleagues.
20
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”20 Web-based Monitoring Wealth of information in database Trigger rates, event rates, cross sections, beam conditions, temperatures, voltages, environmental conditions, etc. … Database is preferred locale for configuration and monitoring data persistency Oracle 10 located at CMS site; replicated to offline world Has current and historical status data Latency ~ < 1 second to ~1 minute Behind firewall for security reasons Need a portal to gain access Provide display of contents And provide access control Typical data present, “Value vs. Time” Needs tools to access, plot, download, correlate Complex, heterogeneous database Many schemas, many designers Already have 140 schemas just in the online database & not nearly done Central description needed Correlate across subsystems Typical monitoring is “Value vs. Time” Global meta-data descriptive tables
21
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”21 RunSummary Pages Clickable measurements Drill-down capability Plot creation Provides Root TTree and histogram object in file Resizeable on resubmit
22
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”22 Environmental - Slow Control Access to current “right-now” conditions …and historical settings and trends… (Zongru Wan)
23
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”23 Trends over Time Plots Search for performance anomalies Interactive, historical, downloadable Selection of type of data Zoom in on problems EM Calorimeter (ECAL) Test Beam
24
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”24 DQM / RootBrowser Silicon Tracker Integration Facility Cosmic Ray Run Hadron Calorimeter (HCAL) Global Integration / Cosmic Ray Run New DQM GUI with user markup (Lassi Tuura) Dynamic JavaScript displays with Tomcat/Java backend
25
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”25 CMS “Page 1” top level status display, simplicity for even the most naïve user (Oracle Portal) CMS Fermilab Data File Process Summary Page Files copied to FNAL Tier 1 site and status of processing (jsp) (S.Murray) More Run Monitor Tools (Francisco Yumiceva)
26
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”26 Screen Snapshot Service, S 3 Remote Operations need Remote Knowledge Operations screens, e.g. RunControl, HV Control, EventDisplay valuable for remote users to know what is going on But normally have tight restrictions on access to nodes What is the Screen Snapshot Service? A way to provide periodic, read-only copies of display images (snapshots) for remote viewing Similar to products like VNC, pcAnywhere, and VGA2WEB but without the cost or danger of accidental remote control Can be used to make private-network displays viewable on the public internet (useful for remote monitoring) Uses commonly available technologies for portability and ease of use: Java, JSP, Tomcat
27
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”27 Screen Snapshot Service Example IGUANA EventDisplay CMS RunControl (see Alex Oh, CHEP’07 279) Actual snapshots from 2007.08.30 CMS global integration run S3S3
28
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”28 More things.......Odds and Ends.... Playback storage manager service enabling online environment DQM test even when the DAQ is not running (Kurt Biery) Work on DQM GUI with Lassi, Nov. 7 - (visiting FNAL) DQM core programming, base monitor work starting –Summary of “Online DQM Baseline Architecture” linked from http://indico.cern.ch/conferenceDisplay.py?confid=19909 http://indico.cern.ch/conferenceDisplay.py?confid=19909 –For all the DQM related issues, we are working very closely with Andreas Meyer, et. al. IN 2007/025 “putting trigger scalar and luminosity information into the event stream”. Hardware and Software are being prepared and planning to do a preliminary test in GREN. (G3, TTC, test/programming – Frank Chelebana, TrigXMonitor (Richard Ruiz, Lorenzo Agostino) We are also started considering the mechanism for the interrupt driven fast alarms at ROC (for exmaple, beam dump), from what available at P5, without distracting the P5/Accel. operation. (from computing side): an MTCC reprocessing to make files readable with CMSSW > 1_5_0 and with new reconstruction is in the works.
29
ROC mini History and Plan Alan Stone arrives – full time ROC person Construction of WH11NW ROC room 2004 20052006 2007 2008 -- HCAL Test Beam WBM effort continues..... MTCC I and II DQM work continues... Location: Move to WH1 (LHC@FNAL) Tracker integ. test Commissioning & DQM for physics runs.... we are here
30
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”30 Plans and Man Goals for Future Runs
31
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”31 summary CMS is now in new phase: 2/3 of detectors are in the collision hall. The rest of the heavy parts are planed to be lowered within 2007. (pixel in March 2008). Commissioning “global runs”, while each sub-detector work continues. Many more sub-detectors are included in global runs, and can learn a lot from the data. Please start looking at REAL DATA! Availability of Coherent way of looking at the huge amount of information (data, monitoring information...) is crucial for the CMS operation as well as the CMS remote operation. CMS is being commissioned and so as the Remote Operation at the same time. ROC at FNAL (a beautiful facility – lhc @ fnal) is here for you for all the USCMS colleagues. Please use it to commission your detector and also involve in shift taking – added bonus: shift service credit
32
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”32 ---- Backup Slides ----
33
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”33 LHC@FNAL ROC Features 4 CERN-style consoles (8 workstations) shared by CMS & LHC scientists 4 Projectors to share content within the ROC or to remote participants Videoconferencing installed for two consoles Webcams for remote viewing of ROC Secure keycard access to the ROC from Atrium and 1East Mtg Room Secure network for console PCs –Dedicated subnet, dedicated router w/Access Control Lists to restrict access. 12-minute video essay displayed on the large “Public Display” used by docents from the Education Department to explain CMS and LHC to tour groups High Definition videoconferencing system for conference room HD viewing of the ROC, and HD display capabilities in the ROC Secure group login capability for consoles, with persistent console sessions –Allows multiple users to share common console settings. Telephone lines share common number. International services enabled. Access to LHC Physics Center computing resources.
34
Oct. 5th, 2007All USCMS Meeting, “Commissioning, Global Runs & ROC”34 Screen Snapshot Service Mechanism Snapshot Producer 1 Java web start app Snapshot Producer n...... Web Service Disk Cache Web Client m Periodic HTTP Web POST with image payload Private net Firewall Public net Web Client 1 Remote Ops Web Client 2...... Normal HTTP Request Web page, no special config (Kurt Biery) control widget Provides real time images of monitor displays to remote sites
35
UXC: general logistic arrangement today Completing phase 1 of +end programme on YB-1 and -2 Phase 2 starts later this week ECAL barrel installed TK PP1, pipework and LV cabling off critical path. HB/EB cabling starting Plug installed HF raising test CASTOR /TOTEM tests Fwd pipe and VAX (pump) installed
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.