CMS Centres for Control, Monitoring, Offline Operations and Analysis

Slides:



Advertisements
Similar presentations
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Advertisements

IT Issues and Support Structures Simulation Education and Complex Technology Based Practice.
Development of the System remote access real time (SRART) at JINR for monitoring and quality assessment of data from the ATLAS LHC experiment (Concept.
IT PLANNING Enterprise Architecture (EA) & Updates to the Plan.
1 Data Storage MICE DAQ Workshop 10 th February 2006 Malcolm Ellis & Paul Kyberd.
e-Learning Series 2012 – Session 4 Teleconferencing, Videoconferencing, SKYPE Information Technology Group.
Les Les Robertson WLCG Project Leader WLCG – Worldwide LHC Computing Grid Where we are now & the Challenges of Real Data CHEP 2007 Victoria BC 3 September.
L. Taylor 2 March CMS Centres Worldwide See paper (attached to agenda) “How to create a CMS My Institute” which addresses the questions:
High Definition Videoconferencing for High Energy Physics, Erik Gottschalk 1 High Definition Videoconferencing for High Energy Physics Erik Gottschalk.
1 Report from Fermilab to the CSMM Working Group Meeting 2/12/2004 Sheila Cisko and Al Thomas.
J. Patrick - Major Challenges/MOPA021 – a new Remote Operations Center at Fermilab J. Patrick, et al Fermilab.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
US CMS Collaboration Meeting: April 7-8, LHC Remote Operations Center Erik Gottschalk Fermilab.
LHC Accelerator Coordination Meeting: March 28, Remote Operations for CMS Erik Gottschalk.
Brochure: Lucas Taylor CHEP 2009, Prague1CMS Centres Worldwide : A New Collaborative.
J. Patrick - Major Challenges/MOPA021 – a new Remote Operations Center at Fermilab J. Patrick, et al Fermilab.
Requirements Review – July 21, Requirements for CMS Patricia McBride July 21, 2005.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Tier 3 and Computing Delhi Satyaki Bhattacharya, Kirti Ranjan CDRST, University of Delhi.
Databases E. Leonardi, P. Valente. Conditions DB Conditions=Dynamic parameters non-event time-varying Conditions database (CondDB) General definition:
1 Drivers for Building Life Cycle Integrations Jim Sinopoli.
US CMS/D0/CDF Jet/Met Workshop – Jan. 28, The CMS Physics Analysis Center - PAC Dan Green US CMS Program Manager Jan. 28, 2004.
A.Golunov, “Remote operational center for CMS in JINR ”, XXIII International Symposium on Nuclear Electronics and Computing, BULGARIA, VARNA, September,
Lucas Taylor LHC Grid Fest CERN 3 rd Oct Collaborating at a distance 1 Collaborating at a Distance Lucas Taylor ▪ LHC Grid Fest ▪ 3 Oct 2008 ▪ CERN.
Summary Collaborative tools track Eva Hladká Masaryk University & CESNET Czech Republic.
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
Pixel DQM Status R.Casagrande, P.Merkel, J.Zablocki (Purdue University) D.Duggan, D.Hidas, K.Rose (Rutgers University) L.Wehrli (ETH Zuerich) A.York (University.
LARP Collaboration Meeting April BNL -FNAL - LBNL - SLAC Status Report E. Harms 28 March 2006.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
Ian Bird Overview Board; CERN, 8 th March 2013 March 6, 2013
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
11/3/2010 CM15 Controls for LARP and LAFS - 1 Terri Lahey LARP/LAFS/CERN and SLAC.
CD Strategy Session/Briefing on Collaboration Tools: Sept. 12, Collaboration Tools Erik Gottschalk.
Your-own-afs-password CMS Centre CMS System Login: Password: N.B. you must.
CMS CB, 22 June CMS Centre Status Lucas Taylor CMS Centre Project Manager CMS Collaboration Board CMS Week, June CMS Centre Status 2.New offices.
Visit to CERN/CMS Jan 2006 Patricia McBride Fermilab Slides taken from presentations by Hans Hoffmann and Werner Jank.
Committee – June 30, News from CERN Erik Gottschalk June 30, 2005.
UEC Meeting: May 6, Remote Operations Center Erik Gottschalk.
Tiina Wickstroem University of Wisconsin (US) ATLAS.
CMS ROC Report Patricia McBride May 26, 2005.
Media Event for First 7 TeV Collisions at CMS
CMS Centre Project - Green Light Meeting
The CMS Experiment at LHC
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
(Prague, March 2009) Andrey Y Shevel
Distributed Collaborations
Grid site as a tool for data processing and data analysis
Online remote monitoring facilities for the ATLAS experiment
Remote Operations For High Energy Physics
Pasquale Migliozzi INFN Napoli
Remote Operations Erik Gottschalk 5 October, 2005
Jan 12, 2005 Improving CMS data transfers among its distributed Computing Facilities N. Magini CERN IT-ES-VOS, Geneva, Switzerland J. Flix Port d'Informació.
Extend Skype Collaboration to Meeting Rooms and Beyond
The Status of Beijing site, and CMS local DBS
CMS Centres Worldwide CMS Centres Worldwide
Database Readiness Workshop Intro & Goals
Remote Operations for CMS & LHC Erik Gottschalk Fermilab
Mandate First draft plan is attached to agenda -- please give feedback
Remote Operations at Fermilab
WLCG: TDR for HL-LHC Ian Bird LHCC Referees’ meting CERN, 9th May 2017.
Update Erik Gottschalk.
Remote Operations Erik Gottschalk 7 September, 2005
CMS Centres for Control, Monitoring, Offline Operations and Analysis
Data Quality Monitoring of the CMS Silicon Strip Tracker Detector
Operations Model Thursday, October 20, 2005 October 13, 2005 Jean Slaughter.
Grid Canada Testbed using HEP applications
CMS Goal of the experiment
LHC Data Analysis using a worldwide computing grid
The LHCb Computing Data Challenge DC06
Presentation transcript:

CMS Centres for Control, Monitoring, Offline Operations and Analysis Lucas Taylor, Northeastern University, Boston Patricia McBride, Kaori Maeshima, Erik Gottschalk Fermilab CHEP’07, Victoria 2-7 September 2007 Abstract The CMS experiment is about to embark on its first physics run at the LHC. To maximize the effectiveness of physicists and technical experts at CERN and worldwide and to facilitate their communications, CMS has established several dedicated and inter-connected operations and monitoring centers. These include: a traditional Control Room at the CMS site in France; a CMS Centre for up to 50 people on the CERN main site in Switzerland, and remote operations centers, such as the LHC@FNAL center at Fermilab. We describe how this system of centers coherently supports the following activities: CMS data quality monitoring, prompt sub-detector calibrations, and time-critical data analysis of express-line and calibration streams; and operation of the CMS computing systems for processing, storage and distribution of real CMS data and simulated data, both at CERN and at offsite centers. We describe the physical infrastructure that has been established, the computing and software systems, the operations model, and the communications systems that are necessary to make such a distributed system coherent and effective. Lucas Taylor

LHC & CMS startup LHC CMS First collisions at 14 TeV Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 LHC & CMS startup Completion of magnet cryostating and tests Descent of last magnet Sector 7-8 cooled to 1.9 K along 3 km All technical systems commissioned to 7 TeV operation & machine closed LHC Beam commissioning starts First collisions at 14 TeV 2010 2008 2006 2007 2009 2005 (0.1 – 1) fb-1 (1 – 10s) fb-1 (10s - 100) fb-1 CMS commissioning, data quality monitoring, calibration, alignment, Offline computing operations … CMS Complete CMS closed and ready for extended physics run CSA’07 – final Computing Challenge Cosmic challenge on surface Lucas Taylor

LHC & CMS startup - human implications Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 LHC & CMS startup - human implications Jim Virdee described the challenges of the new LHC machine, new CMS detector and new physics … Fabiola Gianotti (Interlaken) identified some of the corresponding human difficulties we will face “O(10**3) physicists in panic-mode using and modifying the software and accessing the database, GRID …” “…at the beginning they will be confronted with most atypical (and stressful) situations, for which a lot of flexibility will be needed” I’ll describe how CMS Centres address this By co-locating sub-detector offline experts (CERN, FNAL…) By hosting CMS Computing operations teams By giving all 3000 collaborators live access to monitoring information By ensuring communications systems are effective Lucas Taylor

CMS Centers CMS Control Room Control Room Operates CMS Slow control Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 CMS Centers CMS Control Room Traditional CMS Control of CMS Detector Operations Control Room Operates CMS Slow control Data acquisition Data transfer to Tier-0 Sub detectors: Operate detector Calibrations Data-quality monitoring Constants to HLT Online monitoring CERN Tier-0 IN2P3 FNAL CNAF PIC ASGC GridKA RAL T2 T2 T2 T2 T2 T2 T2 T2 T2 T2 T2 T2 T2 T2 T2 Lucas Taylor

CMS Centers CMS Control Room CMS Centre and LHC@FNAL CMS Centre Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 CMS Centers CMS Control Room CMS Centre and LHC@FNAL Link to Control room Mirror displays Communications Computing Operations Tier-0 production Data storage / transfer Sub detectors Data quality monitoring (also post Tier-0) Calibration Good/bad runs Software fixes Express analysis Novel CMS Control of CMS Detector Operations Online monitoring CMS Centre (at CERN) CERN Tier-0 Control of CMS Computing Operations IN2P3 Offline monitoring FNAL CNAF PIC ASGC GridKA T2 LHC@FNAL RAL T2 T2 T2 T2 T2 T2 T2 T2 T2 T2 T2 T2 T2 T2 T2 Other centres (?) Lucas Taylor

Completed: Feb. 8, 2007 Open House: Feb. 12, 2007 LHC@FNAL Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 LHC@FNAL Started operations this year Joint centre for CMS & LHC Builds on FNAL Remote Operations Centre used in CMS Cosmic challenge LHC@FNAL features 4 CERN-style consoles (8 workstations) Videoconferencing for 2 consoles Webcams for remote viewing of room Secure access, secure network High Definition (HD) Systems Videoconferencing, Webcams… Role Based Access for LHC controls Screen Snapshot Service (SSS) Completed: Feb. 8, 2007 Open House: Feb. 12, 2007 Lucas Taylor

LHC@FNAL Tier-1 operations team Detector groups, e.g. Si tracker Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 LHC@FNAL Tier-1 operations team Responsible for FNAL Tier-1 Weekday, business hour shifts Central support for seven Tier-2 centers (Universities) Detector groups, e.g. Si tracker Remote shifts, Feb–Jun 2007 Worked with CERN tracker on Data Quality Monitoring Data transfer/bookkeeping Event display, DCS,.... Outreach Visits and 12 minute outreach video and photos from CERN Lucas Taylor

Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 CMS Centre at CERN Mid-way through construction; when complete will include New office space for ~250 people Main Monitoring & Operations Room (300 sq. m) 22 consoles with ~5 screens each CMS Computing Operations Rooms Already in use by ~15 operators Meeting Rooms Auditorium plus ~ 6 smaller rooms All with Polycom conf. phone, some with Tandberg VC equipment Miscellaneous rooms: outreach, WiFi visitors room / training centre, rest area with kitchen, locker room… Lucas Taylor

CMS Centre at CERN Main Room Outreach Room Meeting Rooms Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 CMS Centre at CERN Main Room Outreach Room Meeting Rooms Lucas Taylor

Lower row of screens for working Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 CMS Centre at CERN Lower row of screens for working Top row of screens for monitoring (e.g. displays from Control Room) Lucas Taylor

CMS Centre at CERN Will be in the former PS Main Control Room (below) Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 CMS Centre at CERN Will be in the former PS Main Control Room (below) Machine controls now in the Prevessin “CERN Control Centre” (CCC) Lucas Taylor

CMS Centre at CERN … today Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 CMS Centre at CERN … today Detailed requirements study Technical studies completed Ergonomics, acoustics, fire, radiation, lighting, etc. Green light for construction Lucas Taylor

CMS Centre at CERN … today Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 CMS Centre at CERN … today Technical installations underway 22 desks, ~110 screens Network (2 * 10Gbps uplinks) Lots of wireless 40 kW electrical power Air-conditioned under-floor iced water to 20 peripheral fan coil units Acoustic insulation Carpet, fibre-glass in ceiling and consoles Lighting Natural, strips, spots Dimmable and zoned Outreach displays Lucas Taylor

Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 CMS Centre at CERN When completed (March 2008) the CMS Centre will look very similar to the CERN-LHC Control Centre shown below Lucas Taylor

CMS Computing Operations Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 CMS Computing Operations Related CHEP Talks No. 81 Computing Operations at CMS Facilities No. 290 CMS Tier0 - design, implementation and first experiences No. 369 CMS Experiences with Computing Software and Analysis Challenges No. 370 Development of the Tier-1 Facility at Fermilab No. 266 CMS Offline Web Tools CMS “Data Operations” teams Main team at CMS Centre, CERN (15 people) Second main team at LHC@FNAL Ensures CMS workflows function ~24 / 7 Control and monitoring systems Increasing use of “WebTools” Initiate transfers, data placement and removal, production systems… Communications in-person and by telephone and IM with shared desktops Web, some VNC, maybe Webex/EVO (?) Screen Snapshot Service (SSS) for exporting arbitrary displays to the Web Used by CDF, CMS, maybe LHC Web Browser(s) snapshots requests Snapshot Service snapshots Monitored Application(s) http://cmstacwww.cern.ch:8082/snapshot/ShowimageList.jsp http://www-cdfonline.fnal.gov/java/snapshot/ShowimageList.jsp Lucas Taylor

Data Quality Monitoring Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 Data Quality Monitoring Web-based DQM GUI gives access to information in Control Room, offline centres, institutes, Web café, iPhone… Live shifter views - canned with reference histograms Expert exploration of repository of 1000s of detailed histograms Both real-time and offline data DQM systems being refined with practical experience from monthly commissioning runs Related CHEP Talks No. 221 CMS Online Web Based Monitoring No. 279 The Run Control and Monitoring System of the CMS Experiment No. 253 CMS Event Display and Data Quality Monitoring for LHC Startup No. 266 CMS Offline Web Tools No. 434 Data Quality Monitoring and Visualization for the CMS Silicon Strip Tracker No. 268 Data Quality Monitoring for the CMS Electromagnetic Calorimeter Lucas Taylor

Communications WWW Displays Phone Videoconferencing WebCams Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 WWW WebTools, DQM, SSS, … Communications Displays Local applications, DQM web clients, SSS mirrors Phone Headset per console Conference phones Videoconferencing Adjacent meeting room(s) Possibly in main room WebCams Maintain sense of proximity Use with phone for point to point “video conferencing” iMac - under consideration Computer, screen, camera, microphone and iChat for IM and few-person phone-/video-conferencing CMS Control Room mirror mirror ? CMS Centre at CERN LHC@FNAL mirrors Related CHEP talks No. 35 Collaborative Tools and the LHC: An Update No 408 Shaping Collaboration 2006: Action Items for the LHC Lucas Taylor

Summary CMS Centres are currently being established Lucas Taylor CHEP'07, Victoria 2-7 Sept 2007 Summary CMS Centres are currently being established “CMS Centre” at CERN, LHC@FNAL, possibly others … These centres will enhance communication and access to information, thereby helping all 3000 CMS collaborators to play their part in commissioning Offline data quality monitoring, calibrations, express analysis CMS Computing Operations …which will, in turn, increase CMS competitiveness when we start taking LHC data in July 2008 at a new energy frontier Lucas Taylor