Infrastructures and Installation of the Compact Muon Solenoid Data AcQuisition at CERN on behalf of the CMS DAQ group TWEPP 2007 Prague,

Slides:



Advertisements
Similar presentations
Questionnaire Response
Advertisements

André Augustinus 16 June 2003 DCS Workshop Safety.
Clara Gaspar on behalf of the LHCb Collaboration, “Physics at the LHC and Beyond”, Quy Nhon, Vietnam, August 2014 Challenges and lessons learnt LHCb Operations.
Cloud Computing Data Centers Dr. Sanjay P. Ahuja, Ph.D FIS Distinguished Professor of Computer Science School of Computing, UNF.
LECC V. Bobillier1 Overview of LHCb Electronics installation aspects V. Bobillier; L. Roy; J. Christiansen CERN - LHCb.
Atlas SemiConductor Tracker Andrée Robichaud-Véronneau.
RD51 Common Test Beam: Status RD51 collaboration week Kolymbari, June 2009 Yorgos Tsipolitis NTUA.
CSE 550 Computer Network Design Dr. Mohammed H. Sqalli COE, KFUPM Spring 2007 (Term 062)
LHCb Upgrade Optical Fibers from Cavern to the Surface - what has been done ? - prototypes status and next coming steps - what is missing for final installation.
Experimental Area Meeting V. Bobillier1 Good connection to earth (in many points) of every metallic parts to be installed in the cavern is very.
13 January All Experimenters’ MeetingAlan L. Stone - Louisiana Tech University1 Data Taking Statistics Week of 2003 January 6-12 We had to sacrifice.
Proposal of new electronics integrated on the flanges for LAr TPC S. Cento, G. Meng CERN June 2014.
MSS, ALICE week, 21/9/041 A part of ALICE-DAQ for the Forward Detectors University of Athens Physics Department Annie BELOGIANNI, Paraskevi GANOTI, Filimon.
John Coughlan CMS Week Mar FED Update FED Production Status Transition Card.
IMPLEMENTATION OF SOFTWARE INPUT OUTPUT CONTROLLERS FOR THE STAR EXPERIMENT J. M. Burns, M. Cherney*, J. Fujita* Creighton University, Department of Physics,
V. Bobillier1 Long distance cable installation status Calorimeter commissioning meeting.
1 5 December 2012 Laurent Roy Infrastructure / Electronics Upgrade -Cabling (long distance) -Rack and Crate (space, electrical distribution, cooling) -Detector.
Status and plans for online installation LHCb Installation Review April, 12 th 2005 Niko Neufeld for the LHCb Online team.
Installation and Commissioning Planning Leslie Camilleri Calorimeter Commissioning Meeting 5 th September 2006.
13 June 2013Laurent Roy Optical Fibers / Electronics Upgrade -Fibers need for Upgrade -> Installation ? -Two techniques possible for multi fibers installation.
Hall D Online Meeting 28 March 2008 Fast Electronics R. Chris Cuevas Group Leader Jefferson Lab Experimental Nuclear Physics Division System Engineering.
John Coughlan Tracker Week Feb FED Status FED Production Status Transition Card.
Dynamic configuration of the CMS Data Acquisition Cluster Hannes Sakulin, CERN/PH On behalf of the CMS DAQ group Part 1: Configuring the CMS DAQ Cluster.
Hall D Online Meeting 27 June 2008 Fast Electronics R. Chris Cuevas Jefferson Lab Experimental Nuclear Physics Division 12 GeV Trigger System Status Update.
J. Varela, LIP-Lisbon/CERN LEADE WG Meeting CERN, 29 March 2004 Requirements of CMS on the BST J. Varela LIP Lisbon / CERN LHC Experiment Accelerator Data.
The new CMS DAQ system for LHC operation after 2014 (DAQ2) CHEP2013: Computing in High Energy Physics Oct 2013 Amsterdam Andre Holzner, University.
Upgrade of the CSC Endcap Muon Port Card and Optical Links to CSCTF Mikhail Matveev Rice University 17 August 2012.
1 Network Performance Optimisation and Load Balancing Wulf Thannhaeuser.
T.Y. Ling EMU Meeting CERN, September 20, 2005 Status Summary Off-Chamber Electronics.
CSC Track-Finder Plans for Commissioning at Bat.904 and Point 5 Darin Acosta University of Florida.
Paolo Guglielmini - TS / CV LHCb Install Meeting - 20/04/2005 LHCb Install Meeting 20/04/05 TS / CV / Detector Cooling Installations.
All Experimenters MeetingDmitri Denisov Week of July 7 to July 15 Summary  Delivered luminosity and operating efficiency u Delivered: 1.4pb -1 u Recorded:
1 ENH1 Detector Integration Meeting 18/6/2015 D.Autiero (IPNL) WA105:  Detector installation requirements  Detector racks and power requirements Note:
Installation status Control Room PC farm room DetectorsEB Infrastructure 918 ECN3.
9/17/2008TWEPP 2008, R. Stringer - UC Riverside 1 CMS Tracker Services: present status and potential for upgrade Robert Stringer University of California,
1 13 Octobre 2011 Laurent Roy ATCA Crate / Infrastructure in place -Rack in D3: dimension -Electrical power availability -Tell1 racks running in D3 (air.
V. Bobillier1 Cable installation status.
New DAQ at H8 Speranza Falciano INFN Rome H8 Workshop 2-3 April 2001.
RRB scrutiny Report: Muons. Internal scrutiny group for RRB report Internal scrutiny group composition – J. Shank, G. Mikenberg, F. Taylor, H. Kroha,
CMS TriDAS project Infrastructure issues for the CMS online farm
Status of the NA62 network R. Fantechi 23/5/2012.
LHC CMS Detector Upgrade Project RCT/CTP7 Readout Isobel Ojalvo, U. Wisconsin Level-1 Trigger Meeting June 4, June 2015, Isobel Ojalvo Trigger Meeting:
DAQ Overview + selected Topics Beat Jost Cern EP.
Power, Cooling, Cables, Simulations of Heat Dissipation Short overview of ongoing work on these points Technical Coordination (Rolf Lindner et al.) Cooling.
GlueX Collaboration May05 C. Cuevas 1 Topics: Infrastructure Update New Developments EECAD & Modeling Tools Flash ADC VXS – Crates GlueX Electronics Workshop.
1 hcal installation and commissioning status pawel de barbaro, vitali smirnov, june 11, 2007.
D.Macina TS/LEATOTEM Meeting25/02/2004 Roman Pot test at the SPS Test of the Roman Pot prototype in the SPS proposed in December 2003 (CERN/LHCC ):
21 November 2003Jacques Lefrancois1 HOSTING ELECTRONICS AND CONNECTIVITY Role of calorimeter system: Level 0 trigger +  reconstruction +e/  id. Level.
CHEP 2010, October 2010, Taipei, Taiwan 1 18 th International Conference on Computing in High Energy and Nuclear Physics This research project has.
FTK crates, power supplies and cooling issues 13/03/20131FTK-IAPP workshop - A. Lanza  Racks, crates and PS: requirements  Wiener crates  Rittal crates.
CMS DAQ project at Fermilab
Gu Minhao, DAQ group Experimental Center of IHEP February 2011
Cabling for LHCb Upgrade
Rack Design Proposal as of 01/03/2013
CERN Data Centre ‘Building 513 on the Meyrin Site’
ProtoDUNE Installation Workshop - DAQ
ProtoDUNE SP DAQ assumptions, interfaces & constraints
The LHCb Event Building Strategy
SCT/Pixel Draft Rack Layout rcj
Regional Cal. Trigger Milestone: Major Production Complete
Trigger M&O Progress -- Florida, Rice, UCLA, Wisconsin
Regional Cal. Trigger Milestone: Production & Testing Complete
Trigger M&O Progress -- Florida, Rice, UCLA, Wisconsin
TDAQ commissioning and status Stephen Hillier, on behalf of TDAQ
John Harvey CERN EP/LBC July 24, 2001
LHCb Trigger, Online and related Electronics
(On Behalf of CMS Muon Group)
FED Design and EMU-to-DAQ Test
CSC Electronics Installation and Commissioning Oct ‘06
CSC Electronics Installation and Commissioning CSCE I&C
Presentation transcript:

Infrastructures and Installation of the Compact Muon Solenoid Data AcQuisition at CERN on behalf of the CMS DAQ group TWEPP 2007 Prague, Czech Republic

TWEPP 2007, Prague Attila RACZ / PH-CMD 2 Outline Introduction Underground area Surface area What next ?

TWEPP 2007, Prague Attila RACZ / PH-CMD 3 DAQ locations DAQ elements are installed at the experimental site both in the underground counting rooms (USC55) and surface buildings (SCX5). Elements installed in underground areas are in charge of collecting pieces of events from about 650 detector data sources and transmitting these event fragments to the surface elements. They also elaborate a smart back pressure signal that prevents the first level trigger logic of overflowing the front-end electronic (Trigger Throttling System). Elements installed in surface areas are in charge of full event building (a partial event building already takes place underground) and running the High Level Trigger algorithms. Events that pass these filters are stored locally and transmitted later to main CERN computing center.

TWEPP 2007, Prague Attila RACZ / PH-CMD 4 CMS experimental site Underground Counting rooms Surface computer room

TWEPP 2007, Prague Attila RACZ / PH-CMD 5

TWEPP 2007, Prague Attila RACZ / PH-CMD 6 Underground DAQ elements 500 FRL cards receiving the data from one or two detector data sources 650 sender cards plugged on the detector readout card 650 S-link cables (total length 6 km) linking the senders with the FRLs 56 FMM cards, collects the status of every data sources and produce a back pressure 750 RJ-45 cables for TTS signals (total length 11 km) 500 Myrinet Network Interface Cards (NICs) plugged on the FRLs 6 Myrinet switches of 256 ports each 1000 optical patch cords connecting the FRL NICs with the switches (total length 38 km) 50 Compact PCI crates sub-divided into 60 logical crates to house the FRLs (some crates contain dual-backplanes) 60 crate controller PCs with their control cables (1.6 km) 30 opticables running between USC and SCX (216 fibers each, 200 m long)

TWEPP 2007, Prague Attila RACZ / PH-CMD 7 DAQ racks types Air guide 2U Fan tray 1U Chassis 6U Heat ex. 1U Tang. Fans 4U Filling plate Racks for electronic crates (Front view) Racks for computers (Side view) Up to 44U usable space, 10 kW heat dissipation Fans Air/Water heat exchanger Water inlet/outlet Power distribution box

TWEPP 2007, Prague Attila RACZ / PH-CMD 8 Underground counting rooms The DAQ elements, the detector readout electronic and other services (i.e. High Voltage systems, Detector Safety systems) are located in a dual floor room, S1 and S2, having respectively the capacity of ~100 and ~160 racks of 60x90 cm2. The rack assignment between sub-systems has been done in order to satisfy numerous constraints. Some of them are : –Number of racks for each function –Keep minimal the latency for trigger detectors and trigger logic –maximum cable length for inter-rack cabling The final rack assignment was a very long iterative process (several years) and still some small changes/additions are requested ! Once rack assignment “stable”, design and installation of the DAQ cable trays for inter rack communication –Trays located very close to false floor tiles to keep the cable length minimal –List of every single cable/fibers, labeling… –Keep track of the tray occupancy and add always a huge contingency…

TWEPP 2007, Prague Attila RACZ / PH-CMD 9 DAQ racks in USC55 Lower floor

TWEPP 2007, Prague Attila RACZ / PH-CMD 10 DAQ racks in USC55 Upper floor

TWEPP 2007, Prague Attila RACZ / PH-CMD 11 USC DAQ installation schedule/status Rack welding in the counting rooms : Q3-Q CPCI crate controller PC install : December’05 Cable tray installation and rack equipment : January’06 Rack manifolds repair : up to July’06 Copper cabling and electronic installation : Q3-Q Optical patch cords : 7 “big” days between November’06 and January’07 –With the help of many DAQ group member from the software side ! Optical cables installation : March-April’07 (External company) Test of every single elements/cables/fibers : Q –Broken: 2 FMM cables, one optical patch cord, 3 FRLs and 2 CPCI backplanes –All these items have been changed or fixed. Since April’07, all underground DAQ hardware is used for sub-detector commissioning

TWEPP 2007, Prague Attila RACZ / PH-CMD 12 USC area S1

TWEPP 2007, Prague Attila RACZ / PH-CMD 13 USC PC S2 area

TWEPP 2007, Prague Attila RACZ / PH-CMD 14 CMS DAQ installation crew

TWEPP 2007, Prague Attila RACZ / PH-CMD 15 Surface DAQ elements All DAQ surface elements are installed in SCX5 DAQ building –640 Readout Unit/Builder Unit PCs, 2U server –160 2U servers for services :data storage, DQM, databases, Run control… – 6 Myrinet switches of 256 ports each (the same than underground) –Storage systems –6 Gigabit ethernet switches (256 ports each) –~1200 Filter Unit PCs for 50 kHz trigger rate (June 2008) –Again ~1200 PCs for 100 kHz trigger rate (Sometime in 2009) The room has a total capacity of 170 racks (see layout) and 800 kW of cooling –The remaining racks will be used for the filter units when the LHC will ramp up in luminosity hence creating more data to analyse

TWEPP 2007, Prague Attila RACZ / PH-CMD 16 DAQ building (SCX) Computer rooms Conference rooms / labs False floor Data fibers and commodity networks from the pit 352 m2, ~6m height 165 m2, ~3m height Overall dimension: ~ 14 x 30 m2 Main Control room

TWEPP 2007, Prague Attila RACZ / PH-CMD 17 Rack layout in SCX5 RUBU racks Servers/Storage/Switches Filter Units 50 kHz June 2008 IT racks Filter Units 100 kHz 2009… Contingency…

TWEPP 2007, Prague Attila RACZ / PH-CMD 18 Why water cooled racks ? Usually, plenum floors with forced cold air is used to cool down data centers –People do not like water close to PCs… With 10 kW/rack and 2kW/m2, air is no more efficient and requires a real storm –Hot spots are created what ever you do ! Water cooled racks catch the heat at the very source, hence avoiding hot spots and giving a better usage of the floor space

TWEPP 2007, Prague Attila RACZ / PH-CMD 19 Heavy computer science…

TWEPP 2007, Prague Attila RACZ / PH-CMD 20 Force10 switches and RUBU rack

TWEPP 2007, Prague Attila RACZ / PH-CMD 21 Myrinet switch and storage

TWEPP 2007, Prague Attila RACZ / PH-CMD 22 What next ? Commissioning in USC will continue with real detectors (November 2007) –Readout done from surface building very soon Central Control room installation (December 2007) 1200 Filter Unit PCs to purchase and install for June 2008 –50 kHz trigger rate capacity –Ready for first LHC collisions ~1200 Filter Unit PCs to purchase and install for 2009… –Depends on LHC program