Status and plans for online installation LHCb Installation Review April, 12 th 2005 Niko Neufeld for the LHCb Online team.

Slides:



Advertisements
Similar presentations
André Augustinus 16 June 2003 DCS Workshop Safety.
Advertisements

LECC V. Bobillier1 Overview of LHCb Electronics installation aspects V. Bobillier; L. Roy; J. Christiansen CERN - LHCb.
Trigger-less and reconfigurable data acquisition system for J-PET
1 Timescales Construction LCR MICE Beam Monitoring counters + DAQ My understanding Jan 07 Feb 07 Mar 07 Apr 07 May 07 Jun 07 Jul 07 Aug 07 Sep 07 ISIS.
LHCb readout infrastructure NA62 TDAQ WG Meeting April 1 st, 2009 Niko Neufeld, PH/LBC.
Niko Neufeld CERN PH/LBC. Detector front-end electronics Eventbuilder network Eventbuilder PCs (software LLT) Eventfilter Farm up to 4000 servers Eventfilter.
Experimental Area Meeting V. Bobillier1 Good connection to earth (in many points) of every metallic parts to be installed in the cavern is very.
MSS, ALICE week, 21/9/041 A part of ALICE-DAQ for the Forward Detectors University of Athens Physics Department Annie BELOGIANNI, Paraskevi GANOTI, Filimon.
JPV, TB of Nov.15 th, ‘Rack Status‘ Jean-Pierre Vanuxem, PH-AIS.
Summary DCS Workshop - L.Jirdén1 Summary of DCS Workshop 28/29 May 01 u Aim of workshop u Program u Summary of presentations u Conclusion.
R. Divià, U. Fuchs, P. Vande Vyvre – CERN/PH 13 June 2012.
V. Bobillier1 Long distance cable installation status Calorimeter commissioning meeting.
1 5 December 2012 Laurent Roy Infrastructure / Electronics Upgrade -Cabling (long distance) -Rack and Crate (space, electrical distribution, cooling) -Detector.
CERN Real Time conference, Montreal May 18 – 23, 2003 Richard Jacobsson 1 Driving the LHCb Front-End Readout TFC Team: Arek Chlopik, IPJ, Poland Zbigniew.
1 Responsibilities & Planning DCS WS L.Jirdén.
13 June 2013Laurent Roy Optical Fibers / Electronics Upgrade -Fibers need for Upgrade -> Installation ? -Two techniques possible for multi fibers installation.
A water-cooling solution for PC-racks of the LHC experiments  The cooling concept  Rack Model  Cooling Test  Rack Performance  Failure Tests  Future.
Dec. 19, 2006TELL1 commissioning for Calorimeters 1 TELL1 commissioning for calorimeters ■ Reminder ■ TELL1 status ■ ECS for TELL1- PVSS panels ■ Firmware.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
R. Fantechi. TDAQ commissioning Status report on Infrastructure at the experiment PC farm Run control Network …
Niko Neufeld PH/LBC. Detector front-end electronics Eventbuilder network Eventbuilder PCs (software LLT) Eventfilter Farm up to 4000 servers Eventfilter.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
CERN, O.Pinazza: ALICE TOF DCS1 ALICE TOF DCS Answers to DCS Commissioning and Installation related questions ALICE week at CERN O. Pinazza and.
LHCb DAQ system LHCb SFC review Nov. 26 th 2004 Niko Neufeld, CERN.
LHCb Experimental Area Meeting V. Bobillier1 Earth network in UX85 experimental cavern.
Paolo Guglielmini - TS / CV LHCb Install Meeting - 20/04/2005 LHCb Install Meeting 20/04/05 TS / CV / Detector Cooling Installations.
Installation status Control Room PC farm room DetectorsEB Infrastructure 918 ECN3.
Niko Neufeld, CERN/PH. Online data filtering and processing (quasi-) realtime data reduction for high-rate detectors High bandwidth networking for data.
Management of the LHCb Online Network Based on SCADA System Guoming Liu * †, Niko Neufeld † * University of Ferrara, Italy † CERN, Geneva, Switzerland.
1 Calorimeters LED control LHCb CALO meeting Anatoli Konoplyannikov /ITEP/ Status of the calorimeters LV power supply and ECS control Status of.
P. Paolucci - I.N.F.N. of Napoli 1 RPC workshop, 2 march 07 RPC Power system P. Paolucci, G. Polese and D. Lomidze I.N.F.N. di Napoli.
February 14, Comprehensive Review B. Schmidt Outline: Status and overview of activities: –Platforms and staircases –Muon Filters and beam plugs.
DAQ Status & Plans GlueX Collaboration Meeting – Feb 21-23, 2013 Jefferson Lab Bryan Moffit/David Abbott.
1 13 Octobre 2011 Laurent Roy ATCA Crate / Infrastructure in place -Rack in D3: dimension -Electrical power availability -Tell1 racks running in D3 (air.
DAQ interface + implications for the electronics Niko Neufeld LHCb Electronics Upgrade June 10 th, 2010.
Walter Snoeys – CERN – PH – ESE – ME –TOTEM October 2009 TOTEM Electronics Status.
14 November 08ELACCO meeting1 Alice Detector Control System EST Fellow : Lionel Wallet, CERN Supervisor : Andre Augustinus, CERN Marie Curie Early Stage.
Common meeting of CERN DAQ teams CERN May 3 rd 2006 Niko Neufeld PH/LBC for the LHCb Online team.
1 Farm Issues L1&HLT Implementation Review Niko Neufeld, CERN-EP Tuesday, April 29 th.
Niko Neufeld HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
Clara Gaspar, April 2006 LHCb Experiment Control System Scope, Status & Worries.
Niko Neufeld, CERN. Trigger-free read-out – every bunch-crossing! 40 MHz of events to be acquired, built and processed in software 40 Tbit/s aggregated.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Infrastructures and Installation of the Compact Muon Solenoid Data AcQuisition at CERN on behalf of the CMS DAQ group TWEPP 2007 Prague,
PSD upgrade: concept and plans - Why the PSD upgrade is necessary? - Concept of the PSD temperature stabilization and control - Upgrade of HV control system.
Status of the NA62 network R. Fantechi 23/5/2012.
EMu Slice Test -- Status Frank Geurts FNAL/Rice
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
Totem Gas Systems ( T1, T2 ) F. Hahn and S. Haider PH / DT1.
Power, Cooling, Cables, Simulations of Heat Dissipation Short overview of ongoing work on these points Technical Coordination (Rolf Lindner et al.) Cooling.
Introduction to DAQ Architecture Niko Neufeld CERN / IPHE Lausanne.
GlueX Collaboration May05 C. Cuevas 1 Topics: Infrastructure Update New Developments EECAD & Modeling Tools Flash ADC VXS – Crates GlueX Electronics Workshop.
6-Dec-05PC Commissioning1 Peripheral Crate (PC) Commissioning Status Fred Borcherding, CSC/EMU Group.
R. Fantechi 2/09/2014. Milestone table (7/2014) Week 23/6: L0TP/Torino test at least 2 primitive sources, writing to LTU, choke/error test Week.
VC98 March 07Jean-Sébastien GraulichSlide 1 DDAQ Status o DAQ Software o Front-End and Trigger o What will happen soon o Schedule Milestones Jean-Sebastien.
Giovanna Lehmann Miotto CERN EP/DT-DI On behalf of the DAQ team
Challenges in ALICE and LHCb in LHC Run3
09/02/2006 Muon week HV and LV systems status for Magnet Test S. Braibant, P. Giacomelli, M. Giunta.
Controlling a large CPU farm using industrial tools
RT2003, Montreal Niko Neufeld, CERN-EP & Univ. de Lausanne
ProtoDUNE Installation Workshop - DAQ
EMU Slice Test Run Plan a working document.
ProtoDUNE SP DAQ assumptions, interfaces & constraints
The LHCb Event Building Strategy
TileCal ROD Commissioning Counting Room Installation
Trigger M&O Progress -- Florida, Rice, UCLA, Wisconsin
Network Processors for a 1 MHz Trigger-DAQ System
Throttling: Infrastructure, Dead Time, Monitoring
19-April-05 Peripheral Crate Testing, Installation, and Commissioning PC I&C Feb ’06 EMU Fred Borcherding 16-May-19 PC I&C Feb'06.
CSC Electronics Installation and Commissioning CSCE I&C
Presentation transcript:

Status and plans for online installation LHCb Installation Review April, 12 th 2005 Niko Neufeld for the LHCb Online team

Niko Neufeld CERN, PH 2 Components of the Online System Experiment Control System (ECS): run- control and detector control Timing and Fast-Control (TFC): timing and trigger distribution Data Acquisition (DAQ): data movement, data storage and data processing General infrastructure: control room, general purpose (wireless) networking,

Niko Neufeld CERN, PH 3 ECS Physical components of the ECS include: –(A few) fibers (DSS, Ethernet connection SX85- UX85) –Ethernet network (in UX85 and SX85) –Control PCs –CAN and SPECS cables for detector slow control (in the counting houses and towards the detector), these are covered in “Cabling of the Experiment” by V. Bobillier

Niko Neufeld CERN, PH 4 DAQ Physical components of DAQ: –Ethernet network Cables from and in D1, D2, D3 (detector electronics) Switches in D2 –Event Filter Farm Racks in D1 PCs and switches in D1 –Storage system in SX85 –Barrack layout: EDMS , , D3 D2 D1

Niko Neufeld CERN, PH 5 TFC Physical components of TFC: –Electronics modules (number [[FIXME]], 4 types 2 racks: –Fibers from UX to SX via PZ –Fibers from and in D3 barrack (connecting TFC modules and detector electronics) EDMS xxxyyy vz

Niko Neufeld CERN, PH 6 General (online) infrastructure Wire-less and wired networking in SX85 and 2889: has been ordered at IT/CS – expected to be ready by end of July 2005 Workstations in the Control Room: will be installed gradually as needed from September 2005 on Central (disk) servers for ECS and DAQ tests: basic infrastructure will be installed by November 2005

Niko Neufeld CERN, PH 7 Fibers (SX to UX) Fibers from SX85 to UX85 (D2 barrack) in PZ shaft –24 pairs of Multi-mode fibers DSS (profi-bus) connectivity (2 pairs) DSS Ethernet (1 pair) ECS Ethernet (1 – 2 pairs) –12 pairs of Mono-mode fibers DAQ (1 pair) TTC Machine timing signals (2 pairs) DSS PLC Synchronization (2 pairs) Installation finished by TS/EL/OF Easy to add new fibers, if needed

Niko Neufeld CERN, PH 8 TFC Fibers in UX85 “L0” front-end fibers: long distance (up to ~ 60 m) from D3 to detector front-end: 35, together with bulk of cables / fibers (c.f. V. Bobillier) “L1” fibers: short distance (all in the D3 barrack: max 15 m) to detector crates. Will be installed by end of 05 (TS/EL/OF for connections from patchpanel to pp, LHCb for the patch-cords)

Niko Neufeld CERN, PH 9 Ethernet cabling UX85 Logical view of D2 barrack 1544 cables from D3 (detector) to D2 500 Cables between D1 (farm) and D2 406 cables internal in D2 (central DAQ and ECS)

Niko Neufeld CERN, PH 10 Ethernet cabling UX85 Cables for the DAQ and the ECS Ethernet network –Short distances (farm in UX85/D1) allow using cheap copper infrastructure An important part of the Online installation –Total of ~2400 CAT6 UTP cables (4800 patch-panel connections) –50 km total of cables –Maximum cable length: 36 m. LHCb is 10 Gigabit over copper ready –EDMS v4 Work in progress: 65% finished, should be ready by 01/05/05 –work done by AMEC-SPIE organised by IT/CS (M. Da Costa & E. Sallaz)

Niko Neufeld CERN, PH 11 Cabling between D2 and D1 D1 D2 D1 D2

Niko Neufeld CERN, PH 12 Rack installation for Event Filter Farm in D1 barrack Old 59U Delphi racks have been refurbished Each of the 50 racks will be preinstalled with: –Power bars (6) –Angles (88) (with 2 screws each) –Spacer bars (4) –Rack-cooler door (1) –Ethernet patch-cables (88) The PCs will be mounted late as possible (cost!). Lot of work per PC: unpacking, installing, connecting and testing Need more manpower for these tasks

Niko Neufeld CERN, PH 13 Preparation of racks

Niko Neufeld CERN, PH 14 Horizontal Cooling of PC racks Joint project of LHC experiments, pioneered by LHCb 75 pieces build to our specifications by CIAT (10 kW cooling capacity) Will be installed from May 05 in Point 8

Niko Neufeld CERN, PH 15 Online System Integration Test: Real Time Trigger Challenge Test of a vertical slice of the DAQ and software- triggers scheduled for June 2005 Prototype farm together with readout network, event- builders and trigger algorithms Installed, cabled and operated in the manner as anticipated for operation in Point 8 Complete with ECS for run-control (trigger- algorithms) and slow-control (rack-coolers, CPU fans etc…) In a second stage can also test TFC and Readout Boards

Niko Neufeld CERN, PH 16 Planning

Niko Neufeld CERN, PH 17 Conclusions Installation of LHCb Online system well on track Infrastructure for Detector Commissioning ready for September 2005 All parts of Online system will be tested together in the Real Time Trigger Challenge Additional Manpower needed for Event Filter Farm installation Many thanks to the Experimental Area team and the IT/CS and TS/EL groups