R. Divià, U. Fuchs, P. Vande Vyvre – CERN/PH 13 June 2012.

Slides:



Advertisements
Similar presentations
André Augustinus 5 March 2007 Infrastructure status.
Advertisements

1 DCS Installation & commissioning TB 18 May 06 L.Jirden Central DCS Detector DCS Status.
André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
Computer Systems.
Supervision of Production Computers in ALICE Peter Chochula for the ALICE DCS team.
1 Responsibilities & Cost ALICE DCS Review 14 Nov 05 For the DCS Team L.Jirden.
1P. Vande Vyvre - CERN/PH ALICE DAQ Technical Design Report DAQ TDR Task Force Tome ANTICICFranco CARENA Wisla CARENA Ozgur COBANOGLU Ervin DENESRoberto.
State of Library Technology June 2003 IT Infrastructure David Leonian Library Information Systems, Tech Support.
Network Administration At SUNY Ulster. Why Network Administration?
1 Timescales Construction LCR MICE Beam Monitoring counters + DAQ My understanding Jan 07 Feb 07 Mar 07 Apr 07 May 07 Jun 07 Jul 07 Aug 07 Sep 07 ISIS.
Jennifer Rexford Princeton University MW 11:00am-12:20pm Data-Center Traffic Management COS 597E: Software Defined Networking.
Hardware and Multimedia Chapter 4. 4 Personal Computers (PCs) PCs are computers that can be: Used by individuals at home, work, or school Desktop models.
Computer Networks IGCSE ICT Section 4.
Installing Virtualisation Software and Virtual Servers.
Rack Mounted Network Equipment Patch Panels Network Switch Router Rack Servers HDD Arrays UPS System Provides circuit protection and an uninterruptable.
SR1 ongoing workH.Pernegger21/2/02 Ongoing work in SR1 and preparations for detector macro assembly Heinz Pernegger.
VC MICO Report July 07Jean-Sébastien GraulichSlide 1 Main news  DAQ test bench running in Geneva 2 LDCs connected to 2 different PCs, 1 GDCs DAQ trigger.
D0 Run IIb Review 15-Jul-2004 Run IIb DAQ / Online status Stu Fuess Fermilab.
JPV, TB of Nov.15 th, ‘Rack Status‘ Jean-Pierre Vanuxem, PH-AIS.
IMPROUVEMENT OF COMPUTER NETWORKS SECURITY BY USING FAULT TOLERANT CLUSTERS Prof. S ERB AUREL Ph. D. Prof. PATRICIU VICTOR-VALERIU Ph. D. Military Technical.
Transforming B513 into a Computer Centre for the LHC Era Tony Cass —
ALICE DAQ Plans for 2006 Procurement, Installation, Commissioning P. VANDE VYVRE – CERN/PH for LHC DAQ Club - CERN - May 2006.
+ discussion in Software WG: Monte Carlo production on the Grid + discussion in TDAQ WG: Dedicated server for online services + experts meeting (Thusday.
1 Alice DAQ Configuration DB
V. Bobillier1 Long distance cable installation status Calorimeter commissioning meeting.
1 5 December 2012 Laurent Roy Infrastructure / Electronics Upgrade -Cabling (long distance) -Rack and Crate (space, electrical distribution, cooling) -Detector.
Status and plans for online installation LHCb Installation Review April, 12 th 2005 Niko Neufeld for the LHCb Online team.
V. Altini, T. Anticic, F. Carena, W. Carena, S. Chapeland, V. Chibante Barroso, F. Costa, E. Dénes, R. Divià, U. Fuchs, I. Makhlyueva, F. Roukoutakis,
The ALICE DAQ: Current Status and Future Challenges P. VANDE VYVRE CERN-EP/AID.
DAQ & ECS for TPC commissioning A few statements about what has been done and what is still in front of us F.Carena.
1 Responsibilities & Planning DCS WS L.Jirdén.
André Augustinus 26 October 2004 ALICE Technical Board DCS for ‘services’ costs On behalf of Lennart Jirdén.
ALICE Computing Model The ALICE raw data flow P. VANDE VYVRE – CERN/PH Computing Model WS – 09 Dec CERN.
Roberto Divià, CERN/ALICE 1 CHEP 2009, Prague, March 2009 The ALICE Online Data Storage System Roberto Divià (CERN), Ulrich Fuchs (CERN), Irina Makhlyueva.
Network equipment used in a modern office
R. Fantechi. TDAQ commissioning Status report on Infrastructure at the experiment PC farm Run control Network …
ALICE Collaboration Meeting Heidelberg Services Update  Progress on services Detector cabling Gas systems Cooling systems  Rack footprint  Primary electrical.
Niko Neufeld PH/LBC. Detector front-end electronics Eventbuilder network Eventbuilder PCs (software LLT) Eventfilter Farm up to 4000 servers Eventfilter.
PERIMETER SECURITY PROTECTING THE BOUNDARIES OF YOUR INFORMATION SECURITY SYSTEM.
Infrastructure for the LHCb RTTC Artur Barczyk CERN/PH RTTC meeting,
ALICE 5 o’clock meeting Wk 51. INDICO You’ll find this meeting here: – The full shutdown planning.
IDE DCS development overview Ewa Stanecka, ID Week, CERN
Status of LS1 collimator maintenance O. Aberle – 03/03/14 With input from P. Bestmann, S. De Man and J. Lendaro.
Diagram & Icon Library Febuary 2009 Copyright © 2009 VMware, Inc. All rights reserved. This product is protected by U.S. and international copyright and.
1 LHCC RRB SG 16 Sep P. Vande Vyvre CERN-PH On-line Computing M&O LHCC RRB SG 16 Sep 2004 P. Vande Vyvre CERN/PH for 4 LHC DAQ project leaders.
P. Vande Vyvre – CERN/PH CERN – January Research Theme 2: DAQ ARCHITECT – Jan 2011 P. Vande Vyvre – CERN/PH2 Current DAQ status: Large computing.
Installation status Control Room PC farm room DetectorsEB Infrastructure 918 ECN3.
1 FVTX Detector Assembly Stephen Pate New Mexico State University (FVTX Assembly Management WBS 1.7) Stephen Pate, NMSU – FVTX Review – 16-Nov-2009.
 LAN ADVANTAGE  Workstations can share peripherals devices like printers. Cheaper that providing a printer for each computer.  Workstations do not.
Infrastructure availability and Hardware changes Slides prepared by Niko Neufeld Presented by Rainer Schwemmer for the Online administrators.
André Augustinus Infrastructure  Turbines  Radmon  Thermocouple readout  Netgear redundancy  Other cavern items  Cooling CRs  Other CR and surface.
Common meeting of CERN DAQ teams CERN May 3 rd 2006 Niko Neufeld PH/LBC for the LHCb Online team.
1 Farm Issues L1&HLT Implementation Review Niko Neufeld, CERN-EP Tuesday, April 29 th.
Infrastructures and Installation of the Compact Muon Solenoid Data AcQuisition at CERN on behalf of the CMS DAQ group TWEPP 2007 Prague,
Pierre VANDE VYVRE ALICE Online upgrade October 03, 2012 Offline Meeting, CERN.
Status of the NA62 network R. Fantechi 23/5/2012.
Future O 2 LHC P2 U. FUCHS, P. VANDE VYVRE.
R.Divià, CERN/ALICE 1 ALICE off-line week, CERN, 9 September 2002 DAQ-HLT software interface.
André Augustinus 13 June 2005 User Requirements, interlocks, cabling, racks, etc. Some remarks.
Network Move & Upgrade 2008 Les Cottrell SLAC for SCCS core services group Presented at the OU Admin Group Meeting August 21,
P. Vande Vyvre – CERN/PH for the ALICE collaboration CHEP – October 2010.
TC meeting P2 planning wks 3&4 A.Tauro
ACO & AD0 DCS Status report Mario Iván Martínez. LS1 from DCS point of view Roughly halfway through LS1 now – DCS available through all LS1, as much as.
CHEP 2010 – TAIPEI Robert Gomez-Reino on behalf of CMS DAQ group.
SPS Personnel Protection System Renovation Project
CONS and HL-LHC day Analysis of needs from BE-CO
Ingredients 24 x 1Gbit port switch with 2 x 10 Gbit uplinks  KCHF
TPC Commissioning: DAQ, ECS aspects
ProtoDUNE Installation Workshop - DAQ
ProtoDUNE SP DAQ assumptions, interfaces & constraints
Presentation transcript:

R. Divià, U. Fuchs, P. Vande Vyvre – CERN/PH 13 June 2012

Infrastructure June 2012P. Vande Vyvre – CERN/PH2 Racks: Present racks are problematic for recent machines (non-standard mechanical tolerances and rails positions) Possibly exchange some racks, or all New CIAT racks come with more powerful cooling doors (needed for GDCs and TDSs) KVM: Replace KVM solution (close to end-of-life) Power: No change inside racks (new PDUs should already be in place) Exchange Hazemeyer distribution rack ? UPS ? Fibres: 6 new DCAL SMs plus 2-1/3 SMs - 18 fibers for DAQ, 18 fibers for HLT Tasks (DAQ + TC) Racks installation Power Impact No DAQ when intervention on infrastructure

DAQ Network June 2012P. Vande Vyvre – CERN/PH3 Network: Major change: Replace central router (heavy weight objects: need special transport) Change the cabling structure (lots of cabling to be done) Router: use of the IT network frame contract. Central network F10 router replaced by a Brocade Testing of the Brocade has started Tasks (DAQ team + Transport): Order the Brocade router Install new router Remove CAT5 cabling Install fibers from router to racks (10Gbps uplinks) Install top-of-rack switches Impact: No DAQ during when intervention on network

Computing equipment June 2012P. Vande Vyvre – CERN/PH4 Computers Replacement of a large fraction of the readout computers (LDCs) Other old machines to be replaced (ACR PCs and screens) Testing of replacement models has started Remove 'old' machines Out-of-warranty machines can be used for labs Storage Replace old FC switches Replace old disk arrays with a new and more compact model Tasks (DAQ team): Finish testing Market survey and price enquiry (possibly joining forces with IT) Order equipment Install equipment Impact: Reduced DAQ when intervention on computing

Services needed during LS1 June 2012P. Vande Vyvre – CERN/PH5 Logbook: Must be always available Move it to Meyrin site whenever needed ACR: What has to be available and when (DCS e.g.) DAQ system to support detectors activities: Long CR1 downtimes unavoidable. May try to ensure minimal services (e.g. central servers only). Needs to be prepared in advance Will not be always possible Need input on when/what this will be required Re-commissioning: when ? Testing and validation exercises involving multiple projects (CTP, DAQ, ECS, HLT, DCS, Offline...)

AOB June 2012P. Vande Vyvre – CERN/PH6 Protections needed in CRs and ACR during other interventions: Power glitches/losses Cooling interruptions Mechanical interventions (dusting, moving, pulling...) ACR Re-allocation of resources (RC/TC driven) New equipment needed (apart from the PCs): Projector Printers Screens (less & bigger?) Anti-static protections Large screens (still OK, how long more will they work?)