Vault Reconfiguration IT DMM January 23 rd 2002 Tony Cass —

Slides:



Advertisements
Similar presentations
INTD 51 human environments building systems. heating/ventilation/air-conditioning (HVAC) maintain a comfortable indoor climate control temperature and.
Advertisements

Engineering Department EN Cooling and Ventilation issues 1 FUTURE EXPLOITATION OF THE EAST AREA EN-CV Michel OBRECHT.
The CDCE BNL HEPIX – LBL October 28, 2009 Tony Chan - BNL.
Cloud Computing Data Centers Dr. Sanjay P. Ahuja, Ph.D FIS Distinguished Professor of Computer Science School of Computing, UNF.
Telenor Tier 3 Data Center April About Telenor Tier 3 Data Center Telenor built it´s own Data Centar in accordance with the latest industrial standards.
UPS Improvements to Beam Availability at the Australian Synchrotron Light Source Don McGilvery.
MPE Workshop 14/12/2010 Building 107- project Raphaël Berberat
All content in this presentation is protected – © 2008 American Power Conversion Corporation Rael Haiboullin System Engineer Capacity Manager.
Steve Craker K-12 Team Lead Geoff Overland IT and Data Center Focus on Energy Increase IT Budgets with Energy Efficiency.
CERN IT Department CH-1211 Genève 23 Switzerland t Options for Expanding CERN’s Computing Capacity Without A New Building Medium term plans.
Charles F. Hurley Building Case Study B.J. Mohammadipour Bureau of State Office Buildings.
October 23rd, 2009 Visit of CMS Computing Management at CC-IN2P3.
June 2004 Andrea Gaddi, PH-CMI USC Installation Commissioning Task Force 1st Run Meeting, June 3rd, 2004.
The HiLumi LHC Design Study is included in the High Luminosity LHC project and is partly funded by the European Commission within the Framework Programme.
SR1 ongoing workH.Pernegger21/2/02 Ongoing work in SR1 and preparations for detector macro assembly Heinz Pernegger.
Powering the main linac implications Daniel Siemaszko, Serge Pittet OUTLINE : Cost impact of power converters, power consumption and powering.
IT Department 29 October 2012 LHC Resources Review Board2 LHC Resources Review Boards Frédéric Hemmer IT Department Head.
Preliminary Studies of the Electric Power Distribution for CLIC
CV activities on LHC complex during the long shutdown Serge Deleval Thanks to M. Nonis, Y. Body, G. Peon, S. Moccia, M. Obrecht Chamonix 2011.
T H E U N I N T E R R U P T I B L E P O W E R P R O V I D E R SineWave - Installation rules.
Air Conditioning and Computer Centre Power Efficiency The Reality Christophe Martel Tony Cass.
Computer Operations Group Data Centre Facilities Hitendra Patel June 2015.
FIO Services and Projects Post-C5 February 22 nd 2002 CERN.ch.
Transforming B513 into a Computer Centre for the LHC Era Tony Cass —
Building A Computer Centre HEPiX Large Cluster SIG October 21 st 2002 CERN.ch.
New Data Center at BNL– Status Update HEPIX – CERN May 6, 2008 Tony Chan - BNL.
Design of HVAC systems for bld 3862 – LHCb control room and for bld 2885 – LHCb new workshop.
Physical Infrastructure Issues In A Large Centre July 8 th 2003 CERN.ch.
Learning, our way Daikin Europe, Consulting sales section1 Understanding Client’s Needs and best Daikin solution.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Facilities Evolution Wayne Salter / Vincent Doré.
ALMA Archive Operations Impact on the ARC Facilities.
Computer Centre Upgrade Status & Plans Post-C5, June 27 th 2003 CERN.ch.
Consolidation and Upgrade of the SPS Electrical Networks Current Status JP. Sferruzza, on behalf of EN/EL.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Centre Upgrade Project Wayne Salter HEPiX November.
Nikhef/(SARA) tier-1 data center infrastructure
Utility Engineers, PC.  Generation  Transmission  Distribution.
Jean-Paul BURNET, AB/PO ATC-ABOC days, 22 January 2007 PS Main Power System: What has been done from June 2006? What will be the situation in 2007? PS.
High Performance Computing (HPC) Data Center Proposal Imran Latif, Facility Project Manager Scientific & Enterprise Computing Data Centers at BNL 10/14/2015.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Centre Consolidation Project Vincent Doré IT Technical.
CERN Computer Centre Tier SC4 Planning FZK October 20 th 2005 CERN.ch.
ILC Test Facility at New Muon Lab (NML) S. Nagaitsev Fermilab April 16, 2007.
SC Project Review of NCSX, April 8-10, 2008 Plant Design Integration, General Arrangement Drawings E. D. Perry.
1 MICE PM Report Infrastructure/integration progress Electrical work AFC, diffuser, etc Help with project control & staffing issues Worries & problems.
Integration and Installation Status for Collaboration Meeting Ans Pardons, 29/9/2015.
A New Vision for the Computer Centre Tony Cass —
Computer Centre Upgrade Status & Plans Post-C5, October 11 th 2002 CERN.ch.
CERN - IT Department CH-1211 Genève 23 Switzerland t Power and Cooling Challenges at CERN IHEPCCC Meeting April 24 th 2007 Tony Cass.
CD-doc-650 Fermilab Computing Division Physical Infrastructure Requirements for Computers FY04 – 07 (1/4/05)
Design considerations for the FCC electrical network architecture FCC week 2016, Rome, April 2016 Davide Bozzini - CERN/EN/EL With the contribution.
Introduction to the Data Centre Track HEPiX St Louis 7 th November 2007 CERN.ch.
.  1.Leaderships in energy and environmental design (LEED). 2.Khalifa tower Environment friendly. 3.Elevators of Khalifa tower. 4.Evacuation and fire.
POPS: Power for PS A novel 60 MW Pulsed Power System based on Capacitive Energy Storage Jean-Paul Burnet June 2010.
CMS CB, 22 June CMS Centre Status Lucas Taylor CMS Centre Project Manager CMS Collaboration Board CMS Week, June CMS Centre Status 2.New offices.
West Cambridge Data Centre Ian Tasker Information Services.
CV Activities for PSB-LIU POPS-B (new MPS for the PSB-LIU) B 245 PSB Demi water cooling plant Stefano Moccia, EN-CV Meeting 07/03/16.
CD-doc-650 Fermilab Computing Division Physical Infrastructure Requirements for Computers FY04 – 07 (1/4/05)
Update on the PSB Cooling and Ventilation Systems
CMS Centre Project - Green Light Meeting
SR1 extension follow up 27-Mar-17 Marco Ciapetti.
CLIC Civil Engineering Update
CERN Data Centre ‘Building 513 on the Meyrin Site’
Background B513 built in early 1970’s for mainframe era.
Reuters - on the left once you see this leaving Vesenaz
B. 163 facility upgrade: work space options in B. 163 and B. 103 F
SPS activities during LS2 TE-EPC (Power Converter Group)
COOLING & VENTILATION INFRASTRUCTURE
Cloud Computing Data Centers
TEST PLANS for HL LHC IT STRING
Maintenance at ALBA facility: Strategy for the conventional services
Cloud Computing Data Centers
Presentation transcript:

Vault Reconfiguration IT DMM January 23 rd 2002 Tony Cass —

CERN.ch 2 Reminder  Estimated Space and Power Requirements for LHC Computing –2,500m 2 — increase of ~1,000m 2 –2MW — nominal increase of 800kW (1.2MW above current load)  Conversion of Tape Vault to Machine Room area agreed at post-C5 in June –Best option for space provision –Initial cost estimate of 1,300-1,400KCHF

CERN.ch 3 Why Today’s Talk?  Serious money will be committed by Friday. –Last chance for Divisional Management to comment on the conversion.  Some choices need to be validated –No remote control of electrical power distribution –No pre-heating of fresh air intake in case of fire –Future arrangement of storage space in former MG room.

CERN.ch 4  Convert the tape vault to Machine Room area of ~1,200m 2 –False floor, finished height of 70cm –6  “In room” air conditioning units. »Total cooling capacity: 500kW –5  130kW electrical cabinets »Double power input »5 or 6 20kW normabarres/PDU u 3-4 racks of 44PCs/normabarre –2  130kW cabinets supplying “critical equipment area” »Critical equipment can be connected to each PDU »Two zones, one for network equipment, one for other critical services. –Plus smoke detection, but no fire extinction The Project

CERN.ch 5 The Cost Vault Conversion Civil Engineering446K Electrical Installations410K Cooling and Ventilation500K Network infrastructure120K Fire Detection135K Total~1,600K  Cost increase since June 2001 is almost entirely due to addition of fire detection and replacement of lighting. Some small increase in cost of false floor and similar items (access ramps/stairs, reconditioning the floor, …).

CERN.ch 6 The Schedule  December 2001 –Start Dismantling »Old false floor, electrical & data network and much ducting gone.  February/March –Major civil engineering, Vault then MG room  April –Install air conditioning & smoke extraction  May/June –Install new electrical distribution –Install false floor –Install network cables and equipment  End July –Vault available to house new equipment.

CERN.ch 7 Any Alternatives?  Rent space from, e.g., PSI, Digiplex, Telehouse –Price before the dot.com crash: CHF500/m 2 /month –Price now: CHF150/m 2 /month for >500m 2 »CHF1.8M for 1,000m 2 for one year. »This even after I told them of our conversion costs. »Includes some ongoing costs (e.g. HVAC maintenance), but not electricity consumption.

CERN.ch 8 Items to approve  Layout of storage zones in old MG room –Some for existing users –Some for “PC Shop” to replace barn and archive store –Choice never perfect, but layout seems reasonable.  Remote Supervision of Electrical Supply –Operators will be able to view state of electrical supply equipment at the normabarre level, but will not be able to switch power on or off remotely (but manual control will be possible, unlike today). »ST recommend against as additional equipment introduces additional risk of failure. »Same choice will be made for main Machine Room in future refurbishment.  No pre-heating of replacement air in case of fire. –Input air could be as low as –12C if there is a fire in Winter. Service managers wanted preheating, but »this is very difficult to achieve, and »do really we want two 120kW heating coils under the false floor?

CERN.ch 9 Using the extra space  Air conditioning capacity, not space, limits use of the vault. –Maximum heat load of 500W/m 2 »Machine Room can support heat load of 1.5kW/m 2 –PCs generate 1,500W/m 2 even with today’s low density racking. –Use the vault for equipment with low power density. »Tape Robots! Pillars are a problem. However, with imagination, a cluster of 8 robots can be fitted into the vault. 

CERN.ch 10 The Next Steps  Create a new Substation for B513 –To power 2MW of computing equipment plus air- conditioning and ancillary loads. –Included in the site-wide 18kV loop—more redundancy. –Favoured location: –Underground, but 5 transformers on top.  Refurbish the main Computer Room once enough equipment has moved to the vault.