Physical Infrastructure Issues In A Large Centre July 8 th 2003 CERN.ch.

Slides:



Advertisements
Similar presentations
Computer Room Requirements for High Density Rack Mounted Servers Rhys Newman Oxford University.
Advertisements

André Augustinus 15 March 2003 DCS Workshop Safety Interlocks.
Computer Room Provision in Atlas and R89 Graham Robinson.
TYPES OF MECHANICAL SYSTEMS
Cooling Strategies for Small IT Rooms Presented by Jim Magallanes TechniCool Innovations Inc Upham Street Suite B-1, Broomfield.
Enhancing protection for the most valuable assets of the Knowledge Economy.
Zycko’s data centre proposition The very best products and technologies Focus on what your customers need Turning products into solutions Wide range of.
PG&E and Altera Data Center Energy Efficiency Project.
02/24/09 Green Data Center project Alan Crosswell.
CIT 470: Advanced Network and System AdministrationSlide #1 CIT 470: Advanced Network and System Administration Data Centers.
Computer Room Experiences A medium sized tier-2 site view Pete Gronbech GridPP Project Manager HEPIX April 2012.
How PC Power Supplies Work If there is any one component that is absolutely vital to the operation of a computer, it is the power supply. Without it, a.
Managing a computerised PO Operating environment 1.
A New Building Data Center Upgrade capacity and technology step 2011 – May the 4 th– Hepix spring meeting Darmstadt (de) Pascal Trouvé (Facility Manager.
Steve Craker K-12 Team Lead Geoff Overland IT and Data Center Focus on Energy Increase IT Budgets with Energy Efficiency.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Business Continuity Overview Wayne Salter HEPiX April 2012.
Low Pressure, Continuous Flow Breathing Air System.
GE Critical Power © 2010 General Electric Company. All Rights Reserved. This material may not be copied or distributed in whole or in part, without prior.
October 23rd, 2009 Visit of CMS Computing Management at CC-IN2P3.
Applying Precision Air Conditioning Systems
BENEFITTING FROM GREEN IT – FINDINGS FROM THE SUSTEIT PROJECT Professor Peter James
Air Conditioning and Computer Centre Power Efficiency The Reality Christophe Martel Tony Cass.
COMP 4923 A2 Data Center Cooling Danny Silver JSOCS, Acadia University.
Computer Operations Group Data Centre Facilities Hitendra Patel June 2015.
1 November 2008 David Hall Sales Manager – New Business Data Centre Hosting for the M-Business.
Transforming B513 into a Computer Centre for the LHC Era Tony Cass —
Planning the LCG Fabric at CERN openlab TCO Workshop November 11 th 2003 CERN.ch.
Building A Computer Centre HEPiX Large Cluster SIG October 21 st 2002 CERN.ch.
Review of the Project for the Consolidation and Upgrade of the CERN Electrical Distribution System October 24-26, 2012 THE HIGH-VOLTAGE NETWORK OF CERN.
Infrastructure Gina Dickson – Product Manager Kevin Jackson – Sales Manager.
A water-cooling solution for PC-racks of the LHC experiments  The cooling concept  Rack Model  Cooling Test  Rack Performance  Failure Tests  Future.
Unit 2 -Gas And Diesel Power Plants
Thermal-aware Issues in Computers IMPACT Lab. Part A Overview of Thermal-related Technologies.
Infrastructure Improvements 2010 – November 4 th – Hepix – Ithaca (NY)
ALMA Archive Operations Impact on the ARC Facilities.
Computer Centre Upgrade Status & Plans Post-C5, June 27 th 2003 CERN.ch.
CERN.ch 1 Issues  Hardware Management –Where are my boxes? and what are they?  Hardware Failure –#boxes  MTBF + Manual Intervention = Problem!
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Centre Upgrade Project Wayne Salter HEPiX November.
Nikhef/(SARA) tier-1 data center infrastructure
Managing the CERN LHC Tier0/Tier1 centre Status and Plans March 27 th 2003 CERN.ch.
DSS  Informal review Tuesday 24 th October 10:00 Room , organized by Laurent Roy.  Subsystems will not be allowed to run overnight or at weekends.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Centre Consolidation Project Vincent Doré IT Technical.
Small Telco Facility Protection Small, business-critical spaces where business interruption cannot be tolerated.
CERN Computer Centre Tier SC4 Planning FZK October 20 th 2005 CERN.ch.
Overview of the main events related to TS equipment during 2007 Definition Number and category of the events Events and measures taken for each machine.
1 13 Octobre 2011 Laurent Roy ATCA Crate / Infrastructure in place -Rack in D3: dimension -Electrical power availability -Tell1 racks running in D3 (air.
Energy Savings in CERN’s Main Data Centre
Teknologi Pusat Data 12 Data Center Site Infrastructure Tier Standard: Topology Ida Nurhaida, ST., MT. FASILKOM Teknik Informatika.
Computer Centre Upgrade Status & Plans Post-C5, October 11 th 2002 CERN.ch.
Vault Reconfiguration IT DMM January 23 rd 2002 Tony Cass —
CERN - IT Department CH-1211 Genève 23 Switzerland t Power and Cooling Challenges at CERN IHEPCCC Meeting April 24 th 2007 Tony Cass.
All content in this presentation is protected – © 2008 American Power Conversion Corporation Row Cooling.
Physical Security Concerns for LAN Management By: Derek McQuillen.
1 ITM 1.2 How IT Decisions Impact Data Center Facilities: The Importance of Collaboration Lars Strong P.E. Upsite Technologies, Inc.
COOLING & VENTILATION PLANTS M. Nonis – CERN EN Department / CV Group Annual Meeting of the FCC study – Rome 14 th April 2016.
August 2007Andrea Gaddi, CERN Physics Dept. Detectors Infrastructures & Services Baseline to design detectors infrastructures. Tentative arrangement of.
Cooling Water and Heat Recovery system Thomas Hjern Harald Bäck Design Leader Process systems Conventional Facilities Accelerator Review November 3-4,
West Cambridge Data Centre Ian Tasker Information Services.
CERN Disk Storage Technology Choices LCG-France Meeting April 8 th 2005 CERN.ch.
The Genome Sequencing Center's Data Center Plans Gary Stiehr
Dell EMC Modular Data Centers
Free Air Cooling for Data Centres
CANOVATE MOBILE (CONTAINER) DATA CENTER SOLUTIONS
CERN Data Centre ‘Building 513 on the Meyrin Site’
Background B513 built in early 1970’s for mainframe era.
the CERN Electrical network protection system
COOLING & VENTILATION INFRASTRUCTURE
Where Does the Power go in DCs & How to get it Back
The Problem ~6,000 PCs Another ~1,000 boxes But! Affected by:
Presentation transcript:

Physical Infrastructure Issues In A Large Centre July 8 th 2003 CERN.ch

Physical Infrastructure Issues In A Large HEP Centre July 8 th 2003 CERN.ch

CERN.ch 3 Why the change of title?  I only have experience with an HEP centre…  Non-commercial nature of the task we support influences choices –CERN does not lose money if the centre is not working. »At worst, accelerator exploitation stops—but failures elsewhere are much more likely. –We can’t write costs off against (increased) profits »Pressure to minimise investment –We know we are planning for the long term »B513 upgrade planning started in LHC operation starts in 2007 and continues for years.

CERN.ch 4 Fundamental requirements  Flexibility  Adaptability  Futureproof  Able to cope with change

CERN.ch 5 Fundamental requirements  Flexibility  Reliability & Redundancy

CERN.ch 6 Plan for the rest of the talk  Equipment –needs an electrical supply –turns electrical power to heat at 100% efficiency –takes up space –can start a fire

CERN.ch 7 Electrical Supply Issues  Demand  Multi-level redundancy  Backup  Low voltage distribution network  Recovery from failure  Adapted to the load

CERN.ch 8 Electrical Supply Issues  Demand –How will this grow? Impact of »demand for CPU capacity »Power efficiency of computing systems –Future unclear »Plausible historical evidence that W/SpecInt is constant »Vendor statements about power budget for system type  Multi-level redundancy  Backup  Low voltage distribution network  Recovery from failure  Adapted to the load

CERN.ch 10 Electrical Supply Issues  Demand –How will this grow? Impact of »demand for CPU capacity »Power efficiency of computing systems –Future unclear »Plausible historical evidence that W/SpecInt is constant »Vendor statements about power budget for system type  Multi-level redundancy  Backup  Low voltage distribution network  Recovery from failure  Adapted to the load

CERN.ch 11 Electrical Supply Issues  Demand  Multi-level redundancy –High voltage supply & switchboards –Transformers –UPS –Low voltage supply, switchboards and distribution –[Equipment power supplies]  Backup  Low voltage distribution network  Recovery from failure  Adapted to the load

CERN.ch 12 Electrical Supply Issues  Demand  Multi-level redundancy  Backup –Independent high voltage supply »CERN lucky with supply from EDF and EoS –UPS options »Rotary »Static with/without diesel backup –Test your backup! »Override complaints that this causes pain elsewhere.  Low voltage distribution network  Recovery from failure  Adapted to the load

CERN.ch 13 Electrical Supply Issues  Demand  Multi-level redundancy  Backup  Low voltage distribution network –Can run cables to PDUs as necessary –Pre-installed busbar system much better –Flexibility: distribution network capacity should exceed available power. »B513: 2MW available, network sized to deliver 3MW  Recovery from failure  Adapted to the load

CERN.ch 14 Electrical Supply Issues  Demand  Multi-level redundancy  Backup  Low voltage distribution network  Recovery from failure –Pre-failure triage »We will support 250kW on diesels. Physics load shed after 5mins. –Power should stay off if it goes off –Group services using the low voltage network »Easy to power on services, not simply machines »Consider dedicated supply (busbar) for, e.g., network switches  Adapted to the load

CERN.ch 15 Electrical Supply Issues  Demand  Multi-level redundancy  Backup  Low voltage distribution network  Recovery from failure  Adapted to the load –Switched mode power supplies are badly behaved »generate harmonics and high currents in the neutral conductor –Early PC systems had power factor of 0.7. Recently installed systems are better behaved (0.95) in line with EU directive. Can we assume 0.95 in future? »If not, need to install filters or higher rated equipment upstream (UPS, switchboards, transformers)

CERN.ch 16 HVAC Issues  Water cooling  Air cooling  Acceptable temperature  Redundancy

CERN.ch 17 HVAC Issues  Water cooling is efficient, but –Who has the infrastructure (anymore)? »and this infrastructure is not very flexible –Will vendors really ship systems requiring this? –Could have water cooled 19” racks, but these still need the infrastructure and how do you ensure heat transfer from the PCs?  Air cooling  Acceptable temperature  Redundancy

CERN.ch 18 HVAC Issues  Water cooling  Air cooling is simple and flexible, but –low heat capacity –limited to 2kW/m 2 in our high computer centre and just 600W/m 2 in the vault. –Environmental problems: »Noise in vault is 79dB, close to 85dB limit for continuous exposure. May be legal, but is not comfortable (so what? ) »Will need air flow of 510,000m 3 /h in machine room or 60 air changes/hour. –Need to monitor air flow—convection may not be enough to drive adequate flow even in hot/cold aisle layout.  Acceptable temperature  Redundancy

CERN.ch 19 HVAC Issues  Water cooling  Air cooling  Acceptable temperature –Fortunately, PC equipment is relatively tolerant –Set point of machine room is 21  C –Vault set point is 25  C, Machine room will be 26  C »Need 27  C for 3.5MW load across ground floor (2250m 2 ) –But this is set point; local temperatures are higher »30  C+ in hot aisle in the vault  Redundancy

CERN.ch 20 HVAC Issues  Water cooling  Air cooling  Acceptable temperature  Redundancy –with a 2MW load in the computer centre, losing the air conditioning is as dramatic as losing the power supply. »heat rises above acceptable level in 10minutes »and no privileged situation for the critical load –Large HVAC equipment (chillers, cooling stations) doesn’t come with dual power supplies… –Still too many single points of failure in B513 »Human error, though, has been cause of most failures in last 5 years. Equipment is reliable, but is it coming to the end of its reasonable working life?

CERN.ch 21 Space  Is Space a problem? –HVAC limits heat load to 2kW/m 2. –40x150W PCs in shelving units generate 1.25kW/m 2. –40x150W PCs in 19” racks generate 5.5kW/m 2.  Not quite the full picture:

18m double rows of racks 12 shelf units or 36 19” racks 528 box PCs 105kW U PCs 288kW 324 disk servers 120kW(?) Future Machine Room Layout You need aisles for access and space around the sides for electrical and hvac equipment. Essential network equipment has low dissipation.

CERN.ch 23 Space  Is Space a problem? –HVAC limits heat load to 2kW/m 2. –40x150W PCs in shelving units generate 1.25kW/m 2. –40x150W PCs in 19” racks generate 5.5kW/m 2.  Taking into account corridors and n/w equipment –the machine room can hold over 6,000 white box PCs. This may be acceptable. –1U PCs are required to saturate the power supply »but is this the goal? »We will probably have a mix of both by 2007  Use space wisely –Place robots in areas with limited hvac capacity…

CERN.ch 24 Fire Risks, Detection and Suppression  Fire is an ever present risk for computer centres—and an area where the non-commercial business makes a difference. –We can afford to shut down systems on a first alarm »or, at least, with today’s level of false alarms –We can’t afford gas suppression (cost & volume…) »Are “hi-fog” systems really appropriate for a machine room?  Major problem is smoke localisation –Given air flow, sensitive detectors will be triggered even at some distance from source of smoke. –Localisation of detection requires restriction of air flow which works against cooling needs.  We are concerned about halogenated materials. –A small localised incident can produce large volumes of acrid smoke leading to widespread damage.

CERN.ch 25 Questions (not a conclusion)  How will CPU and system power demands evolve? –Will power consumption of HEP relevant systems follow the general trend?  Can we assume a well behaved load in future? Even for individually packaged boxes? –i.e. power factor >0.95  Will water cooling make a comeback?  Will we ever see halogen free PCs? –and at what cost premium?