Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t CF CERN Computer Facilities Evolution Wayne Salter HEPiX May 2011.

Slides:



Advertisements
Similar presentations
Matt Warner Future Facilities Proactive Airflow Management in Data Centre Operation - using CFD simulation to improve resilience, energy efficiency and.
Advertisements

Presented by: April Kerr and Wayne Johnston Date: June 7 th 2006 Axia NetMedia Corporation Using SuperNet for Data Communication throughout Alberta.
1 Optimizing the Efficiency of the NCAR-Wyoming Supercomputing Center Ademola Olarinde Team Member: Theophile Nsengimana Mentor: Aaron Andersen August.
State Data Center Re-scoped Projects With emphasis on reducing load on cooling systems in OB2 April 4, 2012.
Maximizing Your IT Budget through Energy Efficiency Heather Feigum Focus on Energy Advisor.
The CDCE BNL HEPIX – LBL October 28, 2009 Tony Chan - BNL.
Cooling Product Positioning
Co-generation Cogeneration is an attractive option for facilities with high electric rates and buildings that consume large amounts of hot water and electricity.
State Data Center Re-scoped Projects With emphasis on reducing load on cooling systems in OB2 April 4, 2012.
Thermal Management Solutions from APW President Systems
UTSW Thermal Energy Plants, Power Generation and Electrical System What do we do to meet the Emission Reduction, Energy usage Reduction and Electrical.
Datacenter Transformation. In the past decade US Government owned datacenters skyrocketed from 432 to 1200 However, average utilization of servers is.
 Site  Requirements  Local Resources  Initial layout ideas  Brief material selection  Supply options.
Computer Room Experiences A medium sized tier-2 site view Pete Gronbech GridPP Project Manager HEPIX April 2012.
A New Building Data Center Upgrade capacity and technology step 2011 – May the 4 th– Hepix spring meeting Darmstadt (de) Pascal Trouvé (Facility Manager.
12 th U.S. & North America Ventilation Symposium June 2008 Ventilation On Demand (VOD) Auxiliary Fan Project Auxiliary Fan Project D. O’Connor, Sr. Ventilation.
September 18, 2009 Critical Facilities Round Table 1 Introducing the Heat Wheel to the Data Center Robert (Dr. Bob) Sullivan, Ph.D. Data Center Infrastructure.
CERN IT Department CH-1211 Genève 23 Switzerland t Options for Expanding CERN’s Computing Capacity Without A New Building Medium term plans.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Business Continuity Overview Wayne Salter HEPiX April 2012.
© 2008 IBM Corporation Sales Training IBM SystemsGreen Energy Efficiency Rapid Assessment Scott Barielle STG Lab.
© 2010 Colt Telecom Group Limited. All rights reserved. Next Generation Data Centre Design Akber Jaffer 2.
For more information, please contact 19/11/2010 EN/CV/DC Chiller design and purchase by Elena Perez Rodriguez.
Best Practices in HVAC Design/Retrofit
EvergreenEcon.com Heat Pump Water Heater Model Validation, Market Progress Assessment, and Process Evaluation RTF HPWH Sub-Committee Meeting October 15,
IT Department 29 October 2012 LHC Resources Review Board2 LHC Resources Review Boards Frédéric Hemmer IT Department Head.
Chilled water Meyrin consolidation Study 1 st Part Many thanks for their contribution to: Pasquale Alemanno, Fortunato Candito, Alexander Putzu.
Optimising Data Centre Power Planning and Managing Change in Data Centres - 28th November Cirencester.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Remote Hosting First Experiences Wayne Salter (with input.
CERN IT Department CH-1211 Genève 23 Switzerland t The new (remote) Tier 0 What is it, and how will it be used? The new (remote) Tier 0 What.
Lindab Solus - Simply the natural choice.... lindab | comfort Chilled beam revolution! + Save up to 45 % cooling energy!* + Installation and investment.
Lyndonville Electric Department Feasibility Analysis Review December 2,
Smart Grid: Opportunities in the Data Center January 22, 2010 Charles O’Donnell, Vice President, Engineering Liebert AC Power Emerson Network Power.
New Data Center at BNL– Status Update HEPIX – CERN May 6, 2008 Tony Chan - BNL.
Review of the Project for the Consolidation and Upgrade of the CERN Electrical Distribution System October 24-26, 2012 THE HIGH-VOLTAGE NETWORK OF CERN.
CERN IT Department CH-1211 Genève 23 Switzerland t Tier0 Status - 1 Tier0 Status Tony Cass LCG-LHCC Referees Meeting 18 th November 2008.
Systems Life Cycle A2 Module Heathcote Ch.38.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Facilities Evolution Wayne Salter / Vincent Doré.
Infrastructure Improvements 2010 – November 4 th – Hepix – Ithaca (NY)
Computer Centre Upgrade Status & Plans Post-C5, June 27 th 2003 CERN.ch.
Commodity Node Procurement Process Task Force: Status Stephen Wolbers Run 2 Computing Review September 13, 2005.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Centre Upgrade Project Wayne Salter HEPiX November.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
Nikhef/(SARA) tier-1 data center infrastructure
The Data Center Challenge
Smarter Greener Learning The PSBA co-location facility Stephen Frith PSBA Commercial Manager.
High Performance Computing (HPC) Data Center Proposal Imran Latif, Facility Project Manager Scientific & Enterprise Computing Data Centers at BNL 10/14/2015.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Centre Consolidation Project Vincent Doré IT Technical.
Authors: William Tschudi, Lawrence Berkeley National Lab Stephen Fok, Pacific Gas and Electric Company Stephen Fok, Pacific Gas and Electric Company Presented.
Energy Savings in CERN’s Main Data Centre
A brief look at the components that make up the system life cycle.
Future O 2 LHC P2 U. FUCHS, P. VANDE VYVRE.
Vault Reconfiguration IT DMM January 23 rd 2002 Tony Cass —
CERN - IT Department CH-1211 Genève 23 Switzerland t Power and Cooling Challenges at CERN IHEPCCC Meeting April 24 th 2007 Tony Cass.
DRAKENSBERG PUMPED STORAGE SCHEME 29 June Unit 1-3 Generator Stator refurbishment (Tender: PS(DPS)2015/DJ/01) Clarification Meeting 29 June 2015.
CIBSE HCNW 28 August 2014 Modular boilers. Traditional or low-tech replacement boiler installation; Multiple boilers eg 2 no 66% of load for security.
SMARTset & COOLflow March energy Overview A leading provider of low energy cooling products & intelligent energy management solutions.
110 th November 2010EN/CV/DC Detector Cooling Project - advancement status of Work Package N3 Thermosiphon project Michele Battistin 18 May 2011.
Bernard Boutherin 1 Feedback after four years of operation.
West Cambridge Data Centre Ian Tasker Information Services.
The Genome Sequencing Center's Data Center Plans Gary Stiehr
Dell EMC Modular Data Centers
Remote Hosting Project
Unit 2: Chapter 2 Cooling.
The Data Center Challenge
CERN Data Centre ‘Building 513 on the Meyrin Site’
Dana M. Arenius Jefferson Laboratory Cryogenics Dept Head
Olof Bärring LCG-LHCC Review, 22nd September 2008
Remote Hosting Project
Calibrated Energy Models: One New Change
Presentation transcript:

Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Facilities Evolution Wayne Salter HEPiX May 2011

CERN IT Department CH-1211 Geneva 23 Switzerland t CF Overview Reminder on the current status of CERN Computer Facilities Overview of the current issues and anomalies Summary and status of the various Evolution Projects Closing remarks CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Facilities Overview Designed and built in early 1970s Fully refurbished around 2000 –Increased power and improved cooling Nominal current capacity: 2.5MW (including 240kW of critical power) Extended (usable) capacity: 2.9MW (including 340kW of critical power) forfeiting redundancy for all UPS systems –But need to take action in the event of a UPS module failure! Small capacity at local Hosting Centre (17 racks and up to 100kW) CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF Current Issues and Anomalies Cooling for critical UPS room insufficient Mixing of critical and physics equipment –Location and cooling No cooling of CC when running on UPS and insufficient stored cooling capacity when on diesel Insufficient critical power available Approaching the limit of available power for the building No redundancy for critical UPS Usage of full ‘available’ 2.9MW implies loss of redundancy for physics UPS A/C of CC coupled to adjacent office building How to meet CERN’s needs in longer-term CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF Evolution Options Local hosting –Provide additional ‘critical power’ –Allow some level of business continuity On-going improvements of CC –Improve efficiency and resilience of cooling system Upgrade of current computer centre –Increase capacity from 2.9 to 3.5MW –Increase ‘critical power’ from 340 to 600kW –Address a number of long-term issues Remote hosting –Address the increasing capacity needs of CERN –Increase business continuity coverage CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Facilities Evolution - HEPiX May Local Hosting Overview Reason: –Lack of ‘critical power’ –Provide some level of BC –Gain experience with remote hosting History –Price enquiry for hosting – August 2009 –Tender for networking – October 2009 –Planned start – April 2010 –Actual local hosting contract start – 15 th June 2010 –Connectivity contract start – 1 st July 2010 Contracts for local hosting and network connectivity renewed for a second year

CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Facilities Evolution - HEPiX May 2011 Summary of Current Status Agreement for 40m² (actually 36m²) and 100kW (two power feeds) 2 dark fibres on diverse paths 17 IT racks + 2 network racks 209 systems installed 14 racks used and 57kW (average power density of 4.1kW/rack) –3 racks remaining and 43kW! Interventions at Local Hosting site: –Small number of Sys Admin interventions (2-3)* –7-8 Service Manager interventions* –33 Vendor interventions –Many installation interventions * Worrying in terms of ‘real’ remote hosting!

CERN IT Department CH-1211 Geneva 23 Switzerland t CF Layout CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF Issues Everything took longer than foreseen! –Contracts with hosting company and network provider –Preparation of CERN room at hosting company Partitioning, power and fibre connections, access card reader, access to InsideEyes, … –Getting equipment into production Network connectivity expensive Room smaller than expected (36 c.f. 40 m² ) Problems with ramp However, no significant problems running systems remotely as far as we have seen so far CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF Summary of Local Hosting Despite a few teething problems the experience is generally good –However, due to proximity not everything done remotely –Need to understand reasons and how to avoid Still not at full capacity Struggling to utilise full available power –Low average rack power density (~4kW) Good step towards remote Tier0 hosting CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF INTEL suggested improvements Report from INTEL received recently based on a review of CC done at end of 2010: –Improve monitoring of power, temperature (inlet, outlet, return air), humidity, etc. –Improve/balance the airflow –Use variable speed fans for AHU (air flow is ‘twice’ the required rate) –Separation of cooling for critical/non-critical equipment –Run CC at higher temperature –Implement free cooling properly –Improve the exhaust route for hot air CERN Computer Facilities Evolution - HEPiX May Wind direction 11

CERN IT Department CH-1211 Geneva 23 Switzerland t CF On-going improvements Work already done –Improved monitoring (allows us to calculate PUE fairly accurately) –Correction of air input temperature –Reduce intake of hot exhaust air –Improve mixing of outside and re-circulating air Full automation of air selection (% of outside vs. re-circulating) –Change to anti-freeze protection (delay and better mixing) –Above measures predicted to result in 80% saving for chiller power –All control system components connected to UPS Foreseen potential further improvements –Higher server inlet temperature –Understand the pressure drop from AHU to servers –Optimize air flow (across all aisle) –Mixing depending on RH –Use variable speed fans for AHU to reduce air flow to only what needed CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF Goals of the CC Upgrade Project Solve the cooling issue for the critical UPS room –New UPS systems in a different location Increase critical capacity to 600kW Increase overall power capacity to 3.5MW Restore N+1 redundancy for both critical and physics UPS systems Secure cooling for critical equipment when running on UPS and extend stored cooling capacity for physics when on diesel Decouple the A/C for CC from the adjacent office building CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF Power Requirements CERN Computer Facilities Evolution - HEPiX May IT Physics Equipment IT Critical Equipment Physics UPS (4+1) Diesel Generator (N+1) Critical UPS (3+1) 600kW 2.9MW 3.5MW Normal Network Cooling for IT Critical Equipment

CERN IT Department CH-1211 Geneva 23 Switzerland t CF Approach Barn area of CC building to be converted to house: –3 new electrical rooms –An IT room with water cooled racks to house critical equipment (up to 450kW) –New ventilation systems for the critical area of the main computer room (and also the critical UPS room in basement) New ventilation systems for the Telecoms rooms (CIXP) will be installed in adjacent offices New critical UPS systems in basement A new partially sunken building for additional chillers and a storage tank for the cooling of the critical areas An additional storage tank for extending the stored cooling capacity for physics equipment Opportunity to install emergency evacuation stairs CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF Schedule and Status ‘Barn’ cleared of IT equipment end of October Removal of cabling and ducting finished Civil engineering (CE) work just commencing –Delayed due to difficulty in freeing offices CE works to be completed November 2011 EL+CV installations Nov/2011-Nov/2012 Increased physics power available Aug 2012 and increased critical power Nov 2012 Project takes a long time and has high cost! Barn Video CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF Remote Tier0 Hosting – Some History How to provide resources once CERN CC full? Studies for a new CC on Prévessin site –Four conceptual designs (2008/2009) –Lack on site experience –Expensive! –Lack of support from management Interest from Norway to provide a remote hosting facility –Initial proposal not deemed suitable –Formal offer not convincing –Interest from other member states CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF Status Call for interest at FC June 2010 –How much computing capacity for 4MCHF/year? –Is such an approach technically feasible? –Is such an approach financially interesting? –Deadline end of November 2010 Response –Surprising level of interest – 23+ proposals –Wide variation of solutions and capacity offered –Many offering > 2MW –Assumptions and offers not always clearly understood –Wide variation in electricity tariffs (factor of 8!) CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF Follow Up Visits –Visit to many sites –Others invited to CERN Goal –To understand better the proposals –Clarify CERN’s needs Benefits –See existing installations –Triggered us to reconsider some of our ideas/preconceptions –Allowed consortia to understand better our needs –Collect information for technical specification CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF Schedule Proposed timeline –Official decision on whether to go ahead spring 2011 –Official letter to be sent explaining procedure –Tender during summer 2011 for adjudication end 2011/early 2012 –Initial installation first half of 2013 to test operational model –Gradual build up in capacity in-line with experiment needs CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF Summary CERN CC reaching end of capacity Further improvements in CC are possible and being implemented in parallel (e.g. monitoring and cooling) –Better monitoring capabilities –Better efficiency, but –We do not get additional computing capacity from this! Three options to address providing increased capacity: –Local hosting Limited additional capacity –CC Upgrade to 3.5 MW Slow and expensive –Remote hosting Interesting but introduces new challenges But could allow us to address business continuity properly CERN Computer Facilities Evolution - HEPiX May

CERN IT Department CH-1211 Geneva 23 Switzerland t CF Presentation Title - 28 Back