Download presentation
Presentation is loading. Please wait.
Published byNaomi Chapell Modified over 10 years ago
1
Data Centre Capacity and Electricity Use How the EU Code of Conduct can help your data centre electricity bill Chris Cartledge Independent Consultant C.Cartledge@sheffield.ac.uk
2
Summary ICT Electricity Use Footprint Data Centre Electricity Use The Electricity Bill PUE and DCIE ASHRAE 2008 European Code of Conduct
3
University of Sheffield ICT Electricity Use (2008) More than £1M/year ~ 20% Total Institution use PCs dominate Servers: 31% (including HPC & departmental)
4
University of Sheffield Data Centres Electricity Servers, Network, PABX Over 40% of ICT use £400,000 p/a Including departmental & remote cabinets
5
Data Centre Study Half a dozen Universities in North of England Primary 50 cabinets 120 KW Old, but possibly refurbished Was mainframe room Secondary 25 cabinets 75 KW Recent, built to a price, since 2000 Was a plant room
6
Typical University Data Centre Room UPS, but no generator Conventional aircon Dark, usually with the lights off... Open plan: no aisle containment Low density – typically 3kW/cabinet 1.5kW/m 2 Up to about 10kW/cabinet for HPC Often not hot aisle/cold - cooling not efficient ALMOST FULL
7
Electricity Typical bill: £350,000 Estates Building and plant Pays electricity bill Meter data often limited No input on IT spend Major projects, CDM, M&E, PABX, etc Computing Services Must deliver IT service No knowledge of bill Unable to monitor use Buys equipment blind VMWare, thin client, SAN, PoE, IPT, etc * Limited communication and understanding *
8
Power usage effectiveness (PUE) Preferred measure of data centre efficiency - some also quote data centre infrastructure efficiency (DCIE) PUE = Total data centre power / IT power OR, better, over a set period PUE = Total data centre energy consumption / IT energy consumption For example: 160kW of IT equipment (2 almost full data centres) Typical cooling, power conditioning, etc. overhead: 160kW PUE = (160+130) / 160 = 1.81 Annual electricity cost @ 14p/unit = £356,000 Best Practice overhead: 40kW PUE = (160+35) / 160 = 1.22 Cost @14p/unit = £239,000 Potential saving: £116,500
9
Data Centre Set Point Reported average: 21.5 o C – values from 20 o C to 25 o C ASHRE recommendation now: 18 o C to 27 o C was 20 o C to 25 o C up to late 2008 4% saving claimed, for 1 o C higher Up to 20% aircon saving by raising to 26 o C? But at what risk to service? Less safe time in event of aircon failure What actual saving? Fans may work harder, aircon performance not simple Who initiates/manages such a change?
10
PUE and Plant not the whole story! Is dark machine room really dark – lights out? Is obsolete equipment actually switched off? Is idle equipment actually switched off? Are most efficient servers being purchased? Is storage being used efficiently – SAN? Is virtualisation used whenever practicable? Is equipment in a hot isle/cold isle arrangement Is power consumption part of software evaluation?
11
European Code of Conduct European Code of Conduct on Data Centres Energy Efficiency Best Practice Guidelines to enable change – Plan for data centre management – About 120 good practices: cover all aspects Unlikely to become compulsory – HEFCE mindful of University independence – But institutions can sign up – for brownie points Real savings to be made Standard plan for managing data centre capacity
12
Best Practice Guidelines Easy to read Clearly presented Complete Project Plan Description of Project Team Practices logically categorised eg. Deployment of new IT services Practices sequenced by time of implementation eg. On New IT Equipment or Plant Refit Practices scored 1 - 5 in terms of likely value
13
Group Involvement Establish a cross disciplinary change board – Consider impacts, ensure effective solution – Definition of standard IT hardware – M&E implications of new services – Audit existing equipment Optimise and consolidate where possible Virtualisation Set point Identify and deal with little used and unused services
14
Sample
15
Some Top Rated Practices Buy energy efficient IT devices Use virtualised servers and storage Switch off hardware for unused services Virtualise little used services Separate cold air from heated return air Use free or economised cooling Increase temperature set points
16
Conclusion There is a lot that can be done to Improve quality of provision Reduce electricity consumption and costs Meet wider agenda Good guidance, documentation training now available There are issues Split responsibilities Costs are currently hidden Investment may be needed to make progress
17
References European Code of Conduct on Data Centres Energy Efficiency – http://re.jrc.ec.europa.eu/energyefficiency/html/standby_initiative_data_centers.htm ASHRAE - The American Society of Heating, Refrigerating and Air-Conditioning Engineers (advances technology to serve humanity and promote a sustainable world) –http://tc99.ashraetcs.org/documents/ASHRAE_Extended_Environmental_Envelope_Final_A ug_1_2008.pdf The Green Grid PUE nomenclature and supporting information – http://www.thegreengrid.org/en/Global/Content/white- papers/Usage%20and%20Public%20Reporting%20Guidelines%20for%20PUE%20 DCiE Savings reference – http://www.datacenterknowledge.com/archives/2009/01/29/hvac-group-says-data- centers-can-be-warmer/
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.