Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 CTO Challenge William Tschudi 510-495-2417 February 27, 2008.

Similar presentations


Presentation on theme: "1 CTO Challenge William Tschudi 510-495-2417 February 27, 2008."— Presentation transcript:

1 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

2  Selected LBNL Data Center findings  Future bold moves

3 Benchmarks of energy end use

4 Overall electrical power use Courtesy of Michael Patterson, Intel Corporation

5 Your mileage will vary The relative percentages of the energy doing computing varied considerably.

6 High level metric— Ratio of electricity delivered to IT equipment Average.57 Higher is better Source: LBNL Benchmarking

7 High level metric— Average.57 Source: LBNL Benchmarking CTO Challenge – get everyone to this level

8 On-line profiling tool: “Data Center Pro” OUTPUTS Overall picture of energy use and efficiency End-use breakout Potential areas for energy efficiency improvement Overall energy use reduction potential INPUTS Description Utility bill data System information IT Cooling Power On-site gen

9 DOE Save Energy Now Data Center program Major Program Elements 1.Develop and test “DC Pro” Software using pilot energy assessments 2.Create consensus metrics 3.Create and publicize Save Energy Now case studies based on pilot energy assessments 4.Create best practice information and a training curriculum 5.Develop Qualified Specialists program for Data Centers 6.Create guidelines for “Best-in-Class” data center within various classes of data centers, including strategies for incorporating distributed generation technologies

10 Federal Energy Management Program Best practices showcased at Federal data centers Pilot adoption of Best-in-Class guidelines at Federal data centers Adoption of to-be-developed industry standard for Best-in-Class at newly constructed Federal data centers EPA Metrics Server performance rating & ENERGY STAR label Data center performance benchmarking Industrial Technologies Program Tool suite & metrics Energy baselining Training Qualified specialists Case studies Certification of continual improvement Recognition of high energy savers Best practice information Best-in-Class guidelines Industry Tools Metrics Training Best practice information Best-in-Class guidelines IT work productivity standard

11 Energy assessment tools Data Center Assessment Output Overall energy performance (baseline) of data center Performance of IT & infrastructure subsystems compared to benchmarks Prioritized list of energy efficiency actions and their savings, in terms of energy cost ($), source energy (Btu), and carbon emissions (Mtons) IT Module Servers Storage & networking Software Power Systems UPS Distribution Cooling Air Mgmt CRAC/CRAH AHU Chillers On-Site Gen Renewables Co-gen

12 Server Load/ Computing Operations Cooling Equipment Power Conversion & Distribution Alternative Power Generation High voltage distribution Use of DC power Highly efficient UPS systems Efficient redundancy strategies Load management Server innovation Energy efficiency opportunities are everywhere Better air management Better environmental conditions Move to liquid cooling Optimized chilled-water plants Use of free cooling On-site generation Waste heat for cooling Use of renewable energy/fuel cells

13 HVAC best practices Air Management Air Economizers Humidification Control Centralized Air Handlers Low Pressure Drop Systems Fan Efficiency Cooling Plant Optimization Water Side Economizer Variable Speed Chillers Variable Speed Pumping Direct Liquid Cooling

14 Electrical best practices UPS systems Self-generation AC-DC distribution Standby generation

15 Best practices and IT equipment Power supply efficiency Standby/sleep power modes IT equipment fans Virtualization Load shifting

16 Best Practices— Cross-cutting and misc. issues Motor efficiency Right sizing Variable speed drives Lighting Maintenance Continuous Commissioning and Benchmarking Heat Recovery Building Envelope Redundancy Strategies

17 Design guidelines for ten best practices were developed Guides available through LBNL’s website & PG&E’s Energy Design Resources website

18 Broaden recommended and allowable ranges of environmental conditions Debunk contamination and ESD fears Move to liquid cooling Integrate computing equipment and the building Minimize power conversion loss - end to end Facilitate IT – Facilities – CFO understanding CTO Challenge – Some bold steps to improve energy efficiency (and save your customers money)

19 Broaden recommended and allowable ranges of environmental conditions –HVAC can be greatly reduced if higher temperatures can be used for cooling IT equipment (using air or liquid) –ASHRAE addressing this but not based upon science –IT equipment operating at 80º F or higher has huge energy implications CTO Challenge

20 Temperature guidelines – at the inlet to IT equipment ASHRAE Allowable Maximum ASHRAE Allowable Minimum ASHRAE Recommended Maximum ASHRAE Recommended Minimum

21 Humidity guidelines – at the inlet to IT equipment ASHRAE Allowable Maximum ASHRAE Allowable Minimum ASHRAE Recommended Maximum ASHRAE Recommended Minimum

22 CTO challenge: Broaden environmental conditions ASHRAE Allowable Maximum ASHRAE Allowable Minimum ASHRAE Recommended Maximum ASHRAE Recommended Minimum

23 Metric 1.2 for planned LBNL supercomputer facility Average of Facilities Measured-1.74 Total Power/IT Power

24 Debunk contamination and ESD fears –Direct use of outside air for cooling can result in large HVAC savings but fears of contamination hinder its adoption –LBNL studies suggest this should not be a problem –Failure data due to contamination has been requested – none has been produced –ESD is poorly understood. CTO Challenge

25 Outdoor measurements IBM Standard EPA Annual Health Standard EPA 24-Hour Health Standard and ASHRAE Standard

26 Measurements inside the centers IBM Standard EPA Annual Health Standard EPA 24-Hour Health Standard and ASHRAE Standard

27 Move to liquid cooling –Liquid can remove 3500 times as much heat as air –Liquid cooling could eliminate (or greatly reduce) the need for chillers –Liquid is creeping in now – how to accelerate it? CTO Challenge

28 Integrate computing equipment and the building –Often multiple fans in series –air and liquid cooling –High delta T is efficient –Eliminate boxes –Control HVAC from servers on board sensors (Demo being planned) CTO Challenge

29 Minimize power conversion loss - end to end –On site generation –Distribute high voltage AC or DC –Eliminate conversions with use of DC –Insist on high efficiency power supplies and UPS –Optimize DC conversions in the box –AC to the chip? –Redundancy CTO Challenge

30 Measured UPS efficiency Redundant Operation

31 Inverter InOut Bypass Battery/Charger Rectifier Internal Drive External Drive I/O Memory Controller  Processor SDRAM Graphics Controller DC/DC AC/DC DC/DC AC/DC Multi output Power Supply Voltage Regulator Modules 5V 12V 3.3V 12V 1.5/2. 5V 1.1V- 1.85V 3.3V 12V PWM/PFC Switcher Unregulated DC To Multi Output Regulated DC Voltages Data Center power conversions Power Distribution Unit (PDU) Server Uninterruptible Power Supply (UPS) AC DC AC DC

32 UPS factory measurements Typical Operation

33 Power supply efficiency Typical operation

34 Facilitate IT – Facilities – CFO understanding – Disconnect between facilities and IT – Operating budget vs capital budget – Operating cost equals or exceeds capital cost of IT equipment – How to get CFO’s engaged? CTO and CFO Challenge

35 websites: http://hightech.lbl.gov/datacenters/ http://www1.eere.energy.gov/industry/save energynow/partnering_data_centers.html


Download ppt "1 CTO Challenge William Tschudi 510-495-2417 February 27, 2008."

Similar presentations


Ads by Google