24 x 7 Energy Efficiency February, 2007 William Tschudi

Slides:



Advertisements
Similar presentations
Data Center Design Issues Bill Tschudi, LBNL
Advertisements

Using Benchmarking to Identify Energy Efficiency Opportunity in Cleanrooms; The Labs 21 Approach William Tschudi and Peter Rumsey June 29, 2004
Break Out Session 3: Group A: Ventilation and Air Quality Chris Cosgrove, Cosgrove FDS, Inc.
Cooling Product Positioning
PG&E and Altera Data Center Energy Efficiency Project.
1 Page 1 Public Interest Energy Research (PIER) California Institute for Energy & Environment (CIEE) Lawrence Berkeley National Laboratory Bill Tschudi.
Where Does the Power go in DCs & How to get it Back Foo Camp James Hamilton web: blog:
Energy Efficient Data Centers - an Update December, 2006 William Tschudi
My Ton – Ecos Consulting Brian Fortenbery – EPRI Solutions Bill Tschudi – Lawrence Berkeley National Laboratory Sponsored by: California Energy Commission.
Reliability and Energy Efficiency – Not Mutually Exclusive
CoolAir Temperature- and Variation-Aware Management for Free-Cooled Datacenters Íñigo Goiri, Thu D. Nguyen, and Ricardo Bianchini 1.
Creating Energy-Efficient Data Centers
September 18, 2009 Critical Facilities Round Table 1 Introducing the Heat Wheel to the Data Center Robert (Dr. Bob) Sullivan, Ph.D. Data Center Infrastructure.
Federal Data Center Consolidation Initiative Karen Petraska August 17, 2011.
Enabling High Efficiency Power Supplies for Servers : Update on industry, government and utility initiatives Brian Griffith System Power Architect Intel.
2007 ASHRAE Annual Meeting Conserving Natural Resource Use in Buildings William Tschudi – Tengfang Xu – Lawrence Berkeley.
DUCT EFFICIENCY AND HEAT PUMP PERFORMANCE Paul Francisco David Baylon Ecotope, Inc.
Connectivity Week Santa Clara Convention Center May 23, 2011.
Data Centers - They’re Back… E SOURCE Forum September, 2007 William Tschudi
Page 1 Public Interest Energy Research (PIER) California Institute for Energy & Environment (CIEE) Bill Tschudi Lawrence Berkeley National Laboratory
Air Conditioning and Computer Centre Power Efficiency The Reality Christophe Martel Tony Cass.
1 Page 1 Bill Tschudi Sponsored by: Public Interest Energy Research (PIER) California Energy Commission and administered by California.
COMP 4923 A2 Data Center Cooling Danny Silver JSOCS, Acadia University.
LBNL and Government Data Center Programs SC07 November, 2007 William Tschudi
Data centre air management Case studies Sophia Flucker.
1 DOE Data Center Energy Efficiency Program and Tool Strategy Paul Scheihing U.S. Department of Energy Office of Energy Efficiency and Renewable Energy.
Overview of LBNL’s High-Tech Buildings Project Activities Bill Tschudi and Tim Xu April 21, 2004
Cleanrooms: Two promising research areas William Tschudi – LBNL Peter Rumsey – Rumsey Engineers November 4, 2004.
My Ton – Ecos Consulting Brian Fortenbery – EPRI Solutions Bill Tschudi – Lawrence Berkeley National Laboratory Sponsored by: California Energy Commission.
Critical Facilities Roundtable Data Center Update William Tschudi June 15, 2007.
Overview of Data Center Energy Use Bill Tschudi, LBNL
Federal Data Center Programs 7 X 24 Exchange October, 2007 William Tschudi
1 Data Center Energy Efficiency Metrics ASHRAE June 24, 2008 Bill Tschudi Lawrence Berkeley National Laboratory
50th HPC User Forum Emerging Trends in HPC September 9-11, 2013
The Data Center Challenge
Energy Efficient Data Centers Update on LBNL data center energy efficiency projects June 23, 2005 Bill Tschudi Lawrence Berkeley National Laboratory
Electrical Systems Efficiency Bill Tschudi, LBNL
COMP 4923 A2 Data Center Cooling Danny Silver JSOCS, Acadia University.
1 CTO Challenge William Tschudi February 27, 2008.
Data Center Energy Efficiency Sonoma Mountain Village November 29, 2007 William Tschudi
PRESENTATION TITLE GOES HERE Cooling Trends Data Center Storage.
Authors: William Tschudi, Lawrence Berkeley National Lab Stephen Fok, Pacific Gas and Electric Company Stephen Fok, Pacific Gas and Electric Company Presented.
Datacenter Energy Efficiency Research: An Update Lawrence Berkeley National Laboratory Bill Tschudi July 29, 2004.
Optimizing Power and Data Center Resources Jim Sweeney Enterprise Solutions Consultant, GTSI.
1 Data Centers for the 21st Century Working with Industry to Improve Energy Efficiency in Data Centers Dale Sartor, P.E. LBNL Applications Team June 7,
1 Page 1 Steve Greenberg Sponsored by: Public Interest Energy Research (PIER) California Energy Commission and administered by California.
1 1 Weatherization & Indoor Air Quality Impacts of Weatherization on Air Quality and Comfort Inside Your Home Prepared with the assistance of Jed Harrison,
Data Center Energy Use, Metrics and Rating Systems Steve Greenberg Energy Management Engineer Environmental Energy Technologies Division Lawrence Berkeley.
All content in this presentation is protected – © 2008 American Power Conversion Corporation Row Cooling.
Date of download: 5/27/2016 Copyright © ASME. All rights reserved. From: Experimentally Validated Computational Fluid Dynamics Model for a Data Center.
Insights into Google's PUE Results Spring Published Google PUE Results.
1 PCE 2.1: The Co-Relationship of Containment and CFDs Gordon Johnson Senior CFD Manager at Subzero Engineering CDCDP (Certified Data Center Design Professional)
1 Energy Efficient Data Centers: Strategies from the Save Energy Now Program Federal Environmental Symposium June 4, 2008 Dale Sartor Lawrence Berkeley.
Data Center Energy Efficiency SC07 Birds of a Feather November, 2007 William Tschudi
Free Air Cooling for Data Centres
Enabling High Efficient Power Supplies for Servers
Fort Stanwix National Monument Energy Audit Contract
The Data Center Challenge
Electrical Systems Efficiency
Using Heat to Increase Cooling George Hannah BEng (Hons) CEng MIMechE
DOE Data Center Energy Efficiency Program and Tool Strategy
Data Center Research Roadmap
Data Center Research Activities
Date of download: 11/2/2017 Copyright © ASME. All rights reserved.
IT Equipment Efficiency
IT Equipment Efficiency
Fan-Filter Testing - The Results Are In
Where Does the Power go in DCs & How to get it Back
Door Heat Exchanger: Specification
Presentation transcript:

24 x 7 Energy Efficiency February, 2007 William Tschudi

A “research roadmap” developed for the California Energy Commission outlines key areas for energy efficiency research, development, and demonstration – This includes strategies that can be implemented today. Data Center Research Roadmap

Data Center research activities  Benchmarking and 22 data center case studies  Best practices identified  Self-benchmarking protocol  Power supply efficiency study  UPS systems efficiency study  Standby generation losses  Performance metrics – Computation/watt

LBNL data center demonstration projects  “Air management”  Outside air economizer – Contamination concerns – Humidity control concerns  DC powering – Facility level – Rack level

LBNL data center Federal projects  Case studies  Technical assistance  Emerging technology Investigating use of infrared thermography as a visualization tool

Benchmarking energy end use

IT equipment load density

Overall power use in Data Centers Courtesy of Michael Patterson, Intel Corporation

Performance varies The relative percentages of the energy actually doing computing varies considerably.

Percentage of power delivered to IT equipment Average 0.49

Benchmark results helped to find best practices The ratio of IT equipment power to the total is an indicator of relative overall efficiency. Examination of individual systems and components in the centers that performed well helped to identify best practices. Lets talk about a few….

Best practices topics identified through benchmarking

A word about appropriate environmental conditions… ASHRAE published thermal guidelines ASHRAE published thermal guidelines –Majority of IT suppliers participated –Guidelines allow most centers to relax setpoints over standard practice Recommended and allowable ranges of temperature and humidity are provided – at the inlet to the IT equipment Recommended and allowable ranges of temperature and humidity are provided – at the inlet to the IT equipment High temperatures in the “hot aisles” and return to air conditioners is desirable. High temperatures in the “hot aisles” and return to air conditioners is desirable.

Temperature guidelines – at the inlet to IT equipment ASHRAE Allowable Maximum ASHRAE Allowable Minimum ASHRAE Recommended Maximum ASHRAE Recommended Minimum

Humidity guidelines – at the inlet to IT equipment ASHRAE Allowable Maximum ASHRAE Allowable Minimum ASHRAE Recommended Maximum ASHRAE Recommended Minimum

Best scenario – isolate cold and hot 70-75ºF ºF

Another isolation scheme

Fan energy savings – 75% If mixing of cold supply air with hot return air can be eliminated- fan speed can be reduced

Better temperature control can allow raising the temperature in the entire data center ASHRAE Recommended Range Ranges during demonstration

Best practices – free cooling with air economizers

Encouraging outside air economizers Issue: Issue: –Many are reluctant to use air economizers –Outdoor pollutants and humidity control considered equipment risk Goal: Goal: –Encourage use of outside air economizers where climate is appropriate Strategy: Strategy: –Address concerns: contamination/humidity control –Quantify energy savings benefits

Project objectives Identify potential failure mechanisms Identify potential failure mechanisms Measure contamination levels inside and outside of data centers Measure contamination levels inside and outside of data centers Observe humidity control Observe humidity control Evaluate economizer effect on cumulative particulate exposure Evaluate economizer effect on cumulative particulate exposure Compare particle concentrations to guidelines Compare particle concentrations to guidelines

Measurements inside the centers IBM Standard EPA Annual Health Standard EPA 24-Hour Health Standard and ASHRAE Standard

Outdoor measurements IBM Standard EPA Annual Health Standard EPA 24-Hour Health Standard and ASHRAE Standard

Indoor measurements (note scale)

Data center w/economizer

Humidity measurements ASHRAE Recommended Upper Limit ASHRAE Recommended Lower Limit ASHRAE Allowable Lower Limit ASHRAE Allowable Upper Limit

Findings Water soluble salts in combination with high humidity can cause current leakage Water soluble salts in combination with high humidity can cause current leakage Static electricity (caused by humans) can occur with very low humidity Static electricity (caused by humans) can occur with very low humidity Particle concentration typically is an order of magnitude lower than new ASHRAE limits (without economizer) Particle concentration typically is an order of magnitude lower than new ASHRAE limits (without economizer) Filtration and humidity control on make-up air can provide environments similar to those in closed data centers Filtration and humidity control on make-up air can provide environments similar to those in closed data centers

Best practices – Power conversion

Inverter InOut Bypass Battery/Charger Rectifier Internal Drive External Drive I/O Memory Controller  Processor SDRAM Graphics Controller DC/DC AC/DC DC/DC AC/DC Multi output PS Voltage Regulator Modules 5V 12V 3.3V 12V 1.5/2. 5V 1.1V- 1.85V 3.3V 12V PWM/PFC Switcher Unregulated DC To Multi Output Regulated DC Voltages Data Center power conversions AC voltage conversions

Research illustrated large losses in power conversion Uninterruptible Power Supplies (UPS) Power Supplies in IT equipment

DC powering data centers Goal: Show that a DC system could be assembled with commercially available components and measure actual energy savings – a proof of concept demonstration.

Included in the demonstration Side-by-side comparison of traditional AC system with new DC system Side-by-side comparison of traditional AC system with new DC system – Facility level distribution – Rack level distribution Power measurements at conversion points Power measurements at conversion points Servers modified to accept Servers modified to accept 380 V. DC Artificial loads to more fully simulate data center Artificial loads to more fully simulate data center

Typical AC distribution today 480 Volt AC 380 V DC after first stage conversion

Facility-level DC distribution 380V.DC 480 Volt AC 380 V DC is delivered directly into the server to the same point as in an AC powered server. This eliminates the DC-AC conversion at the UPS and the AC-DC conversion in the server. Also, less equipment is needed.

Rack-level DC distribution 480 Volt AC

AC system loss compared to DC AC system loss compared to DC 7-7.3% measured improvement 2-5% measured improvement Rotary UPS

Energy savings for a typical data center 20% or more facility level energy savings because: 20% or more facility level energy savings because: –Redundant UPS and server power supplies operate at reduced efficiency –Cooling loads would be reduced. –The demonstration comparisons were against “best in class” systems which performed better than typical systems we benchmarked. Further optimization of conversion devices/voltages is possible Further optimization of conversion devices/voltages is possible

Demonstration set-up – see website for more detail

DC power – next steps DC power pilot installation(s) DC power pilot installation(s) Standardize distribution voltage Standardize distribution voltage Standardize DC connector and power strips Standardize DC connector and power strips Server manufacturers develop power supply specification Server manufacturers develop power supply specification Power supply manufacturers develop prototype Power supply manufacturers develop prototype UL and communications certification UL and communications certification

Design guidelines available through PG&E’s Energy Design Resources website Design and Training Resources A web-based training resource available on LBL’s website

website:

Discussion/Questions??