War Stories: Implementing ASHRAE’s 2011 Allowable Ranges in Data Centers Mark Monroe, Vice President & CTO of DLB Associates Consulting Engineers Session FM 6.12 Energy/Power/Cooling
Opening Comments Before ASHRAE first published the Thermal Guidelines Book in 2004, there were no vendor neutral standards for data center temperature and humidity Common temperatures were 64 – 68°F (18 – 20°C). ASHRAE’s Thermal Guidelines recommended temperatures up to 80.6°F (27°C) with allowable temperatures even higher. Although there are a number of data centers operating above 68°F (20°C ), often those operating temperatures are in the range of 72 – 75°F (22 – 24°C ). A minority of the data centers are operating above 75°F (24°C). Today’s presentation shares some case studies and experiences about operating at higher temperatures and / or using chiller-less approaches. We hope this will encourage more companies to take advantage of the ASHRAE Recommended and Allowable Envelopes.
Agenda PART 1: ASHRAE 2011 THERMAL GUIDELINES OVERVIEW PART 2: CASE STUDIES -Greenfield Mega Internet Data Center (Europe) -Existing Colocation Data Center (Europe) -Greenfield Colocation Data Center (Asia) -Greenfield Mega Internet Data Center (Europe) CLOSING COMMENTS GOAL: Demonstrate capital & operational savings through design aligned with ASHRAE Thermal Guidelines
Part 1: ASHRAE 2011 Thermal Guidelines Overview
Thermal Guidelines - ALLOWABLE Environmental Envelopes
Gartner Interactive Poll Question Result – Dec 2012 At what equipment inlet temperature do you run your data center? Colder than 65°F 4% 65°F to 68°F 10% 69°F to 72°F 38% 73°F to 76°F 30% 77°F to 80°F 9% Warmer than 80°F 1% Not sure, but the room feels nice and cool 7% Not sure, but the room always feels warm 1% ASHRAE Recommended Inlet 80.6°F
Recommended and Allowable This is RecommendedThis is Allowable
Recommended and Allowable This is RecommendedThis is Allowable
Recommended and Allowable This is RecommendedThis is Allowable
Is This How We Drive Our Data Centers? MPH
Is This How We Drive Our Data Centers? oFoF
Why Condition Data Centers? 12 Svante Arrhenius Arrhenius Equation To reduce the possibility of IT equipment failures
Arrhenius Model Net effect of raising inlet temp 5 o C ? Can’t tell without detailed study Net effect of raising inlet temp 5 o C ? Can’t tell without detailed study
Google Disk Temperature/Reliability Study source:
Time At Temperature Is The Key 1 hr x 133,000 hrs 1 hr x 120,000 hrs 2 hr x 126,500 hrs +
ASHRAE White Paper
Lower bound, average, and upper bound are included since there is variation in server configuration & utilization ASHRAE Estimated Reliability
Statistics In The ASHRAE Paper
Net X-Factor Across The Whole Year ASHRAE Weather Data Viewer provides statistical averages based on the most recent 25 years of data IT hardware failure in Las Vegas with a variable data center temperature is ONLY 13% higher than if the data center was operating at a tightly controlled temperature of 68°F (20°C).
© ASHRAE Table reformatted by DLB Associates Avg. Net Failure Rates for Air-Side Economization NOTE: ASHRAE Weather Data Viewer provides statistical averages based on the most recent 25 years of data Weather Data From ASHRAE Weather Data Viewer Peak Temperature has minimal impact on Server Failure
Part 2: Case Studies
Case Study Overview Case Study Details Emphasis SiteLocationType 1RetrofitUS SoutheastEnterprise Increase Setpoint, Increase Free Cooling Hours, Improve Chiller Efficiency 2GreenfieldEuropeMega IDC Increase Setpoint, Increase Free Cooling Hours, Reduced Mechanical Cooling Plant, Climate Analysis, X-factor Analysis 3Existing DCEuropeColocationIncrease Setpoint, Increase Free Cooling Hours 4GreenfieldUS SoutheastColocation Increase Setpoint, Increase Free Cooling Hours, Improve Chiller Efficiency, Reduced Mechanical Cooling Plant, Climate Analysis, X-factor Analysis 5GreenfieldEuropeMega IDC Increase Setpoint, Increase Free Cooling Hours, Reduced Mechanical Cooling Plant, Climate Analysis
Case Study 1: Retrofit Enterprise Data Center (US Southeast)
The Green Grid – ROI of COOLING SYSTEM ENERGY EFFICIENCY UPGRADES -Study published by The Green Grid in Member company in the southeastern U.S. -Independently verified by TGG members -Published for public distribution * -A step-by-step implementation of Energy Efficiency Measures (EEMs) with ROI calculation on each one. -Included raising the temperature of the supply water and data center halls. Case Study 1: Overview * -
Case Study 1: Floor Plan & Metrics -Raised Floor Area: 33,000 ft 2 (3,066m 2 ) -Raised Floor Height: 24-36” (61-91cm) -Cooling System: Chilled Water Plant feeding CRAH units -Total IT Load: 2MW
Case Study 1 Energy Efficiency Measures (EEM) MEASURES IMPLEMENTED -Variable Speed Drives on CRAHs -Upgrade end-of-life CRAH units with newer, more efficient ones -Improve rack airflow management with blanking plates, baffles -Reposition CRAH temperature/humidity sensors to front of IT racks -Adjust temperature set points for CRAHs and chiller plant Leaving Water Temperature (LWT) GOALS -Implement EEMs one-at-a-time, measure impact carefully -Calculate savings and ROI of each EEM independently and together.
Case Study 1 Details (in whitepaper)
Case Study 1 Details (PUE vs EEM Implemented)
CHILLED WATER LEAVING WATER TEMP -Raised from 44°F (6.7°C) to 46°F (7.8°C) -Measured savings of 5% of the chiller energy -Annual savings 893,000 kWh, $101,000 -ZERO capital cost INLET AIR TEMP -Raised from 68°F (20°C) to 71°F (21.7°C) -Raised 1°F (0.6°C) every few days; measured energy impact -Enabled by airflow management EEMs (baffles, blanking panels, CFDs) -9.1% overall energy savings Case Study 1 Details (Temperature Changes)
Case Study 1 Overall Results -300kW power savings -$300,000 per annum financial savings -$506,000 cost -Simple ROI - 1 year 8 months -ROI of temperature increase: ZERO months (immediate)
Case Study 2: Greenfield Mega Internet Data Center (Europe)
Case Study 2 Overview A European greenfield site was designed completely WITHOUT chillers. This was achieved through: -Climate and cooling analysis -Application of the ASHRAE Thermal Guidelines -Alignment with the client’s value system By raising the temperature and operating in the upper end of the Recommended range AND to accept excursions beyond that range on occasion, a CHILLERLESS design was achieved.
Case Study 2 Floor Plan & Metrics -Raised Floor Area: ft 2 (20,000m 2 ) -Raised Floor Height: 48” (122cm) -Cooling System: Waterside Economizer feeding in-row cooling units -Total IT load: UNDISCLOSED Cooling Power Server Space Support Space © DLB Associates
Case Study 2: Energy Efficiency Measures (EEM) The best practice of isolating the hot aisle from the cold aisle ensures mixing does not occur and was key to pushing the temperature limits of energy efficiency in the facilities. The servers exhaust heat to an enclosed hot-aisle where air is drawn by fans across a chilled water coil and back into the room. The room acts as one big cold-aisle. Room as a Cold Aisle (27°C [80°F]) Plan View Enclosed Hot Aisle (>105°F [41 o C]) Servers Enclosed Hot Aisle (>105°F [41 o C]) Servers © DLB Associates Room as a Cold Aisle Elevation Enclosed Hot Aisle Servers Fans Cooling Coil © DLB Associates
Case Study 2 Details (Wet Bulb Temp. Information) Annual Wet Bulb Temperature Frequency Frequency (Percentage) Wet Bulb Temperature, °F (°C) -22 (-30) -4 (-20) 14 (-10) 32 (0) 50 (10) 68 (20) 86 (30) 0 © DLB Associates
Case Study 3: Existing Colocation Data Center (Europe)
An existing European Data center was examined for opportunities to reduce PUE and decrease energy costs. The data center consists of two data halls (plus one future hall). The cooling system consists of air-cooled chillers which supply chilled water to the CRAC units in the server area. A total PUE of 1.70 was calculated with current PUE contributions from the various systems as follows: Case Study 3 Overview
Case Study 3 Floor Plan & Metrics -Raised Floor Area: 2 x 18,000 ft² (2 x m² ) -Raised Floor Height: 2.6 ft (0.8 m) -Cooling System: Chilled Water Plant with Air-cooled Chillers feeding CRAH Units -Total IT Load: 8MW First Floor © DLB Associates Second Floor © DLB Associates
Case Study 3 Energy Efficiency Measures (EEM) The following best practices were implemented and their impact on PUE and energy costs were analyzed. -Partial containment of cold aisle & installation of blanking panels -Prevents mixing of air streams -Increase temperature in data halls -Reduces fan energy in CRAC units (utilize their VFDs) -Increases number of hours of both partial and full free cooling in integrated free cooling mode -Reduces or eliminates cooling load in compressors -Control CRAC units on cold aisle temperature (instead of return air temperature) -Required installation of cold aisle temperature sensors -Control CRAC units as a whole instead of individually -Prevents “fighting” units -Reduces pump energy -Balance floor tiles to equalize loads on CRAC units -Reduces fan energy in CRAC units
Case Study 3 Energy Efficiency Measures (EEM) CRAC Unit Control Scheme Before (left) & After (right) © DLB Associates
Case Study 3 Details (Temp. Impact on Chiller Efficiency) 44.6 (7) Chiller PUE Chilled Water Supply Temperature °F (°C) Chiller PUE vs. Chilled Water Supply Temperature At 8,5°C (47,3°F) OAT Chiller PUE Decreases With Increased Supply Water Temperature Due To Increased Partial Free Cooling 60.8 (16) 46.4 (8) 48.2 (9) 50.0 (10) 51.8 (11) 53.6 (12) 55.4 (13) 57.2 (14) 59.0 (15) © DLB Associates
Case Study 3 Overall Results (PUE Energy Savings) The changes were expected to reduce the PUE by 0,10 purely from MECHANICAL changes to the existing system. The cost savings from the reduced PUE was projected over time (assumes increased load over time): © DLB Associates
Case Study 4: Greenfield Colocation Data Center (US Southeast)
Case Study 4 Floor Plan & Metrics -Raised Floor Area: 140,000 ft 2 (13,000m²) -Raised Floor Height: 36” (91cm) -Cooling System: Modular Chilled Water Plants with Waterside Economizer Feeding CRAH Units -Total IT Load: 10MW Server Area Power Plant Office Cooling Plant (Above) © DLB Associates
Case Study 4 Energy Efficiency Measures (EEM) A Cooling System Analysis was performed for a Greenfield data center located in on the Eastern seaboard of the United States to understand the tradeoffs in terms of TCO & reliability. The analysis was applied to 3 different cooling system types to understand: 1)Amount of available free cooling at differing supply air temperature (higher supply air temperatures mean MORE free cooling) 2)Magnitude of MECHANICAL cooling capacity needed (when outdoor conditions mean that free cooling is NOT suitable) 3)Annual average relative PUE for each cooling system (directly related to relative operating cost) © DLB Associates Dry Bulb Temperature °F (°C) -20 (-29) 0 (-18) 20 (-7) 40 (4) 60 (16) 80 (27) 100 (38) 120 (49) Hours Per Year Bin Wet Bulb Temperature °F (°C) -20 (-29) 100 (38) 80 (27) 60 (16) 40 (4) 20 (-7) 0 (-18) © DLB Associates 120 (49)
Case Study 4 Details (Climate Analysis) Climate Analysis Results Scenario Number System Type Air Flow Approach ASHRAE TC 9.9 IT Envelope Free Cooling PUE Hours % of Year 1 Direct Air-Side Economizer Open 3°F (1.7°C) Recommended2, A1 Allowable5, Chillers And CRAHs (Water-Side Economizer) Closed 15°F (8.3°C) Recommended5, A1 Allowable8, Indirect Adiabatic CoolingClosed 3°F (1.7°C) 70% WBDE Recommended7, A1 Allowable8, The Climate Analysis maps the statistical weather data and the desired operating temperature range inside the data center (IT Envelope) to the defined Cooling System type to build a theoretical ANNUAL performance in terms of economizer hours and mechanical system PUE. 1)For each system type the approach temperature represents the differential between outdoor air conditions and IT inlet conditions. © DLB Associates
If the Maximum MECHANICAL cooling capacity (e.g., chiller tonnage) is UNDERSIZED, then EXCURSIONS occur. The occurrences of the excursions are directly related to the Outdoor conditions. The EXCURSIONS can be considered in terms of annual DURATION (number of hours per year), or essentially how often they will occur and MAGNITUDE (what IT inlet temperature can be expected during the excursion). Case Study 4 Details (Undersizing & Excursions) Excursion Temperature °F (°C) Cumulative Excursion Frequency (Hours per Year) At full IT load, sizing a chiller plant at 65% of the max tonnage (e.g x 325-tons per 1MW) will result in ~520-hours ABOVE 75.2°F (24°C) and a MAX temp of 84.2°F (29°C). ~150-hours above 78°F (25.5°C) ~10-hours above 80.6°F (27°C) 75.2 (24) 86.0 (30) 65% Chiller Capacity (Max. Temp. 84.2°F [29°C]) 77.0 (25) 78.8 (26) 80.6 (27) 82.4 (28) 84.2 (29) © DLB Associates
Case Study 4 Details (Excursions & Impact on Failure Rate) From an IT Equipment perspective, provided that the Maximum Temperature remains within the ALLOWABLE envelope then the IMPACT of each excursion is NOT significant. The impact of the DURATION of the excursions can be mapped to annual X-factors to understand the impact on server failures. The table below identifies that having a 65% Capacity Chiller Plant translates to a NEGLIGIBLE increase in temperature related IT equipment failures per year. © DLB Associates
Case Study 4 Results (Summary of Cooling Options) Allowing elevated temperatures within a data center provides the opportunity to REDUCE or ELIMINATE the amount of MECHANICAL COOLING that is installed. This example shows that even in a location that may experience fairly high ambient temperatures and a fairly humid environment for a significant portion of the year, SIGNIFICANT first cost savings are achievable which will result in a NEGLIGIBLE impact on ITE performance or longevity. -Direct Airside Cooling -An efficient solution BUT will require supplement mechanical cooling if supply air temperatures are to be maintained below the ASHRAE allowable thresholds. -In order to control HIGH HUMIDITY a fully-sized mechanical cooling plant is required – otherwise high humidity conditions would occur (problematic for the facility). -Chilled Water CRAHs with Waterside Economizer -Reasonable efficiency especially if the operated at elevated supply air temperatures -Chillers may be intentionally undersized to achieve good Capex savings (~$ / MW) and negligible impact to ITE reliability. -Indirect Adiabatic Cooling -MOST efficient solution / provides the MOST free-cooling hours and the lowest PUE. -Mechanical cooling can be ELIMINATED if supply air temperatures are allowed to rise up to the ASHRAE A1 Allowable threshold for the warmest days of the year.
Case Study 5: Greenfield Mega Internet Data Center (Europe)
A European greenfield site was designed with a unique cooling system. Some unique features include: -Seawater as the primary method of cooling -Tempering the outlet seawater to minimize environmental impact -Chillerless design -Modular design with multiple failsafes Case Study 5: Overview
Case Study 5: Floor Plan & Metrics -Raised Floor Area: 2 x ft 2 (2 x m 2 ) -Raised Floor Height: 48” (122cm) -Cooling System: Chillerless Seawater Cooling -Total IT Load: UNDISCLOSED © DLB Associates
The seawater intake temperature was investigated using data taken over several years. The temperature for an existing intake tunnel was compared with a new proposed location at a lower elevation. This elevation had significantly less excursion time (Baseline = 68°F [20°C]) than the existing intake. Case Study 5 Seawater Intake Temp. Investigation Year 1 Year 2 Year 3 © DLB Associates
Case Study 5: Overall Results A closed water-side cooling system was designed to minimize humidification and contamination concerns. This was a chillerless design with a PUE significantly BELOW 1.1. Seawater provided an economical and environmentally conscious cooling solution that was reliable and exceed industry trends for cutting edge PUE.
Case Study Overview: Techniques Technique Case Studies General CFD modeling to predict outcomesX XXX Increase supply air temperatureXXXXX CHILLERLESS design X X Reduced fan speed (VFDs)X X Optimize Controls CRAH Units & Chiller Plants X Temp. / Humidity sensor locations X XX Improve Air Distribution Balance tiles / increase floor pressure X CFD model to review results X Improve Airstream Separation Hot / Cold Aisle ContainmentX XXXX Blanking PanelsX XXXX
Closing Comments There are many ways to use the information presented in the ASHRAE Thermal Guidelines to save energy, CAPEX and OPEX on your data center. The examples presented today are all operational data centers that are benefiting from implementing higher operating temperatures & / or accepting excursions outside of their set points. Having design professionals that have ASHRAE expertise eliminate the risk of balancing CAPEX, OPEX and energy efficiency with mission critical operation. We hope this will encourage more companies to take advantage of the ASHRAE Recommended and Allowable Envelopes.
END OF SLIDES Mark Monroe CTO & VP DLB Associates Consulting Engineers