Download presentation
Presentation is loading. Please wait.
1
Lars Strong P.E. Upsite Technologies, Inc.
ITOM 20.3 How IT Decisions Impact Data Center Facilities: The Importance of Collaboration Lars Strong P.E. Upsite Technologies, Inc.
2
Data Center World – Certified Vendor Neutral
Each presenter is required to certify that their presentation will be vendor-neutral. As an attendee you have a right to enforce this policy of having no sales pitch within a session by alerting the speaker if you feel the session is not being presented in a vendor neutral fashion. If the issue continues to be a problem, please alert Data Center World staff after the session is complete.
3
How IT Decisions Impact Data Center Facilities: The Importance of Collaboration
Decisions and actions typically under the jurisdiction of the IT side of data center management can have a profound impact on the mechanical systems, operating costs, and capacity of the data center. By understanding these impacts, IT and facilities management are able to develop a cooperative approach to managing the data center.
4
Specifying Server Type: Cooling Delta T (∆T)
5
Decisions about server types affect:
Cooling unit operating expense Cooling unit acquisition capital expense
6
ΔT Through IT Equipment
Regardless of airflow management in the computer room, the IT equipment airflow temperature rise will be constant. Every kilowatt of electricity consumed by IT equipment becomes a kW of heat added to the flow of cooling air through the IT equipment. There is a fixed relationship between airflow, temperature differential and heat (energy).
7
ΔT Through IT Equipment – Heat Transfer Equation
CFM = cubic feet per minute of airflow through the server 3.16 = factor for density of air at sea level in relation to ⁰F ΔT = temperature rise of air through the server in ⁰F IT Equipment Required Flow Rate IT Equipment Delta T (⁰F) 15 20 25 30 35 40 Required flow rate (CFM/kW) 211 158 126 105 90 79 3.16 x W Delta T CFM =
8
ΔT Through IT Equipment
“Pizza box” servers at 20 ⁰F ΔT consume 158 CFM / kW “Blade” servers at 35 ⁰F ΔT consume 90 CFM / kW
9
ΔT Through IT Equipment
500kW of “Pizza box” servers 1,000 servers Would require 79,000 CFM of chilled air 500kW of “Blade” servers 1,600 servers (100 chassis) Would require 45,140 CFM of chilled air
10
ΔT Through IT Equipment
“Blade” servers (35 ⁰F ΔT) Servers require 43% less airflow rate Cooling unit fans require 81% less energy At 80% fan speed ~80% airflow rate Only ~51% of fan energy (49% reduction)
11
Specifying Server Class: Class (A1, A2, A3, or A4)
12
Decisions about ASHRAE Server Class affect:
Access to free cooling hours Chiller operating efficiency Capital investment for the mechanical plant PUE anomalies Warranty and associated reliability costs Building footprint and real estate investment
13
ASHREA Environmental Guidelines
14
ASHRAE Server Class How to know the effect of temperature on IT equipment reliability/life? How to know how long it is OK to operate in allowable temperature range? The “X” Factor Baseline = 24/7 68˚F server inlet temperature OEM historical data on user failure reports Performance variations above, at and below baseline 68˚ Premise is that the data center temperature follows mother nature
15
ASHRAE Server Class Time at temperature weighted failure rate calculations for IT equipment in Chicago Inlet Temperature The “X” factor % of Hours 59⁰F ≤ T ≤ 68⁰F 68⁰F ≤ T ≤ 77⁰F 77⁰F ≤ T ≤ 86⁰F 86⁰F ≤ T ≤ 95⁰F Net x-Factor = 0.99
16
ASHRAE Server Class Time-weighted X-factor estimates
for air-side economizer for selected U.S. cities versus baseline 68˚ for 8,760 hours Improved Failure Rates San Francisco 90% Seattle 90% Boston 96% Denver 97% Los Angeles 98% Chicago 99% Worse Failure Rates Miami 128% Phoenix 120% Houston 113% Dallas 110% Atlanta 104% Washington DC 101%
17
Specifying Equipment That Breaths from Front to Back
18
Decisions on IT equipment breathing patterns affect:
Integrity of hot/cold aisle separation Potential floor density thresholds Fan and temperature set points Integration of switches and servers
19
Specify IT Equipment that breaths front to back
As far as the mechanical health of the data center, all equipment should be: rack-mountable Breath from front to rear Recognized best practice of sources as diverse as: European Code of Conduct for Data Centers BICSI-002 ASHRAE TC9.9
20
Specify IT Equipment that breaths front to back
CFD model – switch flow side to side 3,500 sq. ft., 514 kW 3kW rack with 73 ⁰F to 90 ⁰F intake air temperatures 6kW switch rack ingesting 56 ⁰F to 85 ⁰F air Allowable by manufacturer but exceeded internal SLA 72 ⁰F return set point, resulting in 54 ⁰F supply 82,500 CFM cooling supply for 67,900 CFM IT demand
21
Specify IT Equipment that breaths front to back
CFD model – switch flow front to back 3kW rack with 75 ⁰F to 76 ⁰F intake air temperatures 6kW switch rack ingesting 75 ⁰F to 76 ⁰F air Supply temperature increased from 54 ⁰F to 75 ⁰F 38% chiller plant energy savings cooling supply reduced from 82,500 CFM to 72,000 CFM 33.5% CRAH fan energy savings
22
Specifying Storage Type: Solid State or Tape
23
Decisions on Solid State vs Tape Storage affect:
Access to free cooling Total operating energy budget Mechanical plant acquisition and construction costs Humidity management investment
24
Solid State vs Tape Storage
ASHRAE TC9.9 rate of temperature change boundaries: Tape storage - 9°F per hour Solid state storage - 36°F per hour Weigh solid state storage premium against cost avoidance of acquiring, constructing and operating a mechanical plant
25
Solid State vs Tape Storage
Therefore, the approximate 3:1 acquisition and maintenance premium associated with solid state storage should be evaluated in terms of the cost avoidance of both chiller acquisition and installation, as well as its operating cost over the life of the data center
26
Specifying Cooling Unit Set Point
27
Decisions Specifying Cooling Unit Set Point Affect:
Chiller efficiency Cooling unit efficiency and life (hot & cold cycling) Humidity management operating expenses Access to free cooling General operational discipline
28
Specifying Cooling Unit Set Point
It is still common to see 72° F cooling unit return set today, and lower set points in the range of 68° F are still not uncommon Often driven by IT conservatism or hot spot mitigation
29
Specifying Cooling Unit Set Point
Cooling units often cool the air 18° F, resulting in supply temperatures as low as 50° F, potentially causing dew point problems. At 50° F, 100% RH or condensation is reached with 55 grains of moisture per pound of dry air, a condition which would be met at any of the following data center control settings: 60% 65° F 50% 70° F 45% 75° F (Quite common) 36% 80° F
30
Specifying Cages / Layout That is Compatible with Containment
31
Specifying Cages Compatible with Containment affects:
EVERYTHING! Cages and containment don’t have to be mutually exclusive If the cage is not compatible with containment, there will be a need for extra volume airflow and lower cooling unit set points, resulting in higher operating costs
32
Cages and Containment Chimney cabinets provide full containment can be deployed independent of layout Out of a total 79 cabinets, 12 problem cabinets fitted with chimney exhaust 38% chiller plant energy savings 33.5% CRAH fan energy savings
33
Airflow Management Best Practices
34
Adhering to Airflow Management Best Practices Affect:
EVERYTHING! Reduce required airflow rate Reduced fan energy cost Increase cooling unit set points / supply air temperature Increase cooling capacity increasing free cooling hours Lowering chiller energy costs Defer capital expenditure Cooling units or even new data center
35
Raised Floor Open Area Management
Seal cable openings Seal under power distribution units (PDU) and other equipment Check perimeter of raised floor plenum No misplaced perforated
36
Cable management Responsibility of IT
Commonly cabling impedes conditioned airflow under the raised floor Commonly cabling impedes exhaust flow out of IT equipment
37
Rack Open Area Management
Blanking panels Rail seals Under racks Few sites have done this fully Cabinet design plays a large role in AFM Cabinet AFM design needs to be considered by IT when purchasing cabinets
38
Row / Aisle Open Area Management
Gaps between racks Missing racks Doors on the ends of aisles Baffles or full roof over racks
39
3 Key Things You Have Learned During this Session
There are many IT decisions that directly affect facilities management and the overall capacity and operating cost of the data center The goal of airflow management is to effectively cool IT equipment with the lowest possible flow rate at the warmest possible temperature Benefits of IT and Facilities Collaborating Reduced operating cost Increased cooling capacity - prolonged data center life Increased job satisfaction
40
Upsite’s 4 R’s of Airflow Management™
A holistic approach An iterative process ‘Check in’ at the room level after making any AFM improvements
41
Thank you Lars Strong, P.E. Senior Engineer, Upsite Technologies
Follow Upsite for the latest news and information on data center AFM. @UpsiteTech blog.upsite.com On LinkedIn
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.