Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 ITM 1.2 How IT Decisions Impact Data Center Facilities: The Importance of Collaboration Lars Strong P.E. Upsite Technologies, Inc.

Similar presentations


Presentation on theme: "1 ITM 1.2 How IT Decisions Impact Data Center Facilities: The Importance of Collaboration Lars Strong P.E. Upsite Technologies, Inc."— Presentation transcript:

1 1 ITM 1.2 How IT Decisions Impact Data Center Facilities: The Importance of Collaboration Lars Strong P.E. Upsite Technologies, Inc.

2 2 Data Center World – Certified Vendor Neutral Each presenter is required to certify that their presentation will be vendor-neutral. As an attendee you have a right to enforce this policy of having no sales pitch within a session by alerting the speaker if you feel the session is not being presented in a vendor neutral fashion. If the issue continues to be a problem, please alert Data Center World staff after the session is complete.

3 3 How IT Decisions Impact Data Center Facilities: The Importance of Collaboration Decisions and actions typically under the jurisdiction of the IT side of data center management can have a profound impact on the mechanical systems, operating costs, and capacity of the data center. By understanding these impacts, IT and facilities management are able to develop a cooperative approach to managing the data center.

4 4 Specifying Server Type: Cooling Delta T ( ∆T)

5 5 ΔT Through IT Equipment Regardless of airflow management in the computer room, the IT equipment airflow temperature rise will be constant. Every kilowatt of electricity consumed by IT equipment becomes a kW of heat added to the flow of cooling air through the IT equipment. There is a fixed relationship between airflow, temperature differential and heat (energy).

6 6 ΔT Through IT Equipment – Heat Transfer Equation CFM = CFM = cubic feet per minute of airflow through the server 3.16 = factor for density of air at sea level in relation to ⁰F ΔT = temperature rise of air through the server in ⁰F 3.16 x W Delta T IT Equipment Required Flow Rate IT Equipment Delta T (⁰F) 152025303540 Required flow rate (CFM/kW) 211 158 126 105 90 79

7 7 ΔT Through IT Equipment “Pizza box” servers at 20 ⁰ F ΔT consume 158 CFM / kW “Blade” servers at 35 ⁰ F ΔT consume 90 CFM / kW

8 8 ΔT Through IT Equipment 500kW of “Pizza box” servers 1,000 servers Would require 79,000 CFM of chilled air 500kW of “Blade” servers 1,600 servers (100 chassis) Would require 45,140 CFM of chilled air

9 9 ΔT Through IT Equipment “Blade” servers (35 ⁰ F ΔT) Servers require 43% less airflow rate Cooling unit fans require 81% less energy At 50% fan speed 50% airflow rate Only 12.5% of fan energy (87.5% reduction)

10 10 Specifying Server Class: Class (A1, A2, A3, or A4 )

11 11 ASHRAE Server Class Different server classes operate reliably at different maximum temperatures. More free cooling hours, and significantly reduced operating cost Different chiller operating conditions Dramatically reducing operating cost and potentially significantly reducing the cost to construct a new data centers

12 12 ASHREA Environmental Guidelines

13 13 ASHRAE Server Class Increased failure rates from allowing servers to operate within the allowable range may not be as bad as it seems. Site in Chicago 1000 servers, expected failure rate of 0.5% (5 failed servers during their lives) Utilizing free cooling and allowing temperatures to vary within allowable range 5% increase in failure rate of servers ¼ of a server, or 1 out of 1,000 servers over 4 years Dramatically reducing operating cost and potentially significantly reducing the cost to construct a new data centers

14 14 Specifying Equipment That Breaths from Front to Back

15 15 Specify IT Equipment that breaths front to back Not doing so violates hot cold aisle segregation, resulting in the need for lower set points and higher airflow delivery Servers are not the only equipment IT is tasked with specifying – switch airflow is also critical

16 16 Specify IT Equipment that breaths front to back As far as the mechanical health of the data center, all equipment should be: rack-mountable Breath from front to rear Recognized best practice of sources as diverse as: European Code of Conduct for Data Centers BICSI-002 ASHRAE TC9.9

17 17 Specify IT Equipment that breaths front to back CFD model – switch flow side to side 3,500 sq. ft., 514 kW 3kW rack with 73 ⁰ F to 90 ⁰ F intake air temperatures 6kW switch rack ingesting 56 ⁰ F to 85 ⁰ F air Allowable by manufacturer but exceeded internal SLA 72 ⁰ F return set point, resulting in 54 ⁰ F supply 82,500 CFM cooling supply for 67,900 CFM IT demand

18 18 Specify IT Equipment that breaths front to back CFD model – switch flow front to back 3kW rack with 75 ⁰ F to 76 ⁰ F intake air temperatures 6kW switch rack ingesting 75 ⁰ F to 76 ⁰ F air Supply temperature increased from 54 ⁰ F to 75 ⁰ F 38% chiller plant energy savings cooling supply reduced from 82,500 CFM to 72,000 CFM 33.5% CRAH fan energy savings

19 19 Specifying Storage Type: Solid State or Tape

20 20 Solid State vs Tape Storage Dramatic differences in required rates of temperature change could impact access to free cooling Different relative humidity (RH) requirements change access to free cooling and requirement for humidity management

21 21 Solid State vs Tape Storage Storage architecture decisions are another IT management responsibility. ASHRAE TC9.9 states significant difference for the allowable rate of temperature change. Tape storage - 9°F per hour Solid state storage - 36°F per hour

22 22 Solid State vs Tape Storage Therefore, the approximate 3:1 acquisition and maintenance premium associated with solid state storage should be evaluated in terms of the cost avoidance of both chiller acquisition and installation, as well as its operating cost over the life of the data center

23 23 Specifying Cooling Unit Set Point

24 24 Specifying Cooling Unit Set Point Best practice is to specify maximum allowable IT equipment inlet temperature and let mechanical plant find its own level Managing temperature by thermostat set point frequently results in mechanical plant wasted energy and cycling or heating by cooling equipment.

25 25 Specifying Cooling Unit Set Point It is still common to see 72° F cooling unit return set today, and lower set points in the range of 68° F are still not uncommon Often driven by IT conservatism or hot spot mitigation

26 26 Specifying Cooling Unit Set Point Cooling units often cool the air 18° F, resulting in supply temperatures as low as 50° F, potentially causing dew point problems. At 50° F, 100% RH or condensation is reached with 55 grains of moisture per pound of dry air, a condition which would be met at any of the following data center control settings: 60% RH @ 65° F 50% RH @ 70° F 45% RH @ 75° F (Quite common) 36% RH @ 80° F

27 27 Specifying Cages / Layout That is Compatible with Containment

28 28 Cages and Containment Cages and containment don’t have to be mutually exclusive If the cage is not compatible with containment, there will be a need for extra volume airflow and lower cooling unit set points, resulting in higher operating costs

29 29 Cages and Containment Chimney cabinets provide full containment can be deployed independent of layout Out of a total 79 cabinets, 12 problem cabinets fitted with chimney exhaust 38% chiller plant energy savings 33.5% CRAH fan energy savings

30 30 Airflow Management Best Practices

31 31 Airflow Management Best Practices Adhering to airflow management best practices can reduce required airflow volume and allow you to increase cooling unit set points, lowering energy fan costs, lowering chiller energy costs, and increasing free cooling hours

32 32 Airflow Management Best Practices Raised floor open area management 2 types (good and bad) Good Supply tiles in cold aisle, in front of IT equipment Bad Supply tiles in open areas and hot aisles unsealed cable openings PDU and perimeter openings It is easier to seal a cable opening before cabinets and IT equipment are in place

33 33 Airflow Management Best Practices Blanking panels rail seals Few sites have done this fully Cabinet design plays a large role in AFM Cabinet AFM design needs to be considered by IT when purchasing cabinets

34 34 Airflow Management Best Practices Cable Management Responsibility of IT Commonly cabling impedes conditioned airflow under the raised floor Commonly cabling impedes exhaust flow out of IT equipment

35 35 3 Key Things You Have Learned During this Session 1.There are many IT decisions that directly affect facilities management and the overall capacity and operating cost of the data center 2.The goal of airflow management is to effectively cool IT equipment with the lowest possible flow rate at the warmest possible temperature 3.Benefits of IT and Facilities Collaborating 1.Reduced operating cost 2.Increased cooling capacity - prolonged data center life 3.Increased job satisfaction

36 36 Thank you Lars Strong, P.E. Senior Engineer, Upsite Technologies lstrong@upsite.com Follow Upsite for the latest news and information on data center AFM. @UpsiteTech blog.upsite.com On LinkedIn


Download ppt "1 ITM 1.2 How IT Decisions Impact Data Center Facilities: The Importance of Collaboration Lars Strong P.E. Upsite Technologies, Inc."

Similar presentations


Ads by Google