Lars Strong P.E. Upsite Technologies, Inc.

Slides:



Advertisements
Similar presentations
Challenges in optimizing data center utilization
Advertisements

Presented by Paul Almond – Datacentre UK
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Green Datacenter Initiatives at SDSC Matt Campbell SDSC Data Center Services.
Data Center Design Issues Bill Tschudi, LBNL
Clair Christofersen Mentor – Aaron Andersen August 2, 2012
Session Title: Demystifying Efficiency in the Data Center Utilizing Airflow as a System Presented By: Jon deRidder Enabled Energy.
Matt Warner Future Facilities Proactive Airflow Management in Data Centre Operation - using CFD simulation to improve resilience, energy efficiency and.
The Role of Controls for Indoor Air Quality Kent W. Peterson, PE, Fellow ASHRAE P2S Engineering, Inc. Mid Columbia ASHRAE Chapter.
1 * Other names and brands may be claimed as the property of others. Copyright © 2010, Intel Corporation. Data Center Efficiency with Optimized Cooling.
Columbia University’s Advanced Concepts Data Center Pilot June 17, 2011.
Data Center Controls Mark Hydeman, P.E., FASHRAE Taylor Engineering, LLC
Cooling Product Positioning
PG&E and Altera Data Center Energy Efficiency Project.
Project Motivation: Opportunity to explore building efficiency technology and the engineering design process Improving the thermal efficiency will save.
Effect of Rack Server Population on Temperatures in Data Centers CEETHERM Data Center Laboratory G.W. Woodruff School of Mechanical Engineering Georgia.
Thermal Management Solutions from APW President Systems
 Site  Requirements  Local Resources  Initial layout ideas  Brief material selection  Supply options.
CoolAir Temperature- and Variation-Aware Management for Free-Cooled Datacenters Íñigo Goiri, Thu D. Nguyen, and Ricardo Bianchini 1.
September 18, 2009 Critical Facilities Round Table 1 Introducing the Heat Wheel to the Data Center Robert (Dr. Bob) Sullivan, Ph.D. Data Center Infrastructure.
Cutting the Green IT Hype: Fact vs. Fiction Kenneth G. Brill, Executive Director Uptime Institute Inc
1Taylor Engineering, LLC HVAC System Design Mark Hydeman, P.E., FASHRAE Taylor Engineering, LLC
Lecture Objectives: Model HVAC Systems –HW3 Asignemnet Learn about eQUEST software –How to conduct parametric analysis of building envelope.
Best Practices in HVAC Design/Retrofit
Applying Precision Air Conditioning Systems
Air Conditioning and Computer Centre Power Efficiency The Reality Christophe Martel Tony Cass.
Introduction to EMC.  Data Centre physical infrastructure specialist  Technical environment M&E.
Data centre air management Case studies Sophia Flucker.
Liam Newcombe BCS Data Centre Specialist Group Secretary Energy Efficient Data Centres.
November 2004 Low Hanging Fruit Low Cost Energy Efficiency Opportunities in Cleanrooms.
HVACR416 - Design Heat Loss / Heat Gain Part 1. Why? The primary function of Air Conditioning is to maintain conditions that are… o Conductive to human.
Electronics Enclosures
Dealing with Hotspots in Datacenters Caused by High-Density Computing Peter Hannaford Director of Business Development EMEA.
Schneider Electric: Total Cooling Solutions for Data Centers
The Data Center Challenge
Authors: William Tschudi, Lawrence Berkeley National Lab Stephen Fok, Pacific Gas and Electric Company Stephen Fok, Pacific Gas and Electric Company Presented.
Energy Savings in CERN’s Main Data Centre
All content in this presentation is protected – © 2008 American Power Conversion Corporation Row Cooling.
War Stories: Implementing ASHRAE’s 2011 Allowable Ranges in Data Centers Mark Monroe, Vice President & CTO of DLB Associates Consulting Engineers Session.
1 ITM 4.1: A Three-Step Approach to Better Instrumentation of the Data Center and Use of KPIs in Decision Making George Clement.
1 PCE 4.4 New Development In DC Containment Steve Howell.
1 ITM 1.2 How IT Decisions Impact Data Center Facilities: The Importance of Collaboration Lars Strong P.E. Upsite Technologies, Inc.
1 PCE 2.1: The Co-Relationship of Containment and CFDs Gordon Johnson Senior CFD Manager at Subzero Engineering CDCDP (Certified Data Center Design Professional)
Restricted © Siemens AG 2016 Page 1 Tower to Rack: Driving the Next Generation of Cooling Optimization Technology Jay Hendrix, Siemens Industry Inc. Aaron.
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice. HP BLDG. 6A Server Lab.
West Cambridge Data Centre Ian Tasker Information Services.
1 LDR2 Providing Optimal Data Center Performance by Comprehensively Assessing the Facility & Implementing ROI Driven Measures Moderated: Trevor Stiegemeier,
Data Center Energy Efficiency SC07 Birds of a Feather November, 2007 William Tschudi
1 Copyright © 2016, The Green Grid The webcast will begin shortly Today’s live session will be recorded.
1 FOM 2.2 Tier Certification Explained Keith Klesner Senior Vice President North America Uptime Institute.
Free Air Cooling for Data Centres
CANOVATE MOBILE (CONTAINER) DATA CENTER SOLUTIONS
Maria’s Restaurant Chapter 2 Section 9
Fifty Questions What Business and IT Officers Need to Know about their Campus’ Carbon Emissions Did your CEO sign the American College and University.
Unit 2: Chapter 2 Cooling.
ITOM 3.2 Data Center & Cloud Services Market Overview
The Data Center Challenge
Using Heat to Increase Cooling George Hannah BEng (Hons) CEng MIMechE
CERN Data Centre ‘Building 513 on the Meyrin Site’
Doug Jefferson, Business Development Engineer
Data Center Research Roadmap
Thermal Energy Storage
June 13,2016 Kevin Werely Regional Sales Director
Closing the Gap to Free Cooling in the Data Center
Data Center Controls Mark Hydeman, P.E., FASHRAE Taylor Engineering, LLC
Munters DH Systems for Freezer Applications Presented by: Mike Sadosky
The Benefit of Including Energy Recovery System Analysis
Technician’s Guide and Workbook for Zoning Section 1: Introduction
Objective Use financial modeling of your data center costs to optimize their utilization.
Liebert DSE High efficiency thermal management
Maria’s Restaurant Chapter 2 Section 9
Presentation transcript:

Lars Strong P.E. Upsite Technologies, Inc. ITOM 20.3 How IT Decisions Impact Data Center Facilities: The Importance of Collaboration Lars Strong P.E. Upsite Technologies, Inc.

Data Center World – Certified Vendor Neutral Each presenter is required to certify that their presentation will be vendor-neutral. As an attendee you have a right to enforce this policy of having no sales pitch within a session by alerting the speaker if you feel the session is not being presented in a vendor neutral fashion. If the issue continues to be a problem, please alert Data Center World staff after the session is complete.

How IT Decisions Impact Data Center Facilities: The Importance of Collaboration Decisions and actions typically under the jurisdiction of the IT side of data center management can have a profound impact on the mechanical systems, operating costs, and capacity of the data center. By understanding these impacts, IT and facilities management are able to develop a cooperative approach to managing the data center.

Specifying Server Type: Cooling Delta T (∆T)

Decisions about server types affect: Cooling unit operating expense Cooling unit acquisition capital expense

ΔT Through IT Equipment Regardless of airflow management in the computer room, the IT equipment airflow temperature rise will be constant. Every kilowatt of electricity consumed by IT equipment becomes a kW of heat added to the flow of cooling air through the IT equipment. There is a fixed relationship between airflow, temperature differential and heat (energy).

ΔT Through IT Equipment – Heat Transfer Equation CFM = cubic feet per minute of airflow through the server 3.16 = factor for density of air at sea level in relation to ⁰F ΔT = temperature rise of air through the server in ⁰F IT Equipment Required Flow Rate   IT Equipment Delta T (⁰F) 15 20 25 30 35 40 Required flow rate (CFM/kW) 211 158 126 105 90 79 3.16 x W Delta T CFM =

ΔT Through IT Equipment “Pizza box” servers at 20 ⁰F ΔT consume 158 CFM / kW “Blade” servers at 35 ⁰F ΔT consume 90 CFM / kW

ΔT Through IT Equipment 500kW of “Pizza box” servers 1,000 servers Would require 79,000 CFM of chilled air 500kW of “Blade” servers 1,600 servers (100 chassis) Would require 45,140 CFM of chilled air

ΔT Through IT Equipment “Blade” servers (35 ⁰F ΔT) Servers require 43% less airflow rate Cooling unit fans require 81% less energy At 80% fan speed ~80% airflow rate Only ~51% of fan energy (49% reduction)

Specifying Server Class: Class (A1, A2, A3, or A4)

Decisions about ASHRAE Server Class affect: Access to free cooling hours Chiller operating efficiency Capital investment for the mechanical plant PUE anomalies Warranty and associated reliability costs Building footprint and real estate investment

ASHREA Environmental Guidelines

ASHRAE Server Class How to know the effect of temperature on IT equipment reliability/life? How to know how long it is OK to operate in allowable temperature range? The “X” Factor Baseline = 24/7 operation @ 68˚F server inlet temperature OEM historical data on user failure reports Performance variations above, at and below baseline 68˚ Premise is that the data center temperature follows mother nature

ASHRAE Server Class Time at temperature weighted failure rate calculations for IT equipment in Chicago Inlet Temperature The “X” factor % of Hours 59⁰F ≤ T ≤ 68⁰F .865 68⁰F ≤ T ≤ 77⁰F 1.13 77⁰F ≤ T ≤ 86⁰F 1.335 86⁰F ≤ T ≤ 95⁰F 1.482 Net x-Factor = 0.99

ASHRAE Server Class Time-weighted X-factor estimates for air-side economizer for selected U.S. cities versus baseline 68˚ for 8,760 hours Improved Failure Rates San Francisco 90% Seattle 90% Boston 96% Denver 97% Los Angeles 98% Chicago 99% Worse Failure Rates Miami 128% Phoenix 120% Houston 113% Dallas 110% Atlanta 104% Washington DC 101%

Specifying Equipment That Breaths from Front to Back

Decisions on IT equipment breathing patterns affect: Integrity of hot/cold aisle separation Potential floor density thresholds Fan and temperature set points Integration of switches and servers

Specify IT Equipment that breaths front to back As far as the mechanical health of the data center, all equipment should be: rack-mountable Breath from front to rear Recognized best practice of sources as diverse as: European Code of Conduct for Data Centers BICSI-002 ASHRAE TC9.9

Specify IT Equipment that breaths front to back CFD model – switch flow side to side 3,500 sq. ft., 514 kW 3kW rack with 73 ⁰F to 90 ⁰F intake air temperatures 6kW switch rack ingesting 56 ⁰F to 85 ⁰F air Allowable by manufacturer but exceeded internal SLA 72 ⁰F return set point, resulting in 54 ⁰F supply 82,500 CFM cooling supply for 67,900 CFM IT demand

Specify IT Equipment that breaths front to back CFD model – switch flow front to back 3kW rack with 75 ⁰F to 76 ⁰F intake air temperatures 6kW switch rack ingesting 75 ⁰F to 76 ⁰F air Supply temperature increased from 54 ⁰F to 75 ⁰F 38% chiller plant energy savings cooling supply reduced from 82,500 CFM to 72,000 CFM 33.5% CRAH fan energy savings

Specifying Storage Type: Solid State or Tape

Decisions on Solid State vs Tape Storage affect: Access to free cooling Total operating energy budget Mechanical plant acquisition and construction costs Humidity management investment

Solid State vs Tape Storage ASHRAE TC9.9 rate of temperature change boundaries: Tape storage - 9°F per hour Solid state storage - 36°F per hour Weigh solid state storage premium against cost avoidance of acquiring, constructing and operating a mechanical plant

Solid State vs Tape Storage Therefore, the approximate 3:1 acquisition and maintenance premium associated with solid state storage should be evaluated in terms of the cost avoidance of both chiller acquisition and installation, as well as its operating cost over the life of the data center

Specifying Cooling Unit Set Point

Decisions Specifying Cooling Unit Set Point Affect: Chiller efficiency Cooling unit efficiency and life (hot & cold cycling) Humidity management operating expenses Access to free cooling General operational discipline

Specifying Cooling Unit Set Point It is still common to see 72° F cooling unit return set today, and lower set points in the range of 68° F are still not uncommon Often driven by IT conservatism or hot spot mitigation

Specifying Cooling Unit Set Point Cooling units often cool the air 18° F, resulting in supply temperatures as low as 50° F, potentially causing dew point problems. At 50° F, 100% RH or condensation is reached with 55 grains of moisture per pound of dry air, a condition which would be met at any of the following data center control settings: 60% RH @ 65° F 50% RH @ 70° F 45% RH @ 75° F (Quite common) 36% RH @ 80° F

Specifying Cages / Layout That is Compatible with Containment

Specifying Cages Compatible with Containment affects: EVERYTHING! Cages and containment don’t have to be mutually exclusive If the cage is not compatible with containment, there will be a need for extra volume airflow and lower cooling unit set points, resulting in higher operating costs

Cages and Containment Chimney cabinets provide full containment can be deployed independent of layout Out of a total 79 cabinets, 12 problem cabinets fitted with chimney exhaust 38% chiller plant energy savings 33.5% CRAH fan energy savings

Airflow Management Best Practices

Adhering to Airflow Management Best Practices Affect: EVERYTHING! Reduce required airflow rate Reduced fan energy cost Increase cooling unit set points / supply air temperature Increase cooling capacity increasing free cooling hours Lowering chiller energy costs Defer capital expenditure Cooling units or even new data center

Raised Floor Open Area Management Seal cable openings Seal under power distribution units (PDU) and other equipment Check perimeter of raised floor plenum No misplaced perforated

Cable management Responsibility of IT Commonly cabling impedes conditioned airflow under the raised floor Commonly cabling impedes exhaust flow out of IT equipment

Rack Open Area Management Blanking panels Rail seals Under racks Few sites have done this fully Cabinet design plays a large role in AFM Cabinet AFM design needs to be considered by IT when purchasing cabinets

Row / Aisle Open Area Management Gaps between racks Missing racks Doors on the ends of aisles Baffles or full roof over racks

3 Key Things You Have Learned During this Session There are many IT decisions that directly affect facilities management and the overall capacity and operating cost of the data center The goal of airflow management is to effectively cool IT equipment with the lowest possible flow rate at the warmest possible temperature Benefits of IT and Facilities Collaborating Reduced operating cost Increased cooling capacity - prolonged data center life Increased job satisfaction

Upsite’s 4 R’s of Airflow Management™ A holistic approach An iterative process ‘Check in’ at the room level after making any AFM improvements

Thank you Lars Strong, P.E. Senior Engineer, Upsite Technologies lstrong@upsite.com Follow Upsite for the latest news and information on data center AFM. @UpsiteTech blog.upsite.com On LinkedIn