Vette LiquiCoolTM Solution

Slides:



Advertisements
Similar presentations
Challenges in optimizing data center utilization
Advertisements

Presented by Paul Almond – Datacentre UK
HVAC523 Heat Sources.
Exhaust Heat Recovery Systems
Heat Recovery for Commercial Buildings
Cooling Product Positioning
Booster System Basics: Constant Speed Systems
Cloud Computing Data Centers Dr. Sanjay P. Ahuja, Ph.D FIS Distinguished Professor of Computer Science School of Computing, UNF.
Driving Toward Energy Efficiency Emerson Process Management Novaspect. Inc. June 22 / 23, 2010 Steam.
ELECTRIC HYBRID HEAT PUMP WATER HEATERS
Engineer Presentation
Hydraulics.
John Daily Bill Best Nate Slinin Project Concept Our focus is centered around addressing the growing demands placed on the cooling infrastructure in.
Refrigeration and Heat Pump Systems Refrigeration systems: To cool a refrigerated space or to maintain the temperature of a space below that of the surroundings.
MODULAR DATA CENTER PUE
STEAM HEATING.
Hydronic Mechanical Controls
…an innovative shell and tube heat exchanger with an exceptionally high heat transfer coefficient for industrial fluid cooling and air conditioning.
Steve Craker K-12 Team Lead Geoff Overland IT and Data Center Focus on Energy Increase IT Budgets with Energy Efficiency.
 Install new air cooled high efficiency screw chiller (variable speed)  Install new fan coils with ECM motors and low temperature heating coils and proper.
High Density Cooling Air conditioning for high density applications
Water piping design.
Overview of Liquid Cooling Systems Peter Rumsey, Rumsey Engineers.
Hydronic Global Satellite Program highlights
1 Medical Office Building. 2 Occupancy – 400 persons 8 a.m. – 5 p.m. Monday - Friday Building Characteristics Three stories 40,000 square feet (200’ x.
Mission Energy: Energy Efficient Cooling How Data Centre energy consumption and operating costs can be significantly reduced through readily available.
Heat Transfer Equations For “thin walled” tubes, A i = A o.
Refrigeration Basics 101.
NOVO ETS IMPROVING YOUR ORGANIZATIONS’ ENVIRONMENTAL, HEALTH AND SAFETY PERFORMANCE ENERGY MANAGEMENT IN THE CONTEXT OF GREEN PRODUCTIVITY.
Virginia Information Technologies Agency Green Technology Initiatives
Overview of Data Center Energy Use Bill Tschudi, LBNL
Electronics Enclosures
Dealing with Hotspots in Datacenters Caused by High-Density Computing Peter Hannaford Director of Business Development EMEA.
1 BREC Air-cooled water chillers BREF Air-cooled water chillers with free-cooling system 1602A A A A A 3202A A A -
Learning Outcomes Upon completion of this training one should be able to: Identify hydronic chilled water system applications. Define the difference between.
Solar Heating/Cooling/Dehumidifier Systems
We can…. 2 GLOBAL REFERENCES Rev: 00 References :
Ships Service Air System
Heat Transfer Equations For “thin walled” tubes, A i = A o.
Introduction to Energy Management. Week/Lesson 9 part a Evaporative Cooling and Cooling Towers.
Second Nature ® Compact Chiller Leading the way in secondary coolant refrigeration systems. SNCC.
Learning Outcomes Upon completion of this training one should be able to: Identify hydronic chilled water system applications. Define the difference between.
All content in this presentation is protected – © 2008 American Power Conversion Corporation Row Cooling.
I Need A Space Heating System Let’s Look At A Central Steam Plant Option.
1 PCE 2.1: The Co-Relationship of Containment and CFDs Gordon Johnson Senior CFD Manager at Subzero Engineering CDCDP (Certified Data Center Design Professional)
Ventilation & Airflow NetShelter CX 18U,24U,38U Ambient office air is drawn in to the soundproofed air-intake chambers on either side of the NetShelter.
Enhancing The Efficiency Of New And Existing Cooling Systems Dr Robert Lamb CEng FInstR Group Sales & Marketing Director Star Refrigeration Ltd.
CANOVATE MOBILE (CONTAINER) DATA CENTER SOLUTIONS
Automotive Heating And Air Conditioning
Exhaust Heat Recovery Systems
Lecture Objectives: Learn about thermal storage systems
Exhaust Heat Recovery Systems
UNICO INVERTER DC. UNICO INVERTER DC UNICO INVERTER DC The first INVERTER technology based heat pump without external unit on the market The new product.
The Data Center Challenge
ARAC/H/F Air-cooled water chillers, free-cooling chillers and heat pumps Range: kW.
MODULAR DATA CENTER PUE
CERN Data Centre ‘Building 513 on the Meyrin Site’
AdaptaPAK The sustainable, energy-efficient outdoor refrigeration solution.
FP420 Detector Cooling Thermal Considerations
June 13,2016 Kevin Werely Regional Sales Director
Closing the Gap to Free Cooling in the Data Center
Chilled Beam Performance:
The Benefit of Including Energy Recovery System Analysis
Cloud Computing Data Centers
Cloud Computing Data Centers
Liebert DSE High efficiency thermal management
Door Heat Exchanger: Specification
Presentation transcript:

Vette LiquiCoolTM Solution Rob Perry Executive Manager Arlene Allen University of California Santa Barbara Director, Information Systems & Computing

Data Center Trends - Exponential Growth 428 Data Center Trends - Exponential Growth Explosive demand for services is driving Data Center spend 31 Billion Google Searches every month – 10X growth in last 2 years 1 Billion Internet devices today – 1,000X growth since 1992 Number of daily text messages exceeds world population 2,000 4,000 6,000 8,000 10,000 12,000 14,000 16,000 18,000 2000 2001 2002 2003 2004 2005 2006 2010 Year Installed volume servers (000) – USA Source: EPA 2007 Report to congress 3X Growth BRU_

Data Center Trends - Staggering Energy Consumption and Cost of Energy 46 Data Center Trends - Staggering Energy Consumption and Cost of Energy Energy unit price has increased an average of 4% YOY in the USA and 11% YOY Globally Data Center energy consumption is growing by 12% annually Source: EPA 2007 Report to Congress BRU_

46 Data Center Trends – Operating Expense Exceeds Capital Expense in less than 1 year Data Center facility costs are growing 20% vs. IT spend of 6% Operating costs over lifetime of a server ~ 4X original purchase cost Cooling infrastructure can consume up to 55% of Data Center energy Source: Belady, C., “In the Data Center, Power and Cooling Costs More than IT Equipment it Supports”, Electronics Cooling Magazine (Feb 2007) BRU_

Increasing Carbon Footprint Today, the average Data Center consumes energy equivalent to 25,000 houses 90% of large Data Centers will require more power and cooling in the next 3 years Without changes, Data Center greenhouse emissions are predicted to quadruple by 2020 Source: McKinsey & Company

UCSB – “The Problem” UCSB’s existing Data Center is being renovated for research computing and is forcing the corporate/miscellaneous IT equipment into a new space. This new space is not designed to be a Data Center. The footprint is small, the power is limited by existing building wiring and using traditional air-cooling topology is not feasible. The new space limitations requires the load density to increase from a typical density of 6kW or less to a higher density of 10-16kW per rack

LiquiCool - “The Solution” Move the corporate/miscellaneous IT racks into the new space Tap into the existing building chilled water system Install Vette LiquiCool Rear Door Heat Exchangers on every rack Install Vette LiquiCool Coolant Distribution Unit for a secondary loop Install rack-mount UPS in every rack

LiquiCool - “The Solution” LiquiCool™ – A complete cooling solution for the consolidation and scale-out of compute infrastructure in today’s sustainable Data Centers Reduces white space requirements by more than 55% Cuts cooling energy consumption by 50% or more when compared to traditional air-cooled Data Centers Allows 8X the amount of compute power in a typical IT enclosure Lowers carbon footprint by more 50% or more vs. air-cooling Bottom Line: Payback in less than 1 year when compared to traditional computer room air-conditioning

LiquiCool - How does it work? Based on IBM IP & Technology Licenses (>30 years of water cooling experience) Rear Door Heat Exchanger (RDHx) replaces existing rear door of IT enclosure RDHx has chilled water Supply & Return quick connections at bottom OR top Raised floor becomes optional Chilled water circulates through tube+fin coil from Supply connection Equipment exhaust air passes through coil and is cooled before re-entering the room Fin + tube Heat exchanger Front of Enclosure Rear of Enclosure Cold Supply Water Heated Water

LiquiCool System Passive RDHx provides 100% sensible cooling No condensation, no need for reheat or humidification CDU creates a fully isolated, temperature controlled Secondary Loop Chilled water source - city water, building chilled water, packaged chiller… Temperature:10-17oC 50-63oF Water pressure:30-70 psi Temperature:7oC / 45o F Water pressure:100-200 psi 10 10

RDHx - External View Passive No electrical connections No moving parts No Fans No power No noise Attaches to rear No need to rearrange racks Does not consume valuable floor space, adds 4-6” to rear Close-coupled Neutralizes at the source Top Feed Connections Bottom Feed Connections

RDHx - Internal View Protective barrier Air-bleed valves Bottom Feed Hose Connections and drain valve Tube & Fin coil

Thermal Image - Before & After 100% Heat Neutralization

RDHx reduces Leaving Temperature by 28ºF (15.4ºC)! RDHx Cooling in Action Temperature readings taken in the rear of a fully populated Enclosure Rear Door Heat Exchanger Door opened Server Leaving Temp 102ºF (38.9ºC) Rear Door Heat Exchanger Door closed Server Leaving Temp: 74ºF (23.5ºC) RDHx reduces Leaving Temperature by 28ºF (15.4ºC)!

RDHx is Compatible with most major IT Enclosures Industry Standard Enclosure Mount Transition Frame (if needed) Remove existing rack rear door & hinges

RDHx General Specifications Max. Cooling Capacity: 33kW Coolant: Chilled Water (above dew point) Dimensions: 76.6“ H x 4.6“ D x 23.6“ W (1945mm x 117mm x 600mm) Weight – empty: 63lbs (29kg) Liquid Volume: 1.5 Gallons (5.7 Liters) Liquid Flow Rate: 6-10 GPM (23-38 L/min) Head Loss: 7 psi (48 kPa) at 10 GPM (38 L/min) System Input Power: None required Noise: None Couplings: Drip-free stainless steel quick- connects Connection Location: Bottom or Top Feed

Coolant Distribution Unit (CDU) Water to water heat exchanger with pumps, controls and chilled water valve Creates an isolated secondary cooling loop 100% sensible cooling, no condensation Small water volume (tens of gallons) Easier to control water quality Redundant, fault-tolerant design 120kW or 150kW capacity Supports 6-12 RDHx Optional internal manifold for quick expansion SNMP & ModBus communications Power Consumption: 2.6 kW Pump Capacity: 63 GPM at 30psi (240 L/min at 207 kPa) Primary Head Loss: 10.2 psi at 63 GPM (70 kPa at 240 L/min) Minimum Approach Temperature (100% load): 120kW unit - 12°F (6.7 °C) 150kW unit - 8°F (4.4 °C) 63 GPM (240 L/min) on primary and secondary

CDU Simplified

Floor-mount CDU Internal - Front Controller Brazed plate heat exchanger Inverter drive Redundant valves Reservoir tank Redundant variable speed pumps Casters and Drain

Floor-mount CDU Internal - Rear Optional Secondary Loop Distribution Manifold Primary side water filter Primary supply and return connections Optional Secondary Loop Flex Tails

Hose Kits & External Manifolds Connects to flex tails on CDU secondary side ISO B or Sweated Connections Standard & custom configurations Each Vette Hose Kit consists of a flexible Supply hose and a Return hose Factory assembled and tested to IBM specifications and standards Quick-connect drip-free couplings on one end OR both ends Straight hoses for raised floor environments, right angle hoses for non-raised floor environments Standard lengths from 3ft. to 50ft.

Treatment of Cooling Water Water Treatment Treatment of Cooling Water Potential Effects of Non-Treatment Loss of heat transfer Reduced system efficiency Reduced equipment life Equipment failures or leaks De-ionized water without inhibitors is corrosive! SCALE FOULING MICROBIO CORROSION

Scenario I – Out of Space Add RDHx – Double your load per rack Eliminate CRAC units 56% Recovery of White Space!

Scenario II – Out of Power/Capacity Add RDHx Remove (2) CRAC units Reduces cooling energy consumption to free up capacity for growth

Scenario III – High Density CRAC units can typically provide efficient environmental control for rack densities of up to 5kw per rack Adding RDHx allows 8X the Compute Power!

Reference Sites Warwick University, Coventry, UK Rack Entering Air Temperature (EAT): 80°F, 30%RH – Within revised ASHRAE TC9.9 recommended operating range RDHx Entering Water Temperature (EWT): 45°F RDHx Water Flow Rate:10 GPM National Center for HPC, Taiwan

Georgia Tech Super Computing Facility - 12 racks at ~24kW each Reference Sites Front view Rear view Georgia Tech Super Computing Facility - 12 racks at ~24kW each

Silicon Valley Leadership Group Case Study - Modular Cooling Systems Sun Microsystems hosted the assessment of four commercially available modular cooling systems for data centers. The 4 systems were installed at Sun Microsystems’ data center n Santa Clara, California and they included APC InRow RC, IBM/Vette RDHx, Liebert XD and Rittal LCP+. LBNL (Lawrence Berkeley National Laboratory) initiated the study to investigate energy implications of commercially available modular cooling systems compared to that of traditional data centers. Each rack density was 10kW. RDHx Coefficient Of Performance (COP -ratio of cooling provided by the module to the power to drive the module varied from 64.4 to 229.0, compared to other systems between 4.7 to 13.1. RDHx power index (PI, ratio of power demand of the cooling module to compute load) varied from 0.0004 to 0.0009 while competitors figures were between 0.10 to 0.20, with Liquid Cooled Rack at over 0.40. RDHx COP & PI vales indicate dramatically high energy efficiency than the other systems. The results were released on July 1, 2008, during the Silicon Valley Leadership Group (SVLG) Data Center Energy Summit in Santa Clara, California.

SVLG “Chill Off” Results Vette’s LiquiCool™ solution led the field in cooling capacity and in cooling efficiency! Sun Microsystems hosted the assessment of four commercially available modular cooling systems for data centers. The 4 systems were installed at Sun Microsystems’ data center n Santa Clara, California and they included APC InRow RC, IBM/Vette RDHx, Liebert XD and Rittal LCP+. LBNL (Lawrence Berkeley National Laboratory) initiated the study to investigate energy implications of commercially available modular cooling systems compared to that of traditional data centers. Each rack density was 10kW. RDHx Coefficient Of Performance (COP -ratio of cooling provided by the module to the power to drive the module varied from 64.4 to 229.0, compared to other systems between 4.7 to 13.1. RDHx power index (PI, ratio of power demand of the cooling module to compute load) varied from 0.0004 to 0.0009 while competitors figures were between 0.10 to 0.20, with Liquid Cooled Rack at over 0.40. RDHx COP & PI vales indicate dramatically high energy efficiency than the other systems. The results were released on July 1, 2008, during the Silicon Valley Leadership Group (SVLG) Data Center Energy Summit in Santa Clara, California. Vette

LiquiCool - Conclusive Savings for Energy, Space & Cost Largest % of Data Center OPEX growth is power & cooling related Cost of energy for cooling is a large (and growing) cost component Data Center consolidation, virtualization and advanced hardware technology are driving higher power densities per rack and associated white space constraints Traditional air-cooling is less likely feasible Purchasing decisions can no longer be made solely on CAPEX TCO must not only be considered, but is core Value Summary: Reduces white space requirements by more than 55% Cuts cooling energy consumption by 50% or more when compared to traditional air-cooled Data Centers Allows 8X the amount of compute power in a typical IT enclosure Lowers carbon footprint by more 50% or more vs. air-cooling Bottom Line: Payback in less than 1 year when compared to traditional computer room air-conditioning

End Thank You