Closing the Gap to Free Cooling in the Data Center

Slides:



Advertisements
Similar presentations
Clair Christofersen Mentor – Aaron Andersen August 2, 2012
Advertisements

1 Optimizing the Efficiency of the NCAR-Wyoming Supercomputing Center Ademola Olarinde Team Member: Theophile Nsengimana Mentor: Aaron Andersen August.
Pentagon Washington, D.C. UMCS contract for modernization of the Pentagon Scope includes: Design, engineering and commissioning of the Building Operations.
Heat Recovery for Commercial Buildings
Columbia University’s Advanced Concepts Data Center Pilot June 17, 2011.
Learning Outcomes Upon completion of this training one should be able to: Identify open loop and closed loop campus-type hydronic water system applications.
King Fahd University of Petroleum and Minerals Mechanical Engineering Department Presented by Mohammad A. Al-Muaigel /10/2004 National Energy.
RUSSIAN ACADEMY OF SCIENCES PROGRAM SYSTEMS INSTITUTE Optimal Control of Temperature Fields for Cooling of Supercomputer Facilities and Clusters and Energy.
Worcester Polytechnic Institute Gordon Library and Fuller Laboratory B.N. Tripathi Senior Vice President CES/Way.
Cooling Product Positioning
PG&E and Altera Data Center Energy Efficiency Project.
Anand Vanchi- Intel IT Ravi Giri – Intel IT Sujith Kannan – Intel Corporate Services Comprehensive Energy Efficiency of Data Centers – Case study shared.
Measuring and Validating Attempts to Green Columbia’s Data Center October 14, 2010 Rich Hall Peter M Crosta Alan Crosswell Columbia University Information.
Integrating a Saving Energy controller in Existing Air Condition units Case Study.
COMMERCIAL REFRIGERATION
 Site  Requirements  Local Resources  Initial layout ideas  Brief material selection  Supply options.
Computer Room Experiences A medium sized tier-2 site view Pete Gronbech GridPP Project Manager HEPIX April 2012.
September 18, 2009 Critical Facilities Round Table 1 Introducing the Heat Wheel to the Data Center Robert (Dr. Bob) Sullivan, Ph.D. Data Center Infrastructure.
 Install new air cooled high efficiency screw chiller (variable speed)  Install new fan coils with ECM motors and low temperature heating coils and proper.
Data Centre Efficiency. How much power?! The US EPA found in 2006 data centres used 1.5% of total US consumption (5) For Bangor University data centres.
Overview of Liquid Cooling Systems Peter Rumsey, Rumsey Engineers.
Air Conditioning and Computer Centre Power Efficiency The Reality Christophe Martel Tony Cass.
Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta.
COMP 4923 A2 Data Center Cooling Danny Silver JSOCS, Acadia University.
Mission Energy: Energy Efficient Cooling How Data Centre energy consumption and operating costs can be significantly reduced through readily available.
Lecture Objectives: Specify Exam Time Finish with HVAC systems –HW3 Introduce Projects 1 & 2 –eQUEST –other options.
NOVO ETS IMPROVING YOUR ORGANIZATIONS’ ENVIRONMENTAL, HEALTH AND SAFETY PERFORMANCE ENERGY MANAGEMENT IN THE CONTEXT OF GREEN PRODUCTIVITY.
November 2004 Low Hanging Fruit Low Cost Energy Efficiency Opportunities in Cleanrooms.
Ray & Joan Kroc Corps Community Center Mathias Kehoe | Mechanical Option April 09, 2012.
ThinDesk, Inc.. What is Thin Computing?  IT Industry is buzzing about Green IT, Virtualization both in the Data Centre and on the Desktop, Public / Private.
Architectural Engineering Senior Thesis Mechanical System Redesign Saint Joseph Medical Center Chris Nicolais.
Thermo Scientific has “Gone Green” 2014 MSU Greening the Supply Chain New technologies and processes to reduce the overall energy usage, during manufacturing.
The Data Center Challenge
Introduction to HVAC Optimization
1 LBT Q Engineering Review PLMG: Plumbing, includes IC system, chiller systems, instrument lab cooling, and LUCIFER cryo-compressor cooling.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Centre Consolidation Project Vincent Doré IT Technical.
Introduction to Energy Management. Week/Lesson 12 Advanced Technology for Effective Facility Control.
We can…. 2 GLOBAL REFERENCES Rev: 00 References :
COMP 4923 A2 Data Center Cooling Danny Silver JSOCS, Acadia University.
Heat Pumps In a heat engine, heat is converted to mechanical energy by taking advantage of the fact that heat flows from hot to cold. The heat is taken.
Variable Speed Applied to Pumps. Life Cycle Costs - Courtesy of Hydraulic Institute and Europump Initial cost is not the only cost associated with a pump.
Water is the Natural Choice Water: The Natural Choice For Efficiency – Building System efficiency not just a unit rating – Equipment Ratings EER and COP.
Energy Savings in CERN’s Main Data Centre
All content in this presentation is protected – © 2008 American Power Conversion Corporation Row Cooling.
Lack of dedicated building meter – Solution: Estimate square footage of Herberger in relation to total area serviced by meter – Estimate 12% of total.
1 PCE 2.1: The Co-Relationship of Containment and CFDs Gordon Johnson Senior CFD Manager at Subzero Engineering CDCDP (Certified Data Center Design Professional)
Bernard Boutherin 1 Feedback after four years of operation.
GeoExchange / Thermal Energy Storage Systems Reduce Overall Energy Use and Peak Demand.
AstraZeneca R&D Boston Waltham Energy Reduction Initiative 2016 Green Labs Symposium.
ENERGY SAVING AND EFFICIENCY OF AIR CONDITION & LIGHT SYSTEMS AT PUBLIC BUILDINGS Meitav-tec (Contel group)
Free Air Cooling for Data Centres
Maria’s Restaurant Chapter 2 Section 9
Fort Stanwix National Monument Energy Audit Contract
Unit 2: Chapter 2 Cooling.
Book E3 Section 1.4 Air-conditioning
Automotive Air Conditioners
The Data Center Challenge
HVAC EQUIPMENT: COOLING SOURCES (see Chapter 16)
CERN Data Centre ‘Building 513 on the Meyrin Site’
Doug Jefferson, Business Development Engineer
Grunenwald Science and Technology Building
Thermal Energy Storage
Energy Efficiency in District Coiling System
The Benefit of Including Energy Recovery System Analysis
Cloud Computing Data Centers
Frenger “Radiant” chilled beam performance at 1 Shelly St - Sydney
Cloud Computing Data Centers
Calibrated Energy Models: One New Change
Liebert DSE High efficiency thermal management
Maria’s Restaurant Chapter 2 Section 9
Presentation transcript:

Closing the Gap to Free Cooling in the Data Center Ralph Wescott Data Center Manager May 30, 2012 PNNL-SA-86058 (2)

PNNL A multiprogram research laboratory FY 2011 business volume $1.105B 4,600 staff 10,000 networked devices Unique laboratory equipment Scientific supercomputing PNNL has a unique operating contract with DOE that enables private research. EMSL is a unique user facility.

ISB2/1 Perimeter CRAC Cooling 3,000 ft sq typical perimeter CRAC cooled with 2 x 30 ton water-side economizer CRACs and 4 x 20 ton dry coolers. This layout being established to help enclose the Cold aisle to improve the PUE even further.

CSF/1811 Floorplan 10,000 ft sq data center

Chilled Door Exploded View Source: Motivair Corporation

RDHx with doors closed

RDHx with doors open

Top view of the Rear Door Heat Exchangers in the open position Top view of the Rear Door Heat Exchangers in the open position. Each flow 13.6gpm and remove up to 20kW of heat. Air leaving the doors is room neutral.

CSF/1811 Cooling Flow Cooling Flow and Sensor points, details of sensors on next page along with measurements averaged over 24 hr period on 5/22/2012.

Energy Share Calculations Sensor Readings Description of Readings averaged over 24 hrs on 5/22/2012 1 62.2 kW Chiller/Compressor 1 2 0.0 kW Chiller/Compressor 2 3 53.6 gpm Chilled Water Fow to CRACs 4 64.54F Chilled CRAC Water Return Temp 5 45.62F Chilled CRAC Water Supply Temp 6 3.0 kW Chilled Water Pump 1 7 1.38 kW Chilled Water Pump 2 8 815.69 gpm Condenser (Closed) Loop Water Flow 9 74.0F RDHx Loop Return Temp 10 63.67F RDHx Loop Supply Temp 11 111.88 gpm RDHx Loop Water Flow Rate 12 3.83 kW Condenser (Closed) Loop Pump 1 13 2.4 kW Condenser (Closed) Loop Pump 2 14 1.8 kW RDHx Loop Pump 3 (dedicated) 15 RDHx Loop Pump 4 (dedicated) 16 70.69F Condenser (Closed) Loop Return Temp 17 63.38F Condenser (Closed) Loop Supply Temp 18 58.97F Outside Air Temp (average 24 hrs) 19 22.23 kW Well Pump 1 20 17.8 kW Well Pump 2 21 Well Pump 3 22 21.84 kW Well Pump 4 23 63.83F Condenser Chiller 1 Intake Temp 24 70.64F Condenser Chiller 1 Exit Temp 25 624 gpm Condenser Chiller 1 Cooling Flow 26 n/a Condenser Chiller 2 Intake Temp 27 Condenser Chiller 2 Exit Temp 28 0.0 gpm Condenser Chiller 2 Cooling Flow Readings on the left fed calculations on the right to get the fair share of the cooling costs (in kW) that the RDHx and the CRACs have used. Power consumption by the lights was ignored as trivial for this comparison. Consumption by transformers and UPS’s is included in the measuring points.

Contrast/Comparison ISB2/1 GP Data Center CSF/1811 Computer Lab CRACs 99.3 kW to operate 181.9 kW heat removed Efficiency Ratio = 1.82 ISB2/1 Overall PUE = 1.54 (IT + CRACs)/IT (181.9+99.3)/181.9=1.54 CSF/1811 Computer Lab RDHx 19.5 kW to operate 169.3 kW heat removed Efficiency Ratio = 8.7 CRACs 42.5 kW to operate 148.6 kW heat removed Efficiency Ratio = 3.5 CSF Overall PUE = 1.18 (IT + RDHx + CRAC)/IT (336+19.5+42.5)/336=1.18 Wasted energy from UPS’s and transformers included. Lights are not included. Of note in our application: Groundwater cooled CRACs are 2 times more efficient than air-cooled CRACs RDHx are 2.5 times more efficient than groundwater cooled CRACs RDHx are 5 times more efficient that air-cooled CRACs Efficiency Ratio = How many kW of heat removed per kW of energy applied.

Capturing and tracking environmental data will become more important to maximize efficiency and help in choosing the most effective cooling techniques for your locality. This is an example of data available from modern building management software which can be used to feed DCIM Applications to help measure Data Center efficiency and evaluate energy efficient improvements. We are enclosing our cold aisle in ISB2/1 (June 2012) and will be able to immediately compare cooling costs before and after to give us an immediate idea of the ROI for this project. This delta will give us our carbon savings in a verifiable and easy to calculate manner.

DCIM Software in ISB2/1 Random screen shot of ISB2/1 on 5/23/2012. Cooling kW and PUE graphs are saw-toothed due to compressors going on and off. Using air as a heat sink, efficiency is better in the winter, worse in the summer. We are updating this chart to reflect the recent removal of a 300 kVA transformer and older UPS’s which eliminated the 19.98 kW energy loss due to that equipment.

Final Thoughts Groundwater Benefits: Immediate plans: Very efficient for CRACs and especially RDHx Avoid super-hot data center Avoid free-air contamination and security issues Immediate plans: Centralize building controls data base for ease of access Expand DCIM applications to CSF (June 2012) Increase metering to lights and CRACs for CSF Explore “touch cooling” for CPU’s and RAM Uses existing piping installed for RDHx at much reduced flow Returns 120F to building operators With current flow rates, achieve six times the kW cooled PNNL is standardizing on Alerton Controls for all new buildings and converting over for older buildings. This will enable a much more robust SQL centralized database for this information. Current RDHx rise is 10F, with touch cooling rising to 120F, same flow will yield 6 times the heat dissipated and the option of reusing that 120F water for per-heating labs during winter. Wet labs require 6 full air exchanges per hour by code, very expensive to heat/cool that much air all day long.

Ralph Wescott ralph.wescott@pnnl.gov 509-372-6901