Copyright Green Revolution Cooling

Slides:



Advertisements
Similar presentations
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Green Datacenter Initiatives at SDSC Matt Campbell SDSC Data Center Services.
Advertisements

Clair Christofersen Mentor – Aaron Andersen August 2, 2012
Antero Punttila Analyzing most typical energy saving measures Energy Efficiency of Steam and Condensate Systems Antero Punttila, Motiva Oy.
G.E. UPS Productline.
Using Copper Water Loop Heat Pipes to Efficiently Cool CPUs and GPUs Stephen Fried President Passive Thermal Technology, Inc.
Variable Frequency Drives VFD Basics
Large Data Centers Small & Medium Data Centers Computer Rooms & Closets Network-Critical Physical Infrastructure Presented by Ian P. de la Rosa Enterprise.
Cooling Product Positioning
Cloud Computing Data Centers Dr. Sanjay P. Ahuja, Ph.D FIS Distinguished Professor of Computer Science School of Computing, UNF.
Co-generation Cogeneration is an attractive option for facilities with high electric rates and buildings that consume large amounts of hot water and electricity.
PG&E and Altera Data Center Energy Efficiency Project.
MENG 547 LECTURE 3 By Dr. O Phillips Agboola. C OMMERCIAL & INDUSTRIAL BUILDING ENERGY AUDIT Why do we audit Commercial/Industrial buildings Important.
InfraStruxure for Small IT Spaces Solution Introduction.
Reduses – Ampèrestraat 19D – 3861 NC Nijkerk, NL – Tel – – Applied gas engine heatpumps Thermal energy solutions.
Linux Clustering A way to supercomputing. What is Cluster? A group of individual computers bundled together using hardware and software in order to make.
John Daily Bill Best Nate Slinin Project Concept Our focus is centered around addressing the growing demands placed on the cooling infrastructure in.
Thermal Management Solutions from APW President Systems
Solar Energy Technology for Commercial Facilities John Archibald American Solar, Inc. Association of Energy Engineers Baltimore Chapter.
All content in this presentation is protected – © 2008 American Power Conversion Corporation Rael Haiboullin System Engineer Capacity Manager.
September 18, 2009 Critical Facilities Round Table 1 Introducing the Heat Wheel to the Data Center Robert (Dr. Bob) Sullivan, Ph.D. Data Center Infrastructure.
Cutting the Green IT Hype: Fact vs. Fiction Kenneth G. Brill, Executive Director Uptime Institute Inc
Steve Craker K-12 Team Lead Geoff Overland IT and Data Center Focus on Energy Increase IT Budgets with Energy Efficiency.
Federal Data Center Consolidation Initiative Karen Petraska August 17, 2011.
Best Practices in HVAC Design/Retrofit
IT Department 29 October 2012 LHC Resources Review Board2 LHC Resources Review Boards Frédéric Hemmer IT Department Head.
Overview of Liquid Cooling Systems Peter Rumsey, Rumsey Engineers.
Emerson Network Power Datacenter Infrastructure Solutions for Business Critical Continuity Many Needs, One Answer. May 28 th 2010.
COMP 4923 A2 Data Center Cooling Danny Silver JSOCS, Acadia University.
Mission Energy: Energy Efficient Cooling How Data Centre energy consumption and operating costs can be significantly reduced through readily available.
Designing the Building of the Future Practical Experience in IT Deployment Collaborate to Accelerate.
Energy Usage in Cloud Part2 Salih Safa BACANLI. Cooling Virtualization Energy Proportional System Conclusion.
Smart Grid: Opportunities in the Data Center January 22, 2010 Charles O’Donnell, Vice President, Engineering Liebert AC Power Emerson Network Power.
Submerged PC Cooling By: Patrick Hague Geoffrey Clark Christopher Fitzgerald Group 11.
Overview of Data Center Energy Use Bill Tschudi, LBNL
© zerofootprint 2005 © Zerofootprint 2007 Green Real Estate Workshop, Toronto Credit Risk and Capital Management Global Warming Real Estate Building What.
Nikhef/(SARA) tier-1 data center infrastructure
50th HPC User Forum Emerging Trends in HPC September 9-11, 2013
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Centre Consolidation Project Vincent Doré IT Technical.
We can…. 2 GLOBAL REFERENCES Rev: 00 References :
Authors: William Tschudi, Lawrence Berkeley National Lab Stephen Fok, Pacific Gas and Electric Company Stephen Fok, Pacific Gas and Electric Company Presented.
Power and Cooling at Texas Advanced Computing Center Tommy Minyard, Ph.D. Director of Advanced Computing Systems 42 nd HPC User Forum September 8, 2011.
Green Server Room Construction Primary concerns when building a server room is size and cooling. Size can be diminished with the use of virtual servers.
Data Center Energy Use, Metrics and Rating Systems Steve Greenberg Energy Management Engineer Environmental Energy Technologies Division Lawrence Berkeley.
All content in this presentation is protected – © 2008 American Power Conversion Corporation Row Cooling.
Ventilation & Airflow NetShelter CX 18U,24U,38U Ambient office air is drawn in to the soundproofed air-intake chambers on either side of the NetShelter.
Staying Ahead of the Curve IFM Outsourcing pushes the envelope on energy savings Chris Pesek, Managing Director Integrated Facilities Management, JLL.
[Eco 2 System] A Fundamental Twist To Data Centre Cooling Ecological Friendly & Economical.
HVAC Energy Efficiency using Mass Flow and Building Pressurization Jay Richardson Professional Supply, Inc.
Dell EMC Modular Data Centers
How Machine Learning & Analytics Saved 1 Billion kWh
CANOVATE MOBILE (CONTAINER) DATA CENTER SOLUTIONS
Enabling High Efficient Power Supplies for Servers
Overview: Cloud Datacenters II
Fifty Questions What Business and IT Officers Need to Know about their Campus’ Carbon Emissions Did your CEO sign the American College and University.
Unit 2: Chapter 2 Cooling.
Emerson Network Power Datacenter Infrastructure
The Data Center Challenge
Using Heat to Increase Cooling George Hannah BEng (Hons) CEng MIMechE
CERN Data Centre ‘Building 513 on the Meyrin Site’
Clustered Systems Introduction
ANSI/ASHRAE 90.4 Energy Standard for Data Centers
Energy Efficiency in District Coiling System
Cloud Computing Data Centers
AVE stands for Atmospheric Vortex Engine.
Direct Current (DC) Data Center
Cloud Computing Data Centers
Door Heat Exchanger: Specification

Presentation transcript:

Copyright 2012 - Green Revolution Cooling Oil Immersion Cooling Case Studies of Two Oil and Gas Data Centers Christiaan Best Copyright 2012 - Green Revolution Cooling

Copyright 2012 - Green Revolution Cooling The CarnotJet™ System Any OEM Server CarnotJet™ 42U Rack Heat Flow Vertically mounted OEM server Ethernet cable guides 30° – 53° C Water Outside Inside CarnotJet Pump module Power cable guides CarnotJet™ Rack PDU Mount Servers Liquid fill line Install any standard OEM rack server » Any brand » CPU and GPU compatible » Fiber compatible   Submerge into ElectroSafe™ coolant » Captures 100% of heat » Requires no air cooling Intelligent control system » Heat expelled outside » Alerts/monitoring software Copyright 2012 - Green Revolution Cooling

Copyright 2012 - Green Revolution Cooling Disclaimer Information about installations are restricted under NDA. The format of this presentation is a quote by a customer, followed by GRC’s explanation of why this particular benefit is possible. GRC explanations use generic data. Copyright 2012 - Green Revolution Cooling

Copyright 2012 - Green Revolution Cooling Case study 1: CGG Date first publicized: 2012 Size of installation: confidential (but growing) Copyright 2012 - Green Revolution Cooling

Copyright 2012 - Green Revolution Cooling Power Savings “What we are seeing is a significant saving in terms of electricity. I would say it’s not impossible to go up to a factor of two [power allocated to servers in a given data center]” Laurent Clerc, VP of Information Technology Copyright 2012 - Green Revolution Cooling

Power Savings Example Typical Installation Savings This slide needs to be fixed and improved mPUE < 1.03, any climate Copyright 2012 - Green Revolution Cooling

Copyright 2012 - Green Revolution Cooling Retrofit Savings “We saturated the power envelope of this room by putting twice as many systems as we would normally have, if it had a normal way of cooling systems.” Laurent Clerc, VP of Information Technology Copyright 2012 - Green Revolution Cooling

Example Power Infrastructure Average Power Equipment sizing (since peak is higher than average) Conventional Highly Efficient Air GRC Server Power 1 MW .85 MW Air handler / GRC .2 MW .1 MW .04 MW “Other” .5 MW .25 MW .09 MW Total 1.7 MW 1.35 MW .98 MW Make the formatting of both tables identical Battery Backup 1.0M 1.0 MW .85M Room Power distribution 1.2M 1.1 MW Generator 1.9M 1.45M 1.02M Copyright 2012 - Green Revolution Cooling

How to Nearly Double Your Data Center Server Capacity Retrofit with GRC Servers take ½ the floor space Reallocate MEP power to servers (add battery backup if necessary) Add 80% more servers into the same data center Copyright 2012 - Green Revolution Cooling

Data Center Operations “Today there is not really much difference between managing this room [with the CarnotJet” and managing another traditional air based computer.… there’s no noise, almost no noise, the temperature is very reasonable, there’s no air draft, … makes for a much nicer environment overall Laurent Clerc, VP of Information Technology Copyright 2012 - Green Revolution Cooling

Copyright 2012 - Green Revolution Cooling Operations video removing server per rack Deployed on 4 continents Some of the world’s biggest data center operators All of the “check boxes” filled Everything from data center labor unions to building code to seismic reinforcement Copyright 2012 - Green Revolution Cooling

Copyright 2012 - Green Revolution Cooling Start date: 2014 Size: 8 PFLOPS and growing Case Study 2 Copyright 2012 - Green Revolution Cooling

Copyright 2012 - Green Revolution Cooling Efficiency “The GRC system is reducing our cooling energy consumption by up to 90%, bringing down our total energy cost by around 35%.” Dr. Stuart Midgley, CTO Copyright 2012 - Green Revolution Cooling

Copyright 2012 - Green Revolution Cooling Reliability “Any disruption to our data processing can delay client projects worth millions of dollars. That is why we choose the Green Revolution Cooling system.” “The CarnotJet system has proven to be exceptionally reliable and is helping reduce the frequency of failures.” Dr Stuart Midgley, CTO Copyright 2012 - Green Revolution Cooling

A More Reliable Cooling System Integrated, prepackaged cooling system State of the art monitoring and controls Precisely engineered with guaranteed capacity Average more than 99.998% uptime Inherently reliable architecture Very simple system One moving part inside the data center for every 100 kW of power Copyright 2012 - Green Revolution Cooling

Copyright 2012 - Green Revolution Cooling More Reliable Servers No hot spots & limitless rack power density More than 100 kW per rack +/- 3 C, data center wide Improved electrical connector reliability No fans Many other reasons Copyright 2012 - Green Revolution Cooling

Copyright 2012 - Green Revolution Cooling A Quick Summary ½ the power Almost 2x the number of servers in your current facility More reliable cooling More reliable servers Copyright 2012 - Green Revolution Cooling

A Select List of Other Customers Global Telecom [Redacted] [one of the world’s largest ] [one of world’s largest telecoms ] [In Top 10 of most powerful computing centers] [In Top 10 of most powerful computing centers] [In Top 100 of most powerful computing centers]

For those serious about reducing data center cost Unmatched efficiency. Unmatched capital savings. Unmatched simplicity. For those serious about reducing data center cost Christiaan Best, CEO c. 512-771-2902 o. 512-692-8003 christiaan.best@grcooling.com Copyright 2012 - Green Revolution Cooling