Potomac AFCOM Quarterly Meeting Hosted by: DuPont Fabros Technology.

Slides:



Advertisements
Similar presentations
Challenges in optimizing data center utilization
Advertisements

SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Green Datacenter Initiatives at SDSC Matt Campbell SDSC Data Center Services.
Data Center Design Issues Bill Tschudi, LBNL
Matt Warner Future Facilities Proactive Airflow Management in Data Centre Operation - using CFD simulation to improve resilience, energy efficiency and.
Demand Response: The Challenges of Integration in a Total Resource Plan Demand Response: The Challenges of Integration in a Total Resource Plan Howard.
G.E. UPS Productline.
Computer Room Provision in Atlas and R89 Graham Robinson.
Environmental Controls I/IG Lecture 14 Mechanical System Space Requirements Mechanical System Exchange Loops HVAC Systems Lecture 14 Mechanical System.
Data Center Design Brill Awards for Efficient IT
Learning Outcomes Upon completion of this training one should be able to: Identify open loop and closed loop campus-type hydronic water system applications.
Zycko’s data centre proposition The very best products and technologies Focus on what your customers need Turning products into solutions Wide range of.
Cooling Product Positioning
1 BROOKHAVEN SCIENCE ASSOCIATES NSLS – II ASAC Review Conventional Facilities Briefing Electrical Utility Service Dennis Danseglio, P.E. Project Engineer.
PG&E and Altera Data Center Energy Efficiency Project.
02/24/09 Green Data Center project Alan Crosswell.
Computer Room Experiences A medium sized tier-2 site view Pete Gronbech GridPP Project Manager HEPIX April 2012.
September 18, 2009 Critical Facilities Round Table 1 Introducing the Heat Wheel to the Data Center Robert (Dr. Bob) Sullivan, Ph.D. Data Center Infrastructure.
Cutting the Green IT Hype: Fact vs. Fiction Kenneth G. Brill, Executive Director Uptime Institute Inc
Steve Craker K-12 Team Lead Geoff Overland IT and Data Center Focus on Energy Increase IT Budgets with Energy Efficiency.
Building Systems Integration - Energy and Cost Analysis The Milton Hershey School New Supply Center Justin Bem AE Senior Thesis – Spring 2007 Mechanical.
1Taylor Engineering, LLC HVAC System Design Mark Hydeman, P.E., FASHRAE Taylor Engineering, LLC
GE Critical Power © 2010 General Electric Company. All Rights Reserved. This material may not be copied or distributed in whole or in part, without prior.
Data Center Consolidation & Energy Efficiency in Federal Facilities
Overview of Liquid Cooling Systems Peter Rumsey, Rumsey Engineers.
Connectivity Week Santa Clara Convention Center May 23, 2011.
University of Michigan, CSG, May 2006 UM/MITC Data Center.
1 Medical Office Building. 2 Occupancy – 400 persons 8 a.m. – 5 p.m. Monday - Friday Building Characteristics Three stories 40,000 square feet (200’ x.
Air Conditioning and Computer Centre Power Efficiency The Reality Christophe Martel Tony Cass.
Data centre air management Case studies Sophia Flucker.
MISSION CRITICAL COLOCATION 360 Technology Center Solutions.
Phoenix Convention Center Phoenix, Arizona District or Distributed Integrated EnergyDistrict Heating and Cooling Robert McMillin Siemens Industry, Inc.
Overview of Data Center Energy Use Bill Tschudi, LBNL
Management and Organisation of Electricity Use Electrical System Optimisation Belgrade November 2003.
Thermal Design Project Final Report John Wallerich Principal Engineer Wallerich Group, LLC.
Architectural Engineering Senior Thesis Mechanical System Redesign Saint Joseph Medical Center Chris Nicolais.
Schneider Electric: Total Cooling Solutions for Data Centers
The Data Center Challenge
High Performance Computing (HPC) Data Center Proposal Imran Latif, Facility Project Manager Scientific & Enterprise Computing Data Centers at BNL 10/14/2015.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Centre Consolidation Project Vincent Doré IT Technical.
We can…. 2 GLOBAL REFERENCES Rev: 00 References :
Authors: William Tschudi, Lawrence Berkeley National Lab Stephen Fok, Pacific Gas and Electric Company Stephen Fok, Pacific Gas and Electric Company Presented.
Building Systems Integration - Energy and Cost Analysis The Milton Hershey School New Supply Center Justin Bem AE Senior Thesis – Spring 2007 Mechanical.
SERVICE ENTRANCE EQUIPMENT
Optimizing Power and Data Center Resources Jim Sweeney Enterprise Solutions Consultant, GTSI.
Green Server Room Construction Primary concerns when building a server room is size and cooling. Size can be diminished with the use of virtual servers.
Data Center Energy Use, Metrics and Rating Systems Steve Greenberg Energy Management Engineer Environmental Energy Technologies Division Lawrence Berkeley.
All content in this presentation is protected – © 2008 American Power Conversion Corporation Row Cooling.
Insights into Google's PUE Results Spring Published Google PUE Results.
1 ITM 1.2 How IT Decisions Impact Data Center Facilities: The Importance of Collaboration Lars Strong P.E. Upsite Technologies, Inc.
Data Center Infrastructure Total System Effectiveness Matt Gleason, CoreSite Jim Hughes, Mitsubishi Electrical Power Products.
1 PCE 2.1: The Co-Relationship of Containment and CFDs Gordon Johnson Senior CFD Manager at Subzero Engineering CDCDP (Certified Data Center Design Professional)
InfraStruXure Systems Alex Tavakalov
1 DB 3.1 The Life Cycle of a Data Center Steven Shapiro PE, ATD Morrison Hershfield Mission Critical Company Logo OK.
West Cambridge Data Centre Ian Tasker Information Services.
Data Center Energy Efficiency SC07 Birds of a Feather November, 2007 William Tschudi
The Genome Sequencing Center's Data Center Plans Gary Stiehr
Dell EMC Modular Data Centers
Free Air Cooling for Data Centres
CANOVATE MOBILE (CONTAINER) DATA CENTER SOLUTIONS
Unit 2: Chapter 2 Cooling.
The Data Center Challenge
MGHPCC as a Platform for MOC
Using Heat to Increase Cooling George Hannah BEng (Hons) CEng MIMechE
CERN Data Centre ‘Building 513 on the Meyrin Site’
June 13,2016 Kevin Werely Regional Sales Director
Power distribution and UPS in a single, compact system
Robicon Perfect Harmony.
Where Does the Power go in DCs & How to get it Back
Direct Current (DC) Data Center
Liebert DSE High efficiency thermal management
Presentation transcript:

Potomac AFCOM Quarterly Meeting Hosted by: DuPont Fabros Technology

DFT’s ACC7 Data Center

ACC4 – 172,000 RSF; 36.4 MW ACC5 – 176,000 RSF; 36.4 MW ACC6 – 130,000 RSF; 26.0 MW ACC7 1 – 246,000 RSF; 41.6 MW DuPont Fabros Technology Data Centers 4 Markets, 11 Data Centers, 240 MW Critical Load NJ1 1 – 176,000 RSF; 36.4MW CH1 – 231,000 RSF; 36.4 MW SC1 1 – 176,000 RSF; 36.4 MW VA3 – 147,000 RSF; 13.0 MW VA4 – 90,000 RSF; 9.6 MW ACC2 – 53,000 RSF; 10.4 MW ACC3 – 80,000 RSF; 13.9 MW Piscataway, NJ Santa Clara, CA Chicago, IL Northern Virginia COMPANY OVERVIEW: Founded in 1997, IPO in 2007 (NYSE: DFT) Owner, developer, and operator of highly reliable, highly efficient, carrier neutral wholesale data centers Eleven (11) Operating Data Centers with 240 megawatts (MW) of critical load capacity Largest clients include Facebook, Microsoft, Yahoo!, and Rackspace 1 Includes All Phases of Construction in Total RSF: Raised Square Feet MW: Megawatts Operating Properties

➔ Data Center Industry Trends ➔ Project Goals – Unravelling Trends ➔ ACC7 Building Overview ➔ Chiller Assist ➔ MV & the Isolated-Paralle UPS ➔ Proof of Concept Agenda

Examples:  Higher operating temperatures for servers leads to relaxed ASHRAE limits;  Aisle Containment – Tighter management of hot/cold air  All overhead solutions (power/network/grounding); eliminates the need for raised flooring  Auto-sensing power supplies don’t need 120/208V (240V)  Product application: Incorporating new products available on the market (MV)  Greater Security & Audit Requirements  Modular building approach/containers Industry Trends

Primary Goals: Create a Design that addresses Industry Trends By developing a product that Is cheaper to build on a per megawatt basis 2. Has lower maintenance costs 3. Has industry leading PUE Project Goals LOW BUILD COSTS LOW MAINT COSTS LOWEST PUE

SUPPORTING GOALS: 1. MODULAR – Where “modular” refers to a design which may be constructed expeditiously and unobtrusively in logical, incremental building blocks that help defer costs 2. FLEXIBILITY a. Offer a variety of room SIZES and LOAD densities b. Offer SECURITY options c. Support an array of CUSTOMER layouts d. Provisions for CONTAINER hook-ups Project Goals

Building Overview & Computer Room Details

UPS ROOMS: Centrally Located vs. Linear (14) UPS Rooms (28) 1600 kW UPS Units Vs. (32) 1300 kW (ACC5) GENERATOR ROOMS: (28) 2250 kW Units (14 Rms) Vs. (32) 2250 kW (ACC5) Building Layout 447,000 GSF; 405,000 NSF 41.6 MW Critical

28 Rooms 249,000 SQFT 1.48 MW (Base) per Room Computer Rooms 8 Divisible Rooms 4 into 750 kW 4 into 375 kW High Density Rooms Expandable to 2.0 MW by adding (2) PDU Xfmrs Room 6 is day 1 split into (2) 750kW Rooms

ACC7 - Computer Room Metrics Typical Computer Room ~8500 SqFt 1.486MW with ability for 2.0MW

ACC7 - Computer Room Details 2N Electrical Redundancy Total of (18) CRAH units (N+2 ) Expandable to (22) No Raised Floor Lay-in ceiling at 13’-0” CRITICAL DISTRIBUTION PANELS 1.5 MW PDU XFMR’s “A” & “B” B1 B2 A1 A2 A B CRAH (typical)

ACC7 – High-Density Computer Room 1.5 MW PDU XFMR’s “A” & “B” A B 2 nd Set PDU XFMR’s “A” & “B” A B High Density Solution with ability for 2.0MW

ACC7 - Base Computer Room w/ Cabs Flexible equipment layouts per Client preference 378 Cabinets shown

ACC7 – Subdivided Computer Room Steel Mesh Wall Segregates Room Into Smaller Offerings

More Stringent Security & Audit Requirements: TREND: Physical segregation between servers & infrastructure OPPORTUNITY: Provide flexible optional security galleries Introduce steel mesh fencing Provides physical separation Displaces minimum floor space NO modifications to elec/mech infrastructure Unraveling Trends – Security

ACC7 – Computer Room w/ Gallery Optional steel mesh Gallery walls segregate Client space from DFT equipment

HIGHER BRANCH CIRCUIT VOLTAGE SOLUTIONS: TREND - Phasing Out: 120V single phase circuits OPPORTUNITY: Utilize 230 or 240V ($$ Savings) Instant 2X power boost to standard busway products 225A bus at 208V = 65kW 225A bus at 415V = 130kW Single pole breakers in lieu of double-pole CB’s Unraveling Trends

ACC7 - Computer Room Busway Layout 1.5 MW PDU XFMR’s “A” & “B” A B B1 B2 A1 A2

Greater Server Tolerance For Higher T’s Higher Supply & Return Chiller Water T’s Greater Opportunity For Free Cooling Smaller Chillers & Pumps Greater Energy Savings Lower PUE Higher Supply & Return Air T’s Unraveling Trends

TREND: HIGHER OPERATING TEMPERATURES: SOLUTION: Water Side Economization Plant with Chiller assist Primary Chilled water production by heat exchangers to provide 65-70F water (regionally dependent) Year round contribution from heat exchangers with chiller assist as needed Increased chilled water temperature difference of up to 25°F (vs. the conventional 10°F). Chiller Assist

HIGHER OPERATING TEMPERATURES: Allows for more hours of FREE COOLING (economizer operation) and significantly smaller chillers and lower pumping horsepower Reduced CRAH fan horsepower by 60% versus traditional units, while delivering the same cooling capacity performance Requires greater management of hot and cold air. Containment is mandatory Chiller Assist

CRAH Cabinet Ceiling Space Aisle Containment – Greater management of hot/cold air

Medium Voltage & Isolated Parallel (I-P) UPS

Medium vs Low Voltage Distribution: TREND: Advances in MV circuit breakers, switchgear & PDUs OPPORTUNITY: MV Distribution in the Data Center Reduces feeders by factor of 7X Significant cost savings Increased Reliability Increased Flexibility Unraveling Trends – New Products

Medium Voltage - Advantages

Greater safety—lower arc flash ARC containing MV switchgear designs are PPE Category 0 under NFPA 70E Higher Reliability Cooler running ductbanks, fewer terminations Linear Actuated Breakers More reliable (fewer moving parts = longer lasting CB) Faster clearing times Greater flexibility in DC layout - longer feeders without huge cost Greener building—less copper, PVC, steel MV Advantages

Introducing the new “MV PDU” A potentially disruptive technology Deconstructed PDU with separate transformer and distribution panels Single “A” and “B” oil–filled PDU transformers replace the equivalent of ten (10) conventional 300kVA PDU’s Less footprint = greater cabinet yield MV Product Advantages “I’m back from the future!”

Outstanding Efficiency! 99.6% vs 97.5% for dry- type Significant overload capabilities: 150% continuously MV PDU

ACC7 Proof of Concept Room

Installed 200 cabinets & LB’s in CR1. Both aisle containment and rack chimneys demonstrated Installed 200 cabinets & LB’s in CR1. Both aisle containment and rack chimneys demonstrated

ACC7 Proof of Concept Room ACC7 Proof of Concept Room Various load configurations at full 2MW load tested & CRAH’s failed

Conclusions