MGHPCC as a Platform for MOC

Slides:



Advertisements
Similar presentations
Challenges in optimizing data center utilization
Advertisements

Presented by: Rod Meredith Assistant Director Public Works Riley County, KS Matthew Leaper Business Development Manager Johnson Controls, Inc. IMPLEMENTING.
COS211. Source - INFINITE RESEARCH - Issue #003 | 03/08/11 | ww.infiniteresearch.netww.infiniteresearch.net 60% Growth
LAN Room and Wiring Closet Cooling Presented by Jim Magallanes TechniCool Innovations Inc Upham Street Suite B-1, Broomfield CO.
Matt Warner Future Facilities Proactive Airflow Management in Data Centre Operation - using CFD simulation to improve resilience, energy efficiency and.
NEMA. 2 Utility Metering vs. Submetering 10/11/2014 New PowerPoint Template Utility Meter Meter provided by utility with the purpose of measuring.
Green IT Presented by: Lauri Petersen Great River Energy.
©2009 HP Confidential template rev Ed Turkel Manager, WorldWide HPC Marketing 4/7/2011 BUILDING THE GREENEST PRODUCTION SUPERCOMPUTER IN THE.
Potomac AFCOM Quarterly Meeting Hosted by: DuPont Fabros Technology.
Cloud Computing From Different Perspective. but first, What is cloud? Why is it called cloud?
Cooling Strategies for Small IT Rooms Presented by Jim Magallanes TechniCool Innovations Inc Upham Street Suite B-1, Broomfield.
Current impacts of cloud migration on broadband network operations and businesses David Sterling Partner, i 3 m 3 Solutions.
1 Leading Energy Efficiency in High Tech: PG&E’s Program & Service Portfolio Mark Bramfitt, P.E. CSU Dominguez Hills.
Cooling Product Positioning
The Need for an Energy Grid in the Maldives
ON IT 1 Con Edison Energy Efficiency Programs Sustaining our Future Rebecca Craft Director of Energy Efficiency.
All content in this presentation is protected – © 2008 American Power Conversion Corporation Rael Haiboullin System Engineer Capacity Manager.
All content in this presentation is protected – © 2010 American Power Conversion Corporation InfraStruxure ™ Management Software Eduard Bodor Senior System.
Energy management at FERMILAB: strategy on energy management, efficiency, sustainability Stephen Krstulovich, energy manager.
Don ’ t blow off steam- microturbines make cheap, clean energy from waste heat Presented By: Ahmad Sabri El-Sayed Nagy By: Lindsay Audin Web Address:
September 18, 2009 Critical Facilities Round Table 1 Introducing the Heat Wheel to the Data Center Robert (Dr. Bob) Sullivan, Ph.D. Data Center Infrastructure.
Steve Craker K-12 Team Lead Geoff Overland IT and Data Center Focus on Energy Increase IT Budgets with Energy Efficiency.
Getting Green Building Automation. Why is Building Automation a Green Technology? There are programs starting all over the nation that focus on alternative.
H-1 Network Management Network management is the process of controlling a complex data network to maximize its efficiency and productivity The overall.
Architecture for Modular Data Centers James Hamilton 2007/01/08
VAP What is a Virtual Application ? A virtual application is an application that has been optimized to run on virtual infrastructure. The application software.
Data Center Consolidation & Energy Efficiency in Federal Facilities
Information Technology Division Executive Office for Administration and Finance Springfield Data Center Program Energy Strategy Meeting 9/29/2009.
Connectivity Week Santa Clara Convention Center May 23, 2011.
Designing the Building of the Future Practical Experience in IT Deployment Collaborate to Accelerate.
( I SSA ) I NFRASTRUCTURE AS A S ERVICE Will discuss : *Overview *Feature *Benefits for Enterprises * examples.
MISSION CRITICAL COLOCATION 360 Technology Center Solutions.
A Web Based Workorder Management System for California Schools.
1 Drivers for Building Life Cycle Integrations Jim Sinopoli.
Sustainable Building and Housing
#watitis2015 TOWARD A GREENER HORIZON: PROPOSED ENERGY SAVING CHANGES TO MFCF DATA CENTERS Naji Alamrony
Amagees Tech Corp value added services Data Management and Infrastructure.
Increasing DC Efficiency by 4x Berkeley RAD Lab
Optimizing Power and Data Center Resources Jim Sweeney Enterprise Solutions Consultant, GTSI.
Data Center Update November 5, Background Studies by Staggs and Fisher Consulting Engineers and Emerson Network Power confirm serious power and.
All content in this presentation is protected – © 2008 American Power Conversion Corporation Row Cooling.
Data Center Infrastructure Total System Effectiveness Matt Gleason, CoreSite Jim Hughes, Mitsubishi Electrical Power Products.
Monitoreo y Administración de Infraestructura Fisica (DCIM). StruxureWare for Data Centers 2.0 Arturo Maqueo Business Development Data Centers LAM.
West Cambridge Data Centre Ian Tasker Information Services.
FusionCube At-a-Glance. 1 Application Scenarios Enterprise Cloud Data Centers Desktop Cloud Database Application Acceleration Midrange Computer Substitution.
The Genome Sequencing Center's Data Center Plans Gary Stiehr
Dell EMC Modular Data Centers
CANOVATE MOBILE (CONTAINER) DATA CENTER SOLUTIONS
Data Center Stabilization
The Lean and Green by Design
Best Practices Consortium
CIS 700-5: The Design and Implementation of Cloud Networks
Avenues International Inc.
Data Center Network Topologies
DC Market Trends and the key focus areas within
Something Old and Something New
Unit 2: Chapter 2 Cooling.
Leading the way to a “Green-er” Campus
IOT Critical Impact on DC Design
Energy Savings Projects
Flex System Enterprise Chassis
CERN Data Centre ‘Building 513 on the Meyrin Site’
Tailor slide to customer industry/pain points
Agency IT Resources Communications Exchange Meeting
Brandon Hixon Jonathan Moore
Architecture for Modular Data Centers
Objective Use financial modeling of your data center costs to optimize their utilization.
Modular Edge-connected data centers

Presentation transcript:

MGHPCC as a Platform for MOC Jim Culbert Director of IT Services MGHPCC 12/06/2016

The MGHPCC Data Center and Consortium A partnership between 5 universities…. The Commonwealth, and industrial sponsors

Platform for Collaborative Research Computing RC Requirements High density Experimental by definition - flexibility Compute flexibility – commodity, gpu, fpga, the next big thing… Networking flexibility – topology, technology Reconfigurable Regulatory and grant constraints

Collaborative RC (Cont.) Collaboration Requirements Low friction interactions – 3 rows is easier than 3 counties. Low friction infrastructure Pre-install as much as possible Pre-installed cabling, support ad-hoc physical interconnect Pre-configured MeetMe networking services Low friction process Physical flexibility / swaps Collaborative design, construction, operation, governance – if it’s not working we can change it

RC Implementation Options Pro Con Roll your own Flexible! Master of your own destiny! Poor physical environment/support, gets old quick, hard to collaborate. Expensive for university (sometimes PI) Central IT DC More resource efficient Not usually designed to support research computing. Hard to collaborate outside of university. Commercial DC Flexible! Good support. Can be coerced to support RC. Expensive. Can collaborate but everyone needs to do business with the same DC. Not agile/flexible. Paying for enterprise features you don’t need. Commercial Cloud Flexible! Can do RC these days. Great for small scale. Unexpectedly expensive. Collaborators must agree on cloud. MGHPCC Flexible. Built for collaborative RC. Can add other collaborators easily. Cheap– power, efficient DC, utilization (people and equipment) Must work and play well with others. Commute to Western Mass to touch your equipment.

MGHPCC By the Numbers 100k square feet total - 30k square feet of computer room space (~1 acre), 15k admin, 55k everything else (entry rooms, loading docks, staging, recycling, MEP) 10mW* compute power (14kW compute 6kW networking) - standard cabinet densities up to 25kW, custom designs for 100kW. 718 (31k U) total pre-installed cabinets = 580 (25k U) of space for computing and storage, 138 (6k U) for networking and critical storage 1000 strands (2-3mi) pre-installed OM2/OM3, structured cabling for tenant use 1000 linear feet of tenant accessible, inter-cabinet cable tray High Capacity Networking 150 strands in MGHPCC manhole, ~30% of that landed in ERs, more at Appleton and Cabot street. Providers- MIT, UMass, VZ, Comcast, Fibertech Multiple 10G WAN paths to members home locations, 100G to I2, commodity 200M internet Room to expand - Site power and space to build another MGHPCC * 20% UPS/Critical Power

Collaborative RC and Security Designed to support this Security working group during design phase Matrix analysis of typical grant regulatory constraints HIPAA, FISMA, FERPA, NIST SP-800 recommendations Auditable computer-managed physical key control Pervasive HD video Physical isolation all the way to the carrier network demarc MGHPCC resources “look like” campus buildings/rooms

How we’re Green Power portfolio is greater than 90% carbon free 1.2 PUE - comparable to best in class Free Cooling (a.k.a water-side economizer) In row cooling Hot aisle containment 400V power distribution Continuous improvement - Monitor, modify, measure, repeat Plus a million little things (no conditioning in electrical rooms, VFDs, trim chiller, ECO mode UPS, motion sensors, etc.) US Green Building Council, LEED Platinum Certified In the U.S., buildings account for - 38% of CO2 emissions, 13.6% potable water (15 trillion gal. / year) use, 73% Of U.S. electricity consumption