1 Northwestern University Information Technology Data Center Elements Research and Administrative Computing Committee Presented October 8, 2007.

Slides:



Advertisements
Similar presentations
Chapter 3: Planning a Network Upgrade
Advertisements

1 Procuring a Design Project engineers perspective on starting the design process and establishing the A/E scope of work Frank C. Quigley General Engineer.
Objectives: Chapter 9: Data Centre Architecture VLAN definition and benefits * VLANs and broadcast domains * Routers role in VLANs * Types of VLANs * VLANs.
© 2007 Cisco Systems, Inc. All rights reserved.ICND1 v1.0—1-1 Building a Simple Network Exploring the Functions of Networking.
MUNIS Platform Migration Project WELCOME. Agenda Introductions Tyler Cloud Overview Munis New Features Questions.
State Data Center Re-scoped Projects With emphasis on reducing load on cooling systems in OB2 April 4, 2012.
Enhancing protection for the most valuable assets of the Knowledge Economy.
MidAmerican Energy Holdings Company Telecom Power Infrastructure Analysis Premium Power for Colocation Telecom Power Infrastructure Analysis February 27,
Data Center Site Infrastructure Tier Standard: Topology Dr. Natheer Khasawneh Sadeem Al-Saeedi (8276)
Telenor Tier 3 Data Center April About Telenor Tier 3 Data Center Telenor built it´s own Data Centar in accordance with the latest industrial standards.
Secured Hosting Services Tom Carter. What is Application Hosting… Increasingly popular practice of outsourcing software applications to 3 rd party providers.
AFCOM Facility Presentation Directlink Technologies Corp. April 8, 2011.
State Data Center Re-scoped Projects With emphasis on reducing load on cooling systems in OB2 April 4, 2012.
Storage area Network(SANs) Topics of presentation
02/24/09 Green Data Center project Alan Crosswell.
Addressing the Needs for Research and Administrative Computing Research and Administrative Computing Committee Presented October 8, 2007.
Marilyn T. Smith, Head, MIT Information Services & Technology DataSpace IS&T Data CenterMIT Optical Network 1.
Location of the theme gray Location of the theme blue ZTE Proprietary On Reliability of COTS Hardware Dr. Li Mo Chief Architect, CTO Group.
™ כותרת Myths and Misconceptions in Datacenter Rating Shimon Katz, Data Center Project Manager ELECTRICITY 2013– Jerusalem, Israel.
Global Data Centers Design Considerations – Data Center Continuous October 2014 Warrick Gibbens.
Data Center Design Christopher Geyer. A Data Center Highly secure, fault-resistant facilities housing equipment that connect to telecommunications networks.
CHAPTER OVERVIEW SECTION 5.1 – MIS INFRASTRUCTURE
Secured Hosting Services Frank Adams / Tom Carter.
1 May 12, 2010 Federal Data Center Consolidation Initiative.
1 INDIACMS-TIFR TIER-2 Grid Status Report IndiaCMS Meeting, Sep 27-28, 2007 Delhi University, India.
CSG Panel – May 10, 2006 Key Lessons for New Data Centers Project overview Requirements Lessons learned Going Forward.
data center classifications or ratings
Redundant Array of Inexpensive Disks aka Redundant Array of Independent Disks (RAID) Modified from CCT slides.
Version 4.0. Objectives Describe how networks impact our daily lives. Describe the role of data networking in the human network. Identify the key components.
Outline IT Organization SciComp Update CNI Update
New Data Center Update Brian Shannon & Anne Silvester.
Co-location Sites for Business Continuity and Disaster Recovery Peter Lesser (212) Peter Lesser (212) Kraft.
Progress Energy Corporate Data Center Rob Robertson February 17, 2010 of North Carolina.
INFORMATION TECHNOLOGY SERVICES University Data Center Project Overview January 11, 2010.
1 Drivers for Building Life Cycle Integrations Jim Sinopoli.
"1"1 Introduction to Managing Data " Describe problems associated with managing large numbers of disks " List requirements for easily managing large amounts.
© 2006 Cisco Systems, Inc. All rights reserved.Cisco PublicITE I Chapter 6 1 Exploring the Enterprise Network Infrastructure Introducing Routing and Switching.
Jon Walton Director of Technology Bay Area Technology Forum November 5, 2009 Technology Consolidation: Fact or Fiction.
© 2006 Cisco Systems, Inc. All rights reserved.Cisco PublicITE I Chapter 6 1 Exploring the Enterprise Network Infrastructure Introducing Routing and Switching.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Centre Upgrade Project Wayne Salter HEPiX November.
ATLAS Tier 1 at BNL Overview Bruce G. Gibbard Grid Deployment Board BNL 5-6 September 2006.
Click to add text Introduction to the new mainframe: Large-Scale Commercial Computing © Copyright IBM Corp., All rights reserved. Chapter 6: Accessing.
Data Center Facility & Services Overview. Data Center Facility.
High Performance Computing (HPC) Data Center Proposal Imran Latif, Facility Project Manager Scientific & Enterprise Computing Data Centers at BNL 10/14/2015.
Install, configure and test ICT Networks
Teknologi Pusat Data 12 Data Center Site Infrastructure Tier Standard: Topology Ida Nurhaida, ST., MT. FASILKOM Teknik Informatika.
High Availability Environments cs5493/7493. High Availability Requirements Achieving high availability Redundancy of systems Maintenance Backup & Restore.
© 2007 Cisco Systems, Inc. All rights reserved.Cisco Public ITE PC v4.0 Chapter 1 1 Planning a Network Upgrade Working at a Small-to-Medium Business or.
U N C L A S S I F I E D LA-UR Leveraging VMware to implement Disaster Recovery at LANL Anil Karmel Technical Staff Member
Page 1 NSTec –Impact of Server virtualization OFFICIAL USE ONLY Vision Service Partnership Impact of Virtualization on the Data Center Robert Morrow National.
Colocation Data Center Availability : Making Maintenance Windows Obsolete Christopher Thames, Director of Critical Facilities.
West Cambridge Data Centre Ian Tasker Information Services.
C21 World (UK) Consultancy Services for Telecom Industry DATA CENTERS
The Genome Sequencing Center's Data Center Plans Gary Stiehr
1 FOM 2.2 Tier Certification Explained Keith Klesner Senior Vice President North America Uptime Institute.
Dell EMC Modular Data Centers
Tier Certification Explained
Exploring the Functions of Networking
IOT Critical Impact on DC Design
Consolidation, Virtualization and DR
CERN Data Centre ‘Building 513 on the Meyrin Site’
MUNIS Platform Migration Project
Outline Introduction What is Data Center? Storage Equipment in DC.
GES SYSTEM THE IMPORTANCE OF GES SYSTEM IN BUILDING
Business Continuity Technology
Virtualization and Cloud Computing
Direct Current (DC) Data Center
DATA CENTERS C21 Engineering Consultancy Services for Telecom Industry
IT 344: Operating Systems Winter 2007 Module 18 Redundant Arrays of Inexpensive Disks (RAID) Chia-Chi Teng CTB
Presentation transcript:

1 Northwestern University Information Technology Data Center Elements Research and Administrative Computing Committee Presented October 8, 2007

2 Evanston Data Center Physical Space Staff offices 171: Operations/Printing (1950 ft 2 ) 173: Research (950 ft 2 ) 172: Hosting (3850 ft 2 ) Mailroom (2600 ft 2 ) Note: All areas shown have computer room raised floors

3 Chicago Data Center Physical Space 232: Hosting (1000 ft 2 ) 234: Departmental Hosting (800 ft 2 ) Note: Areas do not have computer room raised floors.

4 Data Center Tier Classification The Tier classifications were created to consistently describe the site-level infrastructure required to sustain data center operations, not the characteristics of individual systems or sub-systems. The requirements were established by the Uptime Institute, Inc. and are necessarily and intentionally very broad to allow innovation in achieving the desired level of site infrastructure performance, or uptime. The individual Tiers represent categories of site infrastructure topology that address increasingly sophisticated operating concepts, leading to increased site infrastructure availability.

5 Data Centers Tiers Tier I: Basic Site Infrastructure Non-Redundant Capacity Components and Distribution paths. Any Capacity Component or Distribution Path failure will impact the computer systems. Planned work will require most or all of the systems to be shutdown. Tier II: Redundant Capacity Components Site Infrastructure Redundant Capacity Components, but single Non-Redundant Distribution Paths serving the site’s computer equipment. Redundant UPS modules and engine generators are required. A distribution path failure will cause the computer equipment to shut down. Tier III: Concurrently Maintainable Site Infrastructure Redundant capacity components and multiple distribution paths serving the site’s computer equipment. Generally, only one distribution path serves the computer equipment at any time. Each and every capacity component and element of the distribution paths can be removed from service on a planned basis without causing any of the computer equipment to be shut down. Tier IV: Fault Tolerant Site Infrastructure Redundant capacity systems and multiple distribution paths simultaneously serving the site’s computer equipment The site is not susceptible to disruption of service from a single unplanned worse-case event. Infrastructure maintenance may be performed by using redundant capacity components and distribution paths to safely work on the remaining equipment.

6 Tier Rating of NUIT Data Centers Northwestern University’s central IT data centers are Tier I facilities. Tier I: Basic Site Infrastructure Non-Redundant Capacity Components and Distribution paths. Any Capacity Component or Distribution Path failure will impact the computer systems. Planned work will require most or all of the systems to be shutdown. Building Cost Comparison of Tier Facilities Construction Cost (+/- 30%) Tier ITier IITier IIITier IV Raised Floor (per sq. ft.)$220$220$220$220 Useable UPS Output (per kVA)$10,000$11,000$20,000$22,000

7 Electrical Infrastructures Evanston Data Center Room 172: Hosting Dual Power Feeds Two 150kVA UPS units running in parallel 750kVA Diesel Generator Single Distribution Path Room 173: Small Research Center Single Power Feed Single 160kVA UPS unit No Generator Single Distribution Path Chicago Data Center Rooms 232 and 234 Single Power Grid 350kVA UPS unit 375kVA Diesel Generator Single Distribution Path

8 Electrical Capacity for Evanston Data Center Hosting Rooms 171 and 172 What is the protected load capacity? Safe Electrical Draw is 240kVA What is today’s consumed load? Today’s Draw is 210kVA A power reduction plan is underway to shed load and bring the total draw down to 195kVA for these two rooms. Thus providing 45kVA of unused safe draw to utilize. Small Research Center Room 173 What is the protected load capacity? Safe Electrical Draw is 128kVA What is today’s consumed load? Today’s Draw is 20kVA

9 Electrical Capacity for Chicago Data Center Hosting Rooms 232 and 234 What is the protected load capacity? Safe Electrical Draw is 300kVA What is today’s consumed load? Today’s Draw is 300kVA A power co-generation project to bring additional power into this location has been cancelled by Facilities Management. Facilities Management is recommending the purchase of a larger diesel generator.

10 Cooling Capacity Evanston Data Center Room Tons supports up to 120kVA* Room Tons supports up to 240kVA* Room Tons supports up to 60kVA* Mail Room60 Tons supports up to 180kVA* *Aging units cause lower operating efficiencies causing a 25% inefficiency in cooling capacity. A HVAC upgrade project is underway to replace aging units will be completed in June Room Tons supports up to 510kVA Room Tons supports up to 190kVA Chicago Data Center Rooms 232/23460 Tons supports up to 90kVA Staff offices 171: Operations/Printing 40 tons of cooling 173: Research 20 tons 172: Hosting 80 tons of cooling Mailroom 40 tons of cooling 232: Hosting 234: Departmental Hosting 60 tons of cooling

11 Fire Suppression Evanston Data Center Pre-action sprinkler system Fire Protection project is underway and upgrades will include FM200 protection in Rooms 171, 172 and 173 Chicago Data Center Pre-action sprinkler system FM200 fire suppression system 232: Hosting 234: Departmental Hosting

12 Physical Security Evanston Data Center NetBotzSurveillance Monitoring System DDC SensorsEnvironmental Monitoring System MarlokAccess System AI PhoneDoor Call Entry System Chicago Data Center NetBotzMonitoring System MarlokAccess System

13 Network Connectivity and Security Evanston Data Center Dual Fiber Connection to Evanston NUNet Dual CISCO 6509 Switches Redundant Netscreen Firewalls Chicago Data Center Dual Fiber Connections to Chicago NUNet Dual CISCO 6509 Switches Redundant CISCO PIX Firewall Blades in 6509 switches

14 Storage Capacity Evanston Data Center Storage Area Network Dual Cisco 9509 Fiber Channel Directors 57.2 TB Disk Storage 700 TB Tape Storage Disk Storage IBM DS8100 Enterprise Disk Array IBM DS4500 Disk Array IBM DS4100 Disk Array TAPE IBM 3584 ATL IBM Ultrium LTO1 and LTO2 tape Drives Chicago Data Center Storage Area Network Dual Cisco 9509 Fiber Channel Directors 4.2 TB Disk Storage 100 TB Tape Storage Disk Storage IBM DS4500 Disk Array TAPE IBM 3310 ATL IBM Ultrium LTO3 tape Drives