CIT 470: Advanced Network and System AdministrationSlide #1 CIT 470: Advanced Network and System Administration Data Centers.

Slides:



Advertisements
Similar presentations
Computer Room Requirements for High Density Rack Mounted Servers Rhys Newman Oxford University.
Advertisements

Small Computer Rooms by Tim Hogard Micro Data Centers on a Budget.
Cabinet Ventilation System: CVS 1200 Presented By Nick Bruno Director of Business Development.
Poseidon 4002 Rack monitoring 1. Poseidon: Monitor & Control 2.
Freezer Alarm &Backup Systems Preventing Sample Losses * * * * * * * * * * * * * * * * * * * * * * * * Joe DeFrancesco, Gina Koskela, Peter Sturman Oregon.
Basic Refrigeration Cycle
CIT 470: Advanced Network and System AdministrationSlide #1 CIT 470: Advanced Network and System Administration Data Centers.
Columbia University’s Advanced Concepts Data Center Pilot June 17, 2011.
DATA Centers cs3353. Data Center A centralized location where computer related resources (and data) are stored. The users do not require physical access.
Overclocking and Cooling
IST 346 Chapter 6 Data Centers.  What is a datacenter?  Page 129, “ a data center is a place where you keep machines that are a shared resource”  A.K.A.
CIT 470: Advanced Network and System Administration
Cloud Computing Data Centers Dr. Sanjay P. Ahuja, Ph.D FIS Distinguished Professor of Computer Science School of Computing, UNF.
Maintaining and Troubleshooting Computer Systems Computer Technology.
Linux Clustering A way to supercomputing. What is Cluster? A group of individual computers bundled together using hardware and software in order to make.
02/24/09 Green Data Center project Alan Crosswell.
Computer Security: Principles and Practice EECS710: Information Security Professor Hossein Saiedian Fall 2014 Chapter 16: Physical and Infrastructure Security.
Copyright Green Revolution Cooling
Thermal Management Solutions from APW President Systems
CIT 470: Advanced Network and System AdministrationSlide #1 CIT 470: Advanced Network and System Administration Data Centers.
 Site  Requirements  Local Resources  Initial layout ideas  Brief material selection  Supply options.
Computer Room Experiences A medium sized tier-2 site view Pete Gronbech GridPP Project Manager HEPIX April 2012.
Data Center Design Christopher Geyer. A Data Center Highly secure, fault-resistant facilities housing equipment that connect to telecommunications networks.
Commodity Data Center Design
September 18, 2009 Critical Facilities Round Table 1 Introducing the Heat Wheel to the Data Center Robert (Dr. Bob) Sullivan, Ph.D. Data Center Infrastructure.
Steam Power Station Presented By Ashvin G. Patel Asst. Prof. (E.E.)
IST 346 Chapter 6 Data Centers.
MAINTAINING AND TROUBLESHOOTING COMPUTER SYSTEMS UNIT 6.
Cluster computing facility for CMS simulation work at NPD-BARC Raman Sehgal.
High Density Cooling Air conditioning for high density applications
Applying Precision Air Conditioning Systems
1 Copyright © 2011, Elsevier Inc. All rights Reserved. Chapter 6 Authors: John Hennessy & David Patterson.
Air Conditioning and Computer Centre Power Efficiency The Reality Christophe Martel Tony Cass.
Emerson Network Power Datacenter Infrastructure Solutions for Business Critical Continuity Many Needs, One Answer. May 28 th 2010.
THEMIS/GBO Engineering Peer Review 1 UCB, Oct. 17, 2003 Ground Based Observatories (GBO) Observatory System Design Stewart Harris - UCB.
Energy Usage in Cloud Part2 Salih Safa BACANLI. Cooling Virtualization Energy Proportional System Conclusion.
Smart Grid: Opportunities in the Data Center January 22, 2010 Charles O’Donnell, Vice President, Engineering Liebert AC Power Emerson Network Power.
System Security Chapter no 16. Computer Security Computer security is concerned with taking care of hardware, Software and data The cost of creating data.
Physical Infrastructure Issues In A Large Centre July 8 th 2003 CERN.ch.
Progress Energy Corporate Data Center Rob Robertson February 17, 2010 of North Carolina.
Infrastructure Gina Dickson – Product Manager Kevin Jackson – Sales Manager.
1 Drivers for Building Life Cycle Integrations Jim Sinopoli.
Thermal-aware Issues in Computers IMPACT Lab. Part A Overview of Thermal-related Technologies.
Electronics Enclosures
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN Computer Centre Consolidation Project Vincent Doré IT Technical.
© Copyright 2011 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice. HP Restricted Data Availability.
Authors: William Tschudi, Lawrence Berkeley National Lab Stephen Fok, Pacific Gas and Electric Company Stephen Fok, Pacific Gas and Electric Company Presented.
CIT 470: Advanced Network and System AdministrationSlide #1 CIT 470: Advanced Network and System Administration Data Centers.
All content in this presentation is protected – © 2008 American Power Conversion Corporation Row Cooling.
Physical Security Concerns for LAN Management By: Derek McQuillen.
Ventilation & Airflow NetShelter CX 18U,24U,38U Ambient office air is drawn in to the soundproofed air-intake chambers on either side of the NetShelter.
© 2012 MITSUBISHI HEAVY INDUSTRIES – MAHAJAK AIR CONDITIONERS CO., LTD. All Rights Reserved. Bad case of Installation RAC-B-A07.
Monitoreo y Administración de Infraestructura Fisica (DCIM). StruxureWare for Data Centers 2.0 Arturo Maqueo Business Development Data Centers LAM.
InfraStruXure Systems Alex Tavakalov
By: Mike Nazzario. What is a Data Center?  Facility used to house computer servers for remote storage  Sensitive Data  Warehouse – room with a couple.
West Cambridge Data Centre Ian Tasker Information Services.
Chapter 14: System Administration Mark Milan. System Administration Acquiring new IS resources Maintaining existing IS resources Designing and implementing.
HOW NOT TO INCORRECTLY CARE FOR COMPUTER HARDWARE HANDLING AND STORAGE BY BLAINE AND ZACH.
CIT 668: System Architecture
CANOVATE MOBILE (CONTAINER) DATA CENTER SOLUTIONS
Overview: Cloud Datacenters II
Unit 2: Chapter 2 Cooling.
CERN Data Centre ‘Building 513 on the Meyrin Site’
Data Center Service Brian Messenger 11/15/2016.
Cloud Computing Data Centers
Cloud Computing Data Centers
Door Heat Exchanger: Specification
Presentation transcript:

CIT 470: Advanced Network and System AdministrationSlide #1 CIT 470: Advanced Network and System Administration Data Centers

CIT 470: Advanced Network and System AdministrationSlide #2 Topics Data Center: A facility for housing a large amount of computer or communications equipment. 1.Environment 2.Power 3.Racks 4.Containers

CIT 470: Advanced Network and System AdministrationSlide #3 Environmental Requirements Temperature: 64-72F (17-22C) Temperature inside case around 40F higher. Chips and hard disks fail around 120F. Humidity 35-65% Too low: static discharges. Too high: water condenses, short circuits. Power Reliable, conditioned power. Physical Security Prevent accidents, hacking, theft, vandalism.

CIT 470: Advanced Network and System AdministrationSlide #4 Environment Features HVAC: Heating, Ventilation, Air Cond UPS: Power conditioning, <1hr battery Generator: For long term power outages. Accessibility: For large equipment. Card Lock: SAs only, records entrances/exits. Fire Suppression System: Not water based. Humidity/Water Detection

CIT 470: Advanced Network and System AdministrationSlide #5 HVAC HVAC systems can fail due to Power loss. Coolant leakage. Mechanical failure. Detect HVAC failures early. Many HVACs have sensors to report failure. Temperature monitor in data center otherwise. HVAC maintenance n+1 redundancy to avoid immediate disaster. Usually handled by external contractor.

CIT 470: Advanced Network and System AdministrationSlide #6 Advanced Cooling Devices HP Smart Cooling –Sensors within machine room positions louvers in the floor to direct cool air at hot spots. –Run data center at 27C instead of 13C. IBM Heat eXchanger –Water cooled rear door for rack. –Removes ~15 kW (50000 BTU) from rack. –Useful for dealing with hot spots.

CIT 470: Advanced Network and System AdministrationSlide #7 Space Aisles Must be wide enough to move equipment. Hot spots Result from poor air flow. Servers can overheat when avg room temp low. Work space A place for SAs to work on servers. Desk space, tools, etc. Capacity Ensure that you have enough room to grow.

CIT 470: Advanced Network and System AdministrationSlide #8 Power UPS provides power. Generator provides auxiliary power. ATS switches to generator if building power fails. Liebert UPS and battery bank.

CIT 470: Advanced Network and System AdministrationSlide #9 Power Distribution Under floor power –Susceptible to water; requires water sensors. –Overhead power bus is preferrable. Overhead Power Bus –Drop power down to each rack individually. –Don’t run power cords between racks. –Each rack will have its own PDU.

CIT 470: Advanced Network and System AdministrationSlide #10 Power Distribution Units (PDU) Different power sockets can be on different circuits. Individual outlet control (pwr cycle.) Current monitoring and alarms. Network management (web or SNMP.) APC Power Distribution Unit

Typical Power Usage DevicePower Usage Intel Xeon W GHz Quad Core130 W Intel Xeon E GHz Quad Core80W Intel Xeon E GHz Dual Core80W 7200RPM Hard Drive7W 10,000RPM Hard Drive14W 15,000RPM Hard Drive20W DDR2 DIMM1.65W Video Card20-120W CIT 470: Advanced Network and System AdministrationSlide #11 Power supplies range from 50-90% efficiency, with the 80 PLUS program pushing recent ones towards those higher figures.

CIT 470: Advanced Network and System AdministrationSlide #12 The Power Problem Servers require much more power today. 4-year power cost = server purchase price. Upgrades may have to wait for electricity. Power is a major data center cost –$5.8 billion for server power in –$3.5 billion for server cooling in –$20.5 billion for purchasing hardware in –Google, MS building centers near cheap power on the Columbia and Colorado rivers.

CIT 470: Advanced Network and System Administration Power Usage Effectiveness (PUE) PUE = Data center power / Computer power –PUE=2 indicates that for each watt of power used to power IT equipment, one watt used for HVAC, power distribution, etc. –Decreases towards 1 as DC is more efficient. PUE variation –Industry average = 2 –Microsoft = 1.22 –Google = 1.19

CIT 470: Advanced Network and System AdministrationSlide #14 Physical Security Keycard + code to restrict who can access. Biometrics are becoming popular alternative. Electronically log all accesses. Surveillance cameras. No windows. No floor/ceiling access.

CIT 470: Advanced Network and System AdministrationSlide #15 Racks 19” rack standard –EIA-310D –Other standard numbers. NEBS 21” racks –Telecom equipment. 2-post or 4-post 1U = 1.75 in Air circulation (fans) Cable management Doors or open

CIT 470: Advanced Network and System AdministrationSlide #16 Rack Purposes Organize equipment –Increase density with vertical stacking. Cooling –Internal airflow within rack cools servers. –Airflow determined by arrangement of racks. Wiring –Cable guides keep cables within racks. Power infrastructure –PDUs in racks distribute power to servers.

CIT 470: Advanced Network and System AdministrationSlide #17 Buying a Rack Buy the right size –Space for servers. –+ space for power, patch panel, etc. Be sure it fits your servers. –Appropriate mounting rails. –Shelves for non-rack mountable servers. Physical/environment security –Locking front and back doors if needed. –Sufficient power and cooling. –Power/environment monitors if needed.

CIT 470: Advanced Network and System AdministrationSlide #18 Wiring It’s important to be neat. –So you can get the cable for the right server. –Label both ends of each cable. Prewiring –Run power, net, serial cables when rack setup. –Greatly reduces time to setup a new server. Hiding messiness isn’t being neat. –Raised floors can hide cables. –Some sites don’t remove old cables to avoid accidentally taking down the wrong server.

CIT 470: Advanced Network and System AdministrationSlide #19 Containers Data Center in a shipping container. –4-10X normal data center density. –Up to 2500 servers. –Up to 12.9 kW/m 2. MS/Google fill entire data centers with containers. –Industrialization of IT. –Low PUE.

CIT 470: Advanced Network and System AdministrationSlide #20 References 1.“19-inch Rack”, u, “Top 5 Things to Keep in Mind When Buying a 19” Rack,” Mark Burgess, Principles of System and Network Administration, Wiley, Data Center Journal, “Data Center,” Dan Goodin, “IT confronts the Data Center Power Crisis,” Google, 8.Hurgh, “Rack Layout,” “Project Blackbox,” Thomas A. Limoncelli and Christine Hogan, The Practice of System and Network Administration, Addison-Wesley, Microsoft Data Centers blog,