Download presentation
Presentation is loading. Please wait.
Published byFrederick Sherman Modified over 9 years ago
1
How Microsoft Does Data Centres John Dwyer Area Data Centre Mgr - International Data Centre Solutions
2
The Global Services Foundation Across the company, all over the world, around the clock Zurich.Net Online 2 MBS Online Office Labs Sharepoint.Microsoft.com Yellow box or text = pipeline Azure
3
Scale and Market Growth Server Infrastructure Doubling Every Year Network Capacity 9x Growth in Four Years Tripling of Data Centre Instances Dramatic Expansion of Server & Network Geo- Diversity Managed Growth of Power Capacity & Consumption Source: http://www.internetworldstats.com 3
4
Data Centre Economics Have Changed! Cost of physical space was a primary consideration in data centre design Cost of power and cooling has risen to prominence Data centre managers now must prioritize investment in efficient power and cooling systems to lower the total cost of operating (TCO) of their facilities. Belady, C., “In the Data Center, Power and Cooling Costs More than IT Equipment it Supports”, Electronics Cooling Magazine (Feb 2007) 4
5
Site Selection Internet Population Internet Peering / Network Mobile Users Power Pricing Environmental Construction Costs Tax Climate IT Labor Availability Corporate Citizenship Composite Heat Map 5
6
Why Power Matters… In 2006, U.S. data centres consumed an estimated 61 billion kilowatt- hours (kWh) of energy, which accounted for about 1.5% of the total electricity consumed in the U.S. that year. In the EU, data centres consumed an estimated 56 billion kilowatt- hours in 2007. As an Industry Segment, Data Centres are the fastest growing Energy Segment in the US. Current projections are that data centre power consumption will exceed 100 billion kWh by 2011 in the US and by 2020 in the EU. Those levels of power consumption in the US would necessitate the construction of 10 additional power plants. 6
7
Relevant Metrics at Microsoft PUE/DCiE DC Utilization Server Utilization Cost Move from Cost = f (space) Cost = f (power) X 100% The Green Grid
8
SCRY 8
9
Setting Aggressive PUE Targets 9
10
Environmental Control Standards 10
11
Source: EYP Mission Critical Facilities Inc., New York Where Data Centre Power Goes GFS’ Infrastructure Services is focusing on all the pieces of the pie Opportunities are Everywhere Opportunities are Everywhere 11
12
Source: EYP Mission Critical Facilities Inc., New York Where Data Centre Power Goes GFS’ Infrastructure Services is focusing on all the pieces of the pie Opportunities are Everywhere Opportunities are Everywhere Widening environment can remove chillers and drive this to zero Offline UPS technologies can drive this substantially down Virtualization, active power management 12
13
Three of our Data Centres 13
14
Data Centre Costs in the US Land - 2% Core & Shell Costs – 9% Architectural – 7% Mechanical / Electrical – 82% Since 2004 -16% increase year-to-year Where the costs are: >80% scale with power <10% scale with space 14
15
SCRY – Window to Our World 15
16
SCRY Helps Demonstrate Continuous Improvement 22% improvement over 3 years Follows Moore’s Law On existing data centres and helps set goals for new data centres at Microsoft
17
Where We Think Things Are Going … 17
18
18
19
19
20
Futures – Containers (Chicago) 20
21
Why We Like Containers 1)Can deploy at scale 2)Plug and Play 3)Drives Innovation 4)Abstracts away religious wars in competitive bid AC vs DC Air vs Liquid 5)Cost can include maintenance 6)Allows for easy cost and performance measurement 7)Creates an environment to drive competition around efficiency 8)One throat to choke Question: Is this water cooling or air cooling? 1)Can deploy at scale 2)Plug and Play 3)Drives Innovation 4)Abstracts away religious wars in competitive bid AC vs DC Air vs Liquid 5)Cost can include maintenance 6)Allows for easy cost and performance measurement 7)Creates an environment to drive competition around efficiency 8)One throat to choke Question: Is this water cooling or air cooling? 21
22
Container Solutions Use a standard ISO (40’, 20’, 10’ x 8 x 8’6”) shipping container to hold servers Portability allows the delivery and operation of servers in self - contained units Move costs from long lead to short lead equipment, increased ROI Capital Optimize server Delivery at 1000U+ as a unit vs. 40+U in a rack and a Single SKU & Warranty Containers seen as a solution to burst demand and temporary Microsoft’s approach is different – use them as a primary packaging unit Cost: It costs less to ship 2,000+ servers in one container than it does to ship and then install individual racks. Additional savings come from not needing raised floors or fans for each server, and requiring a lot less wiring, packaging and shipping. 22
23
Container Solutions The container gives us the opportunity to test new technology such as increased supply air temperature, removal of fans from servers, managed airflow with hot aisle containment, significantly increased WPSF and more efficient electrical distributions Microsoft have stand alone units running in production today and are running proof of concepts on newer technology Energy efficiency: At more than 1,000 Watts per square foot, containers allow us to power a lot more servers in a given area. This is the key reason containers have become economically viable. PUE numbers tested in a POC measured at peak ~1.3 23
24
Microsoft Confidential Container POC GFS DCS Ran a Proof of Concept on a Container System in Seattle, Washington, USA PUE came in between 1.2 and 1.3 Ran unit up to full load measured at 107 kW = 178 watts per server Dropped power to unit and came back online no issues Fans and servers ran at full speed for 7 minutes on batteries Container spec completed and vendor RFP underway Batteries temp remained at 75 F using probe and IR camera. Back is exposed to 85 F Need to place a temp probe at rear of battery Ambient air temp had a large effect on temp inside the trailer due to no insulation Permit and UL inspection took 90 days to obtain Harmonics above 15%: varies across all 3 phases 24
25
Timeline started with container in late August PO approved in October First unit Live in January 5 months from Engineering to Live Delays encountered: 1 month: Flood plain re-plan New location, elevated foundation 1 month: Excell energy transformer install 1 month: Final concrete delayed due to snow Actual planning, permitting, construction effort totaled about 3 months First container Jan 5 th, third container Feb 2 nd Container vendor committing to 6-8 week turn around on order long term Virtual Earth Case Study 25
26
Containers - Chicago Data Centre Elevated Spine Connection Microsoft Container: 2400 Servers 375 KW Standard 40 foot shipping container Target PUE 1.25 Microsoft Container: 2400 Servers 375 KW Standard 40 foot shipping container Target PUE 1.25 Top Floor: 10.8MW Traditional COLO Capacity Ground Floor: 20 MW Container Capacity 26
27
27 But More Change Is Coming…
28
Generation 4 Modular Data Centres Challenging Many Data Centre Conventions Prefabrication of Shell and M&E Plant Pre-Assembled Containers Power Density > 1,000 Watt / Square Foot Totally Flexible Configuration (Classes) PUE < 1.1 (depending on Class) 3-5 Month Time-to-Market Reduced Cost of Entry Applying the Model-T Approach http://loosebolts.wordpress.com http://blogs.technet.com/msdataCentres/ 28
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.