Download presentation
Presentation is loading. Please wait.
Published byErnest Brown Modified over 9 years ago
1
Presented by Leadership Computing Facility (LCF) Roadmap Buddy Bland Center for Computational Sciences Leadership Computing Facility Project
2
2 Bland_JaguarRoadmap_0611 Outline Systems Facilities upgrade Systems infrastructure Overview Networking Storage Software and science 2 Bland_JaguarRoadmap_0611
3
3 Bland_JaguarRoadmap_0611 Systems CCS firsts (1991–2008)
4
4 Bland_JaguarRoadmap_0611 Facilities upgrade Preparing computer center for next generation of Jaguar Floor raised by 1 ft New chilled water pipes to support up to 7 MW power New air-handling system New power distribution units
5
5 Bland_JaguarRoadmap_0611 Systems infrastructure—overview Current and projected External B/W (GB/s) LAN B/W (GB/s) Networking 5 240 FY 2009FY 2008 FY 2007 4 140 3 60 Capacity (PB) Bandwidth (GB/s) Archival storage 18 19 FY 2009FY 2008 FY 2007 10 4444 Central storage 10.0 240 FY 2009FY 2008 FY 2007 1.0 60 0.22 10 Capacity (PB) Bandwidth (GB/s)
6
6 Bland_JaguarRoadmap_0611 Shifting to a hybrid InfiniBand/Ethernet network InfiniBand-based network helps meet the bandwidth and scaling needs for the center Wide-area network will scale to meet user demand using currently deployed routers and switches Systems infrastructure—network 1000 TF 240 GB/s LAN 5 GB/s WAN 2007 2009 2008 1000 TF 240 GB/s LAN 5 GB/s WAN 100 TF 60 GB/s LAN 3 GB/s WAN
7
7 Bland_JaguarRoadmap_0611 Consistent planned growth in ORNL external network bandwidth Systems infrastructure—network ORNL and LCF Backbone Connectivity
8
8 Bland_JaguarRoadmap_0611 HPSS software has already demonstrated ability to scale to many PB Add two silos/year Tape capacity and bandwidth, disk capacity and bandwidth are all scaled to maintain a balanced system Use new methods to improve data transfer speeds between parallel file systems and archival system Systems infrastructure—storage Archival storage 1000 TF 18 PB 19 GB/s 2007 2009 2008 1000 TF 10 PB 10 GB/s 100 TF 4 PB 4 GB/s
9
9 Bland_JaguarRoadmap_0611 Increase scientific productivity by providing single repository for simulation data Connect to all major LCF resources Connect to both InfiniBand and Ethernet networks Potentially becomes the primary file system for the 1000 TF system Systems infrastructure—storage Archival storage 1000 TF 10 PB 240 GB/s 2007 2009 2008 1000 TF 1 PB 60 GB/s 100 TF 250 TB 10 GB/s
10
10 Bland_JaguarRoadmap_0611 Software and science—overview Cutting-edge hardware lays the foundation for science at the petascale—scientists using a production petascale system with petascale application development software Establishing fully integrated computing environment Developing software infrastructure to enable productive utilization and system management Empowering scientific and engineering progress and allied educational activities using petascale system Developing and educating the next generation of computational scientists to use petascale system CCS Management Plan coordinates the transition to petascale production 10 Bland_JaguarRoadmap_0611
11
11 Bland_JaguarRoadmap_0611 Software and science Roadmap to deliver Science Day 1 2006200720082009 50 T100 T250 T1000 T expand quad- core Establish CFS Lustre Center of excellence at ORNL quad-core Linux LWK quad-core Catamount Baker LWK petascale LWK dual-core LWK SIO Lustre clients and external cluster of Lustre servers XT4 LW IO interface MPI-Lustre failover LCF-wide file system Hardware supervisory system Mazama system Scale tests using Jaguar expand Fault monitor and prediction system C/R on XT3 dual- core on 100 TF on 250 TF on 1 PF Science day 1 Testbeds OS (key releases) File system RAS HW
12
12 Bland_JaguarRoadmap_0611 Science–drivers Advanced energy systems (e.g., fuel cells, fusion) Biotechnology (e.g., genomics, cellular dynamics) Environmental modeling (e.g., climate prediction, pollution remediation) Nanotechnology (e.g., sensors, storage devices) “Computational simulation offers to enhance, as well as leapfrog, theoretical and experimental progress in many areas of science and engineering…” — [ A Science-Based Case for Large-Scale Simulation (SCaLeS Report), Office of Science, U.S. DOE, July 2003] 12 Bland_JaguarRoadmap_0611
13
13 Bland_JaguarRoadmap_0611 Software and science—fusion
14
14 Bland_JaguarRoadmap_0611 Software and science—biology
15
15 Bland_JaguarRoadmap_0611 Software and science—climate
16
16 Bland_JaguarRoadmap_0611 Software and science—nanoscience
17
17 Bland_JaguarRoadmap_0611 Contacts Arthur S. Bland Leadership Computing Facility Project Director Center for Computational Sciences (865) 576-6727 blandas@ornl.gov 17 Presenter_date
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.