Download presentation
Presentation is loading. Please wait.
Published byFrancis Gallagher Modified over 9 years ago
1
NCAR Supercomputing ‘Data Center’ Project An NCAR-led computing ‘facility’ for the study of the Earth system
2
Outline A Problem An Opportunity –NSF’s Petascale Roadmap A Solution –Facility Proposal: Site and Cost –Partners The Scientific Payoff Next Steps –Schedule and Panels
3
NCAR Leadership in Supercomputing… One of the founding missions of NCAR was: “… to provide, or arrange for provision of facilities for the scientific community as a whole that whose initial cost and upkeep lie beyond the capability of individual universities or research groups. ” – Preliminary Plans for a National Institute for Atmospheric Research. 1959 – NCAR Blue Book Note: the wording does not imply physical collocation. This mission does confer a responsibility that cannot be delegated - namely maintaining an complete integrated cyberinfrastructure (CI) system for modeling and data analysis that meets our scientific community ’ s needs.
4
Examples of NCAR simulation science today Global change climate ensembles Weather Research Forecast Geophysical Turbulence Fire storm front modeling Space weather More…
5
A problem NCAR Mesa Lab computer facility is quickly becoming obsolete Power, cooling and floor space will be inadequate beyond the next procurement Science is being restricted by focusing on capacity ahead of capability
6
CMOS Trends Continue …
7
Chips: Faster, Cheaper but Hotter
9
An Opportunity NSF’s Petascale Roadmap “Overarching Recommendation: Establish a Petascale Collaboratory for the Geosciences with the mission to provide leadership-class computational resources that will make it possible to address, and minimize the time to solution of, the most challenging problems facing the geosciences.”
10
NSF Conclusions NSF is committed to developing and implementing a strategic plan for cyberinfrastructure –Broad based plan involving the university, Federal agencies, vendors, and International partners ATM, OCE, and EAR take different approaches to the realization of CI for their discipline –Dependent on the readiness of the community Petascale facility is an integrating theme for the Geosciences community –High potential for the generation of new knowledge and paradigm for the conduct of research –Building and sustaining a petascale facility will be a significant challenge to budgets and technology –Consistent with NSF strategic vision for CI
11
A solution for NCAR A new computing facility (not at the Mesa Lab) Extensive investigations, working with consultants and internal needs resulted in a detailed set of options Provides for 5-20 years of computing (capacity and capability) diversity based on current and predicted future trends in CMOS technology Allows NCAR to reach beyond its current research scope
12
The facility needed Data Center Expansion Report from NCAR’s Computing and Information Systems Lab 20,000 (initial to 60,000) sq. ft. 4 (to 13) MW power + generators Cooling, etc. On 13 acres (20 year lifetime) Accommodates computers, staff, open space, initial and future requirements
13
Birds Eye View
14
Importance of Site Selection Limited selection of sites that meet criteria –Size (10-15 acres) –Electrical capacity (up to 24 MW) –Fiber optic route (dark fiber) Investigated –Marshall –Louisville –Longmont –Westminster New partners and options are now being sought –IBM –Colorado School of Mines (Water, Political Complications, Fiber Optics) (Electrical Capacity)
15
Cost Drivers Primary Drivers –Tier III Reliability Mechanical Systems Electrical Systems –Engineering Secondary Drivers –Building Size –Land Site Facility - up to $75M (one time) Operations - $15M/year? (2X) Computing increments - $15M/year (2X) Computing infrastructure - $5M/year
16
The Scientific Payoff…
17
A petascale computer will enable scientists to … Do credible regional climate modeling for decision support. Requires resolving individual mountain ranges and ocean western boundary currents. Model climate and weather in a fully coupled mode. Better understand the marine biogeochemical cycles. Requires resolving ocean mesoscale eddies. Accurately simulate the dynamical, microphysical and radiative cloud processes. Improve seismic predictions and understand the structure of the inner core as well as the fine structure of the lower mantle.
18
A petascale computer will enable scientists to Perform new research in solid earth and environmental engineering Assimilate thousands of earthquakes bringing the fine structure of the Earth’s mantle and inner core into focus. Study the physical basis of land surface parameterizations by modeling soils, topography and vegetation at sub-meter scales. More accurately predict the damaging effects of solar flares on satellites and power distribution systems by resolving the fine structure of the corona magnetic field. Investigate energy management applications
19
Science examples …
20
2005 Hurricane Katrina, Track Forecast 4 km, 62 h forecast Landfall on 8/29 14Z, Louisiana/Mississippi Border 12 km, 86 h forecast Observed Track Official Forecast
21
2005 Hurricane Katrina, Track Forecast 4 km WRF, Initialized 8/27 00Z Landfall on 8/29 14Z, Louisiana/Mississippi Border 12 km WRF, Initialized 8/26 00z WRF ARW
22
Mobile Radar Hurricane Katrina Reflectivity at Landfall 29 Aug 2005 14 Z 4 km WRF, 62 h forecast
23
Radar Composite Reflectivity WRF 4 km Hurricane Katrina 72 h Forecast Initialized 27 Aug 2005 00 Z WRF Max Reflectivity
24
WRF4 km Moving-Grid Katrina Wind Forecast Initialized 27 Aug 2005 00 Z Wind Speed (m/s) Cat 1: 33-42 Cat 2: 42-50 Cat 3: 50-59 Cat 4: 60-69 Cat 5: >69
25
Hurricane Katrina 24 h Accum Precip Forecast 60 h forecast, valid 8/29 12Z Stage IV observed 24 h precipitation 36 h forecast, valid 8/30 12Z 4 km moving nested grid forecast
26
Wilmington radar Hurricane Ophelia Approaching Landfall 4 km WRF forecast initialized 12 Sept 2005 00 Z 63 h reflectivity forecast Moorhead City radar72 h reflectivity forecast 14 Sept 2005 15 Z 15 Sept 2005 00 Z
27
72 h Ophelia Wind Forecast Initialized 27 Aug 2005 00 Z
28
72 h Rita Wind Forecast Initialized 19 Sep 2005 00 Z
29
Coupled Climate System Model
30
Integrated Space Weather Modeling
31
Thus …
32
Main Points Huge scientific discoveries await geoscience modelers at 1 PFLOPS and beyond. CMOS continues to get hotter and cheaper. The most recent acquisition tracks this trend. Every center is (or will be) facing facility challenges in the race to these discoveries. This situation is NOT unique to NCAR. NCAR now has a facility plan, that if successful, uniquely positions it as a world leader in geoscience simulation. The new facility is not a crisis: it is an opportunity.
33
The Opportunity Understanding of fundamental physical processes in the Sun-Earth system Environmental and Energy applications not yet possible NCAR and partners will scope/define these options –Such a facility would be a computational equivalent of the Hubble Telescope for geoscience simulation.
34
Next Steps
35
The Schedule Formed NCAR project committee Forming Blue Ribbon Panel and hold workshop - early Oct. 2005 Project plan development Oct-Dec Community engagement - Nov-Jan Formalize partnerships - Oct-Dec Present initial plan to UCAR Board and National Science Foundation, early October, 2005. Initiate facility - June 2006? First electrons - June 2008 - March 2009?
36
The Blue Ribbon Panel Rick Anthes Kelvin Droegemeier Tamas Gombosi Thomas Jordan Jean-Bernard Minster John Orcutt Tim Palmer Shukla David Yuen Plus some to be confirmed
37
Contacts at NCAR Tim Killeen (killeen@ucar.edu) - NCAR Director Lawrence Buja (southern@ucar.edu) and Peter Fox (pfox@ucar.edu) are co-chairs of the NCAR project team Aaron Anderson (aaron@ucar.edu) is the computing facilities contact Jeff Reaves (jreaves@ucar.edu) is the financial/ contracts contact
38
Concluding remarks …
40
NCAR Strategy to Support Leadership Class Computing …
41
Design Strategy Commit to the minimal amount of infrastructure that makes sense (5 year term) –2MW Electrical – Matching Mechanical –Scaleable to 4MW Electrical – Matching Mechanical Modest office & support space Basic Finishes Build the core and shell to support a 20-30 year life or beyond –Don’t compromise on the physical space –Allow for in-place retrofit
42
High Level Concepts Land for 20 Years Building Shell for 10 Outfit for 5 Strategy preserves initial capital investment as computing hardware evolves rapidly. Especially attractive with data-center construction as 50% to 75% of the total cost is in outfitting mechanical and electrical
43
Facility Design Concept 20,000 sq. ft. 4 MW 20,000 sq. ft. 4 MW 20,000 sq. ft. Cooling & Electrical Computers Office Space 20,000 sq. ft. 4 MW 20,000 sq. ft. 4 MW Initial Module Module 2 Module 3 20,000 sq. ft. 4MW 20,000 sq. ft. 4 MW Expand laterally
44
The Conceptual Design …
45
Architectural View Phase 2 Addition Phase 3 Addition
46
Other Key Design Features Practical limit of air-cooling is about 150 Watts/Sq. Ft. –Design is capable of both air and water based cooling –Consistent with current computing trends Completely free span –Virtually all of the raised floor is usable for computing and mass storage equipment Systems of this size have massive wiring requirements –Power is deliverable both overhead and under floor –Networking and data cabling also overhead and under floor Hydronic free cooling –Our cool dry climate allows for up to 3500 hours of “free” cooling per year –Environmentally friendly –Operational cost saving
47
The greater context …
48
NSF’s Cyberinfrastructure Planning plus A Strategic Plan for High Performance Computing Courtesy of: Cliff Jacobs (NSF) and Deborah L. Crawford Acting Director, Office of Cyberinfrastructure
49
NSF “Vision” Document A: Call to Action B: Strategic Plan for High Performance Computing (Aug. 1, 2005) C: Strategic Plan for Data (Nov. 15, 2005) E: Strategic Plan for Education & Workforce Development (Jan. 15, 2006) D: Strategic Plan for Collaboration & Communication (Jan. 15, 2006) NSF Finalizes Cyberinfrastructure Vision DocumentMarch 31, 2006
50
Creating a World-Class HPC Environment To Enable Petascale Science and Engineering Strategic Plan for High Performance Computing (FY 2006 – 2010)
51
Strategic Plan for High Performance Computing (2006-2010) Private Sector Agency Partners HPC Resource Providers S&E Community Portable, Scalable Applications Software & Services Software Service Provider (SSP) SSP Science-Driven HPC Systems Compute Engines Local Storage Visualization Facilities
52
HPC Implementation Milestones (FY 2006-2007) Specify reqts for science- driven architectures Mid-range systems acquisition Leadership Computing Council Leadership System Acquisition Supporting Software Program Scalable Applications Software Program Oct-Dec ‘05 Oct-Dec ‘06 Jan-Mar ‘06 Jan-Mar ‘07 Apr-Jun ‘06 Apr-Jun ‘07 Jul-Sep ‘06 Jul-Sep ‘07 release solicitation make award(s) ongoing annual activity work with academic and vendor communities Phase I award(s) release solicitation Establish Software Services Council release solicitation make award(s)
53
NSF directorate efforts Atmospheric Sciences (ATM) (Solid) Earth Sciences (EAR) Ocean Sciences (OCE) Each have reports and recommendations Led NSF to provide an over-arching strategy for Geosciences
54
Vision A multi (computer) generational computing facility providing an end-to- end capability and capacity to address some of the most challenging environmental problems that face us today The people, expertise, partners and community working synergistically
55
The Benefit Unprecedented capability and capacity in a single virtual facility Opportunity to build new/ stronger partnerships and relations (local, national and international) Jobs in the Front Range Resources for Colorado institutions to perform new work and attract new staff/students
56
Colorado The place to compute Connected, concerned, and active World-class facility as part of a world- respected organization Strong local connections Strong community leadership and partnership Strong commercial connections Pushing technology development for Colorado: small and large companies
57
The Expertise NCAR - community facilitator and integrator (people) 45 years of building among the most advance simulation, assimilation, analysis, and visualization codes/tools 45 years of direct benefit to university and related community
58
The Mission for NCAR computing To reach out beyond NCAR’s traditional fields of geoscience To provide the intellectual leadership and infrastructure To build partnerships with academic, government and private industry to serve their communities To build, install and operate and maintain a world class supercomputing facility
59
The Potential Partners Colorado School of Mines CU/Boulder IBM CSU NOAA/NIST USGS UK Met Office
60
Additional facility materials
61
Machine Room Power Budget: Up-coming Procurement Baseline + 50 kW Safety Margin: 200 kW Other Supers (neglecting bluesky): 150 kW Bluevista: 270 kW (estimate) Power available for new system: 580 kW Max allowable power-price ratio (2007): 39 kW/$M Bluevista power-price ratio: 52 kW/$M By 2010 power-price ratio may be 2.45 times larger (i.e. worse). Conclusion: 2007 procurement is borderline, circa 2010 procurement will be likely insupportable in terms of power at the Mesa Laboratory.
62
Site Selection Drives Schedule Computing System from Procurement B Must Go Elsewhere ML Computer Room At Maximum Capacity
63
Reliability Drivers Designed as a Tier III facility –Concurrently maintainable (eliminates twice yearly shutdowns, improves maintenance schedules) –Redundant paths –Redundant systems N+1 – appropriate number of components plus one additional to maintain redundancy S+S – identical systems plus systems Drivers for a Tier III facility –As devices continue to shrink sensitivity to environmental excursions are more serious –Expectations of user community Critical times of year for critical projects –More real-time or near real-time forecast portfolio of applications
64
Cost Breakdown Includes the backbone for 4MW but only half the electrical equipment 2MW Excludes inflation Additional data center modules
65
Petascale Future Leadership class computing will require a leadership class facility This design can scale along with computing to the PetaFLOP level –Without an initial petascale cost –Manage the risks as part of the strategy Technology changes Funding –Significant competitive advantage Those who will be asked to do this will have to have a solid plan This plan can get us there
66
Additional NSF materials
67
Recommendations Computer ServicesComputer Services –Procure computing systems in a three year technical refresh cycle with the first procurement in 2007 and project completion in 2012 –Phase I: two to three terascale systems (each 50-100 TFLOPS peak, approximately 200 TFLOPS aggregate) –Phase II: procure a new “centerpiece” system with 1 PFLOP peak processing speed Data and Visualization Services –Equip the petascale collaboratory with storage systems capable of archiving approximately 100 Pbytes per year of operation Interface and Collaboration Services –Leverage the technical expertise and cyberinfrastructure in existing national research computing facilities and universities by creating collaboratory nodes at centers where specialized domain expertise –Approximately 75 new hires at petascale facility and the collaboratory nodes
68
Cyberinfrastructure for the Atmospheric Sciences in the 21st Century Ad Hoc Committee for Cyberinfrastructure Research, Development and Education in the Atmospheric Sciences June 2004
69
Recommendation (3) Physical Infrastructure –Provide physical infrastructure typical of a modern best IT center –Consider leasing a data center of the appropriate scale Planning –Form an Application Performance Characterization working group to focus on identifying the algorithmic and architectural elements needed for the design, procurement, and effective use of petascale systems and beyond –Continue to plan through procurement phases for continued provision of petascale and higher computational capability to the geosciences
70
General Recommendations NSF change the way in which it invests in CI for the geosciences in the following ways: Give human resource issues top priority in CI development, hardening, maintenance and distribution. The academic reward structure should recognize CI- related activities Encourage the education of students, educators, support staff and scientists through exposure to the computational environment, including tools, data sets, and models. Provide the necessary support for communication and dissemination of ideas, technologies, and processes to promote the interdisciplinary understanding needed to successfully use CI in the atmospheric sciences. Help organize and coordinate CI activities across the geosciences and provide forums for innovation.
71
General Recommendations (2) Initiate a coordinated effort with NASA, NOAA and other agencies to provide geoscientists with the ways and means to find, effectively use, publish, and distribute geoscience data, including the development of appropriate standards for metadata to which grant recipients should be encouraged to conform. Provide funding for the entire CI software life cycle, including development, testing, hardening, deployment, training, support and maintenance, subject to ongoing review and demonstrated utility and sustainability. Invest in computing infrastructure and capacity building at all levels, including centers,campuses, and departments, in support of atmospheric science research and education. Support the development of geosciences cyber- environments that allow the seamless transport of work from the desktop to supercomputers to Grid- based computing platforms.
72
EARTH SCIENCES
73
SOLID EARTH GEOINFORMATICS SEDIMENTARY & ANCIENT LIFE SYSTEMS GEOPHYSICS & ACTIVE TECTONICS CRUSTAL ARCHITECTURE & EVOLUTION EARTH SURFACE PROCESSES COMPUTATIONAL CENTERS INFORMATION, VISUALIZATION & NUMERICAL TECHNOLOGIES EDUCATION & OUTREACH EarthScope IRIS SCEC PaleoBio CHRONOS CSM NCED CUAHSI SAHRA GEON - Science GERM NAVADAT Computational Geophysics
74
COMPUTATIONAL SOLID EARTH SCIENCES 1. Build a national facility that is second to none for computational 2. Fund ten regional computation, data storage, and visualization clusters as platforms for web services. 3. Increase funding by a factor of 5 to 10 from NSF EAR per year for research group and departmental clusters to $1.5M-$3M. 4. Change policies at the existing national centers. 5. Increase funding for education and training in computational Earth science.
75
Ocean Sciences
76
Survey Findings / Recommendations Expected requirement –ten to one thousand times the current ITI hardware capacity over the next five to ten years, –most critical bottlenecks occurring in the availability of cpu cycles, memory and mass-storage capacity, and network bandwidth. Software systems –need to re-engineer models, and data analysis and assimilation packages, for efficient use on massively parallel computers; –advances in visualization techniques to deal effectively with increasing volumes of observations and model output; –well-designed, documented and tested community models of all types. Extreme shortage of skilled ITI technical personnel accessible to the ocean sciences community Improve access to high- performance computational resources across the ocean sciences. Provide technical support for maintenance and upgrade of local ITI resources. Provide model, data and software curatorship. Facilitate advanced applications programming Survey Recommendations
77
Recommendation (2) Data and Visualization Services –Equip the petascale collaboratory with storage systems capable of archiving approximately 100 Pbytes per year of operation –Invest approximately 10 percent of the overall computing resource in specialized data analysis and visualization systems, distributed among the nodes of the collaboratory Interface and Collaboration Services –Leverage the technical expertise and cyberinfrastructure available in existing national research computing facilities and universities to support the effective scientific use of the computing resources of the petascale facility. –Leveraging resources can best be accomplished by creating collaboratory nodes at appropriate centers where specialized domain expertise –Approximately 75 new hires, distributed among the petascale facility itself and the collaboratory nodes embedded within science teams
78
Petascale Collaboratory “Overarching Recommendation: Establish a Petascale Collaboratory for the Geosciences with the mission to provide leadership-class computational resources that will make it possible to address, and minimize the time to solution of, the most challenging problems facing the geosciences.”
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.