NCAR Supercomputing ‘Data Center’ Project An NCAR-led computing ‘facility’ for the study of the Earth system.

Slides:



Advertisements
Similar presentations
21 st Century Science and Education for Global Economic Competition William Y.B. Chang Director, NSF Beijing Office NATIONAL SCIENCE FOUNDATION.
Advertisements

Indian Institute of Remote Sensing Indian Space Research Organisation Dehradun Challenges in Capacity Building in Remote Sensing & GIS P. S. Roy
Supporting Research on Campus - Using Cyberinfrastructure (CI) Public research use of ICT has rapidly increased in the past decade, requiring high performance.
State of Indiana Business One Stop (BOS) Program Roadmap Updated June 6, 2013 RFI ATTACHMENT D.
Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
Advancing Alternative Energy Technologies Glenn MacDonell Director, Energy Industry Canada Workshop on Alternatives to Conventional Generation Technologies.
High-Performance Computing
NSF Regional Grants Conference St. Louis, MO
GENI: Global Environment for Networking Innovations Larry Landweber Senior Advisor NSF:CISE Joint Techs Madison, WI July 17, 2006.
Petascale System Requirements for the Geosciences Richard Loft SCD Deputy Director for R&D.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
Chicago’s Broadband Strategy TOWARDS AFFORDABLE, UNIVERSAL TECHNOLOGY ACCESS City of Chicago Richard M. Daley, Mayor Department of Innovation and Technology.
The "Earth Cube” Towards a National Data Infrastructure for Earth System Science Presentation at WebEx Meeting July 11, 2011.
1 WRF Development Test Center A NOAA Perspective WRF ExOB Meeting U.S. Naval Observatory, Washington, D.C. 28 April 2006 Fred Toepfer NOAA Environmental.
Social and behavioral scientists building cyberinfrastructure David W. Lightfoot Assistant Director, National Science Foundation Social, Behavior & Economic.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Reorganization at NCAR Presentation to the UCAR Board of Trustees February 25, 2004.
1 NOAA’s Environmental Modeling Plan Stephen Lord Ants Leetmaa November 2004.
1 Robert S. Webb and Roger S. Pulwarty NOAA Climate Service.
Unidata Policy Committee Meeting Bernard M. Grant, Assistant Program Coordinator for the Atmospheric and Geospace Sciences Division May 2012 NSF.
The Climate Prediction Project Global Climate Information for Regional Adaptation and Decision-Making in the 21 st Century.
NCAR Supercomputing ‘Data Center’ Project An NCAR-led computing ‘facility’ for the study of Earth System Science.
Partnerships and Broadening Participation Dr. Nathaniel G. Pitts Director, Office of Integrative Activities May 18, 2004 Center.
Students Becoming Scientists in the World: Integrating Research and Education for Sustainable Development Dr. James P. Collins Directorate for the Biological.
NCAR Annual Budget Review October 8, 2007 Tim Killeen NCAR Director.
NCAR Supercomputing Center (NSC) Project Status Update to the CHAP 4 October 2007 Krista Laursen NSC Project Director.
EarthCube Vision An alternative approach to respond to daunting science and CI challenges An alternative approach to respond to daunting science and CI.
1 Addressing Critical Skills Shortages at the NWS Environmental Modeling Center S. Lord and EMC Staff OFCM Workshop 23 April 2009.
Cyberinfrastructure Planning at NSF Deborah L. Crawford Acting Director, Office of Cyberinfrastructure HPC Acquisition Models September 9, 2005.
Top Issues Facing Information Technology at UAB Sheila M. Sanders UAB Vice President Information Technology February 8, 2007.
Cyberinfrastructure A Status Report Deborah Crawford, Ph.D. Interim Director, Office of Cyberinfrastructure National Science Foundation.
DOE 2000, March 8, 1999 The IT 2 Initiative and NSF Stephen Elbert program director NSF/CISE/ACIR/PACI.
ESIP Federation Air Quality Cluster Partner Agencies.
NCAR Supercomputing ‘Data Center’ Project An NCAR-led computing ‘facility’ for the study of the Earth system May 30, 2006.
Software Engineering Committee Status Report: Preliminary Findings and Recommendations Richard Loft and Gerry Wiener SE Committee Co-chairs National Center.
Geosciences - Observations (Bob Wilhelmson) The geosciences in NSF’s world consists of atmospheric science, ocean science, and earth science Many of the.
National Center for Supercomputing Applications Barbara S. Minsker, Ph.D. Associate Professor National Center for Supercomputing Applications and Department.
Catawba County Board of Commissioners Retreat June 11, 2007 It is a great time to be an innovator 2007 Technology Strategic Plan *
08/05/06 Slide # -1 CCI Workshop Snowmass, CO CCI Roadmap Discussion Jim Bottum and Patrick Dreher Building the Campus Cyberinfrastructure Roadmap Campus.
3 rd Annual WRF Users Workshop Promote closer ties between research and operations Develop an advanced mesoscale forecast and assimilation system   Design.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
The State Climatologist Program and a National Climate Services Initiative Mark A. Shafer Oklahoma Climatological Survey University of Oklahoma.
NASA Applied Sciences Program Update John A. Haynes Program Manager, Weather National Aeronautics and Space Administration Applied Sciences Program Earth.
A Data Centre for Science and Industry Roadmap. INNOVATION NETWORKING DATA PROCESSING DATA REPOSITORY.
GEOSCIENCE NEEDS & CHALLENGES Dogan Seber San Diego Supercomputer Center University of California, San Diego, USA.
Geosciences Directorate Overview May 23, Directorate for Geosciences Mission Support research in atmospheric, earth and ocean sciences Address nation’s.
Marv Adams Chief Information Officer November 29, 2001.
NCAR Supercomputing ‘Data Center’ Project An NCAR-led computing ‘facility’ for the study of the Earth system.
Information Technology Services Strategic Directions Approach and Proposal “Charting Our Course”
Applied Sciences Perspective Lawrence Friedl, Program Director NASA Earth Science Applied Sciences Program LANCE User Working Group Meeting  September.
Earth System Curator and Model Metadata Discovery and Display for CMIP5 Sylvia Murphy and Cecelia Deluca (NOAA/CIRES) Hannah Wilcox (NCAR/CISL) Metafor.
6 February 2004 Internet2 Priorities 2004 Internet2 Industry Strategy Council Douglas Van Houweling.
Power and Cooling of HPC Data Centers Requirements Roger A Panton Avetec Executive Director DICE
1 Symposium on the 50 th Anniversary of Operational Numerical Weather Prediction Dr. Jack Hayes Director, Office of Science and Technology NOAA National.
Next Generation Data and Computing Facility NCAR “Data Center” project An NCAR-led computing ‘facility’ for the study of the Earth system.
Internet2 Strategic Directions October Fundamental Questions  What does higher education (and the rest of the world) require from the Internet.
National Science Foundation Blue Ribbon Panel on Cyberinfrastructure Summary for the OSCER Symposium 13 September 2002.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI strategy and Grand Vision Ludek Matyska EGI Council Chair EGI InSPIRE.
U N I T E D S T A T E S D E P A R T M E N T O F C O M M E R C E N A T I O N A L O C E A N I C A N D A T M O S P H E R I C A D M I N I S T R A T I O N.
PEER 2003 Meeting 03/08/031 Interdisciplinary Framework Major focus areas Structural Representation Fault Systems Earthquake Source Physics Ground Motions.
Driving Innovation V Power electronics – Enabling a resilient energy system, KTP thematic competition Christian Inglis – energy supply team Creating.
Associate Director for Research, Education and Marine Operations
Unidata Policy Committee Meeting
Unidata Policy Committee
MODULE 11: Creating a TSMO Program Plan
Presentation transcript:

NCAR Supercomputing ‘Data Center’ Project An NCAR-led computing ‘facility’ for the study of the Earth system

Outline A Problem An Opportunity –NSF’s Petascale Roadmap A Solution –Facility Proposal: Site and Cost –Partners The Scientific Payoff Next Steps –Schedule and Panels

NCAR Leadership in Supercomputing… One of the founding missions of NCAR was: “… to provide, or arrange for provision of facilities for the scientific community as a whole that whose initial cost and upkeep lie beyond the capability of individual universities or research groups. ” – Preliminary Plans for a National Institute for Atmospheric Research – NCAR Blue Book Note: the wording does not imply physical collocation. This mission does confer a responsibility that cannot be delegated - namely maintaining an complete integrated cyberinfrastructure (CI) system for modeling and data analysis that meets our scientific community ’ s needs.

Examples of NCAR simulation science today Global change climate ensembles Weather Research Forecast Geophysical Turbulence Fire storm front modeling Space weather More…

A problem NCAR Mesa Lab computer facility is quickly becoming obsolete Power, cooling and floor space will be inadequate beyond the next procurement Science is being restricted by focusing on capacity ahead of capability

CMOS Trends Continue …

Chips: Faster, Cheaper but Hotter

An Opportunity NSF’s Petascale Roadmap “Overarching Recommendation: Establish a Petascale Collaboratory for the Geosciences with the mission to provide leadership-class computational resources that will make it possible to address, and minimize the time to solution of, the most challenging problems facing the geosciences.”

NSF Conclusions NSF is committed to developing and implementing a strategic plan for cyberinfrastructure –Broad based plan involving the university, Federal agencies, vendors, and International partners ATM, OCE, and EAR take different approaches to the realization of CI for their discipline –Dependent on the readiness of the community Petascale facility is an integrating theme for the Geosciences community –High potential for the generation of new knowledge and paradigm for the conduct of research –Building and sustaining a petascale facility will be a significant challenge to budgets and technology –Consistent with NSF strategic vision for CI

A solution for NCAR A new computing facility (not at the Mesa Lab) Extensive investigations, working with consultants and internal needs resulted in a detailed set of options Provides for 5-20 years of computing (capacity and capability) diversity based on current and predicted future trends in CMOS technology Allows NCAR to reach beyond its current research scope

The facility needed Data Center Expansion Report from NCAR’s Computing and Information Systems Lab 20,000 (initial to 60,000) sq. ft. 4 (to 13) MW power + generators Cooling, etc. On 13 acres (20 year lifetime) Accommodates computers, staff, open space, initial and future requirements

Birds Eye View

Importance of Site Selection Limited selection of sites that meet criteria –Size (10-15 acres) –Electrical capacity (up to 24 MW) –Fiber optic route (dark fiber) Investigated –Marshall –Louisville –Longmont –Westminster New partners and options are now being sought –IBM –Colorado School of Mines (Water, Political Complications, Fiber Optics) (Electrical Capacity)

Cost Drivers Primary Drivers –Tier III Reliability Mechanical Systems Electrical Systems –Engineering Secondary Drivers –Building Size –Land Site Facility - up to $75M (one time) Operations - $15M/year? (2X) Computing increments - $15M/year (2X) Computing infrastructure - $5M/year

The Scientific Payoff…

A petascale computer will enable scientists to … Do credible regional climate modeling for decision support. Requires resolving individual mountain ranges and ocean western boundary currents. Model climate and weather in a fully coupled mode. Better understand the marine biogeochemical cycles. Requires resolving ocean mesoscale eddies. Accurately simulate the dynamical, microphysical and radiative cloud processes. Improve seismic predictions and understand the structure of the inner core as well as the fine structure of the lower mantle.

A petascale computer will enable scientists to Perform new research in solid earth and environmental engineering Assimilate thousands of earthquakes bringing the fine structure of the Earth’s mantle and inner core into focus. Study the physical basis of land surface parameterizations by modeling soils, topography and vegetation at sub-meter scales. More accurately predict the damaging effects of solar flares on satellites and power distribution systems by resolving the fine structure of the corona magnetic field. Investigate energy management applications

Science examples …

2005 Hurricane Katrina, Track Forecast 4 km, 62 h forecast Landfall on 8/29 14Z, Louisiana/Mississippi Border 12 km, 86 h forecast Observed Track Official Forecast

2005 Hurricane Katrina, Track Forecast 4 km WRF, Initialized 8/27 00Z Landfall on 8/29 14Z, Louisiana/Mississippi Border 12 km WRF, Initialized 8/26 00z WRF ARW

Mobile Radar Hurricane Katrina Reflectivity at Landfall 29 Aug Z 4 km WRF, 62 h forecast

Radar Composite Reflectivity WRF 4 km Hurricane Katrina 72 h Forecast Initialized 27 Aug Z WRF Max Reflectivity

WRF4 km Moving-Grid Katrina Wind Forecast Initialized 27 Aug Z Wind Speed (m/s) Cat 1: Cat 2: Cat 3: Cat 4: Cat 5: >69

Hurricane Katrina 24 h Accum Precip Forecast 60 h forecast, valid 8/29 12Z Stage IV observed 24 h precipitation 36 h forecast, valid 8/30 12Z 4 km moving nested grid forecast

Wilmington radar Hurricane Ophelia Approaching Landfall 4 km WRF forecast initialized 12 Sept Z 63 h reflectivity forecast Moorhead City radar72 h reflectivity forecast 14 Sept Z 15 Sept Z

72 h Ophelia Wind Forecast Initialized 27 Aug Z

72 h Rita Wind Forecast Initialized 19 Sep Z

Coupled Climate System Model

Integrated Space Weather Modeling

Thus …

Main Points Huge scientific discoveries await geoscience modelers at 1 PFLOPS and beyond. CMOS continues to get hotter and cheaper. The most recent acquisition tracks this trend. Every center is (or will be) facing facility challenges in the race to these discoveries. This situation is NOT unique to NCAR. NCAR now has a facility plan, that if successful, uniquely positions it as a world leader in geoscience simulation. The new facility is not a crisis: it is an opportunity.

The Opportunity Understanding of fundamental physical processes in the Sun-Earth system Environmental and Energy applications not yet possible NCAR and partners will scope/define these options –Such a facility would be a computational equivalent of the Hubble Telescope for geoscience simulation.

Next Steps

The Schedule Formed NCAR project committee Forming Blue Ribbon Panel and hold workshop - early Oct Project plan development Oct-Dec Community engagement - Nov-Jan Formalize partnerships - Oct-Dec Present initial plan to UCAR Board and National Science Foundation, early October, Initiate facility - June 2006? First electrons - June March 2009?

The Blue Ribbon Panel Rick Anthes Kelvin Droegemeier Tamas Gombosi Thomas Jordan Jean-Bernard Minster John Orcutt Tim Palmer Shukla David Yuen Plus some to be confirmed

Contacts at NCAR Tim Killeen - NCAR Director Lawrence Buja and Peter Fox are co-chairs of the NCAR project team Aaron Anderson is the computing facilities contact Jeff Reaves is the financial/ contracts contact

Concluding remarks …

NCAR Strategy to Support Leadership Class Computing …

Design Strategy Commit to the minimal amount of infrastructure that makes sense (5 year term) –2MW Electrical – Matching Mechanical –Scaleable to 4MW Electrical – Matching Mechanical Modest office & support space Basic Finishes Build the core and shell to support a year life or beyond –Don’t compromise on the physical space –Allow for in-place retrofit

High Level Concepts Land for 20 Years Building Shell for 10 Outfit for 5 Strategy preserves initial capital investment as computing hardware evolves rapidly. Especially attractive with data-center construction as 50% to 75% of the total cost is in outfitting mechanical and electrical

Facility Design Concept 20,000 sq. ft. 4 MW 20,000 sq. ft. 4 MW 20,000 sq. ft. Cooling & Electrical Computers Office Space 20,000 sq. ft. 4 MW 20,000 sq. ft. 4 MW Initial Module Module 2 Module 3 20,000 sq. ft. 4MW 20,000 sq. ft. 4 MW Expand laterally

The Conceptual Design …

Architectural View Phase 2 Addition Phase 3 Addition

Other Key Design Features Practical limit of air-cooling is about 150 Watts/Sq. Ft. –Design is capable of both air and water based cooling –Consistent with current computing trends Completely free span –Virtually all of the raised floor is usable for computing and mass storage equipment Systems of this size have massive wiring requirements –Power is deliverable both overhead and under floor –Networking and data cabling also overhead and under floor Hydronic free cooling –Our cool dry climate allows for up to 3500 hours of “free” cooling per year –Environmentally friendly –Operational cost saving

The greater context …

NSF’s Cyberinfrastructure Planning plus A Strategic Plan for High Performance Computing Courtesy of: Cliff Jacobs (NSF) and Deborah L. Crawford Acting Director, Office of Cyberinfrastructure

NSF “Vision” Document A: Call to Action B: Strategic Plan for High Performance Computing (Aug. 1, 2005) C: Strategic Plan for Data (Nov. 15, 2005) E: Strategic Plan for Education & Workforce Development (Jan. 15, 2006) D: Strategic Plan for Collaboration & Communication (Jan. 15, 2006) NSF Finalizes Cyberinfrastructure Vision DocumentMarch 31, 2006

Creating a World-Class HPC Environment To Enable Petascale Science and Engineering Strategic Plan for High Performance Computing (FY 2006 – 2010)

Strategic Plan for High Performance Computing ( ) Private Sector Agency Partners HPC Resource Providers S&E Community Portable, Scalable Applications Software & Services Software Service Provider (SSP) SSP Science-Driven HPC Systems Compute Engines Local Storage Visualization Facilities

HPC Implementation Milestones (FY ) Specify reqts for science- driven architectures Mid-range systems acquisition Leadership Computing Council Leadership System Acquisition Supporting Software Program Scalable Applications Software Program Oct-Dec ‘05 Oct-Dec ‘06 Jan-Mar ‘06 Jan-Mar ‘07 Apr-Jun ‘06 Apr-Jun ‘07 Jul-Sep ‘06 Jul-Sep ‘07 release solicitation make award(s) ongoing annual activity work with academic and vendor communities Phase I award(s) release solicitation Establish Software Services Council release solicitation make award(s)

NSF directorate efforts Atmospheric Sciences (ATM) (Solid) Earth Sciences (EAR) Ocean Sciences (OCE) Each have reports and recommendations Led NSF to provide an over-arching strategy for Geosciences

Vision A multi (computer) generational computing facility providing an end-to- end capability and capacity to address some of the most challenging environmental problems that face us today The people, expertise, partners and community working synergistically

The Benefit Unprecedented capability and capacity in a single virtual facility Opportunity to build new/ stronger partnerships and relations (local, national and international) Jobs in the Front Range Resources for Colorado institutions to perform new work and attract new staff/students

Colorado The place to compute Connected, concerned, and active World-class facility as part of a world- respected organization Strong local connections Strong community leadership and partnership Strong commercial connections Pushing technology development for Colorado: small and large companies

The Expertise NCAR - community facilitator and integrator (people) 45 years of building among the most advance simulation, assimilation, analysis, and visualization codes/tools 45 years of direct benefit to university and related community

The Mission for NCAR computing To reach out beyond NCAR’s traditional fields of geoscience To provide the intellectual leadership and infrastructure To build partnerships with academic, government and private industry to serve their communities To build, install and operate and maintain a world class supercomputing facility

The Potential Partners Colorado School of Mines CU/Boulder IBM CSU NOAA/NIST USGS UK Met Office

Additional facility materials

Machine Room Power Budget: Up-coming Procurement Baseline + 50 kW Safety Margin: 200 kW Other Supers (neglecting bluesky): 150 kW Bluevista: 270 kW (estimate) Power available for new system: 580 kW Max allowable power-price ratio (2007): 39 kW/$M Bluevista power-price ratio: 52 kW/$M By 2010 power-price ratio may be 2.45 times larger (i.e. worse). Conclusion: 2007 procurement is borderline, circa 2010 procurement will be likely insupportable in terms of power at the Mesa Laboratory.

Site Selection Drives Schedule Computing System from Procurement B Must Go Elsewhere ML Computer Room At Maximum Capacity

Reliability Drivers Designed as a Tier III facility –Concurrently maintainable (eliminates twice yearly shutdowns, improves maintenance schedules) –Redundant paths –Redundant systems N+1 – appropriate number of components plus one additional to maintain redundancy S+S – identical systems plus systems Drivers for a Tier III facility –As devices continue to shrink sensitivity to environmental excursions are more serious –Expectations of user community Critical times of year for critical projects –More real-time or near real-time forecast portfolio of applications

Cost Breakdown Includes the backbone for 4MW but only half the electrical equipment 2MW Excludes inflation Additional data center modules

Petascale Future Leadership class computing will require a leadership class facility This design can scale along with computing to the PetaFLOP level –Without an initial petascale cost –Manage the risks as part of the strategy Technology changes Funding –Significant competitive advantage Those who will be asked to do this will have to have a solid plan This plan can get us there

Additional NSF materials

Recommendations Computer ServicesComputer Services –Procure computing systems in a three year technical refresh cycle with the first procurement in 2007 and project completion in 2012 –Phase I: two to three terascale systems (each TFLOPS peak, approximately 200 TFLOPS aggregate) –Phase II: procure a new “centerpiece” system with 1 PFLOP peak processing speed Data and Visualization Services –Equip the petascale collaboratory with storage systems capable of archiving approximately 100 Pbytes per year of operation Interface and Collaboration Services –Leverage the technical expertise and cyberinfrastructure in existing national research computing facilities and universities by creating collaboratory nodes at centers where specialized domain expertise –Approximately 75 new hires at petascale facility and the collaboratory nodes

Cyberinfrastructure for the Atmospheric Sciences in the 21st Century Ad Hoc Committee for Cyberinfrastructure Research, Development and Education in the Atmospheric Sciences June 2004

Recommendation (3) Physical Infrastructure –Provide physical infrastructure typical of a modern best IT center –Consider leasing a data center of the appropriate scale Planning –Form an Application Performance Characterization working group to focus on identifying the algorithmic and architectural elements needed for the design, procurement, and effective use of petascale systems and beyond –Continue to plan through procurement phases for continued provision of petascale and higher computational capability to the geosciences

General Recommendations NSF change the way in which it invests in CI for the geosciences in the following ways:  Give human resource issues top priority in CI development, hardening, maintenance and distribution. The academic reward structure should recognize CI- related activities  Encourage the education of students, educators, support staff and scientists through exposure to the computational environment, including tools, data sets, and models.  Provide the necessary support for communication and dissemination of ideas, technologies, and processes to promote the interdisciplinary understanding needed to successfully use CI in the atmospheric sciences.  Help organize and coordinate CI activities across the geosciences and provide forums for innovation.

General Recommendations (2)  Initiate a coordinated effort with NASA, NOAA and other agencies to provide geoscientists with the ways and means to find, effectively use, publish, and distribute geoscience data, including the development of appropriate standards for metadata to which grant recipients should be encouraged to conform.  Provide funding for the entire CI software life cycle, including development, testing, hardening, deployment, training, support and maintenance, subject to ongoing review and demonstrated utility and sustainability.  Invest in computing infrastructure and capacity building at all levels, including centers,campuses, and departments, in support of atmospheric science research and education.  Support the development of geosciences cyber- environments that allow the seamless transport of work from the desktop to supercomputers to Grid- based computing platforms.

EARTH SCIENCES

SOLID EARTH GEOINFORMATICS SEDIMENTARY & ANCIENT LIFE SYSTEMS GEOPHYSICS & ACTIVE TECTONICS CRUSTAL ARCHITECTURE & EVOLUTION EARTH SURFACE PROCESSES COMPUTATIONAL CENTERS INFORMATION, VISUALIZATION & NUMERICAL TECHNOLOGIES EDUCATION & OUTREACH EarthScope IRIS SCEC PaleoBio CHRONOS CSM NCED CUAHSI SAHRA GEON - Science GERM NAVADAT Computational Geophysics

COMPUTATIONAL SOLID EARTH SCIENCES 1. Build a national facility that is second to none for computational 2. Fund ten regional computation, data storage, and visualization clusters as platforms for web services. 3. Increase funding by a factor of 5 to 10 from NSF EAR per year for research group and departmental clusters to $1.5M-$3M. 4. Change policies at the existing national centers. 5. Increase funding for education and training in computational Earth science.

Ocean Sciences

Survey Findings / Recommendations Expected requirement –ten to one thousand times the current ITI hardware capacity over the next five to ten years, –most critical bottlenecks occurring in the availability of cpu cycles, memory and mass-storage capacity, and network bandwidth. Software systems –need to re-engineer models, and data analysis and assimilation packages, for efficient use on massively parallel computers; –advances in visualization techniques to deal effectively with increasing volumes of observations and model output; –well-designed, documented and tested community models of all types. Extreme shortage of skilled ITI technical personnel accessible to the ocean sciences community Improve access to high- performance computational resources across the ocean sciences. Provide technical support for maintenance and upgrade of local ITI resources. Provide model, data and software curatorship. Facilitate advanced applications programming Survey Recommendations

Recommendation (2) Data and Visualization Services –Equip the petascale collaboratory with storage systems capable of archiving approximately 100 Pbytes per year of operation –Invest approximately 10 percent of the overall computing resource in specialized data analysis and visualization systems, distributed among the nodes of the collaboratory Interface and Collaboration Services –Leverage the technical expertise and cyberinfrastructure available in existing national research computing facilities and universities to support the effective scientific use of the computing resources of the petascale facility. –Leveraging resources can best be accomplished by creating collaboratory nodes at appropriate centers where specialized domain expertise –Approximately 75 new hires, distributed among the petascale facility itself and the collaboratory nodes embedded within science teams

Petascale Collaboratory “Overarching Recommendation: Establish a Petascale Collaboratory for the Geosciences with the mission to provide leadership-class computational resources that will make it possible to address, and minimize the time to solution of, the most challenging problems facing the geosciences.”