NASA Center for Computational Sciences Climate and Weather Research at NASA Goddard 9 September 2009 Phil Webster Goddard Space Flight Center

Slides:



Advertisements
Similar presentations
Future Directions and Initiatives in the Use of Remote Sensing for Water Quality.
Advertisements

DOE Global Modeling Strategic Goals Anjuli Bamzai Program Manager Climate Change Prediction Program DOE/OBER/Climate Change Res Div
O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Center for Computational Sciences Cray X1 and Black Widow at ORNL Center for Computational.
Geophysical Fluid Dynamics Laboratory Review June 30 - July 2, 2009 Geophysical Fluid Dynamics Laboratory Review June 30 - July 2, 2009.
The Earth System Analyzer: Using all of the Data to Improve NOAA’s Mission Capabilities Alexander E. MacDonald NOAA Earth System Research Laboratory.
THORPEX-Pacific Workshop Kauai, Hawaii Polar Meteorology Group, Byrd Polar Research Center, The Ohio State University, Columbus, Ohio David H. Bromwich.
Development of a Community Hydrologic Information System Jeffery S. Horsburgh Utah State University David G. Tarboton Utah State University.
B1 -Biogeochemical ANL - Townhall V. Rao Kotamarthi.
The NASA Modeling, Analysis and Prediction (MAP) Modeling Environment Don Anderson NASA HQ Sience Mission Directorate Earth-Sun Division Manager, Modeling,
National Weather Service National Weather Service Central Computer System Backup System Brig. Gen. David L. Johnson, USAF (Ret.) National Oceanic and Atmospheric.
Research Data at NCAR 1 August, 2002 Steven Worley Scientific Computing Division Data Support Section.
The Climate Prediction Project Global Climate Information for Regional Adaptation and Decision-Making in the 21 st Century.
Data Merge Examples, Toolsets for Airborne Data (TAD): Customized Data Merging Function ASDC Introduction The Atmospheric Science Data Center (ASDC) at.
, Increasing Discoverability and Accessibility of NASA Atmospheric Science Data Center (ASDC) Data Products with GIS Technology ASDC Introduction The Atmospheric.
PolarGrid Geoffrey Fox (PI) Indiana University Associate Dean for Graduate Studies and Research, School of Informatics and Computing, Indiana University.
, Implementing GIS for Expanded Data Accessibility and Discoverability ASDC Introduction The Atmospheric Science Data Center (ASDC) at NASA Langley Research.
VO Sandpit, November 2009 e-Infrastructure to enable EO and Climate Science Dr Victoria Bennett Centre for Environmental Data Archival (CEDA)
NCCS NCCS User Forum 24 March NCCS Agenda Welcome & Introduction Phil Webster, CISTO Chief Current System Status Fred Reitz, Operations Manager.
NCCS User Forum June 15, Agenda Current System Status Fred Reitz, HPC Operations NCCS Compute Capabilities Dan Duffy, Lead Architect User Services.
HPC system for Meteorological research at HUS Meeting the challenges Nguyen Trung Kien Hanoi University of Science Melbourne, December 11 th, 2012 High.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Image: MODIS Land Group, NASA GSFC March 2000 POES Microwave Products Presented.
Geosciences - Observations (Bob Wilhelmson) The geosciences in NSF’s world consists of atmospheric science, ocean science, and earth science Many of the.
ESIP Federation 2004 : L.B.Pham S. Berrick, L. Pham, G. Leptoukh, Z. Liu, H. Rui, S. Shen, W. Teng, T. Zhu NASA Goddard Earth Sciences (GES) Data & Information.
CPPA Past/Ongoing Activities - Ocean-Atmosphere Interactions - Address systematic ocean-atmosphere model biases - Eastern Pacific Investigation of Climate.
Overview of CEOS Virtual Constellations Andrew Mitchell NASA CEOS SIT Team / WGISS NASA ESRIN – Frascati, Italy September 20, 2013 GEOSS Vision and Architecture.
Modern Era Retrospective-analysis for Research and Applications: Introduction to NASA’s Modern Era Retrospective-analysis for Research and Applications:
CLASS Information Management Presented at NOAATECH Conference 2006 Presented by Pat Schafer (CLASS-WV Development Lead)
2015 GLM Annual Science Team Meeting: Cal/Val Tools Developers Forum 9-11 September, 2015 DATA MANAGEMENT For GLM Cal/Val Activities Helen Conover Information.
3 rd Annual WRF Users Workshop Promote closer ties between research and operations Develop an advanced mesoscale forecast and assimilation system   Design.
A Portable Regional Weather and Climate Downscaling System Using GEOS-5, LIS-6, WRF, and the NASA Workflow Tool Eric M. Kemp 1,2 and W. M. Putman 1, J.
MAP MODELING EFFORTS MAP: Building Integrated Earth System Analysis (Modeling): IESA CTM work funded by MAP will be seen as part of the overall Earth System.
APEC Climate Center Data Service System Chi-Yung Francis Tam APCC.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Image: MODIS Land Group, NASA GSFC March 2000 Infrared Temperature and.
GEON2 and OpenEarth Framework (OEF) Bradley Wallet School of Geology and Geophysics, University of Oklahoma
1 11/25/2015 Developmental Testbed Center (DTC) Bob Gall June 2004.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
VO Sandpit, November 2009 e-Infrastructure for Climate and Atmospheric Science Research Dr Matt Pritchard Centre for Environmental Data Archival (CEDA)
ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL.
NASA Applied Sciences Program Update John A. Haynes Program Manager, Weather National Aeronautics and Space Administration Applied Sciences Program Earth.
May 6, 2002Earth System Grid - Williams The Earth System Grid Presented by Dean N. Williams PI’s: Ian Foster (ANL); Don Middleton (NCAR); and Dean Williams.
VAPoR: A Discovery Environment for Terascale Scientific Data Sets Alan Norton & John Clyne National Center for Atmospheric Research Scientific Computing.
UNCLASS1 Dr. Gene Whitney Assistant Director for Environment Office of Science and Technology Policy Executive Office of the President WISP Meeting - July.
1 THE EARTH SIMULATOR SYSTEM By: Shinichi HABATA, Mitsuo YOKOKAWA, Shigemune KITAWAKI Presented by: Anisha Thonour.
1 Proposal for a Climate-Weather Hydromet Test Bed “Where America’s Climate and Weather Services Begin” Louis W. Uccellini Director, NCEP NAME Forecaster.
World Climate Research Programme Joint Scientific Committee – 31 Antalya, Turkey.
1 Accomplishments. 2 Overview of Accomplishments  Sustaining the Production Earth System Grid Serving the current needs of the climate modeling community.
Earth System Curator and Model Metadata Discovery and Display for CMIP5 Sylvia Murphy and Cecelia Deluca (NOAA/CIRES) Hannah Wilcox (NCAR/CISL) Metafor.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
Power and Cooling at Texas Advanced Computing Center Tommy Minyard, Ph.D. Director of Advanced Computing Systems 42 nd HPC User Forum September 8, 2011.
Welcome to the PRECIS training workshop
Shu-Hua Chen University of California, Davis eatheresearch & orecasting
National Centers for Environmental Prediction: “Where America’s Climate, Weather and Ocean Services Begin” An Overview.
Presented by LCF Climate Science Computational End Station James B. White III (Trey) Scientific Computing National Center for Computational Sciences Oak.
Support to scientific research on seasonal-to-decadal climate and air quality modelling Pierre-Antoine Bretonnière Francesco Benincasa IC3-BSC - Spain.
NOAA Climate Program Office Richard D. Rosen Senior Advisor for Climate Research CICS Science Meeting College Park, MD September 9, 2010.
March 2004 At A Glance The AutoFDS provides a web- based interface to acquire, generate, and distribute products, using the GMSEC Reference Architecture.
NCCS User Forum December 7, Agenda – December 7, 2010 Welcome & Introduction (Phil Webster, CISTO Chief) Current System Status (Fred Reitz, NCCS.
NASA Earth Exchange (NEX) A collaborative supercomputing environment for global change science Earth Science Division/NASA Advanced Supercomputing (NAS)
NASA Earth Exchange (NEX) Earth Science Division/NASA Advanced Supercomputing (NAS) Ames Research Center.
Central Operations Ben Kyger Acting Director / NCEP CIO.
Architecture of a platform for innovation and research Erik Deumens – University of Florida SC15 – Austin – Nov 17, 2015.
Introduction to Data Analysis with R on HPC Texas Advanced Computing Center Feb
RAL, 2012, May 11 Research behaviour Martin Juckes, 11 May, 2012.
NextGEOSS data hub incl. alpha release
Appro Xtreme-X Supercomputers
Shuyi S. Chen, Ben Barr, Milan Curcic and Brandon Kerns
PI: Will Ivancic/GRC Co-PI: Don Sullivan/ARC
L. Glimcher, R. Jin, G. Agrawal Presented by: Leo Glimcher
Presentation transcript:

NASA Center for Computational Sciences Climate and Weather Research at NASA Goddard 9 September 2009 Phil Webster Goddard Space Flight Center

NASA Center for Computational Sciences NASA Mission Structure To implement NASA’s Mission, NASA Headquarters is organized into four Mission Directorates. Aeronautics: Pioneers and proves new flight technologies that improve our ability to explore and which have practical applications on Earth. Exploration Systems: Creates new capabilities and spacecraft for affordable, sustainable human and robotic exploration Science: Explores the Earth, moon, Mars, and beyond; charts the best route of discovery; and reaps the benefits of Earth and space exploration for society. Space Operations: Provides critical enabling technologies for much of the rest of NASA through the space shuttle, the International Space Station, and flight support. 2September 9, rd HPC User Forum, Broomfield, CO

NASA Center for Computational Sciences Science Mission Directorate 3September 9, rd HPC User Forum, Broomfield, CO

NASA Center for Computational Sciences Earth Science Division Overview Overarching Goal: to advance Earth System science, including climate studies, through spaceborne data acquisition, research and analysis, and predictive modeling Six major activities: Building and operating Earth observing satellite missions, many with international and interagency partners Making high-quality data products available to the broad science community Conducting and sponsoring cutting-edge research in 6 thematic focus areas –Field campaigns to complement satellite measurements –Modeling –Analyses of non-NASA mission data Applied Science Developing technologies to improve Earth observation capabilities Education and Public Outreach 4September 9, rd HPC User Forum, Broomfield, CO

NASA Center for Computational Sciences Earth Science Division Focus Areas 5September 9, rd HPC User Forum, Broomfield, CO

NASA Center for Computational Sciences Modeling, Analysis and Prediction (MAP) Program Seeks an understanding of the Earth as a complete, dynamic system Emphasis on climate and weather Key questions include: − How is the Earth system changing? − What are the forcing mechanisms driving observed changes? − How does the Earth system respond to natural and human-induced changes? − What are the consequences of Earth system change to society? − What further changes can be anticipated, and what can be done to improve our ability to predict such changes through improved remote sensing, data assimilation, and modeling? The MAP program supports observation-driven modeling that integrates the research activities in NASA’s Earth Science Program 6September 9, rd HPC User Forum, Broomfield, CO

NASA Center for Computational Sciences NASA’s Climate and Weather Modeling Spans timescales from weather to short-term climate prediction to long- term climate change Spans weather, climate, atmospheric composition, water & energy cycles, carbon cycle “Unique” in bringing models and observations together through assimilation and simulation Products to support NASA instrument teams and atmospheric chemistry community Contributes to international assessments: WMO/UNEP, IPCC - contributions to IPCC/AR5 – new paradigm of “data delivery” for NASA modeling community – in partnership with NCCS, PCMDI, and LLNL Contributes to WWRP and WCRP 7September 9, rd HPC User Forum, Broomfield, CO

NASA Center for Computational Sciences Tomorrow’s Science New missions – increased sensing of the earth’s climate system as recommended by decadal studies; more data and more types of data Advanced assimilation – use more data to produce better model initiation Higher resolution – better representation of atmospheric processes to improve prediction Greater complexity - understanding and predicting/projecting future climate Coupled ocean-atmosphere-land models - including full carbon cycle Increased collaboration – of models, model output, simulation & observational data sets September 9, rd HPC User Forum, Broomfield, CO

NASA Center for Computational Sciences High-Resolution Climate Simulations with GEOS-5 Cubed-Sphere Model September 9, rd HPC User Forum, Broomfield, CO9 GMAO, GISS, NCCS and SIVO staff is refining techniques for Intel Nehalem (e.g. concurrent serial I/O paths). SIVO’s Bill Putman’s 3.5-km (non-hydrostatic) simulations with GEOS-5 Cubed Sphere Finite-Volume Dynamical Core, on approximately 4,000 Nehalem cores of the NASA Center for Computational Sciences (NCCS) Discover supercomputer, yielded promising results including cloud features not seen with lower-resolution runs. Exploring techniques for more efficient memory and parallel I/O, e.g., evaluating effects of replacing of Intel MPI with MVAPICH. Low Cloud features from 3.5-km GEOS-5 Cubed Sphere for 2 January 2009 (left), compared to GOES-14 first full-disk visible image on 27 July 2009 (center) and 27-km (roughly ¼ degree) GEOS-5 Cubed Sphere for 2 January 2009 (right ). NCCS Discover Scalable Unit 5’s Nehalem architecture and larger core count enables researchers to exploit methods for higher resolution models, advancing NASA Science Mission Directorate science objectives. Bill Putman, Max Suarez, NASA Goddard Space Flight Center; Shian-Jiann Lin, NOAA Geophysical Fluid Dynamics Laboratory

NASA Center for Computational Sciences GEOS-5 – Impact of Resolution Karman Vortex Streets 7 km 28 km 3.5 km 14 km September 9, rd HPC User Forum, Broomfield, CO

NASA Center for Computational Sciences MERRA Project: Modern Era Retrospective- analysis for Research and Applications GMAO’s 30-year reanalysis of the satellite era (1979 to present) using GEOS-5 Largest assimilation data set available today The focus of MERRA is the hydrological cycle and climate variability Today’s observing system - ~1.6M observations per 6-hour snapshot. Close to 90% are from satellites Public record supporting broad range of scientific research Climate Data Assimilation System efforts will continue Single largest compute project at the NCCS Products are accessed online at the GES DISC 11September 9, rd HPC User Forum, Broomfield, CO Michael Bosilovich, Global Modeling and Assimilation Office, NASA Goddard Space Flight Center

NASA Center for Computational Sciences High-Resolution Modeling of Aerosol Impacts on the Asian Monsoon Water Cycle September 9, rd HPC User Forum, Broomfield, CO12 Objectives include 1) clarifying the interactions between aerosols (dust and black carbon) and the Indian monsoon water cycle and how they they may modulate regional climatic impacts of global warming, and 2) testing feedback hypotheses using high-resolution models as well as satellite and in situ observations. The team runs the regional-scale, cloud-resolving Weather Research and Forecasting (WRF) Model at very high resolution—less than 10-km horizontal grid spacing with 31 vertical layers. To mitigate the large computational demands of over 200,000 grid cells, the team uses a triple-nest grid with resolutions of 27, 9, and 3 km. For the aerosol-monsoon studies, a radiation module within WRF links to the Goddard Chemistry Aerosol Radiation and Transport (GOCART) aerosol module. Using the Discover supercomputer at the NASA Center for Computational Sciences (NCCS), the team conducted a model integration for May 1 to July 1 in both 2005 and Among other results, studies documented the “elevated-heat- pump” hypothesis, highlighting the role of the Himalayas and Tibetan Plateau in trapping aerosols over the Indo-Gangetic Plain, and showed preliminary evidence of aerosol impacts on monsoon variability. Rainfall distributions from Weather Research and Forecasting (WRF) Model simulations at 9-kilometer resolution (top row) and from Tropical Rainfall Measurement Mission (TRMM) satellite estimates (bottom row). Units are in millimeters per day. Both WRF and TRMM show heavy rain (red) over the Bay of Bengal and the western coast. By using 256 Intel Xeon processors on Discover, the WRF Model can finish a 1-day integration in less than 3 hours. William Lau, Kyu-Myong Kim, Jainn J. Shi, Toshi Matsui, and Wei-Kuo Tao, NASA Goddard Space Flight Center

NASA Center for Computational Sciences Observing System Experiments: Evaluating and Enhancing the Impact of Satellite Observations September 9, rd HPC User Forum, Broomfield, CO13 An Observing System Experiment (OSE) assesses the impact of an observational instrument by producing two or more data assimilation runs, one of which (the Control run) omits data from the instrument under study. From the resulting analyses, the team initializes corresponding forecasts and evaluates them against operational analyses. The team runs the NASA GEOS-5 data assimilation system at a resolution of 1/2 degree longitude and latitude, with 72 vertical levels, and the GEOS-5 forecasting system at a resolution of 1/2 or 1/4 degree. The team uses high-end computers at the NASA Center for Computational Sciences (NCCS) and the NASA Advanced Supercomputing (NAS) facility. The mass storage allows continual analysis of model results with diagnostic tools. This research has demonstrated the impact of quality- controlled Advanced Infrared Spectrometer (AIRS) observations under partly cloudy conditions. In modeling tropical cyclogenetic processes, the team found that using AIRS data leads to better-defined tropical storms and improved GEOS-5 track forecasts. Depicted in the figure is a set of experiments centering on April–May 2008, during which Tropical Cyclone Nargis hit Myanmar. Impact of the Advanced Infrared Spectrometer (AIRS) on the 1/2-degree Goddard Earth Observing System Model, Version 5 (GEOS-5) forecast for Tropical Cyclone Nargis. Upper left: Differences (AIRS minus Control) in 6- hour forecasts of 200 hPa temperature (°C, shaded) and sea-level pressure (hPa, solid line). Lower left: The 6-hour sea-level pressure forecast from the AIRS run shows a well-defined low close to the observed storm track (green solid line). Lower right: The corresponding 108-hour forecast for 2 May 2008 (landfall time) compares very well with the observed track. Upper right: The 6-hour sea-level pressure forecast from the Control run shows no detectable cyclone. NASA computational resources hosted 70 month-long assimilation experiments and corresponding 5-day forecasts. Oreste Reale and William Lau, NASA Goddard Space Flight Center

NASA Center for Computational Sciences GEOS-5 Support of NASA Field Campaigns: TC4 ARCTAS TIGERZ September 9, rd HPC User Forum, Broomfield, CO14 A Goddard Space Flight Center team supports NASA field campaigns with real-time products and forecasts from the Global Modeling and Assimilation Office’s GEOS-5 model to aid in flight planning and post-mission data interpretation. Recent supported campaigns include the TC4 (Tropical Composition, Cloud and Climate Coupling), ARCTAS (Arctic Research of the Composition of the Troposphere from Aircraft and Satellites), and TIGERZ missions. The most-often-used GEOS-5 configuration was a 2/3-degree longitude by 1/2-degree latitude grid with 72 vertical levels. Data assimilation analyses were conducted every 6 hours. The NASA Center for Computational Sciences (NCCS) hosted the GEOS-5 model runs on its high-end computers and provided a multi-faceted data delivery system through its Data Portal. The mission support was successful, with GEOS-5 products delivered on time for most of the mission duration due to the NCCS ensuring timely execution of job streams and supporting the Data Portal. One example of mission success was a June 29, 2008 DC-8 flight’s sampling of the Siberian fire plume transported to the region in the mid-troposphere, as predicted by GEOS-5. This image shows 500-hectopascal (hPa) temperatures (shading) and heights (contours) during NASA’s ARCTAS (Arctic Research of the Composition of the Troposphere from Aircraft and Satellites) mission. An analysis from the GEOS-5 model is shown with 24- and 48-hour forecasts and validating analyses. These fields, with the accompanying atmospheric chemistry fields, were used to help plan a DC-8 flight on June 29, The GEOS-5 systems were run on 128 processors of the NCCS Explore high-end computer, with a continuous job stream allowing timely delivery of products to inform flight planning. Michele Rienecker, Peter Colarco, Arlindo da Silva, Max Suarez, Ricardo Todling, Larry Takacs, Gi-Kong Kim, and Eric Nielsen, NASA Goddard Space Flight Center

NASA Center for Computational Sciences NASA HPC NCCS at Goddard Space Flight Center –Focused on Climate and Weather Research in the Earth Science Division of the Science Mission Directorate Support code development Environment for running models in production mode Capacity computing for large, complex models Analysis & visualization environments NAS at Ames Research Center –Supports all Mission Directorates For Earth Science: Capability runs for test & validation of next generation models 15September 9, rd HPC User Forum, Broomfield, CO

NASA Center for Computational Sciences Large scale HPC computing Comprehensive toolsets for job scheduling and monitoring Large capacity storage Tools to manage and protect data Data migration support Help Desk Account/Allocation support Computational science support User teleconferences Training & tutorials Interactive analysis environment Software tools for image display Easy access to data archive Specialized visualization support Internal high speed interconnects for HPC components High-bandwidth to NCCS for GSFC users Multi-gigabit network supports on-demand data transfers HPC Compute Data Archival and Stewardship Code repository for collaboration Environment for code development and test Code porting and optimization support Web based tools Code Development* Analysis & Visualization* User Services* Capability to share data & results Supports community-based development Facilitates data distribution and publishing Data Sharing Data Transfer DATA Storage & Management Global file system enables data access for full range of modeling and analysis activities * Joint effort with SIVO NCCS Data Centric Climate Simulation Environment September 9, rd HPC User Forum, Broomfield, CO

NASA Center for Computational Sciences Data Centric Architecture Highlight of Current Activities September 9, rd HPC User Forum, Broomfield, CO17 Data Storage and Management Petabyte online storage plus technology-independent software interfaces to provide data access to all NCCS services Data Archiving and Stewardship Petabyte mass storage facility to support project data storage, access, and distribution, access to data sets in other locations High Performance Computing Building toward Petascale computational resources to support advanced modeling applications Analysis and Visualization Terascale environment with tools to support interactive analytical activities Data Sharing and Publication Web-based environments to support collaboration, public access, and visualization Nehalem Cluster Upgrades Dali – Interactive Data Analysis Data Portal & Earth System Grid Data Management System

NASA Center for Computational Sciences Interactive Data Analysis & Visualization Platform - Dali Interactive Data Analysis Systems –Direct login for users –Fast access to all file systems –Supports custom and 3 rd party applications –Visibility and easy access to post data to the data portal –Interactive display of analysis results In-line and Interactive visualization –Synchronize analysis with model execution –Access to intermediate data as they are being generated –Generate images for display back to the user’s workstations –Capture and store images during execution for later analysis Develop Client/Server Capabilities –Extend analytic functions to the user’s workstations –Data reduction (subsetting, field/variable/temporal extractions, averaging, etc.) and manipulation (time series, display, etc.) functions September 9, rd HPC User Forum, Broomfield, CO18 Analysis & Visualization Direct GPFS I/O Connections ~3 GB/sec per node 16 cores 256GB Dali Analytics Platform 1.2 TF Peak, 128 cores, 2 TB main memory - 8 nodes 2.4 GHz Dunnington (Quad Core) - 16 cores/node with 256 GB memory/core - 3 GB/s I/O bandwidth to GPFS filesystem - Software: CDAT, ParaView, GrADS, Matlab, IDL, python, FORTRAN, C, Quads, LATS4D Currently configured as (8) 16-core nodes with 256 GB RAM/node, with flexibility technology to support up to (2) 64-core nodes with 1 TB RAM/node.

NASA Center for Computational Sciences Data Management System Improving access to shared observational and simulation data through the creation of a data grid Adopting an iRODS grid-centric paradigm –iRODS intermediates between vastly different communities The world of operational data management The world of operational scientific practice Challenges –Creating a catalog of NCCS policies to be mapped into iRODS rules –Creating an architecture for work flows to be mapped into iRODS microservices –Defining metadata and the required mappings –Capturing and publishing metadata –Doing all of this without disrupting operations! September 9, rd HPC User Forum, Broomfield, CO19

NASA Center for Computational Sciences Data Portal and Earth Systems Grid September 9, rd HPC User Forum, Broomfield, CO20 Web-based environments to support collaboration, public access, and visualization Interfaces to the Earth Systems Grid (ESG) and PCMDI for sharing IPCC model data Connectivity to observational data, Goddard DISC, and other scientific data sets Direct connection back to NCCS data storage and archive for prompt publication; minimizes data movement and multiple copies of data Sufficient compute capability for data analysis Data Portal NASAESGPCMDI Local Disk NFSiRODS GPFS MC Other HP c7000 BladeSystem (128 cores, 1.2TF, 120TB of disk)

NASA Center for Computational Sciences Nehalem Cluster Upgrades Additional IBM iDataPlex scalable compute unit added into the Discover cluster in FY09 –Additional 512 nodes (+46 TFLOPS) –4K 2.8 GHz Nehalem quad cores –24 GB RAM per node (+12 TB RAM) –Infiniband DDR interconnect An additional 4K core Nehalem scalable unit to be integrated later this calendar year Performance: –2x speedup of some major NCCS applications –3x to 4x improvement in memory to processor bandwidth –Dedicated I/O nodes to the GPFS file system provides much higher throughput September 9, rd HPC User Forum, Broomfield, CO21 “Discover “ Cluster 110 TF Peak, 10,752 cores, 22.8 TB main memory, Infiniband interconnect Base Unit: nodes 3.2 GHz Xeon Dempsey (Dual Core) SCU1 and SCU2: nodes 2.6 GHz Xeon Woodcrest (Dual Core) SCU3 and SCU4: nodes 2.5 GHz Xeon Harpertown (Quad Core) SCU5: nodes 2.8 GHz Xeon Nehalem (Quad Core)

NASA Center for Computational Sciences Where we’re going… NASA is aggressively moving forward to deploy satellite missions supporting the Decadal Survey*. NCCS is moving forward to support the climate & weather research that will extract the scientific value from this exciting new data! * (January 15, 2007, report entitled Earth Science and Applications from Space: National Imperatives for the Next Decade and Beyond). 22September 9, rd HPC User Forum, Broomfield, CO

NASA Center for Computational Sciences Thank you September 9, rd HPC User Forum, Broomfield, CO23

NASA Center for Computational Sciences NCCS Architecture September 9, rd HPC User Forum, Broomfield, CO24 Management Servers License Servers GPFS Management GPFS Disk Subsystems ~ 1.3 PB Increasing by ~1.8PB in FY10 Other Services Analysis FY09 Upgrade ~45 TF FY10 Upgrade ~45TF Login ARCHIVE Data Gateways Viz Data Portal GPFS I/O Nodes Direct Connect GPFS Nodes Disk ~300 TB Tape ~8 PB Planned for FY10 Internal Services Existing Discover 65 TF GPFS I/O Nodes PBS Servers Data Management NCCS LAN (1 GbE and 10 GbE)