February 25, 2008 The Emerging Front Range HPC Collaboratory Dr. Rich Loft: Director, Technology Development Computational.

Slides:



Advertisements
Similar presentations
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
Advertisements

DOE Global Modeling Strategic Goals Anjuli Bamzai Program Manager Climate Change Prediction Program DOE/OBER/Climate Change Res Div
SACNAS, Sept 29-Oct 1, 2005, Denver, CO What is Cyberinfrastructure? The Computer Science Perspective Dr. Chaitan Baru Project Director, The Geosciences.
Benefits from Participation in the Supercomputing 2008 (SC08) Conference Jim Bottum Jill Gemmill Walt Ligon Mihaela Vorvoreanu.
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA; SAN DIEGO SDSC RP Update October 21, 2010.
Petascale System Requirements for the Geosciences Richard Loft SCD Deputy Director for R&D.
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
LinkSCEEM-2: A computational resource for the development of Computational Sciences in the Eastern Mediterranean Mostafa Zoubi SESAME SESAME – LinkSCEEM.
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Global Climate Modeling Research John Drake Computational Climate Dynamics Group Computer.
1. 2 Welcome to HP-CAST-NTIG at NSC 1–2 April 2008.
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA; SAN DIEGO IEEE Symposium of Massive Storage Systems, May 3-5, 2010 Data-Intensive Solutions.
1 PSC update July 3, 2008 Ralph Roskies, Scientific Director Pittsburgh Supercomputing Center
CYBERINFRASTRUCTURE AND CLIMATE MODELING Bob Oglesby Department of Geosciences* *As of 2 Jan 2006 Presented at the UNL Cyberinfrastructure 2005 Workshop.
Simo Niskala Teemu Pasanen
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Scientific Data Infrastructure in CAS Dr. Jianhui Scientific Data Center Computer Network Information Center Chinese Academy of Sciences.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
October 18, 2011 Utah NCAR-Wyoming Supercomputing Center: Opportunities for Utah Researchers Bryan Shader Special Assistant to VP for Research University.
Advancing Scientific Discovery through TeraGrid Scott Lathrop TeraGrid Director of Education, Outreach and Training University of Chicago and Argonne National.
NE II NOAA Environmental Software Infrastructure and Interoperability Program Cecelia DeLuca Sylvia Murphy V. Balaji GO-ESSP August 13, 2009 Germany NE.
Future Requirements for NSF ATM Computing Jim Kinter, co-chair Presentation to CCSM Advisory Board 9 January 2008.
1 TeraGrid ‘10 August 2-5, 2010, Pittsburgh, PA State of TeraGrid in Brief John Towns TeraGrid Forum Chair Director of Persistent Infrastructure National.
August 2007 Advancing Scientific Discovery through TeraGrid Adapted from S. Lathrop’s talk in SC’07
Overview of the New Blue Gene/L Computer Dr. Richard D. Loft Deputy Director of R&D Scientific Computing Division National Center for Atmospheric Research.
Computer Science Section National Center for Atmospheric Research Department of Computer Science University of Colorado at Boulder Blue Gene Experience.
TeraGrid Overview Cyberinfrastructure Days Internet2 10/9/07 Mark Sheddon Resource Provider Principal Investigator San Diego Supercomputer Center
Mathematics and Computer Science & Environmental Research Divisions ARGONNE NATIONAL LABORATORY Regional Climate Simulation Analysis & Vizualization John.
Cyberinfrastructure Planning at NSF Deborah L. Crawford Acting Director, Office of Cyberinfrastructure HPC Acquisition Models September 9, 2005.
Crystal Ball Panel ORNL Heterogeneous Distributed Computing Research Al Geist ORNL March 6, 2003 SOS 7.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor : A Concept, A Tool and.
Center for Computational Sciences O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Vision for OSC Computing and Computational Sciences
Cray Innovation Barry Bolding, Ph.D. Director of Product Marketing, Cray September 2008.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Grid-BGC: A Grid Enabled Carbon Cycle Modeling Environment Jason Cope and Matthew Woitaszek University of Colorado, Boulder
Innovative Program of Climate Change Projection for the 21st century (KAKUSHIN Program) Innovative Program of Climate Change Projection for the 21st century.
Geosciences - Observations (Bob Wilhelmson) The geosciences in NSF’s world consists of atmospheric science, ocean science, and earth science Many of the.
Commodity Grid Kits Gregor von Laszewski (ANL), Keith Jackson (LBL) Many state-of-the-art scientific applications, such as climate modeling, astrophysics,
The Research Alliance in Math and Science program is sponsored by the Office of Advanced Scientific Computing Research, Office of Science, U.S. Department.
Adrianne Middleton National Center for Atmospheric Research Boulder, Colorado CAM T340- Jim Hack Running the Community Climate Simulation Model (CCSM)
CLIM Fall 2008 What are the Roles of Satellites & Supercomputers in Studying Weather and Climate? CLIM 101.
Cray Environmental Industry Solutions Per Nyberg Earth Sciences Business Manager Annecy CAS2K3 Sept 2003.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
© 2006 The University of Chicago Team Science, Team Scholarship Tom Barton Chad Kainz.
Sergiu April 2006June 2006 Overview of TeraGrid Resources and Services Sergiu Sanielevici, TeraGrid Area Director for User.
1 Accomplishments. 2 Overview of Accomplishments  Sustaining the Production Earth System Grid Serving the current needs of the climate modeling community.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
National Center for Supercomputing Applications University of Illinois at Urbana–Champaign Visualization Support for XSEDE and Blue Waters DOE Graphics.
NCAR RP Update Rich Loft NCAR RPPI May 7, NCAR Teragrid RP Developments Current Cyberinfrastructure –5.7 TFlops/2048 core Blue Gene/L system –100.
SCD User Briefing The Community Data Portal and the Earth System Grid Don Middleton with presentation material developed by Luca Cinquini, Mary Haley,
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
NICS Update Bruce Loftis 16 December National Institute for Computational Sciences University of Tennessee and ORNL partnership  NICS is the 2.
Warren M. Washington NCAR The Parallel Climate Model (PCM) and Transition to a Climate Change Version of the Community Climate System Model (CCSM)
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Data Requirements for Climate and Carbon Research John Drake, Climate Dynamics Group Computer.
Presented by NCCS Hardware Jim Rogers Director of Operations National Center for Computational Sciences.
Presented by LCF Climate Science Computational End Station James B. White III (Trey) Scientific Computing National Center for Computational Sciences Oak.
The Community Climate System Model (CCSM): An Overview Jim Hurrell Director Climate and Global Dynamics Division Climate and Ecosystem.
NASA Earth Exchange (NEX) A collaborative supercomputing environment for global change science Earth Science Division/NASA Advanced Supercomputing (NAS)
NASA Earth Exchange (NEX) Earth Science Division/NASA Advanced Supercomputing (NAS) Ames Research Center.
Petascale Computing Resource Allocations PRAC – NSF Ed Walker, NSF CISE/ACI March 3,
TG ’08, June 9-13, State of TeraGrid John Towns Co-Chair, TeraGrid Forum Director, Persistent Infrastructure National Center for Supercomputing.
SAN DIEGO SUPERCOMPUTER CENTER Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways SDSC Director of Consulting,
Introduction to Data Analysis with R on HPC Texas Advanced Computing Center Feb
Overview of the CCSM CCSM Software Engineering Group June
Performance Technology for Scalable Parallel Systems
Joint Techs, Columbus, OH
Collaborations and Interactions with other Projects
Jeffrey P. Gardner Pittsburgh Supercomputing Center
Cyberinfrastructure and PolarGrid
OGCE Portal Applications for Grid Computing
Presentation transcript:

February 25, 2008 The Emerging Front Range HPC Collaboratory Dr. Rich Loft: Director, Technology Development Computational and Information Systems Laboratory National Center for Atmospheric Research

February 25, 2008 Front Range HPC Collaboratory Components… Facilities HPC Systems Grid Technology Applications/Workflows Science Collaborations Science Drivers Networking

February 25, 2008 The National Center for Atmospheric Research Climate Modeling Expertise Nobel Prize winning IPCC AR-4 Community Modeling Approach Scalable Models Grid Expertise TeraGrid Resource Provider Earth System Grid Grid-BGC (BioGeoChemistry) New Facility Plans (2012) >1 PFLOPS 8 MW >100 PB archive High Performance Computing 40 years of experience 25 TFLOPS peak computers 5 Pbyte archive

February 25, 2008 Climate Modeling… 40 km version credit: J. Hack, NCAR

February 25, 2008 Climate of the last Millennium Credit: Caspar Amman NCAR

February 25, 2008 This is a Critical Moment in Climate Change Research… Reproduce historical trends Investigate climate change Run IPCC Scenarios Investigate solutions Assess impacts Simulate adaptation strategies Work with energy industry Before IPCC AR4 After 2007

February 25, 2008 Mitigation: What we do next can change the amount of warming that occurs. Likelihood of warming (Knutti et al. 2005)

February 25, 2008 L. Buja Dimensions Slide

February 25, 2008 Why High Resolution? Ocean component of CCSM (Collins et al, 2006) Eddy-resolving POP (Maltrud & McClean,2005)

February 25, 2008 The Case for an HPC Collaboratory in the Western Interior l An integrated approach to climate change will involve –Climate modeling with regional capabilities –Mitigation efforts via, for example, carbon sequestration l Most sequestration reservoir formations in the Western Interior l Unconventional oil and gas locations in Western Interior l Regional climate change effects on the Great Plains biomass l Half of all EPSCOR states are in the Western Interior l Power Costs DOE FutureGen Project in Texas

February 25, 2008 Experimental Program to Stimulate Competitive Research (EPSCoR) Most of the Western Interior is EPSCoR

February 25, 2008 National Security Agency - The power consumption of today's advanced computing systems is rapidly becoming the limiting factor with respect to improved/increased computational ability."

February 25, 2008 Moore’s Law = More Cores: Quad Core “Barcelona” AMD Processor… Can 8, 16, 32 cores be far behind?

February 25, 2008 The Path to Petascale is Through Massive Parallelism… Cray-2/8 1986Blue Gene/L K CPU’s 367 TFLOPS 3.9 GFLOPS

February 25, 2008

Cheap Power in the Western Interior

February 25, 2008 Front-Range Collaboratory Resources Current Systems l GECO (CSM+NREL) –2144, 2.66 GHz Intel CPU Dell Cluster (22.8 TFLOPS) l NCAR –2048, 700 MHz PwrPC CPU Blue Gene/L (5.7 TFLOPS) Planned Systems (FY2009) l l NCAR – –~30 TFLOPS system upgrade – –1.5 PB HPSS-based archive l l CU – –~100 TFLOPS MRI Proposal to NSF Farther Out NCAR Supercomputing Center (NSC)

February 25, 2008 Blue NCAR: A platform for scalability research “Frost”

February 25, 2008 NCAR is part of the TeraGrid - NSF’s HPC Grid SDSC TACC UC/ANL NCSA ORNL PU IU PSC TeraGrid is a facility that integrates computational, information, and analysis resources at the San Diego Supercomputer Center, the Texas Advanced Computing Center, the University of Chicago / Argonne National Laboratory, the National Center for Supercomputing Applications, Purdue University, Indiana University, Oak Ridge National Laboratory, the Pittsburgh Supercomputing Center, and the National Center for Atmospheric Research. NCAR Caltech USC-ISI Utah Iowa Cornell Buffalo UNC-RENCI Wisc

February 25, 2008 HPC Infrastructure End State Integrated HPC Resource & Visualization Systems WAN Network HPC Storage Cluster 10Gb/s MSS Archive TG Shared Filesystems Science gateways Data Preservation Switch Data SAN

February 25, 2008 Compute cabinets I/O nodes Interactive nodes Network Switch cabinets Future HPC Systems Are Likely to be Heterogeneous I/O Subsystem Front End Switch ComputeAccelerators Accelerator cabinets

February 25, 2008 NCAR Summer Internships in Parallel Computational Science (SIParCS) l Open to: –Upper division undergrads –Graduate students l In Disciplines such as: –CS, Software Engineering –Applied Math, Statistics –ES Science l Support: –Travel, Housing, Per diem –10 weeks salary l 2008 application deadline: –Feb 1st l Number of interns selected: –8-10 l Members of traditionally underrepresented groups encouraged to apply.

February 25, 2008 Thanks! Any Questions?

February 25, 2008 Front Range HPC Collaboratory: A geographical view Ring Topology Multiple Network Providers Level3 McCleod Platte River Power Auth. ICG Upgradable to multiple 10 Gb/s waves NCAR: ~30 Tflops FY2009 CU: 100 Tflops FY2009 GECO: 11.4 Tflops NSC: 1 Pflops 2011

February 25, 2008 l Over 3000 registered users make it the most used Meteorological Research model; l Operational use by the National Weather Service, US Navy, US Army, USAF, South Korean Meteorological Service, Indian Meteorological Department; l Special forecasts are made by NCAR over the Antarctic in support of international operations there. 36 h Reflectivity Forecast 4 km WRF Model Radar Reflectivity 10 June Z l State-of-the-art Coupled Climate System Model l Open Source Code and freely available data l Significant development collaborations with: –22 US universities –US Department of Energy (DOE) n LANL, LLNL, LANL, ORNL, ANL, LBL, NERSC –NASA l Largest contributor to the 2007 IPCC AR4 –1.4° resolution –11,000 model years simulated >100 TBytes –of output data WRF: Weather and Research Forecast Model CCSM: Community Climate System Model NCAR’s Community Model Approach to Enabling Collaborative Research

February 25, 2008 IPCC

February 25, 2008 Atmospheric Resolution: Number of Northern Hemisphere Cyclones T255 ERA T159 T95 Credit: Jung et al. 2006