CHAP Meeting 4 October 2007 CISL Update Operations and Services CISL HPC Advisory Panel Meeting 4 October 2007 Tom Bettge Director of Operations and Services.

Slides:



Advertisements
Similar presentations
O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Center for Computational Sciences Cray X1 and Black Widow at ORNL Center for Computational.
Advertisements

June 11, 2008 National Center for Atmospheric Research Teragrid 08, Las Vagas Early Experience of Running WRF and CAM on Ranger Siddhartha Ghosh Wei Huang.
Supercomputing Challenges at the National Center for Atmospheric Research Dr. Richard Loft Computational Science Section Scientific Computing Division.
One-day Meeting, INI, September 26th, 2008 Role of spectral turbulence simulations in developing HPC systems YOKOKAWA, Mitsuo Next-Generation Supercomputer.
National Center for Atmospheric Research John Clyne 4/27/11 4/26/20111.
From Athena to Minerva: A Brief Overview Ben Cash Minerva Project Team, Minerva Workshop, GMU/COLA, September 16, 2013.
The CDCE BNL HEPIX – LBL October 28, 2009 Tony Chan - BNL.
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA; SAN DIEGO SDSC RP Update October 21, 2010.
FY14 Budget and Caseload Update Fiscal Committee January 6, 2014.
Petascale System Requirements for the Geosciences Richard Loft SCD Deputy Director for R&D.
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
CS CS 5150 Software Engineering Lecture 19 Performance.
Academic and Research Technology (A&RT)
Title US-CMS User Facilities Vivian O’Dell US CMS Physics Meeting May 18, 2001.
UCAR CONFIDENTIAL NCAR’s Response to upcoming OCI Solicitations Richard Loft SCD Deputy Director for R&D.
Mesoscale & Microscale Meteorological Division / ESSL / NCAR WRF (near) Real-Time High-Resolution Forecast Using Bluesky Wei Wang May 19, 2005 CISL User.
ECMWF Slide 1CAS2K3, Annecy, 7-10 September 2003 Report from ECMWF Walter Zwieflhofer European Centre for Medium-Range Weather Forecasting.
NEES, NEES2, and OSG Thomas Hacker Purdue University.
Possible Strategic Capability Projects High resolution coupled simulation: (ASD) – 1/8 o CAM, 1/10 o POP, 20 years (initially) – SCIDAC proposal: Small,
The Interactive Ensemble Coupled Modeling Strategy Ben Kirtman Center for Ocean-Land-Atmosphere Studies And George Mason University.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
SDSC RP Update TeraGrid Roundtable Reviewing Dash Unique characteristics: –A pre-production/evaluation “data-intensive” supercomputer based.
NCAR Annual Budget Review October 8, 2007 Tim Killeen NCAR Director.
Scientific Computing Division Juli Rew CISL User Forum May 19, 2005 Scheduler Basics.
10 January 2006 CISL Resource Information System High Level Kickoff Meeting 10 January 2006.
Scientific Computing Advisory Board Kickoff Meeting July 11, 2012, 11 AM Icahn L3-36.
Computer Science Section National Center for Atmospheric Research Department of Computer Science University of Colorado at Boulder Blue Gene Experience.
Angèle Simard Canadian Meteorological Center Meteorological Service of Canada MSC Computing Update.
1 Metrics for the Office of Science HPC Centers Jonathan Carter User Services Group Lead NERSC User Group Meeting June 12, 2006.
NCAR Supercomputing ‘Data Center’ Project An NCAR-led computing ‘facility’ for the study of the Earth system May 30, 2006.
John Peoples for the DES Collaboration BIRP Review August 12, 2004 Tucson1 DES Management  Survey Organization  Survey Deliverables  Proposed funding.
PetaApps: Update on software engineering and performance J. Dennis M. Vertenstein N. Hearn.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
SCD Update Tom Bettge Deputy Director Scientific Computing Division National Center for Atmospheric Research Boulder, CO USA User Forum May 2005.
Copyright © 2003 University Corporation for Atmospheric ResearchSponsored by the National Science Foundation NCAR Computing Update Tom Engel Scientific.
HPCMP Benchmarking Update Cray Henry April 2008 Department of Defense High Performance Computing Modernization Program.
IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National Laboratory November.
Nanco: a large HPC cluster for RBNI (Russell Berrie Nanotechnology Institute) Anne Weill – Zrahia Technion,Computer Center October 2008.
Cray Environmental Industry Solutions Per Nyberg Earth Sciences Business Manager Annecy CAS2K3 Sept 2003.
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
11 January 2005 High Performance Computing at NCAR Tom Bettge Deputy Director Scientific Computing Division National Center for Atmospheric Research Boulder,
TeraGrid Quarterly Meeting Arlington, VA Sep 6-7, 2007 NCSA RP Status Report.
FY14 Budget and Caseload Update Fiscal Committee November 4, 2013.
On the Road to a Sequential CCSM Robert Jacob, Argonne National Laboratory Including work by: Mariana Vertenstein (NCAR), Ray Loy (ANL), Tony Craig (NCAR)
© 2010 Pittsburgh Supercomputing Center Pittsburgh Supercomputing Center RP Update July 1, 2010 Bob Stock Associate Director
Status and plans at KEK Shoji Hashimoto Workshop on LQCD Software for Blue Gene/L, Boston University, Jan. 27, 2006.
LCSE – NCSA Partnership Accomplishments, FY01 Paul R. Woodward Laboratory for Computational Science & Engineering University of Minnesota October 17, 2001.
Chapter 12 The Network Development Life Cycle
Small Business Innovation Research Program (SBIR) Presented by Sharina Broughton.
Shu-Hua Chen University of California, Davis eatheresearch & orecasting
Power and Cooling of HPC Data Centers Requirements Roger A Panton Avetec Executive Director DICE
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Data Requirements for Climate and Carbon Research John Drake, Climate Dynamics Group Computer.
Seaborg Decommission James M. Craw Computational Systems Group Lead NERSC User Group Meeting September 17, 2007.
Sunpyo Hong, Hyesoon Kim
Next Generation Data and Computing Facility NCAR “Data Center” project An NCAR-led computing ‘facility’ for the study of the Earth system.
Scheduling a 100,000 Core Supercomputer for Maximum Utilization and Capability September 2010 Phil Andrews Patricia Kovatch Victor Hazlewood Troy Baer.
5-7 May 2003 SCD Exec_Retr 1 Research Data, May Archive Content New Archive Developments Archive Access and Provision.
Petascale Computing Resource Allocations PRAC – NSF Ed Walker, NSF CISE/ACI March 3,
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Building PetaScale Applications and Tools on the TeraGrid Workshop December 11-12, 2007 Scott Lathrop and Sergiu Sanielevici.
ESSL Holland, CCSM Workshop 0606 Predicting the Earth System Across Scales: Both Ways Summary:Rationale Approach and Current Focus Improved Simulation.
Quarterly Meeting Spring 2007 NSTG: Some Notes of Interest Adapting Neutron Science community codes for TeraGrid use and deployment. (Lynch, Chen) –Geared.
© Thomas Ludwig Prof. Dr. Thomas Ludwig German Climate Computing Center (DKRZ) University of Hamburg, Department for Computer Science (UHH/FBI) Disks,
Phase II Financial Review Guidance
OPERATING SYSTEMS CS 3502 Fall 2017
National Vision for High Performance Computing
Nuclear Physics Data Management Needs Bruce G. Gibbard
SNA Work Plan and Budget Revision 2015 (additional activities)
Presentation transcript:

CHAP Meeting 4 October 2007 CISL Update Operations and Services CISL HPC Advisory Panel Meeting 4 October 2007 Tom Bettge Director of Operations and Services Computational and Information Systems Laboratory

CHAP Meeting 4 October during the past six months…….. blueice bluevista

CHAP Meeting 4 October 2007 Resource Utilization

CHAP Meeting 4 October 2007 Challenges for CISL and CHAP (a good thing!!) l Witness: 1.7M GAUs available, 1.0M GAUs requested l Witness: NCAR Resources Increasing (another bump in Sept 2008) University use could fall behind NCAR University use could fall behind NCAR l Tapping “Latent” GAUs? (CISL’s challenge) –Special queue access by request for large university users –Consider requiring projects with large allocations to use in a “paced” manner, monthly, quarterly, etc. l Filling the surplus gap for university PIs –Awareness (CISL and NSF) –Mirror the NCC n Only one project selected for BTS in Nov, 5 were submitted –Leave well enough alone – “Build it and they will come.” from April CHAP Meeting

CHAP Meeting 4 October 2007 Large Community Requests (10 6 GAUs)

CHAP Meeting 4 October 2007 Community (non-CSL) GAUs Used

CHAP Meeting 4 October 2007 Breakthrough Science (BTS) Initiative Spring 2007

CHAP Meeting 4 October 2007 Breakthrough Science Projects Usage PIAffiliation NSF Prog Title GAUs Used Eric Chassignet Florida State OCE Building a New Version of CCSM3 with HYCOM as the Ocean Model 452,000 Dave Randall CSUATM Global Climate Modeling with Super Parameterization 325,000 Brian Savage University of Rhode Island EAR 3-D Tomography of the Crust and Upper Mantle Beneath the Gulf Extensional Province and Baja California 1000 Bill Smyth Oregon State OCE Instability and Turbulence in a Sheared, Diffusively Unstable Fluid 279,000 Ron Cohen Carnegie Inst. of Washington EAR Quantum Monte Carlo on Geophysical Materials 521,000 David Yuen University of Minnesota EAR (1) Poro-elastic Wave Propagation in Oil Reservoirs (2) 3-D Tsunami Wave Simulations 39,000 Annick Pouquet NCARNCAR Small Scale Structures in MHD Turbulence: Toward a Better Understanding of the Magnetosphere 233,000 Rich Rotunno NCARNCAR High Resolution Hurricane Simulations 726,000

CHAP Meeting 4 October 2007 University Capability Computing NCAR Capability Computing PIAffiliation NSF Prog Title GAUs Used David Straus COLAATM Detection and Attribution of Climate Change: The Interactive Ensemble 424,000 Chris Davis NCARNCAR Advanced Hurricane WRF Model Trials 227,000

CHAP Meeting 4 October 2007 Program Distribution

CHAP Meeting 4 October 2007 BTS Outcomes l CISL was able to provide large amounts of computing time through the use of special queues and the fair share scheduler l Three university projects have submitted large requests to CHAP because of BTS or UCC simulations –Straus, 423K –Cohen, 250K –Smyth, 70K l Very likely to provide early Capability Computing Opportunity in 2008 with availability of bluefire (some caveats here) –community eligibility TBD –start solicitation earlier –require more detailed proposals –users provide NCAR with more precise benchmarks (NCAR will help in advance) –projects which need resources and are ready-to-go

CHAP Meeting 4 October 2007 ICESS Computing Status

CHAP Meeting 4 October 2007 ICESS System Phase II l Jan 2007 to ~Jun 2008: 1.9 GHz POWER5+ p575 SMP nodes dual-link Federation Switch 4 TB memory, 150 TBytes disk 1600 batch processors – 16 processors/node 4 bluesky equivalents, 12 TFLOPs peak l ~Jun 2008 to ~Jun 2011: 4.7 GHz POWER6 p575+ SMP nodes quad-link Federation Switch 9.2 TB memory, 150 TBytes disk ~3200 batch processors – 32 processors/node 15.5 bluesky equivalents, 57.6 TFLOPs peak Phase I Phase II blueice bluefire

CHAP Meeting 4 October 2007 Sustained Capacity – with ICESS

CHAP Meeting 4 October 2007 ICESS Subcontract Performance Commitments l blueice is 4 bluesky-equivalents l bluefire is to be 15.5 bluesky-equivalents The following table shows committed average code speedups (each code is run on the indicated range of processor counts). Average blueice speedup over bluesky Average est’d bluefire speedup over bluesky Est’d bluefire speedup over blueice CAM (64,256) HD3D (8,16,32,64,128) POP (8,16,24,32,48,64,128) WRF (1,2,4,8,16,32,64,128,256) Weighted Average* *Weighted Average: 45% CAM, 10% HD3D, 20% POP, 25% WRF

CHAP Meeting 4 October 2007 Facility Constraints

CHAP Meeting 4 October 2007 Liquid Cooling !!!! l Power6 uses even more direct methods of heat transfer and reduces reaction time for mechanical systems

CHAP Meeting 4 October 2007 bluefire Schedule l April 2008delivery –Power up enough bluefire to match blueice production workload l May 2008ATP completed on partial bluefire l June 2008blueice decommissioned l July 2008ATP completed on entire bluefire l Aug-Nov 2008Capability Computing Opportunity –UCC and NCC details TBD l October 2008bluevista decommissioned l Dec 2008Full availability of bluefire –e.g., CSL not allocated fully until December 1

CHAP Meeting 4 October 2007 MSS Plans

CHAP Meeting 4 October 2007 MSS Growth

CHAP Meeting 4 October 2007 MSS Observations l Current growth rate is TB/month l Capacity of current archive (6 PB) will be exhausted June-Sept 2008 l Growth rate will be higher with bluefire availability l Estimate capacity of ≥20 PB needed prior to move to NSC l Near Term Plan –Engage vendors with estimated requirements –Request proposals for 4 year contract to partner with NCAR for MSS expansion –Award, install, and begin production by Sept 2008 –Phased decommission of current silos

CHAP Meeting 4 October 2007 CAS2K7 – September 9-13, 2007 l Keynote Speakers –emphasis on petascale computing and interdisciplinary applications and data access l Trends –Power/performance impacts on facilities – huge issues –Grids becoming increasingly useful –Important problems are interdisciplinary –Increasing parallellism, not faster clocks Presentations:

CHAP Meeting 4 October 2007 Questions and Discussion

CHAP Meeting 4 October 2007 Main HPC Systems at NCAR l IBM Power5 p575 l GHz P5pes l 4.7 Tflop Peak l 210 kW l IBM Power6 p575+ l GHz P6pes l 57.6 Tflop Peak l 700 kW l IBM Power5+ p575 l GHz P5pes l 12.0 Tflop Peak l 290 kW IBM Power5 IBM Power6 IBM Power5+ bluevistablueicebluefire

CHAP Meeting 4 October HPC Resource Split NCAR Capability Computing

CHAP Meeting 4 October 2007 Breakthrough Science (BTS) Initiative Spring 2007 l First four months of blueice production use l 75% of blueice devoted to BTS –NSF selected five projects –CHAP selected one projects –NCAR selected two projects l Consultant assigned to each project l Special queues: bts_prm, bts_reg, bts_sby, bts_ded – provided prototype access for capability computing