TeraGrid Advanced User Support (AUS) Amit Majumdar, SDSC Area Director – AUS TeraGrid Annual Review April 6-7, 2009 1.

Slides:



Advertisements
Similar presentations
Extreme Scalability RAT Report title=RATs#Extreme_Scalability Sergiu Sanielevici,
Advertisements

User Services Transition To XD TG Quarterly Management Meeting, San Juan 12/7/2010 Amit & Sergiu.
User Support Coordination Objectives for Plan Years 4 and 5 (8/1/2008 to 3/1/2010) Sergiu Sanielevici, GIG Area Director for User Support Coordination.
SAN DIEGO SUPERCOMPUTER CENTER Advanced User Support Project Outline October 9th 2008 Ross C. Walker.
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
Open Library Environment Designing technology for the way libraries really work November 19, 2008 ~ ASERL, Atlanta Lynne O’Brien Director, Academic Technology.
Network, Operations and Security Area Tony Rimovsky NOS Area Director
1 Open Library Environment Designing technology for the way libraries really work December 8, 2008 ~ CNI, Washington DC Lynne O’Brien Director, Academic.
US NITRD LSN-MAGIC Coordinating Team – Organization and Goals Richard Carlson NGNS Program Manager, Research Division, Office of Advanced Scientific Computing.
PYs 4 and 5 Objectives for Education, Outreach and Training and External Relations Scott Lathrop
ExTASY 0.1 Beta Testing 1 st April 2015
Effective User Services for High Performance Computing A White Paper by the TeraGrid Science Advisory Board May 2009.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI Future support of EGI services Tiziana Ferrari/EGI.eu Future support of EGI.
TeraGrid’s Efforts towards Broadening Participation Scott Lathrop
TeraGrid’s Efforts towards Broadening Participation Scott Lathrop
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
1 Preparing Your Application for TeraGrid Beyond 2010 TG09 Tutorial June 22, 2009.
Technology + Process SDCI HPC Improvement: High-Productivity Performance Engineering (Tools, Methods, Training) for NSF HPC Applications Rick Kufrin *,
SAN DIEGO SUPERCOMPUTER CENTER Impact Requirements Analysis Team Co-Chairs: Mark Sheddon (SDSC) Ann Zimmerman (University of Michigan) Members: John Cobb.
Coordinating the TeraGrid’s User Interface Areas Dave Hart, Amit Majumdar, Tony Rimovsky, Sergiu Sanielevici.
1 PY4 Project Report Summary of incomplete PY4 IPP items.
Kelly Gaither Visualization Area Report. Efforts in 2008 Focused on providing production visualization capabilities (software and hardware) Focused on.
Martin Schulz Center for Applied Scientific Computing Lawrence Livermore National Laboratory Lawrence Livermore National Laboratory, P. O. Box 808, Livermore,
TeraGrid Quarterly Meeting Dec 5 - 7, 2006 Data, Visualization and Scheduling (DVS) Update Kelly Gaither, DVS Area Director.
NIH Resource for Biomolecular Modeling and Bioinformatics Beckman Institute, UIUC NAMD Development Goals L.V. (Sanjay) Kale Professor.
NIH Resource for Biomolecular Modeling and Bioinformatics Beckman Institute, UIUC NAMD Development Goals L.V. (Sanjay) Kale Professor.
A Framework for Visualizing Science at the Petascale and Beyond Kelly Gaither Research Scientist Associate Director, Data and Information Analysis Texas.
Advanced User Support Amit Majumdar 5/7/09. Outline  Three categories of AUS  Update on Operational Activities  AUS.ASTA  AUS.ASP  AUS.ASEOT.
SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Cyberinfrastructure Requirements and Best Practices Lessons from a study of TeraGrid Ann Zimmerman.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
NEES Cyberinfrastructure Center at the San Diego Supercomputer Center, UCSD George E. Brown, Jr. Network for Earthquake Engineering Simulation NEES TeraGrid.
Joint Meeting of the AUS, US, XS Working Groups TG10 Tuesday August 3, hrs Elwood II.
TeraGrid Extension Gateway Activities Nancy Wilkins-Diehr TeraGrid Quarterly, September 24-25, 2009 The Extension Proposal!
SAN DIEGO SUPERCOMPUTER CENTER Advanced User Support Project Overview Adrian E. Roitberg University of Florida July 2nd 2009 By Ross C. Walker.
Advanced User Support -Update Amit Majumdar SDSC.
Lawrence Livermore National Laboratory S&T Principal Directorate - Computation Directorate Tools and Scalable Application Preparation Project Computation.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
© 2010 Pittsburgh Supercomputing Center Pittsburgh Supercomputing Center RP Update July 1, 2010 Bob Stock Associate Director
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
National Center for Supercomputing Applications University of Illinois at Urbana–Champaign Visualization Support for XSEDE and Blue Waters DOE Graphics.
Education, Outreach and Training (EOT) and External Relations (ER) Scott Lathrop Area Director for EOT Extension Year Plans.
TeraGrid Program Year 5 Overview John Towns Chair, TeraGrid Forum Director, Persistent Infrastructure National Center for Supercomputing Applications University.
Network, Operations and Security Area Tony Rimovsky NOS Area Director
GridChem Sciene Gateway and Challenges in Distributed Services Sudhakar Pamidighantam NCSA, University of Illinois at Urbaba- Champaign
Education, Outreach and Training (EOT) Scott Lathrop Area Director for EOT February 2009.
Advanced User Support Amit Majumdar 8/13/09. Outline  Three categories of AUS  Operational Activities  AUS.ASTA  AUS.ASP  ASTA example.
TeraGrid’s Common User Environment: Status, Challenges, Future Annual Project Review April, 2008.
Advanced User Support Amit Majumdar 8/12/10. Outline  Three categories of AUS  Operational Activities  AUS.ASTA  ASTA examples  AUS.ASP  AUS.ASEOT.
Visualization Update June 18, 2009 Kelly Gaither, GIG Area Director DV.
HPC University Requirements Analysis Team Training Analysis Summary Meeting at PSC September Mary Ann Leung, Ph.D.
Advanced User Support in the Swedish National HPC Infrastructure May 13, 2013NeIC Workshop: Center Operations best practices.
PEER 2003 Meeting 03/08/031 Interdisciplinary Framework Major focus areas Structural Representation Fault Systems Earthquake Source Physics Ground Motions.
TeraGrid Program Year 5 Overview John Towns Chair, TeraGrid Forum Director, Persistent Infrastructure National Center for Supercomputing Applications University.
Petascale Computing Resource Allocations PRAC – NSF Ed Walker, NSF CISE/ACI March 3,
TG ’08, June 9-13, State of TeraGrid John Towns Co-Chair, TeraGrid Forum Director, Persistent Infrastructure National Center for Supercomputing.
Building PetaScale Applications and Tools on the TeraGrid Workshop December 11-12, 2007 Scott Lathrop and Sergiu Sanielevici.
Education, Outreach and Training (EOT) Scott Lathrop Area Director for EOT January 2010.
Visualization Update July 2, 2009 Kelly Gaither, GIG Area Director DV.
Purdue RP Highlights TeraGrid Round Table November 5, 2009 Carol Song Purdue TeraGrid RP PI Rosen Center for Advanced Computing Purdue University.
Kent Milfeld, TACC TG Allocations Coordinator Aug. 13, 2009.
SAN DIEGO SUPERCOMPUTER CENTER SDSC Resource Partner Summary March, 2009.
TeraGrid’s Process for Meeting User Needs. Jay Boisseau, Texas Advanced Computing Center Dennis Gannon, Indiana University Ralph Roskies, University of.
Quarterly Meeting Spring 2007 NSTG: Some Notes of Interest Adapting Neutron Science community codes for TeraGrid use and deployment. (Lynch, Chen) –Geared.
Defining the Competencies for Leadership- Class Computing Education and Training Steven I. Gordon and Judith D. Gardiner August 3, 2010.
User Representation in TeraGrid Management Jay Boisseau Director, Texas Advanced Computing Center The University of Texas at Austin.
SCEC Capability Simulations on TeraGrid
VisIt Project Overview
XSEDE Value Added and Financial Economies
POTENTIAL EGEE APPLICATIONS IN THE CZECH REPUBLIC INITIAL IDEAS
Development of the Nanoconfinement Science Gateway
Presentation transcript:

TeraGrid Advanced User Support (AUS) Amit Majumdar, SDSC Area Director – AUS TeraGrid Annual Review April 6-7,

Outline of This Short Talk Overview of TeraGrid-wide AUS Three Sub-Areas Under AUS Advanced Support for TeraGrid Applications (ASTA) Advanced Support Projects (ASP) Advanced Support EOT (ASEOT) Benefit of TeraGrid-wide AUS Management and Operation of AUS Focus of PY05 Plan 2

Overview of TeraGrid-Wide AUS TeraGrid-wide AUS Area initiated in August 2008 –Until then effort was mostly RP based –AUS staff are expert computational scientists (~90% Ph.D) –TG-wide AUS allows global management/coordination Provide the best mix/match of staff (expertise) for users Allows close interaction with DV, SGW, EOT, US, UFC The report/presentation consists of pre-August RP based and post-August TG-wide AUS –TRAC allocations-based Advanced Support for TeraGrid Applications (ASTA) since early

Three Sub-Areas Under AUS Advanced Support for TeraGrid Applications (ASTA) Advanced Support Projects (ASP) Advanced Support EOT (ASEOT) 4

AUS - ASTA Users request ASTA as part of quarterly TRAC –Reviewed by TRAC and recommendation score provided –Scores taken into account to select ASTA projects –Other criterion Match of AUS staff expertise for ASTA RP site(s) where PI has allocation ASTA work plan Interest of PI, PI team Additional mechanisms for ASTA in Sept 2008: –Startup and Supplemental ASTA –The scope of a regular US activity elevated to ASTA –PI requests ASTA at the next TRAC (AUS continues uninterrupted) 5

Number of ASTAs Started per Quarter - Trend TeraGrid-wide AUS started  Currently total of ~40 active ASTAs from last three TRACs, Startup/Supplemental, continuing from

ASTAs – Significant Deep&Broad Impact ASTAs breaking records –Astronomy grid size, AMR depth –Earthquake simulations grid size, 1 Hz, 100K core –MD simulations time scales, larger QM region –DNS turbulence grid size (world record grid size benchmark) ASTAs other impacts –Million serial job submission –Workflow – data analysis –Math library analysis –Detail visualization –SHMEM version of MPI code –Innovative resource use 7  15 ASTAs contributed to Science Highlights  Joint paper between ASTA staff and PI’s team in journals, conferences, SC08 best poster etc.  Joint proposal between ASTA staff and PI – several PetaApps and other NSF proposal

ASTA Example - CFD PI: Yeung, Georgia Tech World’s largest DNS turbulence simulation – first ever benchmarks on Ranger and Kraken Collaborative ASTA from three different RP sites with complementary expertise –FFT performance –MPI collective scaling –Performance analysis –Visualization The intense enstrophy (red iso-contours) in long but thin tubes surrounded by large energy dissipation (blue/green volume rendering). Image courtesy of Kelly Gaither (TACC), Diego Donzis (U. Maryland). 8

ASTA Example - MD PI: Roitberg, U. Florida – focuses on combined QM/MM simulations of biological enzymes Implemented high performance, parallel QM/MM within AMBER Semi-empirical QM/MM runs times faster Significantly larger QM region can be studied in detail Resulted in progress towards treatment of Chagas’ disease Work released through AMBER – impacts many TG, non-TG users; work transferable to CHARMM A snapshot from a QM/MM simulation of the protein Nitrophorin-2 expressed in the triatome Rhodnius prolixus, a vector for Trypanosoma cruzi, and a key target for Chagas’ disease inhibitors Image: A.E. Roitberg,G. Seabra (UFL), J. Torras (UPC-Spain), R.C. Walker (SDSC) ASTA effort: Improve the serial and parallel efficiency of the DFTB QM/MM Improve scaling to further accelerate the SCF convergence in QM calculations 9

AUS - ASP  Foundation work: installation of domain science, HPC specific software, associated optimization on machines, debugging, training, and interaction with users  Projects with opportunity of broad impact  Identified, with input from users, by AUS staff, other TG WGs  Example and potential projects  Benchmark widely used community Molecular Dynamics (NAMD, AMBER, other) and Materials Science (CPMD, VASP, other) applications on TeraGrid machines and provide well documented information  Beneficial to both users and TRAC reviewers  Provide usage scenario-based, technical documentation on effective use of profiling, tracing tools on TeraGrid machines  Analyze, benchmark hybrid and multi-core programming techniques  Document exemplary ways of using TG resources from ASTA projects, science highlights etc. 10

11 Molecular Dynamics Benchmark AUS - Advanced Support Project

AUS - ASEOT Contribute to and deliver advanced HPC/CI topics in training, tutorial, workshops AUS outreach to the user community Organize/participate in workshops –FT workshop with BlueWaters, DOE labs, vendors –Special MD-HPC session in ACS Work with XS-WG, and EOT to create a venue for PetaScale users, SDCI-based tools developers Interaction with other (e.g. DataNet, iPlant) NSF funded projects to understand/anticipate their CI needs 12

Benefit of TeraGrid-wide AUS Rare pool of excellent staff for NSF - expertise and experience in HPC, domain science, architectures Coordinated TeraGrid-wide AUS – mix/match of expertise from different RP sites –Molecular Dynamics expert HPC perf tools expert –FFT expert data (HDF5) expert –Parallel I/O expert visualization expert –PETSc expert from a RP another RP site’s machine –Innovative ASP – significant user impacts e.g. AMBER and NAMD experts collaborating on code scaling e.g. IPM and TAU experts collaborating on profiling, tuning Complementary advanced training opportunities (at TG09, possibly SC09) 13

Management and Operation of AUS By a team of AUS POCs from participating RP sites –Bi-weekly telecon for AUS projects, matching of AUS staff, contributing to reports etc. Bi-weekly technical tele/webcon among AUS staff –Technical presentation on ASTA, ASP by AUS technical staff –Technical insight gained by all AUS staff on ASTAs, issues, resources –AUS staff expertise and research interest shared –Collaborative environment for AUS staff for current/future projects Other ASP specific telecons as needed All presentations, meeting minutes available in Wiki 14

Focus of PY05 Plan ASTA - build on experience from PY04 ASTAs –Utilize team expertise from PY04 –Seek out new fields (economics, social science) –Thorough work plan with PIs –Utilize PY04 ASTA work e.g.VASP optimized on machine X – impacts all current/future users ASP –Petascale, Data, Viz ASP - possibly jointly with DV, SGW –User survey provides ideas EOT –Continue outreach to DataNet, iPlant –ASTA for under-represented in HPC/CI - with EOT/Pathways 15