Download presentation
Presentation is loading. Please wait.
Published byBenedict Nash Modified over 9 years ago
1
Advanced User Support Amit Majumdar 5/7/09
2
Outline Three categories of AUS Update on Operational Activities AUS.ASTA AUS.ASP AUS.ASEOT
3
Three Categories of AUS I.Advanced Support for TeraGrid Applications (AUS.ASTA) AUS staff work on a particular user’s code (usually) Guided/initiated by the allocations process I.Advanced Support for Projects (AUS.ASP) Project initiated by AUS staff (jointly with users) – impacts many users I.Advanced Support for EOT (AUS.ASEOT) Advanced HPC/CI part of EOT
4
Update on Operational Activities Bi-weekly telecon among AUS POCs from every RP site o Matching of AUS staff to ASTA projects, discussion about ASP, EOT activities Web/Tele-con among AUS technical staff o Biweekly technical tele/web-conference on ASTA and other projects - using readytalk o 13 presentation sessions (including today), 22 technical presentations
5
ASTAs Started in October 2008 - TRAC #PI nameFieldAUS staff 1Jordan/USCGeo sciences Cui (SDSC ), Urbanic (PSC ), Kim(TACC ), Wong (NICS), Bock (NCSA) 2Mahesh/U. MinnCFDTatineni (SDSC), Crosby (NICS),Cazes (TACC ) 3Roitberg/U. FloridaBiochemWalker (SDSC ), Pamidighantam (NCSA ), Halloy (NICS ) 4Voth/U. UtahBioechemBlood (PSC ), Crosby (NICS), Peterson (TACC ) 5Buehler/MITCivil EngrWalker (SDSC) 6Durisen/IUAstroHenschel, Berry (IU) 7Papavassiliou/ OUCFDReddy (PSC ) 8Van Einde/UCSDStr. EngrChoi (SDSC), Halloy (NICS), Cazes (TACC)
6
Startup/Supplemental ASTAs #PI NameFieldAUS Staff 1Schulten/UIUCBio ChemKoesterke, Liu, Milfeld, TACC 2Ferrante/U. WashingtonCFD/DNS/turbulenceTaha (NCSA), Pekurovsky (SDSC,), Bock (NCSA), Peterson (TACC), Crosby (NICS) 3Tabakin/U. PittQuantum ComputingGomez ( PSC) 4Karimabadi/ UCSDSpace PhysicsTatineni (SDSC) 5Mashayek/UICCFDTaha (NCSA) 6Malamataris/GMCFDBrook (NICS) 7Hu/CSULBPhysicsBarth (TACC) 8Yeung/GtechCFD/DNSWong(NICS), Pekurovsky (SDSC) 9Finol/CMUCFDJana (PSC) 10Geoffrey/CMUCSJana (PSC) 11,12,13Roberts/RTI;Karniadakis/Brown;C oveney/UCL CS/CFD/ChemistryKaronis (NIU, ANL) 14Cobb/ORNLNeutron ScienceLynch, Chen (ORNL) 15Kanamitsu/SIO-UCSDClimate SciencePeterson (TACC) 16Burke/PittMedicineBrown (PSC)
7
ASTAs Started in January 2009 - TRAC #PI NameFieldAUS staff 1 Aidun/GTCFDDillman (Purdue), Khurram (NCSA) 2Cheatham/U.UtahBiochemPamidighantam (NCSA), Crosby (NICS), Milfeld (TACC), Madrid (PSC), Walker (SDSC) 3Gygi/UCDMaterials ScienceFahey (NICS), Kim (TACC) 4Jansen/RPICFDWong (NICS), Peterson (TACC), O’neal (PSC) 5Van de Walle/UCSB Materials ScienceLiu (TACC), Vanomoer (NCSA) 6Yip/MITMaterials Science Halloy (Yip) 7Scheraga/CornellChemistryBlood, Mahmoodi (PSC) 8Oleynik/U. South Florida Materials ScienceBarth (TACC), Crosby (NICS), Jundt (LONI)
8
ASTAs Started in April 2009 - TRAC #PI NameFieldAUS staff 1Batista/LANLMaterials ScienceLiu (TACC) 2Schnetter/LSUAstrophysicsMahmoodi (PSC), Kufrin, Liu (NCSA), Pfeiffer (SDSC) 3Helly/SIO-UCSDClimate ScienceCheeseman (Purdue), Jundt (LONI) 4Liu/UKYQCDReddy (PSC), Fahey (NICS) 5Smith/GtechCFDTBD upon PI’s response 6Wheeler/UTCFDKhamra (TACC) 7Cen/PrincetonAstrophysicsBrook (NICS), Kharma(TACC), Chourasia (SDSC) 8Latour/ClemsonBioChemTBD upon PI’s response 9Apte/Oregon St.CFDRosales (TACC) 10Sandholm/CPUCSWelling (PSC) Total number of active ASTAs ~40
9
ASTA Update - PI: Durisen, IU, Astrophysics AUS staff: Henschel, Berry (IU) Legacy code – OpenMP parallel – benchmarked on three Altix systems (PSC, NCSA, ZIH) – different performance! Optimized subroutines (calculates the gravitational potential) – 1.8X speedup Simulations generate several TBs of data which is then analyzed interactively using IDL Transferring this amount of data via traditional methods (ftp, scp, etc.) to IU is extremely time consuming and tedious By mounting the Data Capacitor at PSC on Pople the user can write their data directly to IU and then access from servers in their department Extensive profiling and optimization of I/O performance on local scratch vs DataCapacitor – eventually ~30% speedup in I/O Files appear locally as they are generated by simulation
10
ASTA Update - PI: Scheraga, Cornell, Biophysics AUS Staff : Blood, Mahmoodi (PSC) Identified a serial bottleneck and load imbalance problem that limited parallel scaling Eliminated serial bottleneck and restructured code to eliminate imbalance Resulting code performing 4 times faster for large systems Never ending optimization – In the new code computation/communication balance is changed – further profiling ongoing using CrayPAT, TAU
11
ASTA Update - PI: Van de Walle, UCSB, Condensed Matter Physics AUS Staff : Liu (TACC), Vanmoer (NCSA) Main effort was to identify performance issues of VASP Identified only routine that needed lower level (-O1) compilation; others used –O3 : resulted in ~10 % performance improvement MKL on Ranger had SMP enabled with default OMP_NUM_THREADS of 4; caused overhead – fixed with proper wayness and threads Proper setting of NPAR (determines process grouping for band diagonalization and FFT) showed 3-4 times speedup
12
Advanced Support Projects Two projects ongoing – Benchmarking Molecular Dynamics codes – Benchmarking Materials Science codes Other potential ones we are looking into – Multi-core performance analysis – Usage-based perf/profiling tools
13
Comparison of MD Benchmark on TeraGrid Machines at Different Parallel Efficiencies
14
Advanced Support EOT Advanced HPC classes at various RP sites TG09 – AUS staff participating in organizing TG09; reviewing papers – AUS staff will be presenting papers at TG09; presenting tutorials – Joint US-AUS-XS working group meeting at TG09
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.