1 NERSC User Group Business Meeting, June 3, 2002 High Performance Computing Research Juan Meza Department Head High Performance Computing Research.

Slides:



Advertisements
Similar presentations
University of Chicago Department of Energy The Parallel and Grid I/O Perspective MPI, MPI-IO, NetCDF, and HDF5 are in common use Multi TB datasets also.
Advertisements

U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update June 12,
Presented by Suzy Tichenor Director, Industrial Partnerships Program Computing and Computational Sciences Directorate Oak Ridge National Laboratory DOE.
LBNL Visualization Group Research Snapshot Wes Bethel Lawrence Berkeley National Laboratory Berkeley, CA 24 Feb 2004.
Summary Role of Software (1 slide) ARCS Software Architecture (4 slides) SNS -- Caltech Interactions (3 slides)
The DOE Science Grid Computing and Data Infrastructure for Large-Scale Science William Johnston, Lawrence Berkeley National Lab Ray Bair, Pacific Northwest.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
Programming Tools and Environments: Linear Algebra James Demmel Mathematics and EECS UC Berkeley.
Office of Science Office of Biological and Environmental Research J Michael Kuperberg, Ph.D. Dan Stover, Ph.D. Terrestrial Ecosystem Science AmeriFlux.
Web-Enabling the Warehouse Chapter 16. Benefits of Web-Enabling a Data Warehouse Better-informed decision making Lower costs of deployment and management.
Office of Science Office of Biological and Environmental Research G. L. Geernaert Climate and Environmental Sciences Division Workshop on Community Modeling.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Assessment of Core Services provided to USLHC by OSG.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
LLNL-PRES-XXXXXX This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
QCD Project Overview Ying Zhang September 26, 2005.
Visualization Group March 8 th, Visualization Group Permanent staff: –Wes Bethel (group leader) –John Shalf, Cristina Siegerist, Raquel Romano Collaborations:
Extreme scale parallel and distributed systems – High performance computing systems Current No. 1 supercomputer Tianhe-2 at petaflops Pushing toward.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update September.
Physics Steven Gottlieb, NCSA/Indiana University Lattice QCD: focus on one area I understand well. A central aim of calculations using lattice QCD is to.
IPlant cyberifrastructure to support ecological modeling Presented at the Species Distribution Modeling Group at the American Museum of Natural History.
Jarek Nabrzyski, Ariel Oleksiak Comparison of Grid Middleware in European Grid Projects Jarek Nabrzyski, Ariel Oleksiak Poznań Supercomputing and Networking.
National Energy Research Scientific Computing Center (NERSC) Visualization Tools and Techniques on Seaborg and Escher Wes Bethel & Cristina Siegerist NERSC.
Pascucci-1 Valerio Pascucci Director, CEDMAV Professor, SCI Institute & School of Computing Laboratory Fellow, PNNL Massive Data Management, Analysis,
CERN openlab V Technical Strategy Fons Rademakers CERN openlab CTO.
Geosciences - Observations (Bob Wilhelmson) The geosciences in NSF’s world consists of atmospheric science, ocean science, and earth science Many of the.
National Collaboratories Program Overview Mary Anne ScottFebruary 7, rd DOE/NSF Meeting on LHC and Global Computing “Infostructure”
National Center for Supercomputing Applications Barbara S. Minsker, Ph.D. Associate Professor National Center for Supercomputing Applications and Department.
Numerical Libraries Project Microsoft Incubation Group Mary Beth Hribar Microsoft Corporation CSCAPES Workshop June 10, 2008 Copyright Microsoft Corporation,
Leibniz Supercomputing Centre Garching/Munich Matthias Brehm HPC Group June 16.
Commodity Grid Kits Gregor von Laszewski (ANL), Keith Jackson (LBL) Many state-of-the-art scientific applications, such as climate modeling, astrophysics,
Land Ice Verification and Validation (LIVV) Kit Weak scaling behavior for a large dome- shaped test case. It shows that the scaling behavior of a new run.
 Advanced Accelerator Simulation Panagiotis Spentzouris Fermilab Computing Division (member of the SciDAC AST project)
Brent Gorda LBNL – SOS7 3/5/03 1 Planned Machines: BluePlanet SOS7 March 5, 2003 Brent Gorda Future Technologies Group Lawrence Berkeley.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Scalable Systems Software for Terascale Computer Centers Coordinator: Al Geist Participating Organizations ORNL ANL LBNL.
ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL.
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
1 COMPUTER SCIENCE DEPARTMENT COLORADO STATE UNIVERSITY 1/9/2008 SAXS Software.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
J.-N. Leboeuf V.K. Decyk R.E. Waltz J. Candy W. Dorland Z. Lin S. Parker Y. Chen W.M. Nevins B.I. Cohen A.M. Dimits D. Shumaker W.W. Lee S. Ethier J. Lewandowski.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
1 OFFICE OF ADVANCED SCIENTIFIC COMPUTING RESEARCH The NERSC Center --From A DOE Program Manager’s Perspective-- A Presentation to the NERSC Users Group.
National Energy Research Scientific Computing Center (NERSC) NERSC View of the Greenbook Bill Saphir Chief Architect NERSC Center Division, LBNL 6/23/2004.
Earth System Curator and Model Metadata Discovery and Display for CMIP5 Sylvia Murphy and Cecelia Deluca (NOAA/CIRES) Hannah Wilcox (NCAR/CISL) Metafor.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.
ComPASS Summary, Budgets & Discussion Panagiotis Spentzouris, Fermilab ComPASS PI.
2. WP9 – Earth Observation Applications ESA DataGrid Review Frascati, 10 June Welcome and introduction (15m) 2.WP9 – Earth Observation Applications.
Toward a common data and command representation for quantum chemistry Malcolm Atkinson Director 5 th April 2004.
PDSF and the Alvarez Clusters Presented by Shane Canon, NERSC/PDSF
Directions in eScience Interoperability and Science Clouds June Interoperability in Action – Standards Implementation.
Collaborative Tools for the Grid V.N Alexandrov S. Mehmood Hasan.
NASA Earth Exchange (NEX) A collaborative supercomputing environment for global change science Earth Science Division/NASA Advanced Supercomputing (NAS)
HPC University Requirements Analysis Team Training Analysis Summary Meeting at PSC September Mary Ann Leung, Ph.D.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
VisIt Project Overview
Challenges in Electromagnetic Modeling Scalable Solvers
Collaborations and Interactions with other Projects
University of Technology
Scientific Computing At Jefferson Lab
Brian Matthews STFC EOSCpilot Brian Matthews STFC
Defining the Grid Fabrizio Gagliardi EMEA Director Technical Computing
L. Glimcher, R. Jin, G. Agrawal Presented by: Leo Glimcher
Presentation transcript:

1 NERSC User Group Business Meeting, June 3, 2002 High Performance Computing Research Juan Meza Department Head High Performance Computing Research

2 NERSC User Group Business Meeting, June 3, 2002 Computational Research in Balance with NERSC Center  A organizational balance between production facility and research activities is essential for the success of both  the NERSC Center benefits directly from the DOE funded computer science research project  the requirements of the NERSC Center stimulate computer science research and keep it closely relevant to the DOE mission

3 NERSC User Group Business Meeting, June 3, 2002 National Energy Research Scientific Computing ( NERSC ) Division DIVISION DIRECTOR HORST SIMON DIVISION DEPUTY DIRECTOR WILLIAM KRAMER CENTER FOR BIOINFORMATICS & COMP. GENOMICS MANFRED ZORNADVANCED SYSTEMS SYSTEMS TAMMY WELCOME DIVISION ADMINISTRATOR & FINANCIAL MANAGER WILLIAM FORTNEY DISTRIBUTED SYSTEMS DEPARTMENT WILLIAM JOHNSTON Department Head DEB AGARWAL, Deputy HIGH PERFORMANCE COMPUTING DEPARTMENT WILLIAM KRAMER Department Head CHIEF TECHNOLOGIST DAVID BAILEY COMPUTATIONAL SYSTEMS JIM CRAW COMPUTER OPERATIONS NETWORKING SUPPORT WILLIAM HARRIS NETWORKING & SECURITY HOWARD WALTER USER SERVICES FRANCESCA VERDIER HENPCOMPUTING DAVID QUARRIE MASSSTORAGE NANCY MEYER HIGH PERFORMANCE COMPUTING RESEARCH DEPARTMENT Juan Meza Department Head APPLIED NUMERICAL ALGORITHMS PHIL COLELLA CENTER COMPUTATIONAL SCIENCE & ENGR. JOHN BELL FUTURE TECHNOLOGIES TECHNOLOGIES BRENT GORDA IMAGING & COLLABORATIVE COMPUTING BAHRAM PARVIN SCIENTIFICCOMPUTING ESMOND NG VISUALIZATION WES BETHEL SCIENTIFIC DATA MANAGEMENT ARIE SHOSHANI COLLABORATORIES DEB AGARWAL DATA INTENSIVE DIST. COMPUTING BRIAN TIERNEY DISTRIBUTED SECURITY RESEARCH MARY THOMPSON NETWORKING WILLIAM JOHNSTON (acting) Rev: GRID TECHNOLOGIES KEITH JACKSON

4 NERSC User Group Business Meeting, June 3, 2002 Future Technology Group  Mission is to investigate, understand and recommend technologies of importance to the NERSC center and its user base in the 3-5 year timeframe  Checkpoint/Restart to enable production computing for Linux  Target OS is Linux; User or system administrator initiated  Specifically aimed at parallel MPI jobs  Part of Scalable Systems Software SciDAC project  Infiniband Study  IB is the next “big thing” in system area networking  Switched fabric, O/S bypass, one-sided messaging, self healing  IPv6 addressing

5 NERSC User Group Business Meeting, June 3, 2002 Big Data and Remote Visualization: Visapult  Motivation: remote and interactive visualization of large scientific data over a wide area network.  Framework and application for remote direct volume visualization of large structured mesh data.  Visapult winner of SC2001 Bandwidth Challenge Source Volume

6 NERSC User Group Business Meeting, June 3, 2002 Scientific Computing Group  Computational materials science  electronic structure calculations, molecular dynamics  Environmental and earth sciences  climate modeling, groundwater flow simulation, modeling of the earth  Computational Physics  cosmic microwave background radiation, high-energy neutrino detection, accelerator modeling  Numerical linear algebra  eigen/singular value computations, sparse matrix computations  extended precision basic linear algebra subprograms  environments and tools

7 NERSC User Group Business Meeting, June 3, 2002 Advanced Computational Testing and Simulation (ACTS)  Make ACTS tools available on NERSC platforms  Enable large scale scientific applications  Extended support for experimental software  Perform independent evaluation of tools  Provide consistent application interfaces  Coordinate efforts with other supercomputing centers  Educate and train  Provide technical support  Maintain ACTS information center (

8 NERSC User Group Business Meeting, June 3, 2002 Use of ACTS Tools Scattering in a quantum system of three charged particles (Rescigno, Baertschy, Isaacs and McCurdy, Dec. 24, 1999), SuperLU. Cosmic Microwave Background Analysis, BOOMERanG collaboration, MADCAP code (Apr. 27, 2000), ScaLAPACK.

9 NERSC User Group Business Meeting, June 3, 2002 Vision  Listening mode  Some observations (after 2 months)  Clear that NERSC is doing world class research  Leaders in the application of modeling and simulation to science  Good balance of applications, computer science, and algorithms that benefits the entire community  Challenges  Delivery on SciDAC goals  Earth Simulator response: advanced architecture research  Maintaining a balance that will successful track record