High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative.

Slides:



Advertisements
Similar presentations
What do we currently mean by Computational Science? Traditionally focuses on the “hard sciences” and engineering –Physics, Chemistry, Mechanics, Aerospace,
Advertisements

U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Current Progress on the CCA Groundwater Modeling Framework Bruce Palmer, Yilin Fang, Vidhya Gurumoorthi, Computational Sciences and Mathematics Division.
A Case Study in the Visualization of Supernova Simulation Data Ed Bachta Visualization and Interactive Spaces Lab.
Research Projects General Sciences Physical Sciences Energy Sciences Biosciences Ali Belkacem - Chemical Sciences MRC workshop: March 26, 2002.
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Global Climate Modeling Research John Drake Computational Climate Dynamics Group Computer.
ASCI/Alliances Center for Astrophysical Thermonuclear Flashes Simulating Self-Gravitating Flows with FLASH P. M. Ricker, K. Olson, and F. X. Timmes Motivation:
Update on the DOE SciDAC Program Vicky White, DOE/HENP Lattice QCD Collaboration Meeting Jefferson Lab, Feb
1 NERSC User Group Business Meeting, June 3, 2002 High Performance Computing Research Juan Meza Department Head High Performance Computing Research.
B1 -Biogeochemical ANL - Townhall V. Rao Kotamarthi.
Center for Component Technology for Terascale Simulation Software (aka Common Component Architecture) (aka CCA) Rob Armstrong & the CCA Working Group Sandia.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Discovery Environments Susan L. Graham Chief Computer Scientist Peter.
T7/High Performance Computing K. Ko, R. Ryne, P. Spentzouris.
Role of Deputy Director for Code Architecture and Strategy for Integration of Advanced Computing R&D Andrew Siegel FSP Deputy Director for Code Architecture.
Kwok Ko (SLAC), Robert Ryne (LBNL) ADVANCED COMPUTING FOR 21 ST CENTURY ACCELERATOR SCIENCE & TECHONLOGY SciDAC PI Meeting – January 15-16, 2002.
LLNL-PRES-XXXXXX This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Event Metadata Records as a Testbed for Scalable Data Mining David Malon, Peter van Gemmeren (Argonne National Laboratory) At a data rate of 200 hertz,
ComPASS Project Overview Panagiotis Spentzouris, Fermilab ComPASS PI.
Improved pipelining and domain decomposition in QuickPIC Chengkun Huang (UCLA/LANL) and members of FACET collaboration SciDAC COMPASS all hands meeting.
QCD Project Overview Ying Zhang September 26, 2005.
November 13, 2006 Performance Engineering Research Institute 1 Scientific Discovery through Advanced Computation Performance Engineering.
R. Ryne, NUG mtg: Page 1 High Energy Physics Greenbook Presentation Robert D. Ryne Lawrence Berkeley National Laboratory NERSC User Group Meeting.
Massive star evolution convection semiconvection overshoot angular momentum transport with and without B-field torques nucleosynthesis presupernova models.
After step 2, processors know who owns the data in their assumed partitions— now the assumed partition defines the rendezvous points Scalable Conceptual.
Components for Beam Dynamics Douglas R. Dechow, Tech-X Lois Curfman McInnes, ANL Boyana Norris, ANL With thanks to the Common Component Architecture (CCA)
Pascucci-1 Valerio Pascucci Director, CEDMAV Professor, SCI Institute & School of Computing Laboratory Fellow, PNNL Massive Data Management, Analysis,
June 29 San FranciscoSciDAC 2005 Terascale Supernova Initiative Discovering New Dynamics of Core-Collapse Supernova Shock Waves John M. Blondin NC State.
Terascale Simulation Tools and Technology Center TSTT brings together existing mesh expertise from Labs and universities. State of the art: many high-quality.
CMRS Review, PPPL, 5 June 2003 &. 4 projects in high energy and nuclear physics 5 projects in fusion energy science 14 projects in biological and environmental.
Presented by An Overview of the Common Component Architecture (CCA) The CCA Forum and the Center for Technology for Advanced Scientific Component Software.
NIH Resource for Biomolecular Modeling and Bioinformatics Beckman Institute, UIUC NAMD Development Goals L.V. (Sanjay) Kale Professor.
Mark Rast Laboratory for Atmospheric and Space Physics Department of Astrophysical and Planetary Sciences University of Colorado, Boulder Kiepenheuer-Institut.
Advanced Simulation and Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
Detail-Preserving Fluid Control N. Th ű rey R. Keiser M. Pauly U. R ű de SCA 2006.
Computational Aspects of Multi-scale Modeling Ahmed Sameh, Ananth Grama Computing Research Institute Purdue University.
Beam Dynamics: Planned Activities Code Development Intrabeam collisions Electron cooling Continued support for IMPACT Continued development of  beam-beam.
Presented by Scientific Data Management Center Nagiza F. Samatova Network and Cluster Computing Computer Sciences and Mathematics Division.
Land Ice Verification and Validation (LIVV) Kit Weak scaling behavior for a large dome- shaped test case. It shows that the scaling behavior of a new run.
1 1 What does Performance Across the Software Stack mean?  High level view: Providing performance for physics simulations meaningful to applications 
 Advanced Accelerator Simulation Panagiotis Spentzouris Fermilab Computing Division (member of the SciDAC AST project)
F. Douglas Swesty, DOE Office of Science Data Management Workshop, SLAC March Data Management Needs for Nuclear-Astrophysical Simulation at the Ultrascale.
VAPoR: A Discovery Environment for Terascale Scientific Data Sets Alan Norton & John Clyne National Center for Atmospheric Research Scientific Computing.
I/O for Structured-Grid AMR Phil Colella Lawrence Berkeley National Laboratory Coordinating PI, APDEC CET.
CCA Common Component Architecture CCA Forum Tutorial Working Group CCA Status and Plans.
TeraScale Supernova Initiative: A Networker’s Challenge 11 Institution, 21 Investigator, 34 Person, Interdisciplinary Effort.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
1 1 Office of Science Jean-Luc Vay Accelerator Technology & Applied Physics Division Lawrence Berkeley National Laboratory HEP Software Foundation Workshop,
J.-N. Leboeuf V.K. Decyk R.E. Waltz J. Candy W. Dorland Z. Lin S. Parker Y. Chen W.M. Nevins B.I. Cohen A.M. Dimits D. Shumaker W.W. Lee S. Ethier J. Lewandowski.
Large Scale Time-Varying Data Visualization Han-Wei Shen Department of Computer and Information Science The Ohio State University.
1 OFFICE OF ADVANCED SCIENTIFIC COMPUTING RESEARCH The NERSC Center --From A DOE Program Manager’s Perspective-- A Presentation to the NERSC Users Group.
DOE Network PI Meeting 2005 Runtime Data Management for Data-Intensive Scientific Applications Xiaosong Ma NC State University Joint Faculty: Oak Ridge.
HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.
DOE/SciDAC Supernova Science Center (SNSC) S. Woosley (UCSC), A. Burrows (UA), C. Fryer (LANL), R. Hoffman (LLNL)+ 20 researchers.
Core Collapse Supernovae: Power Beyond Imagination
August 12, 2004 UCRL-PRES Aug Outline l Motivation l About the Applications l Statistics Gathered l Inferences l Future Work.
The Performance Evaluation Research Center (PERC) Participating Institutions: Argonne Natl. Lab.Univ. of California, San Diego Lawrence Berkeley Natl.
Center for Extended MHD Modeling (PI: S. Jardin, PPPL) –Two extensively developed fully 3-D nonlinear MHD codes, NIMROD and M3D formed the basis for further.
Center for Component Technology for Terascale Simulation Software (CCTTSS) 110 April 2002CCA Forum, Townsend, TN This work has been sponsored by the Mathematics,
Collaborative Tools for the Grid V.N Alexandrov S. Mehmood Hasan.
 Accelerator Simulation P. Spentzouris Accelerator activity coordination meeting 03 Aug '04.
CSCAPES Mission Research and development Provide load balancing and parallelization toolkits for petascale computation Develop advanced automatic differentiation.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
Design and Planning Tools John Grosh Lawrence Livermore National Laboratory April 2016.
Regression Testing for CHIMERA Jessica Travierso Austin Peay State University Research Alliance in Math and Science National Center for Computational Sciences,
VisIt Project Overview
Challenges in Electromagnetic Modeling Scalable Solvers
Unstructured Grids at Sandia National Labs
U.C. Berkeley Millennium Project
Usability of In Situ Generated PDFs for Post Hoc Analysis
Presentation transcript:

High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

Primary Goal: A full understanding, achieved through numerical simulation, of how supernovae of all types explode and how the elements have been created in nature. Comparison of these results with astronomical observations and the abundance pattern seen in the sun. SciDAC Supernova Science Center UCSC LANL LLNL U. Arizona

Major Challenges For gravitational collapse supernovae – Three-dimensional hydrodynamics coupled to radiation transport including regions that are optically gray For thermonuclear supernovae – Turbulent combustion at low Prandtl number and very high Rayleigh number For nucleosynthesis – Standardized nuclear data in a machine-friendly format Computational - Optimization of codes on parallel computers. Manipulation and visualization of large data sets. Development of novel approaches to radiation transport and hydrodynamics on a grid

First three-dimensional calculation of a core-collapse supernova. This figure shows the iso-velocity contours (1000 km/s) 60 ms after core bounce in a collapsing massive star. Calculated by Fryer and Warren at LANL using SPH (300,000 particles). Resolution is poor and the neutrinos were treated artificially (trapped or freely streaming, no gray region), but such calculations will be used to guide our further code development. The box is 1000 km across.

Nucleosynthesis in a 25 solar mass supernova compared with abundances in the sun. Abundances for over 2000 isotopes of the elements from hydrogen through polonium were followed in each of approximately 1000 zones throughout the life of the star and its simulated explosion as a supernova. Our library will eventually include 300 such models. A production factor of 20 for every isotope would mean that every isotope in the sun could be created if 1 gram in 20 of the primordial sun had been ejected by 25 solar mass supernovae like this one. Actually making the solar abundances requires many stars of different masses.

General Approach Develop an ensemble of appropriate codes – We are exploring several approaches to hydrodynamics, at both high and low Mach numbers. Monte Carlo transport will serve to test and validate other approaches Obtain short term results for guidance – Useful results can be obtained in 3D using a Lagrangian particle based code (SPH) and rudimentary neutrino transport. These can guide future work. Work with in-house experts in computer science – There are experts in radiation hydro – e.g. Monte Carlo, at LANL. The University of Arizona Center for Integrative Modeling and Simulation and High Performance Distributed Computing Laboratory will help with code development and visualization. Work with other SciDAC centers -

Other Related SciDAC Activities Terascale Supernova Initiative – We will make our presupernova models and nuclear data libraries publicly available. We plan joint meetings with the TSI teams High Performance Data Grid Toolkit and DOE Science Grid – These three national co-laboratories and networking centers, working together with computer scientists on our team at Arizona, can help us optimize our codes for large scale, distributed computing environments (Grid computing). The PI’s of these centers have ongoing interactions with our team members at the HPDC laboratory in Arizona Algorithmic and Software Framework for Applied Partial Differential Equations Center – Phil Collela is a co-author of one of our main codes (PPM). We are interested in learning new techniques especially for following low Mach number flows

Other Related SciDAC Activities - continued Terascale High Fidelity Simulations of Turbulent Combustion with Detailed Chemistry – Type Ia (thermonuclear) supernovae are prime examples of turbulent combustion. We have already worked with experts at the Livermore Sandia Combustion Center for years Scalable Systems Software Center and the Center for Component Technology for Terascale Simulation Software – We will collaborate with experts at these two centers to make our codes scalable, component based, and run efficiently in Grid computing environments Particle Physics Data Grid (PPDG) – Grid enabled tools for data intensive requirements. Also possible common interest in Monte Carlo and other particle transport schemes. Discussions begun with Richard Mount.

ADVANCED COMPUTING FOR 21 ST CENTURY ACCELERATOR SCIENCE & TECHONLOGY Accelerators are Crucial to Scientific Discoveries in High Energy Physics, Nuclear Physics, Materials Science, and Biological Science

ASE Partnerships in Applied Math & Comp. Sci. The success of the ASE will require close collaboration with applied mathematicians and computer scientists to enable the development of high-performance software components for terascale platforms. Collaborating SciDAC Integrated Software Infrastructure Centers: TOPS – Eigensolvers, Linear Solvers TSTT – Mesh Generation & Adaptive Refinement CCTTSS – Code Components & Interoperability APDEC – Parallel Solvers on Adaptive Grids PERC – Load Balancing & Communication Collaborating National Labs & Universities: NERSC – Eigensolvers, Linear Solvers Stanford – Linear Algebra, Numerical Algorithms UCD – Parallel Visualization, Multi-resolution techniques LANL – Statistical Methods for Computer Model Evaluation

FNAL, BNL High Intensity Beams in Circular Machines UCLA, USC, Tech-X Plasma Based Accelerator Modeling UC Davis Particle & Mesh Visualization Stanford, NERSC Parallel Linear Solvers & Eigensolvers LBNL Parallel Beam Dynamics Simulation SLAC Large-Scale Electromagnetic Modeling SNL Mesh Generation & Refinement U. Maryland Lie Methods in Accelerator Physics LANL High Intensity Linacs, Computer Model Evaluation M=e :f 2 : e :f 3 : e :f 4 : … N=A -1 M A

Collaborations and Links ISICAcc. SimQCDTSISN Res. TOPS ** ? * CCTTSS ** * TSTT ** * APDEC *** SSS * PERC ** * SDMC ***

More Possible Links LinkAcc. SimQCDTSISN Sci. PPDG ** * SAM* Sci Grid * Networks ** Combustion ** Viz ***