TeraScale Supernova Initiative: A Networker’s Challenge 11 Institution, 21 Investigator, 34 Person, Interdisciplinary Effort.

Slides:



Advertisements
Similar presentations
Magnetic Chaos and Transport Paul Terry and Leonid Malyshkin, group leaders with active participation from MST group, Chicago group, MRX, Wisconsin astrophysics.
Advertisements

Outline Dynamo: theoretical General considerations and plans Progress report Dynamo action associated with astrophysical jets Progress report Dynamo: experiment.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
Cosmological Structure Formation A Short Course
Lunar Advanced Science and Exploration Research: Partnership in Science and Exploration Michael J. Wargo, Sc.D. Chief Lunar Scientist for Exploration Systems.
Modeling Generation and Nonlinear Evolution of VLF Waves for Space Applications W.A. Scales Center of Space Science and Engineering Research Virginia Tech.
UNCLASSIFIED: LA-UR Data Infrastructure for Massive Scientific Visualization and Analysis James Ahrens & Christopher Mitchell Los Alamos National.
Computational Steering on the GRID Using a 3D model to Interact with a Large Scale Distributed Simulation in Real-Time Michael.
Are P2P Data-Dissemination Techniques Viable in Today's Data- Intensive Scientific Collaborations? Samer Al-Kiswany – University of British Columbia joint.
UK e-Science and the White Rose Grid Paul Townend Distributed Systems and Services Group Informatics Research Institute University of Leeds.
Distributed Real-Time Systems for the Intelligent Power Grid Prof. Vincenzo Liberatore.
Expert Group for Virtual Reality in Transport, Manufacturing and Logistics Virtual Reality in Transport, Manufacturing and Logistics VIRTUAL REALITY IN.
WG Goals and Workplan We have a charter, we have a group of interested people…what are our plans? goalsOur goals should reflect what we have listed in.
Cyberinfrastructure: Enabling New Research Frontiers Sangtae “Sang” Kim Division Director – Division of Shared Cyberinfrastructure Directorate for Computer.
Objective of numerical relativity is to develop simulation code and relating computing tools to solve problems of general relativity and relativistic astrophysics.
Gravitational waves from neutron star instabilities: What do we actually know? Nils Andersson Department of Mathematics University of Southampton IAP Paris.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER 1 NERSC Visualization Greenbook Workshop Report June 2002 Wes Bethel LBNL.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
Logistical Networking Micah Beck, Research Assoc. Professor Director, Logistical Computing & Internetworking (LoCI) Lab Computer.
Presented by Leadership Computing Facility (LCF) Roadmap Buddy Bland Center for Computational Sciences Leadership Computing Facility Project.
Responding to the Unexpected Yigal Arens Paul Rosenbloom Information Sciences Institute University of Southern California.
Exploding Massive Stars: The Perfect ‘App’ for Computational Physics SESAPS 2003 John M. Blondin North Carolina State University.
Pascucci-1 Valerio Pascucci Director, CEDMAV Professor, SCI Institute & School of Computing Laboratory Fellow, PNNL Massive Data Management, Analysis,
June 29 San FranciscoSciDAC 2005 Terascale Supernova Initiative Discovering New Dynamics of Core-Collapse Supernova Shock Waves John M. Blondin NC State.
Research Networks and Astronomy Richard Schilizzi Joint Institute for VLBI in Europe
John D. McCoy Principal Investigator Tom McKenna Project Manager UltraScienceNet Research Testbed Enabling Computational Genomics Project Overview.
Office of Science U.S. Department of Energy ESCC Meeting July 21-23, 2004 Network Research Program Update Thomas D. Ndousse Program Manager Mathematical,
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Facilities and How They Are Used ORNL/Probe Randy Burris Dan Million – facility administrator.
Future Multimedia-rich Network Applications Klara Nahrstedt University of Illinois at Urbana-Champaign
F. Douglas Swesty, DOE Office of Science Data Management Workshop, SLAC March Data Management Needs for Nuclear-Astrophysical Simulation at the Ultrascale.
O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY 1 Enabling Supernova Computations by Integrated Transport and Provisioning Methods Optimized.
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
Cielo Integrated Modeling of External Occulters for Exoplanet Missions Jet Propulsion Laboratory, Caltech Institute of Technology.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Experimental Facilities DivisionOak Ridge October 13-15, 2003 Data Visualization and Analysis Ian Anderson Director, Experimental Facilities October 13-15,
National Center for Supercomputing Applications University of Illinois at Urbana–Champaign Visualization Support for XSEDE and Blue Waters DOE Graphics.
HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.
DOE/SciDAC Supernova Science Center (SNSC) S. Woosley (UCSC), A. Burrows (UA), C. Fryer (LANL), R. Hoffman (LLNL)+ 20 researchers.
Core Collapse Supernovae: Power Beyond Imagination
Lawrence H. Landweber National Science Foundation SC2003 November 20, 2003
DOE UltraScience Net The Need –DOE large-scale science applications on supercomputers and experimental facilities require high-performance networking Petabyte.
Simula Research Laboratory Lokaliteter & Forskning
Roadmap to Next Generation Internet: Indian Initiatives Subbu C-DAC, India.
Magneto-hydrodynamic Simulations of Collapsars Shin-ichiro Fujimoto (Kumamoto National College of Technology), Collaborators: Kei Kotake(NAOJ), Sho-ichi.
Differential Equations Linear Equations with Variable Coefficients.
Fuel Cycle Research Thrust Using A Full Fusion Nuclear Environment
An Architectural Approach to Managing Data in Transit Micah Beck Director & Associate Professor Logistical Computing and Internetworking Lab Computer Science.
Logistical Networking: Buffering in the Network Prof. Martin Swany, Ph.D. Department of Computer and Information Sciences.
1 Kostas Glinos European Commission - DG INFSO Head of Unit, Géant and e-Infrastructures "The views expressed in this presentation are those of the author.
Christian Y. Cardall Oak Ridge National Laboratory Physics Division University of Tennessee, Knoxville Department of Physics and Astronomy Terascale Supernova.
Development of magneto- differential-rotational instability in magnetorotational supernovae Sergey Moiseenko, Gennady Bisnovatyi-Kogan Space Research Institute,
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Status of GSDC, KISTI Sang-Un Ahn, for the GSDC Tier-1 Team
Resource Optimization for Publisher/Subscriber-based Avionics Systems Institute for Software Integrated Systems Vanderbilt University Nashville, Tennessee.
Visualization Techniques for Discrete Ordinates Method Radiation Transport Cornelius Toole Lawrence Berkeley National Laboratory Jackson State University.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
E-infrastructure requirements from the ESFRI Physics, Astronomy and Analytical Facilities cluster Provisional material based on outcome of workshop held.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
Particle Physics Sector Young-Kee Kim / Greg Bock Leadership Team Strategic Planning Winter Workshop January 29, 2013.
DIII-D Frontiers Science Proposal Template
DOE Facilities - Drivers for Science: Experimental and Simulation Data
Anthony Mezzacappa (ORNL) Computational Methods in Transport
Simulation of Core Collapse Supernovae
Gravitational SIGNATURE of Core-Collapse Supernovae
Project overview Agenda Project goals How we plan to work together
TeraScale Supernova Initiative
BOLTZMANN-FOKKER-PLANCK KINETIC SOLVER
Presentation transcript:

TeraScale Supernova Initiative: A Networker’s Challenge 11 Institution, 21 Investigator, 34 Person, Interdisciplinary Effort   ascertain the core collapse supernova mechanism(s)   understand supernova phenomenology   e.g.: (1) element synthesis, (2) neutrino, gravitational wave, and gamma ray signatures   provide theoretical foundation in support of OS experimental facilities (RHIC, SNO, RIA, NUSEL)   develop enabling technologies of relevance to many applications   e.g. 3D, multifrequency, precision radiation transport   serve as testbed for development and integration of technologies in simulation “pipeline”   e.g. data management, networking, data analysis, and visualization Explosions of Massive Stars Relevance:  Element Production  Cosmic Laboratories  Driving Application With ISIC and other collaborators: 77 people from 24 institutions involved.

Need Boltzmann Solution   Need Angular Distribution   Need Spectrum   Need Neutrino Distribution   Fluid Instabilities   Rotation   Magnetic Fields 6D RMHD Problem! Need these to few percent accuracy!

Spherical Symmetry Axisymmetry No Symmetry Example: Boltzmann transport equation for spherical symmetry. Dominant Computation: Nonlinear, integro-partial differential equations for the radiation distribution functions.

3D Hydrodynamics Run   5 Variables (Density, Entropy, Three Fluid Velocities)   1024 X 1024 X 1024 Cartesian Grid   1000 Time Steps 43 Terabyte Dataset “The flea on the tail on the dog…” Multidimensional Neutrino Data Data Management 13 Petabyte Dataset ~3 Petabyte Dataset...in weeks to months on a PF platform.

Networking Raw Bandwidth Needs What about the radiation field data? Need end-to-end dedicated paths/bandwidth, on demand.   Interactive visualization.   Real-time collaboration. Need protocols that provide this capability. None exist that will give 10 Gbps throughputs and stable control.   Work with Nagi Rao 3 PB! Bulk Data Transfer Needs Needs for Collaborative Visualization

Addressing Bulk Data Transfer Needs: Logistical Networking   Light Weight   Low Level   Deployable … Solution   New Paradigm   Integrate storage and networking.   Multi-source, multi-stream. Atchley, Beck, and Moore (2003) Data transfer rates Mbps using TCP/IP! Limit set by ORNL firewall. Greater rates expected   outside firewall,   other protocols (e.g., Sabul). Direct impact on TSI’s ability to do work!

  Without putting in place the needed computational science infrastructure, our science will simply be inaccessible in the future.   Significant progress has been made in the areas of   linear solvers,   performance analysis and optimization,   data management and analysis,   networking,   and visualization.   In particular, Logistical Networking has provided an easily deployable solution to our current bulk data transfer needs and has had a significant impact on TSI’s current ability to do science.   TSI’s future data management and networking needs are daunting. TSI will generate hundreds of TeraBytes of simulation data per simulation within the next two years. What then?   Meeting these needs will require every new idea. Investment now in networking technologies will allow us to meet these needs.