Algorithms for Lattice Field Theory at Extreme Scales Rich Brower 1*, Ron Babich 1, James Brannick 2, Mike Clark 3, Saul Cohen 1, Balint Joo 4, Tony Kennedy.

Slides:



Advertisements
Similar presentations
SciDAC Software Infrastructure for Lattice Gauge Theory
Advertisements

Lecture 1: basics of lattice QCD Peter Petreczky Lattice regularization and gauge symmetry : Wilson gauge action, fermion doubling Different fermion formulations.
Some Specific Projects Modeling and HPC US France Young Engineering Scientists symposium.
Lattice Quantum Chromodynamic for Mathematicians Richard C. Brower Yale University May Day 2007 Tutorial in “ Derivatives, Finite Differences and.
Saturday, 02 May 2015 Speeding up HMC with better integrators A D Kennedy and M A Clark School of Physics & SUPA, The University of Edinburgh Boston University.
Lattice QCD Comes of Age y Richard C. Brower XLIst Rencontres de Moriond March QCD and Hadronic interactions at high energy.
P. Vranas, IBM Watson Research Lab 1 Gap domain wall fermions P. Vranas IBM Watson Research Lab.
Adaptive Multigrid for Lattice QCD James Brannick October 22, 2007 Penn State
GPGPU Introduction Alan Gray EPCC The University of Edinburgh.
Lattice QCD (INTRODUCTION) DUBNA WINTER SCHOOL 1-2 FEBRUARY 2005.
CS 290H 7 November Introduction to multigrid methods
 Product design optimization Process optimization Reduced experimentation Physical system Process model Product model Product Market need Multiscale Modeling.
Lattice QCD in Nuclear Physics Robert Edwards Jefferson Lab NERSC 2011 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.:
Volkan Cevher, Marco F. Duarte, and Richard G. Baraniuk European Signal Processing Conference 2008.
Reconfigurable Application Specific Computers RASCs Advanced Architectures with Multiple Processors and Field Programmable Gate Arrays FPGAs Computational.
Two Approaches to Multiphysics Modeling Sun, Yongqi FAU Erlangen-Nürnberg.
QCD – from the vacuum to high temperature an analytical approach an analytical approach.
1 Aug 7, 2004 GPU Req GPU Requirements for Large Scale Scientific Applications “Begin with the end in mind…” Dr. Mark Seager Asst DH for Advanced Technology.
A status report of the QCDSF N f =2+1 Project Yoshifumi Nakamura (NIC/DESY) for the QCDSF collaboration Lattice Regensburg Aug. 3, 2007.
SciDAC Software Infrastructure for Lattice Gauge Theory DOE Grant ’01 -- ’03 (-- ’05?) All Hands Meeting: FNAL Feb. 21, 2003 Richard C.Brower Quick Overview.
Module on Computational Astrophysics Jim Stone Department of Astrophysical Sciences 125 Peyton Hall : ph :
1 Parallel Simulations of Underground Flow in Porous and Fractured Media H. Mustapha 1,2, A. Beaudoin 1, J. Erhel 1 and J.R. De Dreuzy IRISA – INRIA.
KIT – University of the State of Baden-Württemberg and National Laboratory of the Helmholtz Association Institute for Data Processing and Electronics.
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
October 26, 2006 Parallel Image Processing Programming and Architecture IST PhD Lunch Seminar Wouter Caarls Quantitative Imaging Group.
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 11 Some materials adapted from Prof. Keith E. Gubbins:
Basis Light-Front Quantization: a non-perturbative approach for quantum field theory Xingbo Zhao With Anton Ilderton, Heli Honkanen, Pieter Maris, James.
Priority Research Direction Key challenges Fault oblivious, Error tolerant software Hybrid and hierarchical based algorithms (eg linear algebra split across.
Domain Applications: Broader Community Perspective Mixture of apprehension and excitement about programming for emerging architectures.
Simulating Quarks and Gluons with Quantum Chromodynamics February 10, CS635 Parallel Computer Architecture. Mahantesh Halappanavar.
Dynamical Chirally Improved Quarks: First Results for Hadron MassesC.B. Lang : Dynamical Chirally Improved Quarks: First Results for Hadron Masses C. B.
QCD Project Overview Ying Zhang September 26, 2005.
Lattice QCD in Nuclear Physics Robert Edwards Jefferson Lab CCP 2011 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.:
Chasing Shadows A D Kennedy University of Edinburgh Tuesday, 06 October 2015 QCD & NA, Regensburg.
Extreme scale parallel and distributed systems – High performance computing systems Current No. 1 supercomputer Tianhe-2 at petaflops Pushing toward.
Extreme-scale computing systems – High performance computing systems Current No. 1 supercomputer Tianhe-2 at petaflops Pushing toward exa-scale computing.
R. Ryne, NUG mtg: Page 1 High Energy Physics Greenbook Presentation Robert D. Ryne Lawrence Berkeley National Laboratory NERSC User Group Meeting.
Excited State Spectroscopy using GPUs Robert Edwards Jefferson Lab TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A.
4.2.1 Programming Models Technology drivers – Node count, scale of parallelism within the node – Heterogeneity – Complex memory hierarchies – Failure rates.
Statistical Sampling-Based Parametric Analysis of Power Grids Dr. Peng Li Presented by Xueqian Zhao EE5970 Seminar.
MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation 1 Dynamic Sensor Resource Management for ATE MURI.
NIH Resource for Biomolecular Modeling and Bioinformatics Beckman Institute, UIUC NAMD Development Goals L.V. (Sanjay) Kale Professor.
1 2+1 Flavor lattice QCD Simulation on K computer Y.Kuramashi U. of Tsukuba/RIKEN AICS August 2, Mainz.
Computational Aspects of Multi-scale Modeling Ahmed Sameh, Ananth Grama Computing Research Institute Purdue University.
Lattice QCD and GPU-s Robert Edwards, Theory Group Chip Watson, HPC & CIO Jie Chen & Balint Joo, HPC Jefferson Lab TexPoint fonts used in EMF. Read the.
Digital Intuition Cluster, Smart Geometry 2013, Stylianos Dritsas, Mirco Becker, David Kosdruy, Juan Subercaseaux Welcome Notes Overview 1. Perspective.
International Exascale Software Co-Design Workshop: Initial Findings & Future Plans William Tang Princeton University/Princeton Plasma Physics Lab
SciDAC Software Infrastructure for Lattice Gauge Theory Richard C. Brower QCD Project Review May 24-25, 2005 Code distribution see
Leverage the data characteristics of applications and computing to reduce the communication cost in WSNs. Design advanced algorithms and mechanisms to.
Multiple Spatial and Temporal Scales The Challenge For optimal product design which spatial and temporal scales should be resolved?
The Old Well 3/15/2003 AMS 2003 Spring Southeastern Sectional Meeting 1 An adaptive method for coupled continuum-molecular simulation of crack propagation.
Probing TeV scale physics in precision UCN decays Rajan Gupta Theoretical Division Los Alamos National Lab Lattice 2013 Mainz, 30 July 2013 Superconducting.
Outline Introduction Research Project Findings / Results
MULTISCALE COMPUTATIONAL METHODS Achi Brandt The Weizmann Institute of Science UCLA
Status and plans at KEK Shoji Hashimoto Workshop on LQCD Software for Blue Gene/L, Boston University, Jan. 27, 2006.
An Introduction to Lattice QCD and Monte Carlo Simulations Sinya Aoki Institute of Physics, University of Tsukuba 2005 Taipei Summer Institute on Particles.
Some GPU activities at the CMS experiment Felice Pantaleo EP-CMG-CO EP-CMG-CO 1.
University of Texas at Arlington Scheduling and Load Balancing on the NASA Information Power Grid Sajal K. Das, Shailendra Kumar, Manish Arora Department.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
INTRODUCTION TO NUCLEAR LATTICE EFFECTIVE FIELD THEORY Young-Ho Song (RISP, Institute for Basic Science) RI meeting, Daejeon,
Defining the Competencies for Leadership- Class Computing Education and Training Steven I. Gordon and Judith D. Gardiner August 3, 2010.
Panel: Beyond Exascale Computing
Baryons on the Lattice Robert Edwards Jefferson Lab Hadron 09
Electron Ion Collider New aspects of EIC experiment instrumentation and computing, as well as their possible impact on and context in society (B) COMPUTING.
Condensate enhancement for mass generation in SU(3) gauge theory
1888pressrelease - "Quantum Chromodynamics (QCD)"
Chroma: An Application of the SciDAC QCD API(s)
Excited State Spectroscopy from Lattice QCD
What are Multiscale Methods?
Martin Croome VP Business Development GreenWaves Technologies.
Presentation transcript:

Algorithms for Lattice Field Theory at Extreme Scales Rich Brower 1*, Ron Babich 1, James Brannick 2, Mike Clark 3, Saul Cohen 1, Balint Joo 4, Tony Kennedy 5, James Osborn 6, Claudio Rebbi 1 1 Boston University, 2 Penn State, 3 Harvard-Smithsonian, 4 Jefferson Lab., 5 U. of Edinburgh, 6 Argonne Lab., *presenter: The exascale era promises to dramatically expand the ability of lattice field theory to investigate the multiple scales in nuclear physics, the quark-gluon plasma as well as possible dynamics beyond the standard model. Increasingly complex scale-aware QCD algorithms are a challenge to software engineering and the co-design of heterogeneous architectures to support them. At present the multi-GPU cluster offers a useful preview of the challenge at the level of 100's of cores per node with a relatively low bandwidth interconnect. Development of new algorithms to meet this challenging architecture include communication reduction by (Schwarz) domain decomposition, multi-precision arithmetic, data compression and on the fly local reconstruction. The QUDA library[4] developed at Boston University is being used as a software platform for these early investigation. smoothing Fine Grid Smaller Coarse Grid restriction prolongation (interpolation) The Multigrid V-cycle For example in QCD, uniquely non-perturbative quantum effects spontaneously break conformal and chiral symmetry giving rise to a ratio of length scales: m proton /m ¼ = 7, which only recent advances in simulation are just beginning to fully resolve. As a consequence the most efficient Dirac solvers are just now becoming mult-scaled as well. Future advances will reveal more opportunities for multi-scale algorithms. The core algorithms are 1.Sparse matrix solvers for the quark propagating in a “turbulent” chromo-electromagnetic background field. 2.Symplectic integrators for the Hamiltonian evolution to produce an stochastic ensemble of these fields. Current state of the art QCD algorithms exploit the simplicity of a uniform space-time hypercubic lattice grid mapped onto a homogenous target architecture to achieve nearly ideal scaling. Nonetheless, this single grid paradigm is very likely to be modified substantially at extreme scales. Neither the lattice physics nor computer hardware are intrinsically single scaled. Lattice Field Theory Core Algorithms Visualization[1] of the scales in a gluonic ensemble a(lattice) << 1/m proton << 1/m ¼ << L (box) At the same time hardware suited to exascale performance is expected to become increasingly heterogeneous with O(1000) cores per node, coupled to hierarchical networks -- the GPU cluster being an early precursor of this trend. As a test of concept the Wilson multigrid inverter combined with GPU technology is estimated to reduce the cost per Dirac solve by O(100). Heterogeneous Computer Architectures Multi-scale Challenge GPU Testbed and Future Directions Hamiltonian Integrators Adaptive Multigrid Inverter Adaptive multigrid automatically discovers the near null space to construct the coarse grid operator. Applied to the Wilson-clover Dirac inverter on 32 3 x256 lattice, it outperforms single grid methods by 20x at the lightest quark mass[2]. Extensions of adaptive multigrid are under development for Domain Wall and Staggered fermions as well as to Hamiltonian evolution for lattice ensembles The Force Gradient integrator is an optimized 4 th order multi-time step algorithm for Hybrid Monte Carlo (HMC) sampling of the gauge field ensemble. The error in the true Hamiltonian plotted as function of step size demonstrates its superiority for light quark masses[3]. [1] Dereck Leinweber, [2] R. Babich, J. Brannick, R. Brower, M. Clark, T. Manteuffel, S. McCormick, J. Osborn, C. Rebbi, arXiv: [3] A. D. Kennedy, M. A. Clark, P. J. Silva, arXiv: [4] M. A. Clark, R. Babich, K. Barros, R. C. Brower, C. Rebbi,. arXiv: v2, 1.7x 12x 20x