Electron Ion Collider New aspects of EIC experiment instrumentation and computing, as well as their possible impact on and context in society (B) COMPUTING
Trends in NP Computing Don Geesaman (ANL) “It will be joint progress of theory and experiment that moves us forward, not in one side alone.” Martin Savage (INT) “The next decade will be looked back upon as a truly astonishing period in NP and in our understanding of fundamental aspects of nature. This will be made possible by advances in scientific computing (…)” Exascale 2021 ASCR The Department’s Exascale Computing Initiative intends to accelerate delivery of at least one exascale-capable system in 2021.
Computing Trends and EIC Computing Think out of the box The way physics analyses are done has been largely shaped by the kinds of computing that has been available so far. Computing begins to grow in very different ways in the future, driven by very different aspects than in the past (e.g., Exascale Computing Initiative). This is an unique opportunity for NP to think about new possibilities and paradigms that can and should arise (e.g., online calibrations and analysis). Future compatibility hardware and software Exascale Computing - Most powerful future computers will likely be very different from the kind of computers currently used in NP. This requires a modular design with structures robust against likely changes in computing environment so that changes in underlying code can be handled without an entire overhaul of the structure. User centered design for enhancing scientific productivity Engage wider community of physicists, whose primary interest is not computing, in software design to understand the user requirements first and foremost and make design decisions largely based on user requirements.
Implications of Exascale Computing In the era of Exascale Computing, petascale-capable systems will be available at the beamline and would allow for an unprecedented integration of detector components and computation. A computer-detector integration would require fundamentally different algorithms but would eliminate at least some of the constraints of computer off-detector that result in physics trade-offs. Petascale computing at the beamline would facilitate a computing model that extends from the work that is going on with LHCb now that relies on machine learning at the trigger level and a computer-detector integration to deliver analysis ready data from the DAQ system, i.e. online calibrations, event reconstruction, and physics analysis in real time. A similar approach would allow accelerator operations to use simultaneous simulations and deep learning over operational parameters to tune the machine for performance.
Benefits to Large Scale Computing Novel computer architectures first realized by Lattice QCD theorists in the 2000-2005 period by a collaboration of Columbia, IBM and RBRC have allowed U.S. computer manufacturers to gain world leadership in capability computing. The knowledge in the use of field programmable gate arrays and graphics cards for the low-cost solution of extremely CPU-intensive, repetitive computations was first implemented in the Jefferson Lab Lattice QCD calculations. The combination of using graphics processing units (GPU) and algorithm development has been proven to establish speedup of computations with gains of 4 to 11. This is now applied on leadership GPU systems such as DOE Titan (ORNL) and NSF Blue Waters (NCSA - University of Illinois). The effort to boost the computing capabilities will continue with the Exascale Computing Initiative and will enable an era of QCD calculations with high precision and high accuracy. Past efforts in lattice QCD in collaboration with industry have driven development of new computing paradigms that benefit large scale computation. These capabilities underpin many important scientific challenges, e.g. studying climate and heat transport over the Earth. The EIC will be the facility in the era of high precision QCD and the first NP facility in the era of Exascale Computing. This will affect the interplay of experiment, simulations, and theory profoundly and result in a new computing paradigm that can be applied to other fields of science and industry.
Towards a computing vision for the EIC Extremely broad science program Strong interplay between theory and experiment Lessons learned from LHC Computing central to success of scientific goals Complexity of analysis ecosystem limits time on physics analysis Strong role of deep learning Era of Exascale Computing Changing the paradigm for I/O, storage and computation High-precision QCD calculations (MC, Lattice QCD) Computing requirements Integration of DAQ, analysis and theory Seamless data processing from DAQ and trigger system to data analysis using artificial intelligence Flexible, modular analysis ecosystem