Electron Ion Collider New aspects of EIC experiment instrumentation and computing, as well as their possible impact on and context in society (B) COMPUTING.

Slides:



Advertisements
Similar presentations
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
Advertisements

The Role of Environmental Monitoring in the Green Economy Strategy K Nathan Hill March 2010.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
High-Performance Computing
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update June 12,
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
Computer Science Prof. Bill Pugh Dept. of Computer Science.
DATA PRESERVATION IN ALICE FEDERICO CARMINATI. MOTIVATION ALICE is a 150 M CHF investment by a large scientific community The ALICE data is unique and.
Welcome to HTCondor Week #14 (year #29 for our project)
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
International collaboration in high energy physics experiments  All large high energy physics experiments today are strongly international.  A necessary.
THE REGIONAL MUNICIPALITY OF YORK Information Technology Strategy & 5 Year Plan.
Simulating Quarks and Gluons with Quantum Chromodynamics February 10, CS635 Parallel Computer Architecture. Mahantesh Halappanavar.
The Climate Prediction Project Global Climate Information for Regional Adaptation and Decision-Making in the 21 st Century.
Introduction and Overview Questions answered in this lecture: What is an operating system? How have operating systems evolved? Why study operating systems?
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, ADVANCED SCIENTIFIC COMPUTING RESEARCH An.
Future role of DMR in Cyber Infrastructure D. Ceperley NCSA, University of Illinois Urbana-Champaign N.B. All views expressed are my own.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update September.
Physics Steven Gottlieb, NCSA/Indiana University Lattice QCD: focus on one area I understand well. A central aim of calculations using lattice QCD is to.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
SLUO LHC Workshop: Closing RemarksPage 1 SLUO LHC Workshop: Closing Remarks David MacFarlane Associate Laboratory Directory for PPA.
Crystal Ball Panel ORNL Heterogeneous Distributed Computing Research Al Geist ORNL March 6, 2003 SOS 7.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
BESAC Dec Outline of the Report I. A Confluence of Scientific Opportunities: Why Invest Now in Theory and Computation in the Basic Energy Sciences?
1 Planning for Reuse (based on some ideas currently being discussed in LHCb ) m Obstacles to reuse m Process for reuse m Project organisation for reuse.
Contractor Assurance System Peer Review April Page 1 NSTAR 2011 May 16, 2011 Jefferson Lab Hugh Montgomery.
ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL.
National Strategic Computing Initiative
World Climate Research Programme Joint Scientific Committee – 31 Antalya, Turkey.
Why A Software Review? Now have experience of real data and first major analysis results –What have we learned? –How should that change what we do next.
National Center for Supercomputing Applications University of Illinois at Urbana–Champaign Visualization Support for XSEDE and Blue Waters DOE Graphics.
HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
CERN VISIONS LEP  web LHC  grid-cloud HL-LHC/FCC  ?? Proposal: von-Neumann  NON-Neumann Table 1: Nick Tredennick’s Paradigm Classification Scheme Early.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
CPM 2012, Fermilab D. MacFarlane & N. Holtkamp The Snowmass process and SLAC plans for HEP.
Scrum BUT Research Adapting Agile principles in a research environment Gary Morgan Chief Science and Technology Office March 16, 2009 PNNL-SA
GEO Strategic Plan : Implementing GEOSS Douglas Cripe GEO Work Programme Symposium 2-4 May 2016, Geneva.
Particle Physics Sector Young-Kee Kim / Greg Bock Leadership Team Strategic Planning Winter Workshop January 29, 2013.
Centre of Excellence in Physics at Extreme Scales Richard Kenway.
overview of activities on High Performance Computing
Geoffrey Fox Panel Talk: February
Workshop Concluding Remarks
H2020, COEs and PRACE.
Advancing an Open Source Service Oriented Architecture (SOA) Electronic Health Ecosystem Open Health Tools New Charter Presentation Mario G. Hyland Senior.
Mechanical & Manufacturing Engineering Program
Future Trends in Nuclear Physics Computing Workshop
Tohoku University, Japan
for the Offline and Computing groups
Enabling machine learning in embedded systems
Dynamic Data Driven Application Systems
Generic R&D for an EIC: Developing Analysis Tools and Techniques for the EIC Whitney Armstrong (ANL), Elke-Caroline Aschenauer (BNL), Franco Bradamante.
NP-ASCR Workshop Purposes of review
Action IC0603 Antenna Systems & Sensors for Information Society Technologies (ASSIST) Participating countries: BE, BG, CH, CY, CZ, DE, DK, EE, ES, FI,
Accepted Students Program
Chapter 16 Nursing Informatics: Improving Workflow and Meaningful Use
.Stat Suite built by the SIS-CC
Scientific Computing At Jefferson Lab
Dynamic Data Driven Application Systems
Nuclear Physics Data Management Needs Bruce G. Gibbard
Computing Overview Amber Boehnlein.
World Federation of Engineering Organization WFEO
Scientific Computing Strategy
SOLID Collaboration Meeting
The Barcelona Supercomputing Center
Feedback from the Temple Town Meeting MEIC Accelerator R&D Meeting
Presentation transcript:

Electron Ion Collider New aspects of EIC experiment instrumentation and computing, as well as their possible impact on and context in society (B) COMPUTING

Trends in NP Computing Don Geesaman (ANL) “It will be joint progress of theory and experiment that moves us forward, not in one side alone.” Martin Savage (INT) “The next decade will be looked back upon as a truly astonishing period in NP and in our understanding of fundamental aspects of nature. This will be made possible by advances in scientific computing (…)” Exascale 2021 ASCR The Department’s Exascale Computing Initiative intends to accelerate delivery of at least one exascale-capable system in 2021.

Computing Trends and EIC Computing Think out of the box The way physics analyses are done has been largely shaped by the kinds of computing that has been available so far. Computing begins to grow in very different ways in the future, driven by very different aspects than in the past (e.g., Exascale Computing Initiative). This is an unique opportunity for NP to think about new possibilities and paradigms that can and should arise (e.g., online calibrations and analysis). Future compatibility hardware and software Exascale Computing - Most powerful future computers will likely be very different from the kind of computers currently used in NP. This requires a modular design with structures robust against likely changes in computing environment so that changes in underlying code can be handled without an entire overhaul of the structure. User centered design for enhancing scientific productivity Engage wider community of physicists, whose primary interest is not computing, in software design to understand the user requirements first and foremost and make design decisions largely based on user requirements.

Implications of Exascale Computing In the era of Exascale Computing, petascale-capable systems will be available at the beamline and would allow for an unprecedented integration of detector components and computation. A computer-detector integration would require fundamentally different algorithms but would eliminate at least some of the constraints of computer off-detector that result in physics trade-offs. Petascale computing at the beamline would facilitate a computing model that extends from the work that is going on with LHCb now that relies on machine learning at the trigger level and a computer-detector integration to deliver analysis ready data from the DAQ system, i.e. online calibrations, event reconstruction, and physics analysis in real time. A similar approach would allow accelerator operations to use simultaneous simulations and deep learning over operational parameters to tune the machine for performance.

Benefits to Large Scale Computing Novel computer architectures first realized by Lattice QCD theorists in the 2000-2005 period by a collaboration of Columbia, IBM and RBRC have allowed U.S. computer manufacturers to gain world leadership in capability computing. The knowledge in the use of field programmable gate arrays and graphics cards for the low-cost solution of extremely CPU-intensive, repetitive computations was first implemented in the Jefferson Lab Lattice QCD calculations. The combination of using graphics processing units (GPU) and algorithm development has been proven to establish speedup of computations with gains of 4 to 11. This is now applied on leadership GPU systems such as DOE Titan (ORNL) and NSF Blue Waters (NCSA - University of Illinois). The effort to boost the computing capabilities will continue with the Exascale Computing Initiative and will enable an era of QCD calculations with high precision and high accuracy. Past efforts in lattice QCD in collaboration with industry have driven development of new computing paradigms that benefit large scale computation. These capabilities underpin many important scientific challenges, e.g. studying climate and heat transport over the Earth. The EIC will be the facility in the era of high precision QCD and the first NP facility in the era of Exascale Computing. This will affect the interplay of experiment, simulations, and theory profoundly and result in a new computing paradigm that can be applied to other fields of science and industry.

Towards a computing vision for the EIC Extremely broad science program Strong interplay between theory and experiment Lessons learned from LHC Computing central to success of scientific goals Complexity of analysis ecosystem limits time on physics analysis Strong role of deep learning Era of Exascale Computing Changing the paradigm for I/O, storage and computation High-precision QCD calculations (MC, Lattice QCD) Computing requirements Integration of DAQ, analysis and theory Seamless data processing from DAQ and trigger system to data analysis using artificial intelligence Flexible, modular analysis ecosystem