Computational Requirements for NP Robert Edwards Jefferson Lab TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAAAAAAA.

Slides:



Advertisements
Similar presentations
To find out more or to apply, please visit our career portal and post your CV. goodyear-dunlop.com/career The Opportunity Develop and apply skill to analyze.
Advertisements

Strategy for Nuclear Physics Scope and Range of Physics Current Projects Future Projects Other issues Balance of Programme, Theory.
What do we currently mean by Computational Science? Traditionally focuses on the “hard sciences” and engineering –Physics, Chemistry, Mechanics, Aerospace,
Test Automation Success: Choosing the Right People & Process
Excited State Spectroscopy from Lattice QCD
Nuclear Physics in the SciDAC Era Robert Edwards Jefferson Lab SciDAC 2009 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Presentation at WebEx Meeting June 15,  Context  Challenge  Anticipated Outcomes  Framework  Timeline & Guidance  Comment and Questions.
Electromagnetic Force QED Strong Force QCD Gravitational Force General Relativity galaxy m matter m crystal m atom m atomic nucleus.
High-Performance Computing
Maria Grazia Pia, INFN Genova 1 Part V The lesson learned Summary and conclusions.
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
Lattice QCD in Nuclear Physics Robert Edwards Jefferson Lab NERSC 2011 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.:
1 Intellectual Architecture Leverage existing domain research strengths and organize around multidisciplinary challenges Institute for Computational Research.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
The science of simulation falsification algorithms phenomenology machines better theories computer architectures non-perturbative QFT experimental tests.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
1 Challenges Facing Modeling and Simulation in HPC Environments Panel remarks ECMS Multiconference HPCS 2008 Nicosia Cyprus June Geoffrey Fox Community.
Board on Career Development: Strategic Planning David E. Lee Chair Board on Career Development 25 February 2013.
GlueX Collaboration Meeting February 2011 Jefferson Lab Our 30’th Collaboration Meeting.
Role of Deputy Director for Code Architecture and Strategy for Integration of Advanced Computing R&D Andrew Siegel FSP Deputy Director for Code Architecture.
Lattice QCD and GPU-s Robert Edwards, Theory Group Chip Watson, HPC & CIO Jie Chen & Balint Joo, HPC Jefferson Lab TexPoint fonts used in EMF. Read the.
Simulating Quarks and Gluons with Quantum Chromodynamics February 10, CS635 Parallel Computer Architecture. Mahantesh Halappanavar.
Partnerships and Broadening Participation Dr. Nathaniel G. Pitts Director, Office of Integrative Activities May 18, 2004 Center.
QCD Project Overview Ying Zhang September 26, 2005.
Lattice QCD in Nuclear Physics Robert Edwards Jefferson Lab CCP 2011 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.:
Future role of DMR in Cyber Infrastructure D. Ceperley NCSA, University of Illinois Urbana-Champaign N.B. All views expressed are my own.
Physics Steven Gottlieb, NCSA/Indiana University Lattice QCD: focus on one area I understand well. A central aim of calculations using lattice QCD is to.
Excited State Spectroscopy using GPUs Robert Edwards Jefferson Lab TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A.
Hadron Spectroscopy from Lattice QCD
BESAC Dec Outline of the Report I. A Confluence of Scientific Opportunities: Why Invest Now in Theory and Computation in the Basic Energy Sciences?
1 Direction scientifique Networks of Excellence objectives  Reinforce or strengthen scientific and technological excellence on a given research topic.
Lattice QCD and GPU-s Robert Edwards, Theory Group Chip Watson, HPC & CIO Jie Chen & Balint Joo, HPC Jefferson Lab TexPoint fonts used in EMF. Read the.
SURA BOT 11/5/02 Lattice QCD Stephen J Wallace. SURA BOT 11/5/02 Lattice.
1 DOE Office of Science October 2003 SciDAC Scientific Discovery through Advanced Computing Alan J. Laub.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
National Strategic Computing Initiative
Workshop on Women in Science and Engineering Latifa Elouadrhiri Jefferson Lab November 16, 2009.
Funding: Staffing for Research Computing What staffing models does your institution use for research computing? How does your institution pay for the staffing.
BESAC August Part III IV. Connecting Theory with Experiment V. The Essential Resources for Success Co-Chairs Bruce Harmon – Ames Lab and Iowa.
A 10 YEAR OUTLOOK A REPORT BY THE NSF ADVISORY COMMITTEE FOR ENVIRONMENTAL RESEARCH & EDUCATION SPONSORED BY THE NATIONAL SCIENCE FOUNDATION SEPTEMBER.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.
May 15-16, 2003RHIC-AGS User Group Meeting NSF Nuclear Physics Budget Overview: FY2003, FY2004 NSF Initiatives Physics Division issues.
ComPASS Summary, Budgets & Discussion Panagiotis Spentzouris, Fermilab ComPASS PI.
The Performance Evaluation Research Center (PERC) Participating Institutions: Argonne Natl. Lab.Univ. of California, San Diego Lawrence Berkeley Natl.
U.S. Department of Energy’s Office of Science Midrange Scientific Computing Requirements Jefferson Lab Robert Edwards October 21, 2008.
Collaboration between University- National Lab-Industry It is in the national interest to foster and support a vibrant and dynamic research infrastructure.
Professor Arthur Trew Director, EPCC EPCC: KT in novel computing.
CERN VISIONS LEP  web LHC  grid-cloud HL-LHC/FCC  ?? Proposal: von-Neumann  NON-Neumann Table 1: Nick Tredennick’s Paradigm Classification Scheme Early.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
The outlook for Nuclear Theory Niels Walet School of Physics and Astronomy University of Manchester.
CPM 2012, Fermilab D. MacFarlane & N. Holtkamp The Snowmass process and SLAC plans for HEP.
High Performance Computing Activities at Fermilab James Amundson Breakout Session 5C: Computing February 11, 2015.
Hall D Computing Facilities Ian Bird 16 March 2001.
Centre of Excellence in Physics at Extreme Scales Richard Kenway.
LQCD Computing Project Overview
Baryons on the Lattice Robert Edwards Jefferson Lab Hadron 09
Electron Ion Collider New aspects of EIC experiment instrumentation and computing, as well as their possible impact on and context in society (B) COMPUTING.
Future Trends in Nuclear Physics Computing Workshop
ICT NCP Infoday Brussels, 23 June 2010
NP-ASCR Workshop Purposes of review
Data Issues Julian Borrill
Nucleon Resonances from Lattice QCD
Scientific Computing At Jefferson Lab
Excited State Spectroscopy from Lattice QCD
Excited State Spectroscopy from Lattice QCD
Scientific Computing Strategy
Defining the Grid Fabrizio Gagliardi EMEA Director Technical Computing
Feedback from the Temple Town Meeting MEIC Accelerator R&D Meeting
Presentation transcript:

Computational Requirements for NP Robert Edwards Jefferson Lab TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAAAAAAA

Cold and Hot QCD Generate “snapshots” of gluon fields –Integrate a D=4+1 version of Hamilton’s eqs. –Expensive part: builds in all quantum fluctuations Correlation functions computed over “snapshots”

Nuclear Structure+Reactions “Ab-initio” methods –Green’s Func. MC, No-core Shell Model(s), Coupled-cluster Density-func. method

Nuclear Astrophysics The works! –QCD+QED+Weak(neutrinos)+Gravity (including Gen. Rel.) –Structure, reactions, detailed 3d radiation HD, time-evolution… FLASH: thermodynamic supernovae Core-collapse supernovae

Leadership resources essential

The “wide world” of performance improvement: LQCD as representative Time-consumer: solving Dirac equation (large sparse linear system) Heterogeneous system (cpu+gpu – TitanDev) Increased performance: domain decomp. methods (not bulk synchronous) Example of path forward: collab. with AM/CS & in this case also industry! 100 TF-s on 768 gpus

Trends Architectures going “wide” – not (as) fast A single narrative (NVIDIA, Intel, etc.) – only degree of wideness These new architectures imply/open-up new opportunities for parallelism Bulk-synchronization bad. New programming models needed. Managing large simulations will be a challenge. Might will be application specific.

Available cycles in US LQCD (Cold+Hot) –USQCD facilities: (funded by HEP and NP): 300M core-hrs (clusters - capacity) 4.7M gpu-hrs ! 140M core-hrs Roughly half to NP ! 220M –[NP]ALCC+INCITE +NERSC+NSF+Regional ! 200M [capability] –Total to NP ! 420M core-hrs Nuclear structure+reactions –ALCC+INCITE ! 12+60(+)M ! 75M Nuclear astrophysics –INCITE ! 150M Across NP (lqcd+structure+astro) –~ 650M core-hrs

Computing Requirements Rough conversion factor estimated from LQCD calculations –1 B core-hrs ! 0.1 PF-yr Compare to resource requirements in NP Exascale report Where do we need to be to deliver on NSAC milestones? Where does Moore’s Law get us?

Timelines: Cold QCD

Properties of dense QCD Desired Trajectory Flat Trajectory (Moore’s law) Hot & Dense QCD

Nuclear Reactions Desired Trajectory Flat Trajectory (Moore’s law) Microscopic Theory of Fission Nuclear Structure and Reactions

Solar modelingCore-collapse supernovae Nuclear Astrophysics Desired Trajectory Flat Trajectory (Moore’s law)

How to deliver on science? Clearly, more computational resources essential to deliver on science Moore’s law does not get us there However, unrealistic to expect funding agencies to solve the order of magnitude flop problem Solution: –Algorithms tied to emerging architectures –People Consider a Cold QCD milestone

2018 NSAC milestone: exotic mesons JLab 12 GeV turns on

Delivering on science Increased levels of HPC (cycles) Improve algorithms & extract more performance Partnership with CS & AM: SciDAC Partnerships with Vendors Partnerships with Leadership Computing Facilities Increased level of HPC staffing

Funding profiles for NP Theory computations Computation (cycles) –DOE & NSF Centers [increase ~5x in 2013] –USQCD facilities Software infrastructure, Interdisciplinary support –DOE: SciDAC, Topical Centers –NSF: MRI, PIF, former PetaApps Base funding/leveraging –Leverage off univ. as well as lab positions –Bridge/joint positions with labs – can (have!) be targeted for computations In particular, concern over software/interdisciplinary support –Support for junior positions –Support for work beyond(outside) base funding

SciDAC, topical centers, … SciDAC: – : ~$3M/yr – : ~$3M/yr –2012: ~~ $3M/yr –2013: $1M –2014: ?? DOE Topical centers: – : ~$1M/yr

Answers for Tribble Committee What is the minimum level of support needed to maintain a viable program in computational nuclear physics? At the minimum, a vibrant and healthy program of computational nuclear physics should advance nuclear theory, enhance our understanding of experimental results, and be competitive internationally. A vibrant program will support the achievement of NSAC milestones and deliver a compelling case for continued support of the Nuclear Physics program within the US. What workforce is needed to maintain a viable program? What will it require to take the community to the exascale era? Computational nuclear physics bridges many areas of science, and as such, the expertise of a broad range of individuals including physicists, computer scientists, applied mathematicians, as well as students is vital to the success of the program. As computational environments become more diverse, it is crucial that the workforce include those individuals with skills sufficient to master these challenges. This workforce will follow the emerging trends in these architectures and will develop the algorithms and software infrastructure to enable their exploitation and advance the overall program. Interdisciplinary programs such as SciDAC will become even more crucial in the future to help foster such collaborations.

Will others solve this for us? Japan: –“Origins of the Cosmos” one of 5 major fields for K-computer at Riken [LQCD, Nuclear structure (mostly Cold QCD) and Cosmology] Large HPC efforts in Japan (RIKEN), China and Europe –LQCD Significant efforts in hadron and nuclear spectroscopy as well as nuclear structure. These groups have significant experience in all aspects of the calculations. Research in support of large experimental facilities JParc(Japan), BES(Beijing), GSI(Germany), LHC(Cern) –Nuclear structure + reactions Significant efforts internationally. US well organized. –Nuclear astrophysics Significant efforts internationally. US well organized.