HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.

Slides:



Advertisements
Similar presentations
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Advertisements

Presentation at WebEx Meeting June 15,  Context  Challenge  Anticipated Outcomes  Framework  Timeline & Guidance  Comment and Questions.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
Secure and Trustworthy Cyberspace (SaTC) Program Sam Weber Program Director March 2012.
May 17, Capabilities Description of a Rapid Prototyping Capability for Earth-Sun System Sciences RPC Project Team Mississippi State University.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
1 WRF Development Test Center A NOAA Perspective WRF ExOB Meeting U.S. Naval Observatory, Washington, D.C. 28 April 2006 Fred Toepfer NOAA Environmental.
Update on the DOE SciDAC Program Vicky White, DOE/HENP Lattice QCD Collaboration Meeting Jefferson Lab, Feb
Social and behavioral scientists building cyberinfrastructure David W. Lightfoot Assistant Director, National Science Foundation Social, Behavior & Economic.
DATA PRESERVATION IN ALICE FEDERICO CARMINATI. MOTIVATION ALICE is a 150 M CHF investment by a large scientific community The ALICE data is unique and.
State of Kansas Statewide Financial Management System Pre-Implementation Project Steering Committee Meeting January 11, 2008.
Welcome to HTCondor Week #14 (year #29 for our project)
Field Project Planning, Operations and Data Services Jim Moore, EOL Field Project Services (FPS) Mike Daniels, EOL Computing, Data and Software (CDS) Facility.
Role of Deputy Director for Code Architecture and Strategy for Integration of Advanced Computing R&D Andrew Siegel FSP Deputy Director for Code Architecture.
Simulating Quarks and Gluons with Quantum Chromodynamics February 10, CS635 Parallel Computer Architecture. Mahantesh Halappanavar.
Effective User Services for High Performance Computing A White Paper by the TeraGrid Science Advisory Board May 2009.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update September.
Physics Steven Gottlieb, NCSA/Indiana University Lattice QCD: focus on one area I understand well. A central aim of calculations using lattice QCD is to.
R. Ryne, NUG mtg: Page 1 High Energy Physics Greenbook Presentation Robert D. Ryne Lawrence Berkeley National Laboratory NERSC User Group Meeting.
Results of the HPC in Europe Taskforce (HET) e-IRG Workshop Kimmo Koski CSC – The Finnish IT Center for Science April 19 th, 2007.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER 1 NERSC Visualization Greenbook Workshop Report June 2002 Wes Bethel LBNL.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
BESAC Dec Outline of the Report I. A Confluence of Scientific Opportunities: Why Invest Now in Theory and Computation in the Basic Energy Sciences?
Computational Requirements for NP Robert Edwards Jefferson Lab TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAAAAAAA.
Pascucci-1 Valerio Pascucci Director, CEDMAV Professor, SCI Institute & School of Computing Laboratory Fellow, PNNL Massive Data Management, Analysis,
Commodity Grid Kits Gregor von Laszewski (ANL), Keith Jackson (LBL) Many state-of-the-art scientific applications, such as climate modeling, astrophysics,
Experts in numerical algorithms and High Performance Computing services Challenges of the exponential increase in data Andrew Jones March 2010 SOS14.
Land Ice Verification and Validation (LIVV) Kit Weak scaling behavior for a large dome- shaped test case. It shows that the scaling behavior of a new run.
S2I2: Enabling grand challenge data intensive problems using future computing platforms Project Manager: Shel Swenson (USC & GATech)
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
Materials Innovation Platforms (MIP): A New NSF Mid-scale Instrumentation and User Program to Accelerate The Discovery of New Materials MRSEC Director’s.
DESY Photon Science XFEL official start of project: 5 June 2007 FLASH upgrade to 1 GeV done, cool down started PETRA III construction started 2 July 2007.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Joint Meeting of the AUS, US, XS Working Groups TG10 Tuesday August 3, hrs Elwood II.
1 DOE Office of Science October 2003 SciDAC Scientific Discovery through Advanced Computing Alan J. Laub.
Scientific Workflow systems: Summary and Opportunities for SEEK and e-Science.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
Bob Lucas University of Southern California Sept. 23, 2011 Transforming Geant4 for the Future Bob Lucas and Rob Roser USC and FNAL May 8, 2012.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
Fire Emissions Network Sept. 4, 2002 A white paper for the development of a NSF Digital Government Program proposal Stefan Falke Washington University.
ComPASS Summary, Budgets & Discussion Panagiotis Spentzouris, Fermilab ComPASS PI.
The Performance Evaluation Research Center (PERC) Participating Institutions: Argonne Natl. Lab.Univ. of California, San Diego Lawrence Berkeley Natl.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
Collaboration between University- National Lab-Industry It is in the national interest to foster and support a vibrant and dynamic research infrastructure.
SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
Infrastructure Breakout What capacities should we build now to manage data and migrate it over the future generations of technologies, standards, formats,
High Risk 1. Ensure productive use of GRID computing through participation of biologists to shape the development of the GRID. 2. Develop user-friendly.
Data Infrastructure Building Blocks (DIBBS) NSF Solicitation Webinar -- March 3, 2016 Amy Walton, Program Director Advanced Cyberinfrastructure.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Resource Optimization for Publisher/Subscriber-based Avionics Systems Institute for Software Integrated Systems Vanderbilt University Nashville, Tennessee.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Toward High Breakthrough Collaboration (HBC) Susan Turnbull Program Manager Advanced Scientific Computing Research March 4, 2009.
Building PetaScale Applications and Tools on the TeraGrid Workshop December 11-12, 2007 Scott Lathrop and Sergiu Sanielevici.
Evolution of successful Forum for Computational Excellence (FCE) Pilot project – raising awareness for HEP response to rapid evolution of the computational.
TeraGrid’s Process for Meeting User Needs. Jay Boisseau, Texas Advanced Computing Center Dennis Gannon, Indiana University Ralph Roskies, University of.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
BioExcel - Intro Erwin Laure, KTH. PDC Center for High Performance Computing BioExcel Consortium KTH Royal Institute of Technology – Sweden University.
Hall D Computing Facilities Ian Bird 16 March 2001.
Centre of Excellence in Physics at Extreme Scales Richard Kenway.
LQCD Computing Project Overview
RDA US Science workshop Arlington VA, Aug 2014 Cees de Laat with many slides from Ed Seidel/Rob Pennington.
Joslynn Lee – Data Science Educator
Electron Ion Collider New aspects of EIC experiment instrumentation and computing, as well as their possible impact on and context in society (B) COMPUTING.
for the Offline and Computing groups
National e-Infrastructure Vision
LQCD Computing Operations
Presentation transcript:

HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne

Projects Lattice QCD Supernova modeling Particle Physics Data Grid Accelerator modeling

National Computational Infrastructure for Lattice Gauge Theory “The SciDAC Program has been enormously successful in producing community codes for terascale computing” “It has done so by providing the support needed to enable apps scientists to work on such codes, and by encouraging collaborations among apps scientists, applied mathematicians and computer scientists in their development” –“we strongly recommend that this very successful approach be continued “…it is important that these codes continue to evolve, since once codes become stagnant, the quickly become obsolete.” “It is also important that codes be properly maintained and ported to new platforms, and that appropriate support be provided for the code users.” “…we strongly recommend that DOE provide hardware with the capability and capacity to meet the needs of the SciDAC apps areas, either as part of an extension of the SciDAC Program, or as part of a separate program.”

National Computational Infrastructure for Lattice Gauge Theory, cont. “While recognizing that the bulk of computing resources for areas other than LQCD will be provided by commercial supercomputer located at DOE centers, [the LQCD project] has clearly demonstrated that for our field designing hardware and software that specifically takes into account the structure of the computation is highly advantageous… We expect this approach to be useful in some other fields, and we urge that work in this direction by us and by other be supported in an extended SciDAC program.” Future plans include: –Continued software development of the QCD API –Collaboration w/ TOPS on multigrid algorithm for Dirac operator inversion –Build a unified users environment; run BNL, FNAL, Jlab compute facilities as a single meta-facility (will use grid tools developed by groups such has PPDG)

Supernova Modeling: TSI recommendations “We strongly recommend a 5-year SciDAC follow-on program” SAPP/ISIC comments: –“We envision that our primary mode of multidisciplinary collaboration will be through SAPP.” [lift the 25% SAPP funding cap] –“…we also envision significant continued interaction with the ISICs.” –“successes…often the result of combined efforts of SAPP & ISIC teams.” –“We imagine SAPP & ISICs as equal partners in next SciDAC round.” “The next generation of SciDAC should focus on apps that can benefit from computational capabilities at the 100+ TFLOP level Verification & Validation (V&V) comments: –“DOE/SC should demand rigorous V&V from its application teams” –“demand that work that has the imprimatur of SciDAC funding should manifest and maintain rigorous and continuing V&V efforts, including publication of results of V&V test problems… insists that apps teams publish numerical details sufficient to allow replication of the results by other researchers…”

Supernova Modeling: TSI recommendations, cont. “…in order for the apps teams to fully leverage the efforts of the ISIC teams, the apps must have sufficient embedded personnel” Software developed by the ISICs should be performed in close collaboration w/ the applications. “We have had limited success in incorporating products that were developed and then ‘thrown over the fence.’” Long-term view of software developed by apps, SAPP, and ISICs -- beginning with R&D, continuing through deployment and maintenance. “…we must establish an end-to-end computational science infrastructure to enable the scientific workflow, from simulation data generation and storage, data movement, data analysis, and scientific visualization”

Supernova Modeling: SNSC summary “Having the support of people w/ computational expertise is essential to our (and SciDAC’s) goals” “The collab works best when there is a sustained effort in which the computer scientists become involved w/ the science goals (joint pubs a good indicator). Geographical proximity also helps. “There is a real need in the community for a special breed of computational *scientist* …trained in the intricacies of writing, optimizing, maintaining, and running large codes.” “One of the major goals of SciDAC II should be the training of this next generation of specialists.” “As the codes mature… ISICs in optimization, data mgmt, and visualization (sorely needed and still absent) will also be beneficial”

HEP Collaboratories Future HEP and ASCR have jointly funded the Particle Physics Data Grid program (PPDG) PPDG works closely with GriPhyN and iVDGL-(NSF-funded) to create a working Grid environment for HEP and NP Among the results are Grid3 (late 2004), the start of the Open Science Grid (now) and ongoing vigorous CS-HENP collaboration PPDG-OSG already supports massive simulations for LHC (simulation is not I/O intensive and is run by small teams) Major challenges must be addressed to support data analysis (I/O intensive, many users) Grids are central to the computing for the future of HENP Strong encouragement from NSF MPS to propose continued work building on the successes of PPDG/GriPhyN/iVDGL.

Accelerator Science and Technology (AST) project Excerpts from the draft white paper: –“ SciDAC has been an incredibly successful program. A key reason is that it encouraged collaboration and supported the formation of [multidisciplinary teams]. ” –“ While the SciDAC AST focused mainly on terascale problems, under the new project the focus will range from the terascale to the petascale depending on the problem ” –“ Given the time required to develop major software packages as found under SciDAC, we are proposing an initial 5-year project duration …” –“… just as SciDAC itself is a coordinated multi-program office activity, so too is the project that we have proposed in this white paper – a coordinated, multi-program office accelerator modeling initiative that builds upon the success of the SciDAC AST project.

Accelerator Science and Technology (AST) project, cont. Future management structure –Through community discussions and meetings with DOE/SC Associate Director ’ s & program managers, a follow-on to the AST project is being formulated. –Coordination of accelerator modeling activities across offices of DOE/SC essential to make most efficient use of SC resources –Details are being worked out (e.g. one or multiple proposals) –Community will formulate a science-driven plan; coordination details will be decided by the AD ’ s and their Program Mgrs.