1 DOE Office of Science www.science.doe.gov/scidac October 2003 SciDAC Scientific Discovery through Advanced Computing Alan J. Laub.

Slides:



Advertisements
Similar presentations
E-Science Data Information and Knowledge Transformation Thoughts on Education and Training for E-Science Based on edikt project experience Dr. Denise Ecklund.
Advertisements

U.S. Department of Energys Office of Science Distributed Science at Department of Energy Dan Hitchcock
What do we currently mean by Computational Science? Traditionally focuses on the “hard sciences” and engineering –Physics, Chemistry, Mechanics, Aerospace,
Sandy Landsberg DOE Office of Science Advanced Scientific Computing Research September 20, 2012 DOE / ASCR Interests & Complex Engineered Networks Applied.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Presentation at WebEx Meeting June 15,  Context  Challenge  Anticipated Outcomes  Framework  Timeline & Guidance  Comment and Questions.
High-Performance Computing
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update June 12,
Presented by Suzy Tichenor Director, Industrial Partnerships Program Computing and Computational Sciences Directorate Oak Ridge National Laboratory DOE.
EInfrastructures (Internet and Grids) US Resource Centers Perspective: implementation and execution challenges Alan Blatecky Executive Director SDSC.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
1 Mathematical, Information and Computational Sciences Computer Science PI Meeting June 26-27, 2002 Fred Johnson.
Ensuring Our Nation’s Energy Security NCSX Computational Challenges and Directions in the Office of Science Science for DOE and the Nation
1 Intellectual Architecture Leverage existing domain research strengths and organize around multidisciplinary challenges Institute for Computational Research.
1 WRF Development Test Center A NOAA Perspective WRF ExOB Meeting U.S. Naval Observatory, Washington, D.C. 28 April 2006 Fred Toepfer NOAA Environmental.
Update on the DOE SciDAC Program Vicky White, DOE/HENP Lattice QCD Collaboration Meeting Jefferson Lab, Feb
BERAC Charge A recognized strength of the Office of Science, and BER is no exception, is the development of tools and technologies that enable science.
U.S. Department of Energy’s Office of Science Dr. Raymond Orbach February 25, 2003 Briefing for the Basic Energy Sciences Advisory Committee FY04 Budget.
1 Emerging Research Concepts for Very Large-Scale Software Engineering Walt Scacchi Institute for Software Research University of California, Irvine Irvine,
UK e-Science and the White Rose Grid Paul Townend Distributed Systems and Services Group Informatics Research Institute University of Leeds.
Office of Science U.S. Department of Energy U.S. Department of Energy’s Office of Science Dr. Raymond L. Orbach Under Secretary for Science U.S. Department.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
The NIH Roadmap for Medical Research
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
US NITRD LSN-MAGIC Coordinating Team – Organization and Goals Richard Carlson NGNS Program Manager, Research Division, Office of Advanced Scientific Computing.
Supercomputing Center Jysoo Lee KISTI Supercomputing Center National e-Science Project.
Reorganization at NCAR Presentation to the UCAR Board of Trustees February 25, 2004.
Role of Deputy Director for Code Architecture and Strategy for Integration of Advanced Computing R&D Andrew Siegel FSP Deputy Director for Code Architecture.
1 Midrange Computing Working Group Process and Goals Background The MRC Working Group Phase I: - Assessment and Findings - Recommendations for a path forward.
Designing the Microbial Research Commons: An International Symposium Overview National Academy of Sciences Washington, DC October 8-9, 2009 Cathy H. Wu.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, ADVANCED SCIENTIFIC COMPUTING RESEARCH An.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update September.
U.S. Department of Energy’s Office of Science High Performance Computing Challenges and Opportunities Dr. Daniel Hitchcock
DOE 2000, March 8, 1999 The IT 2 Initiative and NSF Stephen Elbert program director NSF/CISE/ACIR/PACI.
Center for Computational Sciences O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Vision for OSC Computing and Computational Sciences
Revitalizing High-End Computing – Progress Report July 14, 2004 Dave Nelson (NCO) with thanks to John Grosh (DoD)
Directorate Update David W. Lightfoot Assistant Director National Science Foundation.
BESAC Dec Outline of the Report I. A Confluence of Scientific Opportunities: Why Invest Now in Theory and Computation in the Basic Energy Sciences?
ASCAC-BERAC Joint Panel on Accelerating Progress Toward GTL Goals Some concerns that were expressed by ASCAC members.
Pascucci-1 Valerio Pascucci Director, CEDMAV Professor, SCI Institute & School of Computing Laboratory Fellow, PNNL Massive Data Management, Analysis,
Commodity Grid Kits Gregor von Laszewski (ANL), Keith Jackson (LBL) Many state-of-the-art scientific applications, such as climate modeling, astrophysics,
Ensuring Our Nation’s Energy Security NCSX News from the Office of Science Presentation to the Basic Energy Sciences Advisory Committee July 22, 2002 Dr.
Brent Gorda LBNL – SOS7 3/5/03 1 Planned Machines: BluePlanet SOS7 March 5, 2003 Brent Gorda Future Technologies Group Lawrence Berkeley.
CSE 102 Introduction to Computer Engineering What is Computer Engineering?
Group Science J. Marc Overhage MD, PhD Regenstrief Institute Indiana University School of Medicine.
Marv Adams Chief Information Officer November 29, 2001.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
Barriers to Industry HPC Use or “Blue Collar” HPC as a Solution Presented by Stan Ahalt OSC Executive Director Presented to HPC Users Conference July 13,
BESAC August Part III IV. Connecting Theory with Experiment V. The Essential Resources for Success Co-Chairs Bruce Harmon – Ames Lab and Iowa.
1 OFFICE OF ADVANCED SCIENTIFIC COMPUTING RESEARCH The NERSC Center --From A DOE Program Manager’s Perspective-- A Presentation to the NERSC Users Group.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.
ComPASS Summary, Budgets & Discussion Panagiotis Spentzouris, Fermilab ComPASS PI.
The Performance Evaluation Research Center (PERC) Participating Institutions: Argonne Natl. Lab.Univ. of California, San Diego Lawrence Berkeley Natl.
PLASMA SCIENCE ADVANCED COMPUTING INTITUTE PROGRAM ADVISORY COMMITTEE MEETING W. M. TANG and V. S. CHAN 3 June 2004.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
Office of Science U.S. Department of Energy High-Performance Network Research Program at DOE/Office of Science 2005 DOE Annual PI Meeting Brookhaven National.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Toward High Breakthrough Collaboration (HBC) Susan Turnbull Program Manager Advanced Scientific Computing Research March 4, 2009.
Steven L. Lee DOE Program Manager, Office of Science Advanced Scientific Computing Research Scientific Discovery through Advanced Computing Program: SciDAC-3.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
Scrum BUT Research Adapting Agile principles in a research environment Gary Morgan Chief Science and Technology Office March 16, 2009 PNNL-SA
U.S. Department of Energy’s Office of Science Presentation to the Basic Energy Sciences Advisory Committee (BESAC) Dr. Raymond L. Orbach, Director November.
EarthCube Sustaining the Geosciences for 21 st Century Challenges Credits: from top to bottom: NOAA Okeanos Explorer Program (CC BY-SA 2.0), NASA/Kathryn.
Clouds , Grids and Clusters
DOE Facilities - Drivers for Science: Experimental and Simulation Data
Presentation transcript:

1 DOE Office of Science October 2003 SciDAC Scientific Discovery through Advanced Computing Alan J. Laub

2 Introduction SciDAC is a $60+M/yr pilot program for a “new way of doing science” first Federal program to support and enable “CSE” and (terascale) computational modeling and simulation as the third pillar of science (relevant to the DOE mission) along with theory and experiment spans the entire Office of Science (ASCR, BES, BER, FES, HEP, NP) involves all DOE labs and many universities builds on 50 years of DOE leadership in computation and mathematical software (EISPACK, LINPACK, LAPACK, ScaLAPACK, etc.)

3 Addressing the Performance Gap through Software , Teraflops 1996 Peak performance is skyrocketing l In 1990s, peak performance increased 100x; in 2000s, it will increase 1000x But... l Efficiency for many science applications declined from 40-50% on the vector supercomputers of 1990s to as little as 5- 10% on parallel supercomputers of today Need research on... l Mathematical methods and algorithms that achieve high performance on a single processor and scale to thousands of processors l More efficient programming models for massively parallel supercomputers Performance Gap Peak Performance Real Performance

4 It’s Not Only Hardware! Updated version of chart appearing in “Grand Challenges: High performance computing and communications”, OSTP committee on physical, mathematical and Engineering Sciences, 1992.

5 SciDAC Goals an INTEGRATED program to: – (1) create a new generation of scientific simulation codes that takes full advantage of the extraordinary capabilities of terascale computers – (2) create the mathematical and computing systems software to enable scientific simulation codes to effectively and efficiently use terascale computers – (3) create a collaboratory software environment to enable geographically distributed scientists to work effectively together as a team and to facilitate remote access, through appropriate hardware and middleware infrastructure, to facilities, data, and human resources with the ultimate goal of advancing fundamental research in science central to the DOE mission

6 Initial Awards Focus on Software Scientific Applications –Climate Simulation –Computational Chemistry –Fusion – 5 projects –High Energy/Nuclear Physics (incl. Astrophysics) – 5 projects Collaboratories –Four projects Middleware and Network Research –Six projects Computer Science (4 ISICs) –Scalable Systems Software –Common Component Architecture –Performance Science and Engineering –Scientific Data Management Applied Mathematics (3 ISICs) –PDE Solvers/Libraries –Structured Grids / AMR –Unstructured Grids

7 Where Did the Money Go? In the first year, $57M was awarded (via 51 projects ranging from as little as $50K to as much as several million dollars ) in the following ways: – about one third each to ISICs, scientific applications, and collaboratories/middleware/networks – about one third to BES, BER, FES, HEP, NP and about two thirds to ASCR – slightly over one half of awards to DOE labs and the balance to universities and other research institutions

8 CSE is Team-Oriented successful CSE usually requires teams with members and/or expertise from at least mathematics, computer science, and (several) application areas language and culture differences usual reward structures focus on the individual incompatible with traditional academia SciDAC will help break down barriers and lead by example; DOE labs are a critical asset for early success

9 Organizational Benefits benefits of “team-based science” are both technical and sociological synergistic benefits derived from interdisciplinary interactions; application scientists can now pursue more diverse and in-depth scientific explorations (e.g., Community Climate System Model) SciDAC teams with membership across DOE labs and with academia have enhanced cooperation across labs thereby increasing overall performance of DOE/SC

10 Successful Launch of Program SciDAC under way for over two years first PI meeting January 2002 in Washington, DC theme: introduction to the integrated SciDAC program; initiation of team building second annual PI meeting was held March 10-11, 2003 in Napa, Calif. theme: assessing SciDAC progress the SciDAC concept is working; a cultural change is emerging new scientific results that would not otherwise have been possible

11 Examples of Early Success Steve Jardin (PPPL): “… [SciDAC] is a significant factor in our productivity, comparable to that obtained by going to the next- generation computer.” Tony Mezzacappa (ORNL): “The SciDAC Program is making possible a whole new class of supernova simulations. I could never go back to single-investigator research.” Rob Ryne (LBNL): SciDAC algorithmic advancements and visualization in accelerator design enable us to “… optimize designs to reduce costs and risks and help ensure project success.”

12 Updated Overview of SciDAC almost 80 “two-pagers” now available on SciDAC website: or divided into –Basic Energy Sciences (BES) –Biological and Environmental Research (BER) –Fusion Energy Sciences (FES) –High-Energy and Nuclear Physics (HEP, NP) –Advanced Scientific Computing Research (ASCR) o CS ISICs (Integrated Software Infrastructure Centers) o Math ISICs o Collaboratories o Networking and Middleware

13 Future SciDAC Issues additional high-end computing and network resources – initial SciDAC focus is on software, but new hardware is needed now – U.S. response to Japanese Earth Simulator? – potential synergistic partnerships leveraging off the success of the SciDAC model (e.g., ITER decision and FSP) – both capability and capacity computing needs are evolving rapidly – NSTC (OSTP) HEC Revitalization Task Force (HECRTF) limited architectural options available in the U.S. today – science and engineering needs require architectural diversity – math and CS research will play a key role – topical or focused computing can be a cost-effective way of providing extra computing resources expansion of SciDAC program – many important SC research areas (e.g., visualization, functional genomics/proteomics) are not yet formally included in SciDAC; computational nanoscience / materials science now included as part of Nanoscale Science Research Centers