Daniel Hitchcock Acting Associate Director Advanced Scientific Computing Research ASCR OVERVIEW October 7, 2011 BERAC.

Slides:



Advertisements
Similar presentations
Sandy Landsberg DOE Office of Science Advanced Scientific Computing Research September 20, 2012 DOE / ASCR Interests & Complex Engineered Networks Applied.
Advertisements

O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Center for Computational Sciences Cray X1 and Black Widow at ORNL Center for Computational.
Norman D. Peterson Director, Government Relations September 9, 2013
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Priority Research Direction (I/O Models, Abstractions and Software) Key challenges What will you do to address the challenges? – Develop newer I/O models.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update June 12,
Presented by Suzy Tichenor Director, Industrial Partnerships Program Computing and Computational Sciences Directorate Oak Ridge National Laboratory DOE.
Scientific Grand Challenges Workshop Series: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale Warren M. Washington National.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
U.S. Department of Energy’s Office of Science Dr. Raymond Orbach February 25, 2003 Briefing for the Basic Energy Sciences Advisory Committee FY04 Budget.
Office of Science Office of Biological and Environmental Research J Michael Kuperberg, Ph.D. Dan Stover, Ph.D. Terrestrial Ecosystem Science AmeriFlux.
Office of Science U.S. Department of Energy U.S. Department of Energy’s Office of Science Raymond L. Orbach Director, Office of Science April 20, 2005.
Lawrence Berkeley National Laboratory Kathy Yelick Associate Laboratory Director for Computing Sciences.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, ADVANCED SCIENTIFIC COMPUTING RESEARCH An.
NCAR Annual Budget Review October 8, 2007 Tim Killeen NCAR Director.
Daniel Hitchcock Acting Associate Director Advanced Scientific Computing Research ASCR OVERVIEW August 3, 2011 BESAC.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update September.
November 13, 2006 Performance Engineering Research Institute 1 Scientific Discovery through Advanced Computation Performance Engineering.
Physics Steven Gottlieb, NCSA/Indiana University Lattice QCD: focus on one area I understand well. A central aim of calculations using lattice QCD is to.
R. Ryne, NUG mtg: Page 1 High Energy Physics Greenbook Presentation Robert D. Ryne Lawrence Berkeley National Laboratory NERSC User Group Meeting.
24 April 2015 FY 2016 Budget Request to Congress for DOE’s Office of Science Dr. Patricia M. Dehmer Acting Director, Office of Science
1 Investing in America’s Future The National Science Foundation Strategic Plan for FY Advisory Committee for Cyberinfrastructure 10/31/06 Craig.
DOE 2000, March 8, 1999 The IT 2 Initiative and NSF Stephen Elbert program director NSF/CISE/ACIR/PACI.
BROOKHAVEN SCIENCE ASSOCIATES Peter Bond Deputy Director for Science and Technology October 29, 2005 New Frontiers at RHIC Workshop.
Presented by Leadership Computing Facility (LCF) Roadmap Buddy Bland Center for Computational Sciences Leadership Computing Facility Project.
BESAC Dec Outline of the Report I. A Confluence of Scientific Opportunities: Why Invest Now in Theory and Computation in the Basic Energy Sciences?
Office of Science Office of Biological and Environmental Research DOE Workshop on Community Modeling and Long-term Predictions of the Integrated Water.
Pascucci-1 Valerio Pascucci Director, CEDMAV Professor, SCI Institute & School of Computing Laboratory Fellow, PNNL Massive Data Management, Analysis,
Office of Science U.S. Department of Energy Raymond L. Orbach Director Office of Science U.S. Department of Energy Presentation to BESAC December 6, 2004.
NSDL Collections Based on DOE User Facilities Christopher Klaus 10/05/03.
Ted Fox Interim Associate Laboratory Director Energy and Engineering Sciences Oak Ridge, Tennessee March 21, 2006 Oak Ridge National Laboratory.
U.S. Department of Energy’s Office of Science News from the Office of Science Presentation to the Basic Energy Sciences Advisory Committee August 6, 2004.
May 6, 2002Earth System Grid - Williams The Earth System Grid Presented by Dean N. Williams PI’s: Ian Foster (ANL); Don Middleton (NCAR); and Dean Williams.
1 DOE Office of Science October 2003 SciDAC Scientific Discovery through Advanced Computing Alan J. Laub.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
Bob Lucas University of Southern California Sept. 23, 2011 Transforming Geant4 for the Future Bob Lucas and Rob Roser USC and FNAL May 8, 2012.
National Strategic Computing Initiative
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
BESAC August Part III IV. Connecting Theory with Experiment V. The Essential Resources for Success Co-Chairs Bruce Harmon – Ames Lab and Iowa.
1 OFFICE OF ADVANCED SCIENTIFIC COMPUTING RESEARCH The NERSC Center --From A DOE Program Manager’s Perspective-- A Presentation to the NERSC Users Group.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
ComPASS Summary, Budgets & Discussion Panagiotis Spentzouris, Fermilab ComPASS PI.
The Performance Evaluation Research Center (PERC) Participating Institutions: Argonne Natl. Lab.Univ. of California, San Diego Lawrence Berkeley Natl.
NICS Update Bruce Loftis 16 December National Institute for Computational Sciences University of Tennessee and ORNL partnership  NICS is the 2.
Presented by NCCS Hardware Jim Rogers Director of Operations National Center for Computational Sciences.
Tools for Portals, Search, Assimilation, Provenance Computing Infrastructure for Science Individual University and Lab PIs National and Int’l collabs Research.
CERN VISIONS LEP  web LHC  grid-cloud HL-LHC/FCC  ?? Proposal: von-Neumann  NON-Neumann Table 1: Nick Tredennick’s Paradigm Classification Scheme Early.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Petascale Computing Resource Allocations PRAC – NSF Ed Walker, NSF CISE/ACI March 3,
Office of Science U.S. Department of Energy High-Performance Network Research Program at DOE/Office of Science 2005 DOE Annual PI Meeting Brookhaven National.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
View from Washington and Germantown Vincent Dattoria ESnet Program Manager, Facilities Division, Office of Advanced Scientific Computing Research February.
Toward High Breakthrough Collaboration (HBC) Susan Turnbull Program Manager Advanced Scientific Computing Research March 4, 2009.
Steven L. Lee DOE Program Manager, Office of Science Advanced Scientific Computing Research Scientific Discovery through Advanced Computing Program: SciDAC-3.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
Presented by SciDAC-2 Petascale Data Storage Institute Philip C. Roth Computer Science and Mathematics Future Technologies Group.
U.S. Department of Energy’s Office of Science Presentation to the Basic Energy Sciences Advisory Committee (BESAC) Dr. Raymond L. Orbach, Director November.
DOE Office of Science Graduate Student Research (SCGSR) Program
DOE Office of Science Graduate Student Research (SCGSR) Program
A Brief Introduction to NERSC Resources and Allocations
Electron Ion Collider New aspects of EIC experiment instrumentation and computing, as well as their possible impact on and context in society (B) COMPUTING.
Strategic Planning Process
Strategic Planning Process
DOE Facilities - Drivers for Science: Experimental and Simulation Data
The Shifting Landscape of CI Funding
DOE Office of Science Graduate Student Research (SCGSR) Program
Presentation transcript:

Daniel Hitchcock Acting Associate Director Advanced Scientific Computing Research ASCR OVERVIEW October 7, 2011 BERAC

2 BERAC September 7, 2011 Advanced Scientific Computing Research Delivering world leading computational and networking capabilities to extend the frontiers of science and technology The Scientific Challenges:  Deliver next-generation scientific applications using today’s petascale computers.  Discover, develop and deploy tomorrow’s exascale computing and networking capabilities.  Develop, in partnership with U.S. industry, next generation computing hardware and tools for science.  Discover new applied mathematics and computer science for the ultra-low power, multicore-computing future.  Provide technological innovations for U.S. leadership in Information Technology to advance competitiveness. FY 2012 Highlights:  Research in uncertainty quantification for drawing predictive results from simulation  Co-design centers to deliver next generation scientific applications by coupling application development with formulation of computer hardware architectures and system software.  Investments in U.S. industry to address critical challenges in hardware and technologies on the path to exascale  Installation of a 10 petaflop low-power IBM Blue Gene/Q at the Argonne Leadership Computing Facility and a hybrid, multi-core prototype computer at the Oak Ridge Leadership Computing Facility.

3BERAC September 7, 2011 ASCR Facilities Providing the Facility – High-End and Leadership Computing Investing in the Future - Research and Evaluation Prototypes Linking it all together – Energy Sciences Network (ESnet)

OLCF / ORNL ALCF / ANL Nashville 100 G/l ANI Testbed (ARRA ) Chicago Energy Sciences Net 4 BERAC September 7, 2011 NERSC / LBNL NYC Sunnyvale 2009 Award

Substantial innovation is needed to provide essential system and application functionality in a timeframe consistent with the anticipated availability of hardware Provide forefront research knowledge and foundational tools: – Applied Mathematics – Computer Science – SciDAC – Next Generation Networking for Science 5BERAC September 7, 2011 ASCR Research

Looking Backward and Forward BERAC September 7, SciDAC Centers for Enabling Technology and Institutes Science Application Partnerships Hopper Jaguar Intrepid Many Core Energy Aware X Stack CoDesign SciDAC Institutes Strategic ASCR – SC Office Partnerships Titan Mira ???

Goals & Objectives Deliver tools and resources to lower barriers to effectively use state-of-the-art computational systems; Create mechanisms to address computational grand challenges across different science application areas; Incorporate basic research results from Applied Mathematics and Computer Science into computational science challenge areas and demonstrate that value Grow the Nation’s computational science research community. Awards- Up to $13M/year over 5 years available to support 1–5 Institutes Eligible applicants- DOE National Laboratories, Universities, Industry and other organizations Expected outcome- Institutes that cover a significant portion of DOE computational science needs on current and emerging computational systems. Timeline Solicitations opened- February 23, 2011 Letters of Intent- March 30, 2011 Solicitations closed- May 2, 2011 First awards- end of FY2011 Answers to Inquiries- SciDAC Institutes 7 BERAC September 7, 2011

Scientific Discovery through Advanced Computing (SciDAC) Institutes – FY11 Awards FASTMath – Frameworks, Algorithms, and Scalable Technologies for Mathematics Topic areas: Structured & unstructured mesh tools, linear & nonlinear solvers, eigensolvers, particle methods, time integration, differential variational inequalities QUEST – Quantification of Uncertainty in Extreme Scale Computations Topic areas: Forward uncertainty propagation, reduced stochastic representations, inverse problems, experimental design & model validation, fault tolerance SUPER – Institute for Sustained Performance, Energy and Resilience Topic areas: Performance engineering (including modeling & auto-tuning), energy efficiency, resilience & optimization FASTMath Director – Lori Diachin, LLNL QUEST Director – Habib N. Najm, SNL SUPER Director – Robert F. Lucas, USC Argonne National LaboratoryLos Alamos National LaboratoryArgonne National Laboratory Lawrence Berkeley National LabSandia National Laboratories*Lawrence Berkeley National Lab Lawrence Livermore National Lab*Johns Hopkins UniversityLawrence Livermore National Lab Sandia National LaboratoriesMassachusetts Institute of TechnologyOak Ridge National Laboratory Rensselaer Polytechnic InstituteUniversity of Southern CaliforniaUniversity of California at San Diego University of Texas at AustinUniversity of Maryland University of North Carolina University of Oregon University of Southern California* University of Tennessee at Knoxville University of Utah 8 OMB Briefing BERAC September 7, 2011

Strategic ASCR – SC Program Partnerships 9 BERAC September 7, 2011 Goals and Objectives – Partner with SC Programs to combine the best applied mathematics, computer science, and networking with SC program expertise to enable Strategic advances in program missions. Five FOAs out before September 30, 2011: – Fusion Energy Science: topics in Edge Plasma Physics, Multiscale Integrated Modeling, and Materials Science – High Energy Physics: topics in Cosmological Frontier, Lattice Gauge Theory QCD, and Accelerator Modeling and Simulation – Nuclear Physics: topics in Low- and Medium-Energy Nuclear Physics and Heavy Ion Collider Physics – Biological and Environmental Research: topics in dynamics of atmospheres, oceans, and ice sheets that support Earth System Modeling – Basic Energy Sciences: topics in theoretical chemistry and materials science, especially excited states of atoms and molecules, materials electron correlation.

At $1M per MW, energy costs are substantial 1 petaflop in 2010 will use 3 MW 1 exaflop in 2018 at 200 MW with “usual” scaling 1 exaflop in 2018 at 20 MW is target The Future is about Energy Efficient Computing 10 BERAC September 7, 2011 goal usual scaling

The Fundamental Issue: Where does the Energy (and Time) Go? 11 BERAC September 7, 2011 Intranode/MPI Communication On-chip / CMP communication Intranode/SMP Communication

Memory Technology: Bandwidth costs power BERAC September 7,

Locality, Locality, Locality! Billion Way Concurrency; Uncertainty Quantification (UQ) including hardware variability; Flops free data movement expensive so: – Remap multi-physics to put as much work per location on same die; – Include embedded UQ to increase concurrency; – Include data analysis if you can for more concurrency – Trigger output to only move important data off machine; – Reformulate to trade flops for memory use. Wise use of silicon area Approaches 13 BERAC September 7, 2011

Goals & Objectives Understand how to allocate complexity between hardware, systems software, libraries, and applications; Modify application designs at all levels; Understand reformulating as well as reimplementing tradeoffs; Explore uncertainty quantification, in line data analysis, and resilience in applications; Co-adapt applications to new programming models and perhaps languages; Impact of massive multithreaded nodes and new ultra-lightweight operating systems. Awards- June 2011 Expected outcome- Understanding, Guidance for Future Applications, Application ReadinessCo-Design 14 BERAC September 7, 2011

Three Exascale Co-Design Centers Awarded Exascale Co-Design Center for Materials in Extreme Environments (ExMatEx) Director: Timothy Germann (LANL) Center for Exascale Simulation of Advanced Reactors (CESAR) Director: Robert Rosner (ANL) Combustion Exascale Co-Design Center (CECDC) Director: Jacqueline Chen (SNL) ExMatEx (Germann) CESAR (Rosner) CECDC (Chen) National Labs LANLANLSNL LLNLPNNLLBNL SNLLANL ORNL LLNL NREL University & Industry Partners StanfordStudsvikStanford CalTechTAMUGA Tech RiceRutgers U ChicagoUT Austin IBMUtah TerraPower General Atomic Areva 15 BERAC September 7, 2011

All of these hardware trends impact data driven science (in many cases more than compute intensive); Data from instruments still on month doubling because detectors on CMOS feature size path; 100 gigabit per second per lambda networks on horizon; Disk read and write rates will fall further behind processors and memory; Significant hardware infrastructure needed to support this which probably will not be replicated at users’ home institution (i.e. launching a petabyte file transfer at a users laptop is not friendly) Future of Data Driven Science 16 BERAC September 7, 2011

Goals & Objectives Identify and review the status, successes, and shortcomings of current data (including analysis and visualization) and communication pathways for scientific discovery in the basic energy sciences; Ascertain the knowledge, methods and tools needed to mitigate present and projected data and communication shortcomings; Consider opportunities and challenges related to data and communications with the combination of techniques (with different data streams) in single experiments; Identify research areas in data and communications needed to underpin advances in the basic energy sciences in the next ten years; Create the foundation for information exchanges and collaborations among ASCR and BES supported researchers, BES scientific user facilities and ASCR computing and networking facilities. Co-Chairs Peter Nugent, NERSC J. Michael Simonson, SNS ASCR-BES Workshop on Data 17 BERAC September 7, 2011 “Data and Communications in Basic Energy Sciences: Creating a Pathway for Scientific Discovery” October 24-25, 2011 Bethesda, MD

Relevant Websites ASCR: science.energy.gov/ascr/science.energy.gov/ascr/ ASCR Workshops and Conferences: science.energy.gov/ascr/news-and-resources/workshops-and-conferences/ SciDAC: INCITE: science.energy.gov/ascr/facilities/incite/science.energy.gov/ascr/facilities/incite/ Exascale Software: DOE Grants and Contracts info: science.doe.gov/grants/science.doe.gov/grants/ ASCR at a Glance 18 BERAC September 7, 2011 Office of Advanced Scientific Computing Research Associate Director – Daniel Hitchcock (acting) Phone: Research Division Director – William Harrod Phone: Facilities Division Director – Daniel Hitchcock Phone:

19 BERAC September 7, 2011

Funded by the Department of Energy’s Office of Science and National Nuclear Security Administration, the DOE CSGF trains scientists to meet U.S. workforce needs and helps to create a nationwide interdisciplinary community. Some Talks from 2011 Fellows Conference – "The Materials Genome: An online database for the design of new materials for clean energy and beyond“ "The Materials Genome: An online database for the design of new materials for clean energy and beyond“ – "Understanding electromagnetic fluctuations in microstructured geometries“ "Understanding electromagnetic fluctuations in microstructured geometries“ – "Computational science meets materials chemistry: a bilingual investigation of surface effects in nanoscale systems“ "Computational science meets materials chemistry: a bilingual investigation of surface effects in nanoscale systems“ Computational Science Graduate Fellowship BERAC September 7, 2011

Hopper in production April PFlop/s peak performance Cray XE6 Over a billion core-hours to science per year Gemini high performance resilient interconnect BES NERSC Requirements Workshop : – NERSC 21BERAC September 7, 2011

22 BERAC September 7, 2011 INCITE In FY 2012, the Argonne LCF will be upgraded with a 10 petaflop IBM Blue Gene/Q. The Oak Ridge LCF will continue site preparations for a system expected in FY 2013 that will be 5-10 times more capable than the Cray XT-5. Oak Ridge Leadership Computing Facility Energy Storage MaterialsNuclear Reactor SimulationTurbulence  The Cray XT5 ("Jaguar") at ORNL and the IBM Blue Gene/P ("Intrepid") at ANL will provide ~2.3 billion processor hours in FY12 to address science and engineering problems that defy traditional methods of theory and experiment and that require the most advanced computational power.  Peer reviewed projects are chosen to advance science, speed innovation, and strengthen industrial competitiveness.  Demand for these machines has grown each year, requiring upgrades of both. Argonne Leadership Computing Facility

~ 20 times the computing power in only 20% more space Argonne Leadership Computing Facility 23BERAC September 7, 2011 Mira Blue Gene/Q—peak 10 PF 48 racks -- just one row more than Intrepid 49,152 nodes (16 core) 786,432 processors (205 GF peak) 786 TB RAM 28.8 PB storage capacity 240 GB/s storage rate Estimated 3-8 MW projected power Water-cooled Intrepid Blue Gene/P—peak 557 TF 40 racks 40,960 nodes (quad core) 163,840 processors (3.4 GF Peak) 80 TB RAM 8 PB storage capacity 88 GB/s storage rate 1.2 MW power Air-cooled 48 racks more CPUs/node (x 4) faster CPUs (x 60) faster storage (x 2.7) more storage (x 3.6) more RAM/CPU (x 2) ALCF - nowALCF

Oak Ridge Leadership Computing Facility BERAC September 7, Science Application Readiness Application Readiness Review August, 2010 Science Teams Port, Optimize, New Modeling Approaches Early Access to development machine Cray-NVIDIA System ~10-20 Pflop/s peak Hybrid x86/accelerator architecture Liquid cooled Highly power efficient OLCF-3 Builds on DARPA HPCS investment in Cray Prepare applications for path to Exascale computing CD-1 approved, December 2009 Lehman CD2-3b Review scheduled for August, 2010

BER 2010 Requirements Workshop Net-Req-Workshop-2010-Final-R eport.pdf Electronic Collaboration Tools – Network Performance Tools – fasterdata.es.net fasterdata.es.net Virtual Circuits – BERAC September 7, 2011