Lawrence Berkeley National Laboratory Kathy Yelick Associate Laboratory Director for Computing Sciences.

Slides:



Advertisements
Similar presentations
U.S. Department of Energys Office of Science Distributed Science at Department of Energy Dan Hitchcock
Advertisements

Data and Information Opportunities
Norman D. Peterson Director, Government Relations September 9, 2013
ASCR Data Science Centers Infrastructure Demonstration S. Canon, N. Desai, M. Ernst, K. Kleese-Van Dam, G. Shipman, B. Tierney.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Director’s Welcome Jonathan Dorfan 32 nd Annual SSRL Users Meeting October 17, 2005.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update June 12,
Research Projects General Sciences Physical Sciences Energy Sciences Biosciences Ali Belkacem - Chemical Sciences MRC workshop: March 26, 2002.
Berkeley Lab Overview. 2 Founded in 1931 on Berkeley Campus Moved to Current Site in 1940.
BERAC Charge A recognized strength of the Office of Science, and BER is no exception, is the development of tools and technologies that enable science.
U.S. Department of Energy’s Office of Science Dr. Raymond Orbach February 25, 2003 Briefing for the Basic Energy Sciences Advisory Committee FY04 Budget.
Office of Science U.S. Department of Energy U.S. Department of Energy’s Office of Science Dr. Raymond L. Orbach Under Secretary for Science U.S. Department.
Office of Science Office of Biological and Environmental Research J Michael Kuperberg, Ph.D. Dan Stover, Ph.D. Terrestrial Ecosystem Science AmeriFlux.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Welcome to HTCondor Week #14 (year #29 for our project)
 The institute started in 1989 as a UNDP funded project called the National Agricultural Genetic Engineering Laboratory (NAGEL).  The Agricultural.
Opportunities for Discovery: Theory and Computation in Basic Energy Sciences Chemical Sciences, Geosciences and Biosciences Response to BESAC Subcommittee.
Designing the Microbial Research Commons: An International Symposium Overview National Academy of Sciences Washington, DC October 8-9, 2009 Cathy H. Wu.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, ADVANCED SCIENTIFIC COMPUTING RESEARCH An.
DOE Resources & Facilities for Biological Discovery : Realizing the Potential Presentation to the BERAC 25 April 2002.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update September.
24 April 2015 FY 2016 Budget Request to Congress for DOE’s Office of Science Dr. Patricia M. Dehmer Acting Director, Office of Science
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
SLUO LHC Workshop: Closing RemarksPage 1 SLUO LHC Workshop: Closing Remarks David MacFarlane Associate Laboratory Directory for PPA.
Presented by Leadership Computing Facility (LCF) Roadmap Buddy Bland Center for Computational Sciences Leadership Computing Facility Project.
ASCAC-BERAC Joint Panel on Accelerating Progress Toward GTL Goals Some concerns that were expressed by ASCAC members.
Pascucci-1 Valerio Pascucci Director, CEDMAV Professor, SCI Institute & School of Computing Laboratory Fellow, PNNL Massive Data Management, Analysis,
WHAT IS BERKELEY LAB?. One of 17 Department of Energy National Laboratories, Berkeley Lab — managed by the University of California — conducts non-classified,
1 HiGrade Kick-off Welcome to DESY Hamburg Zeuthen.
Office of Science U.S. Department of Energy Raymond L. Orbach Director Office of Science U.S. Department of Energy Presentation to BESAC December 6, 2004.
The Earth System Grid (ESG) Computer Science and Technologies DOE SciDAC ESG Project Review Argonne National Laboratory, Illinois May 8-9, 2003.
DESY Photon Science XFEL official start of project: 5 June 2007 FLASH upgrade to 1 GeV done, cool down started PETRA III construction started 2 July 2007.
Office of Science (SC) Overview John Yates Office of Operations Program Management, SC-33 Office of Science Briefing for DOE FIMS Training at ANL – May.
119 May 2003HEPiX/HEPNT National Institute for Nuclear Physics and High Energy Physics Coordinates all (experimental) subatomic physics research in The.
High Performance Computing (HPC) Data Center Proposal Imran Latif, Facility Project Manager Scientific & Enterprise Computing Data Centers at BNL 10/14/2015.
1 OFFICE OF ADVANCED SCIENTIFIC COMPUTING RESEARCH The NERSC Center --From A DOE Program Manager’s Perspective-- A Presentation to the NERSC Users Group.
HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.
The Performance Evaluation Research Center (PERC) Participating Institutions: Argonne Natl. Lab.Univ. of California, San Diego Lawrence Berkeley Natl.
Committee to Assess the Current Status and Future Direction of High Magnetic Field Science in the United States 18 May 2012 Dr. Patricia M. Dehmer Deputy.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Tools for Portals, Search, Assimilation, Provenance Computing Infrastructure for Science Individual University and Lab PIs National and Int’l collabs Research.
3D-ASICS for X-ray Science
DESY. Status and Perspectives in Particle Physics Albrecht Wagner Chair of the DESY Directorate.
CERN VISIONS LEP  web LHC  grid-cloud HL-LHC/FCC  ?? Proposal: von-Neumann  NON-Neumann Table 1: Nick Tredennick’s Paradigm Classification Scheme Early.
Considering Time in Designing Large-Scale Systems for Scientific Computing Nan-Chen Chen 1 Sarah S. Poon 2 Lavanya Ramakrishnan 2 Cecilia R. Aragon 1,2.
High throughput biology data management and data intensive computing drivers George Michaels.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Office of Science U.S. Department of Energy High-Performance Network Research Program at DOE/Office of Science 2005 DOE Annual PI Meeting Brookhaven National.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Toward High Breakthrough Collaboration (HBC) Susan Turnbull Program Manager Advanced Scientific Computing Research March 4, 2009.
Evolution of successful Forum for Computational Excellence (FCE) Pilot project – raising awareness for HEP response to rapid evolution of the computational.
Energy Systems Integration Facility May Renewable and Efficiency Technology Integration ESIF Supports National Goals National carbon goals require.
CPM 2012, Fermilab D. MacFarlane & N. Holtkamp The Snowmass process and SLAC plans for HEP.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
U.S. Department of Energy’s Office of Science Presentation to the Basic Energy Sciences Advisory Committee (BESAC) Dr. Raymond L. Orbach, Director November.
1 CF lab review ; September 16-19, 2013, H.Weerts Budget and Activity Summary Slides with budget numbers and FTE summaries/activity for FY14 through FY16.
Centre of Excellence in Physics at Extreme Scales Richard Kenway.
DOE Office of Science Graduate Student Research (SCGSR) Program
DOE Office of Science Graduate Student Research (SCGSR) Program
A Brief Introduction to NERSC Resources and Allocations
Tools and Services Workshop
Joslynn Lee – Data Science Educator
Electron Ion Collider New aspects of EIC experiment instrumentation and computing, as well as their possible impact on and context in society (B) COMPUTING.
BigPanDA Technical Interchange Meeting July 20, 2017 Hong Ma
National Laboratories:  Managing Team Science for Regional and Global Impact  Kathy Yelick Associate Laboratory Director Computing Sciences.
DOE Facilities - Drivers for Science: Experimental and Simulation Data
ESnet and Science DMZs: an update from the US
Computing Overview Amber Boehnlein.
Scientific Computing Strategy
Presentation transcript:

Lawrence Berkeley National Laboratory Kathy Yelick Associate Laboratory Director for Computing Sciences

Berkeley Lab Changes Science 13 Nobel Prizes 4200 employees $820M funding Operated by UC 1000 student 250 faculty

The Joint Genome Institute 1,000 Users NERSC 5,608 Users 2,443 Users Berkeley Lab User Facilities Enable SC Community to Discover, Learn, and Create The Advanced Light Source 616 Users The Molecular Foundry & NCEM All numbers from FY2014 Energy Sciences Network 240 Petabytes/yr.

Scientific Areas at Berkeley Lab Energy Sciences Biology and the Environment Computing Physical Sciences Operations

Computing Sciences Area at Berkeley Lab Computing Sciences Area Computational Research Applied Math Computer Science Data Science Computational Science Esnet Facility NERSC Facility ~$150M Funding Included Science Engagement and Support

Computing Sciences Vision Achieve transformational, breakthrough impacts in scientific domains through the discovery and use of advanced computational methods and systems and make them accessible to the broad science community.

Disruptions in Computing Over 10 Years Redefine Best-in-Class Facilities Computing performance growth stalls twice New science problems require new mathematics EXASCAL E MATH Experiment and observation will require advanced computing DATA Engaged in New Science Domains and Questions

Exascale Goals: Broad Science Impact Co-Design systems for simulation and analysis Portable and productive exascale software stack Suite of AMR applications including climate and combustion NERSC deploys the first exascale system capability of running diverse simulation and data worloads in 2024 Materials and chemistry applications using sparse and real- space methods Applications achieve 500x current NERSC capability by 2025

Data Growth is Outpacing Computing Growth Graph based on average growth

High end computing has focused on simulation ExperimentationTheory Simulation Data Analysis 10 Computing

Data analysis is equally important in Science Experimentation Theory SimulationData Analysis Computing Growth in Sequencers, CCDs, sensors, etc. 11

“Superfacility” for Experimental Science Data Systems, Services and Research MS- DESI ALSLHCJGI APSLCLS Other data- producing sources

Data collection with robots and high resolution detectors (GISAX) Transfer to NERSC FFT + mask data from experiment Analysis and modeling at NERSC, ORNL, or ANL HipGISAXS simulation HipRMC fitting FFT Compare start with random system move particle random Autotuning On-the-fly calibration, processing SPOT Suite: Real-time access via web portal Light Source: Envisioning a “Super Facility”

Interdisciplinary Data Science at Berkeley Lab PTF Super- nova Planck CMB Alice LHC Atlas LHC MP MGI Carbon Flux* Sensors Daya Bay Neu- trinos ALS Light source LCLS Light source Bio- Imaging Rad- Map Sensors JGI Bio Open- MSI mass spec KBase Bio HEP, NERSCNASA, HEPNPHEP BES,EERE, NERSC EEREHEP LDRD, ALS, NERSC, ESnet, ASCR LDRDDHSJGI, LDRD“LDRD” BER, NERSC

CRT: A Facility for Discoveries with Computing CRT Substantia l Completio n NERSC -8 Cori Phase II NERSC -8 Cori Phase I CRT 20 MW upgrad e NERSC Petaflo p CRT 40 MW upgrad e NERSC-10 Capable Exascale for broad Science ESnet-6 Deploye d ESnet-6 upgrade begins ESnet testbed upgrade Staff move in NRP complete 12.5 MW

Computational Research and Theory (CRT): A Building Designed for Exascale Systems Substantial occupancy in Spring 2015: 4-story, 76,000 sq ft facility  270 offices in collaborative office setting (2 x 20,000 sq ft floors)  $143M UC Building, $19.9M DOE computing infrastructure  21,000 sq ft HPC floor  Come to tribbon cutting on November 12th Energy efficient  Year-round air cooling  PUE < 1.1  LEED Gold design 42MW to building  12.5MW provisioned  Low cost WAPA power  Additional power needed for future systems In 2016, NERSC-8 (“Cori”) will move SC users toward exascale, offering 10x NERSC-6 on key applications and 15x more energy efficiency

Goal: To enable new modes of scientific discovery Scientific Discovery Growth in Data New Analysis Methods New Science Processes Detectors Robotics Sensors Image Analysis Machine Learning Combine observation and simulation Re-use and Re-analyze