Computer Science in UNEDF George Fann, Oak Ridge National Laboratory Rusty Lusk, Argonne National Laboratory Jorge Moré, Argonne National Laboratory Esmond.

Slides:



Advertisements
Similar presentations
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Advertisements

ORNL, 6 December 2007 Large-Scale Mass Table Calculations M. Stoitsov J. Dobaczewski, W. Nazarewicz, J. Pei, N. Schunck Department of Physics and Astronomy,
Presented by Suzy Tichenor Director, Industrial Partnerships Program Computing and Computational Sciences Directorate Oak Ridge National Laboratory DOE.
The DOE Science Grid Computing and Data Infrastructure for Large-Scale Science William Johnston, Lawrence Berkeley National Lab Ray Bair, Pacific Northwest.
Presented by Scalable Systems Software Project Al Geist Computer Science Research Group Computer Science and Mathematics Division Research supported by.
A 100,000 Ways to Fa Al Geist Computer Science and Mathematics Division Oak Ridge National Laboratory July 9, 2002 Fast-OS Workshop Advanced Scientific.
Computing At Argonne (A Sampler) William Gropp
ADLB Update Recent and Current Adventures with the Asynchronous Dynamic Load Balancing Library Rusty Lusk Mathematics and Computer Science Division Argonne.
Oak Ridge National Laboratory — U.S. Department of Energy 1 The ORNL Cluster Computing Experience… John L. Mugler Stephen L. Scott Oak Ridge National Laboratory.
The Asynchronous Dynamic Load-Balancing Library Rusty Lusk, Steve Pieper, Ralph Butler, Anthony Chan Mathematics and Computer Science Division Nuclear.
Commodity Grid (CoG) Kits Keith Jackson, Lawrence Berkeley National Laboratory Gregor von Laszewski, Argonne National Laboratory.
An Automated Component-Based Performance Experiment and Modeling Environment Van Bui, Boyana Norris, Lois Curfman McInnes, and Li Li Argonne National Laboratory,
© Fujitsu Laboratories of Europe 2009 HPC and Chaste: Towards Real-Time Simulation 24 March
Preserving the Scientific Record: Establishing Relationships with Archives Matthew Mayernik National Center for Atmospheric Research Version 1.0 Review.
LLNL-PRES-XXXXXX This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, ADVANCED SCIENTIFIC COMPUTING RESEARCH An.
1 Scientific Data Management Center DOE Laboratories: ANL: Rob Ross LBNL:Doron Rotem LLNL:Chandrika Kamath ORNL: Nagiza Samatova.
Center for Programming Models for Scalable Parallel Computing: Project Meeting Report Libraries, Languages, and Execution Models for Terascale Applications.
November 13, 2006 Performance Engineering Research Institute 1 Scientific Discovery through Advanced Computation Performance Engineering.
Presented by High Productivity Language and Systems: Next Generation Petascale Programming Wael R. Elwasif, David E. Bernholdt, and Robert J. Harrison.
Presented by High Productivity Language Systems: Next-Generation Petascale Programming Aniruddha G. Shet, Wael R. Elwasif, David E. Bernholdt, and Robert.
3D ASLDA solver - status report Piotr Magierski (Warsaw), Aurel Bulgac (Seattle), Kenneth Roche (Oak Ridge), Ionel Stetcu (Seattle) Ph.D. Student: Yuan.
Efficient Visualization and Analysis of Very Large Climate Data Hank Childs, Lawrence Berkeley National Laboratory December 8, 2011 Lawrence Livermore.
Crystal Ball Panel ORNL Heterogeneous Distributed Computing Research Al Geist ORNL March 6, 2003 SOS 7.
Alex Brown PREX Aug Neutron Radii and the Neutron Equation of State.
High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative.
Center for Computational Sciences O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Vision for OSC Computing and Computational Sciences
Opportunities in Parallel I/O for Scientific Data Management Rajeev Thakur and Rob Ross Mathematics and Computer Science Division Argonne National Laboratory.
Nuclear structure and reactions Nicolas Michel University of Tennessee.
Presented by An Overview of the Common Component Architecture (CCA) The CCA Forum and the Center for Technology for Advanced Scientific Component Software.
National Collaboratories Program Overview Mary Anne ScottFebruary 7, rd DOE/NSF Meeting on LHC and Global Computing “Infostructure”
1CPSD Software Infrastructure for Application Development Laxmikant Kale David Padua Computer Science Department.
Ab-initio Calculations of Microscopic Structure of Nuclei James P. Vary, Iowa State University Esmond G. Ng, Berkeley Lab Masha Sosonkina, Ames Lab April.
OS and System Software for Ultrascale Architectures – Panel Jeffrey Vetter Oak Ridge National Laboratory Presented to SOS8 13 April 2004 ack.
The Earth System Grid (ESG) Computer Science and Technologies DOE SciDAC ESG Project Review Argonne National Laboratory, Illinois May 8-9, 2003.
Presented by Scientific Data Management Center Nagiza F. Samatova Network and Cluster Computing Computer Sciences and Mathematics Division.
Land Ice Verification and Validation (LIVV) Kit Weak scaling behavior for a large dome- shaped test case. It shows that the scaling behavior of a new run.
Brent Gorda LBNL – SOS7 3/5/03 1 Planned Machines: BluePlanet SOS7 March 5, 2003 Brent Gorda Future Technologies Group Lawrence Berkeley.
A look at computing performance and usage.  3.6GHz Pentium 4: 1 GFLOPS  1.8GHz Opteron: 3 GFLOPS (2003)  3.2GHz Xeon X5460, quad-core: 82 GFLOPS.
Scalable Systems Software for Terascale Computer Centers Coordinator: Al Geist Participating Organizations ORNL ANL LBNL.
U.S. Department of Energy’s Office of Science News from the Office of Science Presentation to the Basic Energy Sciences Advisory Committee August 6, 2004.
Spectroscopy of Odd-Mass Nuclei in Energy Density Functional Theory Impact of Terascale Computing N. Schunck University of Tennessee, 401 Nielsen Physics,
1 DOE Office of Science October 2003 SciDAC Scientific Discovery through Advanced Computing Alan J. Laub.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
1 1 Office of Science Jean-Luc Vay Accelerator Technology & Applied Physics Division Lawrence Berkeley National Laboratory HEP Software Foundation Workshop,
1 Technology to calculate observables Global properties Spectroscopy DFT Solvers Functional form Functional optimization Estimation of theoretical errors.
BioPSE NCRR SCIRun2 -THE PROJECT -OBJECTIVES -DEVELOPMENTS -TODAY -THE FUTURE.
Lawrence Livermore National Laboratory Effective interactions for reaction calculations Jutta Escher, F.S. Dietrich, D. Gogny, G.P.A. Nobre, I.J. Thompson.
BESAC August Part III IV. Connecting Theory with Experiment V. The Essential Resources for Success Co-Chairs Bruce Harmon – Ames Lab and Iowa.
Presented by Performance Engineering Research Institute (PERI) Patrick H. Worley Computational Earth Sciences Group Computer Science and Mathematics Division.
Report on some of the UNEDF work performed by the UW centered group A. Bulgac, Y.-L. (Alan) Luo, P. Magierski, K.J. Roche, I. Stetcu, S. Yoon J.E. Drut,
Presented by Scientific Data Management Center Nagiza F. Samatova Oak Ridge National Laboratory Arie Shoshani (PI) Lawrence Berkeley National Laboratory.
DOE Network PI Meeting 2005 Runtime Data Management for Data-Intensive Scientific Applications Xiaosong Ma NC State University Joint Faculty: Oak Ridge.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
Xolotl: A New Plasma Facing Component Simulator Scott Forest Hull II Jr. Software Developer Oak Ridge National Laboratory
2/22/2001Greenbook 2001/OASCR1 Greenbook/OASCR Activities Focus on technology to enable SCIENCE to be conducted, i.e. Software tools Software libraries.
Supercomputing 2006 Scientific Data Management Center Lead Institution: LBNL; PI: Arie Shoshani Laboratories: ANL, ORNL, LBNL, LLNL, PNNL Universities:
The Performance Evaluation Research Center (PERC) Participating Institutions: Argonne Natl. Lab.Univ. of California, San Diego Lawrence Berkeley Natl.
The Fifth UNEDF Annual Collaboration Meeting June 20-24, 2011, Michigan State University Welcome! UNEDF members collaborators guests.
Presented by NCCS Hardware Jim Rogers Director of Operations National Center for Computational Sciences.
MPI on a Million Processors Pavan Balaji, 1 Darius Buntinas, 1 David Goodell, 1 William Gropp, 2 Sameer Kumar, 3 Ewing Lusk, 1 Rajeev Thakur, 1 Jesper.
April 17 DoE review 1 Future Computing Needs for Reaction Theory Ian Thompson Nuclear Theory and Modeling Group, Lawrence Livermore National Laboratory.
The Team: Ab initio DFT OSU (Drut, Furnstahl, Platter) MSU (Bogner, Gebremariam) DFT applications UTK (Nazarewicz, Pei, Schunck, Sheikh, Stoitsov) UW Seattle.
Center for Component Technology for Terascale Simulation Software (CCTTSS) 110 April 2002CCA Forum, Townsend, TN This work has been sponsored by the Mathematics,
HPC University Requirements Analysis Team Training Analysis Summary Meeting at PSC September Mary Ann Leung, Ph.D.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
Presented by SciDAC-2 Petascale Data Storage Institute Philip C. Roth Computer Science and Mathematics Future Technologies Group.
VisIt Project Overview
Performance Technology for Scalable Parallel Systems
Software Practices for a Performance Portable Climate System Model
Presentation transcript:

Computer Science in UNEDF George Fann, Oak Ridge National Laboratory Rusty Lusk, Argonne National Laboratory Jorge Moré, Argonne National Laboratory Esmond Ng, Lawrence Berkeley National Laboratory Ken Roche, Oak Ridge National Laboratory Masha Sosonkina, Ames Laboratory

Argonne National Laboratory Goals of the Computer Science Efforts in the UNEDF SciDAC Work with specific physicists in the project to improve particular UNEDF component codes Create methods and software that are useful to other codes in UNEDF Create methods and software that may be useful to other high-performance scientific applications

Argonne National Laboratory The Collaborations Large-Scale Eigenvalue Calculations –James Vary (Iowa State), Esmond Ng, Chao Yang (LBNL) Asynchronous Dynamic Load-Balancing Library –Steven Pieper (ANL), Rusty Lusk (ANL) Multiresolution Methods for Nuclear DFT-Based Methods –Witold Nazarewicz (UT), David Dean (ORNL/UT), George Fann (ORNL) Coupled Cluster Expansions –David Dean (ORNL/UT), Ken Roche (ORNL) Performance Analysis and Optimization –Witold Nazarewicz (UT), Jorge Moré, Boyana Norris, Jason Sarich (ANL) Implementation of 3D-lattice scalable nuclear static and time-dependent Density Functional Theory (DFT) –Aurel Bulgac (UW), Ken Roche (ORNL) Usability and Automatic Deployment –James Vary (Iowa State U.), Masha Sosonkina (Ames Lab)

Argonne National Laboratory Where the CS Collaborations Fit In Dean Roche Pieper Lusk Vary Ng/Sosonkina Nazarewicz Fann Nazarewicz Moré/Norris Bulgac Roche Physicists Computer Scientists Bulgac Roche

Argonne National Laboratory INCITE Award Title: Computational Nuclear Structure Awardees (all in UNEDF): –David Dean, Oak Ridge National Laboratory –James Vary, Iowa State University –Witold Nazarewicz, University of Tennessee and ORNL –Steven Pieper, Argonne National Laboratory Computer Resources awarded: –10,000,000 processor-hours on Argonne’s IBM BlueGene P –7,500,000 processor-hours on Oak Ridge’s Cray XT-4 Much of the computer science work described here is targeted at scaling up the relevant codes to take advantage of the power of the leadership class machines.

Argonne National Laboratory Testimonials “We have made considerable progress and much more is in the pipeline. These collaborations between physicists and computer scientists have been critical to making rapid progress.” -- James Vary, Iowa State “I think all us are working as a team very well on this project which ultimately will enable the GFMC program to use 10,000's of processors. Without this collaboration, I see no way that the GFMC program would be able to effectively use the next generation of supercomputers. I also think that my needs are helping to define a package, being written by computer-science professionals, that will be useful to others.” -- Steve Pieper, Argonne National Laboratory “In general, there is a great level of satisfaction on the collaboration with our CS colleagues.” -- Witold Nazarewicz, Oak Ridge

Argonne National Laboratory End of CS Overview