Cyber-Infrastructure for Materials Simulation at the Petascale Trends/opportunities Recommendations for CI support David Ceperley National Center for Supercomputing.

Slides:



Advertisements
Similar presentations
College of Natural Sciences University of Northern Iowa Welcome to the Computer Science Department Dr. Ben Schafer.
Advertisements

Supporting Research on Campus - Using Cyberinfrastructure (CI) Public research use of ICT has rapidly increased in the past decade, requiring high performance.
What do we currently mean by Computational Science? Traditionally focuses on the “hard sciences” and engineering –Physics, Chemistry, Mechanics, Aerospace,
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Priority Research Direction: Portable de facto standard software frameworks Key challenges Establish forums for multi-institutional discussions. Define.
Teaching Courses in Scientific Computing 30 September 2010 Roger Bielefeld Director, Advanced Research Computing.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
University of Leeds Department of Chemistry The New MCM Website Stephen Pascoe, Louise Whitehouse and Andrew Rickard.
Jeffery Loo NLM Associate Fellow ’03 – ’05 chemicalinformaticsforlibraries.
NSF and Environmental Cyberinfrastructure Margaret Leinen Environmental Cyberinfrastructure Workshop, NCAR 2002.
Social and behavioral scientists building cyberinfrastructure David W. Lightfoot Assistant Director, National Science Foundation Social, Behavior & Economic.
B1 -Biogeochemical ANL - Townhall V. Rao Kotamarthi.
Panelist: Shashi Shekhar McKnight Distinguished Uninversity Professor University of Minnesota Cyber-Infrastructure (CI) Panel,
Multidisciplinary Research Program of the University Research Initiative (MURI) Accurate Theoretical Predictions of the Properties of Energetic Materials.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Calculation of Molecular Structures and Properties Molecular structures and molecular properties by quantum chemical methods Dr. Vasile Chiş Biomedical.
CS598CXZ Course Summary ChengXiang Zhai Department of Computer Science University of Illinois, Urbana-Champaign.
Cloud computing is the use of computing resources (hardware and software) that are delivered as a service over the Internet. Cloud is the metaphor for.
WHAT IS A COMPUTER? Computer is an electronic device designed to manipulate data so that useful information can be generated. Computer is multifunctional.
© Fujitsu Laboratories of Europe 2009 HPC and Chaste: Towards Real-Time Simulation 24 March
Profile and a quick introduction Software Engineering: ) هندسة البرمجيات (in Arabic: is the branch of computer science Designed to develop a set rules.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Distributed Virtual Laboratory for Smart Sensor System Design Distributed Virtual Laboratory for Smart Sensor System Design Oleksandr Palagin, Volodymyr.
Overview of Computing. Computer Science What is computer science? The systematic study of computing systems and computation. Contains theories for understanding.
Overview of the HUBzero Platform
Future role of DMR in Cyber Infrastructure D. Ceperley NCSA, University of Illinois Urbana-Champaign N.B. All views expressed are my own.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
Physics Steven Gottlieb, NCSA/Indiana University Lattice QCD: focus on one area I understand well. A central aim of calculations using lattice QCD is to.
R. Ryne, NUG mtg: Page 1 High Energy Physics Greenbook Presentation Robert D. Ryne Lawrence Berkeley National Laboratory NERSC User Group Meeting.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
Lecture on Computer Science as a Discipline. 2 Computer “Science” some people argue that computer science is not a science in the same sense that biology.
2005 Materials Computation Center External Board Meeting The Materials Computation Center Duane D. Johnson and Richard M. Martin (PIs) Funded by NSF DMR.
National Science Foundation DMR ITR Computational Review and Workshop: ITR and beyond Daryl Hess, NSF Bruce Taggart, NSF June 17-19, 2004 Urbana, IL.
Results of the HPC in Europe Taskforce (HET) e-IRG Workshop Kimmo Koski CSC – The Finnish IT Center for Science April 19 th, 2007.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
The Future of the iPlant Cyberinfrastructure: Coming Attractions.
Chapter 3 DECISION SUPPORT SYSTEMS CONCEPTS, METHODOLOGIES, AND TECHNOLOGIES: AN OVERVIEW Study sub-sections: , 3.12(p )
Phase diagram calculation based on cluster expansion and Monte Carlo methods Wei LI 05/07/2007.
Materials World Network: The Materials Computation Center Outreach David M. Ceperley, University of Illinois at Urbana-Champaign, DMR This Travel.
BESAC Dec Outline of the Report I. A Confluence of Scientific Opportunities: Why Invest Now in Theory and Computation in the Basic Energy Sciences?
“ … a new age has dawned in scientific and engineering research, pushed by continuing progress in computing, information and communication technology,
Geosciences - Observations (Bob Wilhelmson) The geosciences in NSF’s world consists of atmospheric science, ocean science, and earth science Many of the.
The course in Computational Physics at the University of Udine Alessandro De Angelis on behalf of the Consiglio di Corso di Laurea
Mapping New Strategies: National Science Foundation J. HicksNew York Academy of Sciences4 April 2006 Examples from our daily life at NSF Vision Opportunities.
National Center for Supercomputing Applications Barbara S. Minsker, Ph.D. Associate Professor National Center for Supercomputing Applications and Department.
Experts in numerical algorithms and High Performance Computing services Challenges of the exponential increase in data Andrew Jones March 2010 SOS14.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
1 HPC Middleware on GRID … as a material for discussion of WG5 GeoFEM/RIST August 2nd, 2001, ACES/GEM at MHPCC Kihei, Maui, Hawaii.
VAPoR: A Discovery Environment for Terascale Scientific Data Sets Alan Norton & John Clyne National Center for Atmospheric Research Scientific Computing.
GEOSCIENCE NEEDS & CHALLENGES Dogan Seber San Diego Supercomputer Center University of California, San Diego, USA.
Thermo-Calc Software MGI – the challenges and opportunities for CALPHAD NIST Diffusion Workshop May 9 and 10, 2013 P K Mason Thermo-Calc.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Role of Theory Model and understand catalytic processes at the electronic/atomistic level. This involves proposing atomic structures, suggesting reaction.
HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.
Bonus Question:. True/False: 200 Question: Businesses of all sizes use computers to collect data and keep track of records. Answer True.
The Role of Experimentation in Individual and Team Innovation and Discovery Dan Frey Department of Mechanical Engineering Engineering Systems Division.
PEER 2003 Meeting 03/08/031 Interdisciplinary Framework Major focus areas Structural Representation Fault Systems Earthquake Source Physics Ground Motions.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
Chapter 1 Objectives Define the term, computer Identify the components of a computer Discuss the advantages and disadvantages of using computers Recognize.
Sub-fields of computer science. Sub-fields of computer science.
Engineering (Richard D. Braatz and Umberto Ravaioli)
Electron Ion Collider New aspects of EIC experiment instrumentation and computing, as well as their possible impact on and context in society (B) COMPUTING.
Condensed Matter Physics and Materials Science: David M
Data Science Diversity from the Perspective of a National Laboratory
CS 21a: Intro to Computing I
EXCITED Workshop Suvrajeet Sen, DMII, ENG – CI Coordinator Workshop
Prof. Sanjay. V. Khare Department of Physics and Astronomy,
Distributed Virtual Laboratory for Smart Sensor System Design
Presentation transcript:

Cyber-Infrastructure for Materials Simulation at the Petascale Trends/opportunities Recommendations for CI support David Ceperley National Center for Supercomputing Applications Department of Physics Materials Computation Center University of Illinois Urbana-Champaign N.B. All views expressed are my own.

What will happen during the next 5 years? What will the performance be used for? CPU time increase by >30 Memory increases similar Supercomputer performance on the desktop Some methods (e. g. simulations) are not communication/memory bound. They will become more useful. Community is large and diverse; Many areas Large movement to local clusters and away from large centers. Impact of new computing trends: consumer devices, grids, data mining,…

“Nano-scale” systems Reductionist approach to “Nano-scale” systems Unity of chemistry, condensed matter physics, materials science, biophysics, nanotechnology at the nano-scale All is described by the quantum mechanics for electrons, statistical mechanics for the ions. The computational problem is well-posed but very hard. Rapid progress possible for calculations of real materials What are challenges for the next decade? How will the petaflop computers be used? Four main predictable directions:

1. Quest for accuracy in condensed matter simulations –Hard sphere MD/MC ~1953 –Empirical potentials (e.g. Lennard-Jones) ~1960 –Local density functional theory ~1985 –Correlated electronic structure methods ~2000 “Chemical accuracy” is difficult to achieve

Typical accuracy today (systematic error) is 1000K for ab initio simulations. Accuracy needs to be 100K to predict room temperature phenomena. Simulation approach only needs 100x the current resources if systematic errors are under control and efficiency maintained. Example problem achievable within 5 years: –direct (ab initio) simulation of liquid water from electrons and nuclei with accuracy much better than the current 50C. Problem that could be solvable: –simulation of strongly correlated electronic systems such as the copper oxides.

2. Larger Systems Complexity of simulation methods are similar, ranging from O(1) to O(N). Only some methods are ready to scale up. But simulations are really 4d -- both space and time need to be scaled 10 4 increase in CPU means 10-fold increase in length-time scales. Go from 2nm to 20nm by the end of the decade for high accuracy. Many important physical applications.

3. More Systems Parameter studies are very promising use of petaflop resource. Typical example: materials design. Combinatorics leads to a very large number of possible compounds to search. [>92 k ] Needs both accurate QM calculations, statistical mechanics, multiscale methods, easily accessible experimental data,… Interdisciplinary!

What is the most stable binary alloy? What about 4 components? Each square represents a PhD in 1980

4. Multiscale How to do it without losing accuracy? –QMC /DFT –DFT-MD –SE-MD –FE How to make it parallel? (Load balancing with different methods) Simulation of chemical reactions in solution. Challenge is to integrate what is happening on the microscopic quantum level with the mesoscopic classical level. Lots of software/interdisciplinary work needed. Important recent progress.

How do we achieve our potential use of computer technology in research and education? How to make best use of existing resources? What are the problems?

Why are some groups more successful than the US materials community? –Europeans (VASP ABINIT …), Quantum Chemistry, Lattice Gauge theory, applied math,… CI does not fit into the professional career path. Software is expensive –We need long term, carefully chosen projects Unlike research, the effort is wasted unless the software is, documented, maintained and used. Big opportunities: my impression is that the state of software in our field is low. We could be doing research more efficiently. Basic condensed matter software needed in education.

3 legs of CI 1.Research New algorithms have led to advances greater than hardware in efficiency Many new methods appearing 2.Deployment and maintenance New efficient codes just don’t happen 3.Education We need to bring more people into the CI game. Funding for the 3 legs should be separate since they have different aims.

Dan Reed’s observations

Software/infrastructure development Support development of tested methodology, including user documentation, training, maintenance Market based approach to what software we need Yearly open competitions for small (1 PDRA) grants for developing & maintaining CI indefinitely. Standing panel to rank proposals based on expected impact within 5 years. –Institutional memory in panel is important. –Key factors in the review should be experience with actual users of the software, experts in the methodology and measurable scientific impact of code. –Not research but deployment and maintenance.

Education in Computational Science Need for ongoing specialized training:workshops, tutorials, courses –Parallel computing, optimization –Numerical Libraries and algorithms –Languages, code development tools Develop a computational culture and community. Need to refill the pipeline for algorithm experts. Falls in between NSF directorates Meeting place for scientists of different disciplines having similar problems (like KITP?) Reach a wider world through the web. Explore new ways of sustaining groups. Large payoff for relatively low investment

Inter-aspects Interdisciplinary teams are needed: especially CS & applications, applied math & applications (performance analysis, best practice software development) Disciplines sharing the same problem International team ….

Databases for materials? We need vetted benchmarks with various theoretical and experimental data Storage of all the outputs? –What is balance between computation and storage? – Computed data is perishable in that the cost to regenerate decreases each year and improvements in accuracy mean newer data is more reliable. –Useful in connection with published reports in testing codes and methods. –Expanding role of journals? New Journal: “Computational Science and Discovery” has this as its aim. –Need to handle “drinking from firehose.” This could be handled by XML based data structure (standards) to store inputs and outputs. (Zurich meeting next month will address this)

Computational funding modes 1.Large collaborations (medium ITR’s) Needed for multidisciplinary/large projects 2.Algorithmic research (small ITR’s) Fits into “scientific/academic” culture. 3.Cycle providers (NSF centers&local clusters) “time machine”, for groups not having their own cluster or having special needs 4.Coupled research/development/CPU grants for the petascale machines 5.Software/infrastructure development 6.Education in CI unmet opportunities At risk