The Astrophysical MUltiscale Software Environment (AMUSE) P-I: Portegies Zwart Co-Is: Nelemans, Pols, O’Nuallain, Spaans Adv.: Langer, Tolstoy, Hut, Ercolano,

Slides:



Advertisements
Similar presentations
The Complete Technical Analysis and Development Environment An attractive alternative to MATLAB and GAUSS - Physics World.
Advertisements

1 VO Theory Use Cases – Intermediate Scale David De Young Project Scientist US NVO IVOA Theory Interest Group S. Lorenzo del Escorial - 10/05.
Sundanc e High-tech DSP solutions. Giving you the freedom to design Multiprocessor Technology Ltd SOFTWARE UTILITY TOOLS.
Adding Theory to the Virtual Observatory Sarah Maddison Centre for Astrophysics & Supercomputing Swinburne University Outline Two possible ways to add.
1 Coven a Framework for High Performance Problem Solving Environments Nathan A. DeBardeleben Walter B. Ligon III Sourabh Pandit Dan C. Stanzione Jr. Parallel.
Introduction to the GRAPE. N-body : direct integration of the equations of motion Fokker-Planck : direct integration of the Fokker-Planck equation Monte-Carlo.
GravitySimulator Beyond the Million Body Problem Collaborators:Rainer Spurzem (Heidelberg) Peter Berczik (Heidelberg/Kiev) Simon Portegies Zwart (Amsterdam)
RCAC Research Computing Presents: DiaGird Overview Tuesday, September 24, 2013.
Modeling the Physics of Galaxy Formation Andrew Benson California Institute of Technology.
Module on Computational Astrophysics Professor Jim Stone Department of Astrophysical Sciences and PACM.
© , Michael Aivazis DANSE Software Issues Michael Aivazis California Institute of Technology DANSE Software Workshop September 3-8, 2003.
Module on Computational Astrophysics Jim Stone Department of Astrophysical Sciences 125 Peyton Hall : ph :
Close encounters between stars and Massive Black Holes Clovis Hopman Weizmann Institute of Science Israel Advisor: Tal Alexander.
A. Frank - P. Weisberg Operating Systems Evolution of Operating Systems.
A l a p a g o s : a generic distributed parallel genetic algorithm development platform Nicolas Kruchten 4 th year Engineering Science (Infrastructure.
PROGRAMMING LANGUAGES The Study of Programming Languages.
J. Cuadra – Accretion of Stellar Winds in the Galactic Centre – IAU General Assembly – Prague – p. 1 Accretion of Stellar Winds in the Galactic Centre.
Software Engineering 1 The Life Cicle of Software Lesson 5.
Joshua Alexander University of Oklahoma – IT/OSCER ACI-REF Virtual Residency Workshop Monday June 1, 2015 Deploying Community Codes.
The Evolution of the Universe Nicola Loaring. The Big Bang According to scientists the Universe began ~15 billion years ago in a hot Big Bang. At creation.
Course Outline Introduction in algorithms and applications Parallel machines and architectures Overview of parallel machines, trends in top-500, clusters,
EENG 1920 Chapter 1 The Engineering Design Process 1.
Jonathan Carroll-Nellenback University of Rochester.
CCA Common Component Architecture Manoj Krishnan Pacific Northwest National Laboratory MCMD Programming and Implementation Issues.
Service-enabling Legacy Applications for the GENIE Project Sofia Panagiotidi, Jeremy Cohen, John Darlington, Marko Krznarić and Eleftheria Katsiri.
Advisor: Dr. Aamir Shafi Co-Advisor: Mr. Ali Sajjad Member: Dr. Hafiz Farooq Member: Mr. Tahir Azim Optimizing N-body Simulations for Multi-core Compute.
Trace Generation to Simulate Large Scale Distributed Application Olivier Dalle, Emiio P. ManciniMar. 8th, 2012.
Simultech 2011, July, 2011, Noordwijkerhout, The Netherlands Component Approach to Distributed Multiscale Simulations Katarzyna Rycerz(1,2), Marian.
Introduction To System Analysis and Design
Research in Astronomy Prof. David Cohen Swarthmore College January 30, 2004 Resources and Information for Students Sponsored by SWAP.
1 Cactus in a nutshell... n Cactus facilitates parallel code design, it enables platform independent computations and encourages collaborative code development.
Objective of numerical relativity is to develop simulation code and relating computing tools to solve problems of general relativity and relativistic astrophysics.
Center for Component Technology for Terascale Simulation Software CCA is about: Enhancing Programmer Productivity without sacrificing performance. Supporting.
Bulgarian GRID, Bulgarian Virtual observatory and some astronomical applications Georgi Petrov, Momchil Dechev and Emanouil Atanasov VII Bulgarian-Serbian.
From Simulation to Visualization: Astrophysics Goes Hollywood Frank Summers January 17, 2002.
Advanced Simulation and Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
Galactic Nucleus. Mass of the Galaxy The orbit of clusters can be used to estimate the mass of the galaxy. –Same used for planets and binary stars The.
1 The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI Towards Environment.
THEORETICAL ASTROPHYSICS AND THE US-NVO INITIATIVE D. S. De Young National Optical Astronomy Observatory.
F. Douglas Swesty, DOE Office of Science Data Management Workshop, SLAC March Data Management Needs for Nuclear-Astrophysical Simulation at the Ultrascale.
Jacqueline Keane NASA Astrobiology Pavel Senin University of Hawaii at
Cracow Grid Workshop, November 5-6, 2001 Concepts for implementing adaptive finite element codes for grid computing Krzysztof Banaś, Joanna Płażek Cracow.
Connections to Other Packages The Cactus Team Albert Einstein Institute
1 1 Office of Science Jean-Luc Vay Accelerator Technology & Applied Physics Division Lawrence Berkeley National Laboratory HEP Software Foundation Workshop,
Page 1 PACS GRITS 17 June 2011 Herschel Data Analysis Guerilla Style: Keeping flexibility in a system with long development cycles Bernhard Schulz NASA.
SDM Center High-Performance Parallel I/O Libraries (PI) Alok Choudhary, (Co-I) Wei-Keng Liao Northwestern University In Collaboration with the SEA Group.
1 Data Structures for Scientific Computing Orion Sky Lawlor /04/14.
A computer contains two major sets of tools, software and hardware. Software is generally divided into Systems software and Applications software. Systems.
C OMPUTATIONAL R ESEARCH D IVISION 1 Defining Software Requirements for Scientific Computing Phillip Colella Applied Numerical Algorithms Group Lawrence.
Design and implementation Chapter 7 – Lecture 1. Design and implementation Software design and implementation is the stage in the software engineering.
Workflow Management Concepts and Requirements For Scientific Applications.
Intersecting UK Grid & EGEE/LCG/GridPP Activities Applications & Requirements Mark Hayes, Technical Director, CeSC.
Operating System Concepts with Java – 7 th Edition, Nov 15, 2006 Silberschatz, Galvin and Gagne ©2007 Chapter 0: Historical Overview.
Regression Testing for CHIMERA Jessica Travierso Austin Peay State University Research Alliance in Math and Science National Center for Computational Sciences,
VisIt Project Overview
CST 1101 Problem Solving Using Computers
Hydrodynamic Galactic Simulations
ASC/Alliances Center for Astrophysical Thermonuclear Flashes
Evolution of Operating Systems
P4 : Stellar system formation and life appearance conditions
AWS Batch Overview A highly-efficient, dynamically-scaled, batch computing service May 2017.
BlueGene/L Supercomputer
SDM workshop Strawman report History and Progress and Goal.
Simulation at NASA for the Space Radiation Effort
Objective Understand web-based digital media production methods, software, and hardware. Course Weight : 10%
This is NOT the Milky Way galaxy! It’s a similar one: NGC 4414.
GENERAL VIEW OF KRATOS MULTIPHYSICS
Summary What is the Gadget-2 How Gadget-2 works Preliminar results
Presentation transcript:

The Astrophysical MUltiscale Software Environment (AMUSE) P-I: Portegies Zwart Co-Is: Nelemans, Pols, O’Nuallain, Spaans Adv.: Langer, Tolstoy, Hut, Ercolano, de Grijs, Mellema, Spurzem, Bischof, Quillen AMUSE

The objectives of AMUSE More science with existing software Combine existing astrophysical codes This is a technical problem It is technically possible Impression of how it works

Existing codes Excellent single-physics codes exist  hydro  gravity  radiation  stellar evolution All written in different languages, different format, different architecture.... Need a homogeneous environment for utilizing these resources

More science with existing code Universe is multi-physics... Scientific objectives:  dense stellar systems (hydro+gravity+stellar evo.)  evolution of galactic environments, star formation, AGN,... (hydro+gravity+radiation)  planet formation (hydro+gravity+radiation)  galaxy formation and interaction (gravity+hydro+radiation+stellar evo.) Single physics software solutions exist, try to combine existing codes

This is a technical problem No new physics needed Combining requires understanding of how software and computer hardware interacts Development to a usefull toolbox requires professional engineering Requires substantial manpower

It is technically feasible Developing new code not optimal because  it is a time consuming task  large codes tend to become unmanageable  initial assumptions tend to require redesign at a late stage in the development process Combining existing code via wrapper has been tried, and works Propose homogeneous software framework, à la Numerical Recipes

Flow control layer (scripting language) Gas dynamics Radiative transport Stellar evolutionStellar dynamics Interface layer (scripting and high level languages) Smoothed particles hydrodynamics Metropolis Hastings Monte Carlo Henyey multi-shell stellar evolution 4 th order Hermite block timestep N-body AMUSE

Limitations and Merits - Only problems whose physics are expressible through module coupling (different time scales) - Low and high level use possible - Radiative transfer (and stellar evolution) module links to VO (through eg. ‘spiegel’ and ‘partiview’): dust and stellar continuum, atomic and molecular lines; ELT, JWST, ALMA, Herschel

Impression of how it works A) install B) suite of test applications C) design your own multi-physics problem D) write script E) run F) analyze data G) download package from website H) write Nature paper

Design/Performance AMUSE module must be written in language with Foreign Function Interface (C, C++, Fortran as well as high level languages like C#, Java, Haskell. Low level applications optimized. Top level uses a scripting language. These are slow, but do just I/O, GUI, call sequence. Top level can run in parallel (using MPI, GRID technology); data exchange through HDF Low level can run in parallel or on dedicated hardware (eg GRAPE or GPU for direct N-body)

Initial Applications Young and dense star cluster Evolution of gas and stars near a black hole in a galactic nucleus Dynamics of embryonic planets in a debris disk

Relation to other projects Different concept but with similar scientific objectives/physics:  FLASH  Gadget  Starlab Comparable in setup but with different scientific objectives:  Atmosphere/Ocean/Tectonic simulations by NASA  Molecular dynamics

QUESTIONS?

management/development plan programmers under daily supervision of software engineer and PI regular interaction with postdoc, who protects scientific objectives

The cost 6-year of programming effort (3x2years?) 2 years of software engineering 2 years of postdoc travel, webservices, hardware, etc. total cost: 640Keuro NOVA request: 500kEuro