The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/2007-2013) under grant agreement n° RI-261507. Distributed Multiscale.

Slides:



Advertisements
Similar presentations
OMII-UK Steven Newhouse, Director. © 2 OMII-UK aims to provide software and support to enable a sustained future for the UK e-Science community and its.
Advertisements

EGEE 08, Istanbul, September 25th, Support for experimental science communities Istanbul, September 25th, 2008 Norbert Meyer, Marcin Płóciennik.
ASCR Data Science Centers Infrastructure Demonstration S. Canon, N. Desai, M. Ernst, K. Kleese-Van Dam, G. Shipman, B. Tierney.
High Performance Computing Course Notes Grid Computing.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI Multiscale APPlications.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI CYFRONET Programming.
New Capabilities in QosCosGrid Middleware for Advanced Job Management, Advance Reservation and Co-allocation of Computing Resources B. Bosak, P. Kopta,
SICSA student induction day, 2009Slide 1 Social Simulation Tutorial Session 6: Introduction to grids and cloud computing International Symposium on Grid.
What is OMII-Europe? Qin Li Beihang University. EU project: RIO31844-OMII-EUROPE 1 What is OMII-Europe? Open Middleware Infrastructure Institute for Europe.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI Multiscale Applications.
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
New Communities: The Virtual Physiological Human Use Case Stefan Zasada University College London
Advanced Techniques for Scheduling, Reservation, and Access Management for Remote Laboratories Wolfgang Ziegler, Oliver Wäldrich Fraunhofer Institute SCAI.
Issues in (Financial) High Performance Computing John Darlington Director Imperial College Internet Centre Fast Financial Algorithms and Computing 4th.
RI User Support in DEISA/PRACE EEF meeting 2 November 2010, Geneva Jules Wolfrat/Axel Berg SARA.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Steven Newhouse Technical Director CERN.
SEEK Welcome Malcolm Atkinson Director 12 th May 2004.
1 The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI Towards Environment.
ICCS WSES BOF Discussion. Possible Topics Scientific workflows and Grid infrastructure Utilization of computing resources in scientific workflows; Virtual.
Terena conference, June 2004, Rhodes, Greece Norbert Meyer The effective integration of scientific instruments in the Grid.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
EGEE-III-INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE-III All Activity Meeting Brussels,
EMI INFSO-RI European Middleware Initiative (EMI) Alberto Di Meglio (CERN)
European Middleware Initiative (EMI) Alberto Di Meglio (CERN) Project Director.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI Requirements for Multiscale.
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI Tools for Building and.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI UMD Roadmap Steven Newhouse 14/09/2010.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI CYFRONET Hands.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI CYFRONET Multiscale.
EUDAT receives funding from the European Union's Horizon 2020 programme - DG CONNECT e-Infrastructures. Contract No Collaboration.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI EGI and PRACE ecosystem.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
Co-ordination & Harmonisation of Advanced e-Infrastructures for Research and Education Data Sharing Grant.
EGI-InSPIRE EGI-InSPIRE RI EGI strategy towards the Open Science Commons Tiziana Ferrari EGI-InSPIRE Director at EGI.eu.
EGI-InSPIRE RI EGI-InSPIRE RI EGI-InSPIRE Software provisioning and HTC Solution Peter Solagna Senior Operations Manager.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI Overview for ENVRI Gergely Sipos, Malgorzata Krakowian EGI.eu
EGI-InSPIRE RI An Introduction to European Grid Infrastructure (EGI) March An Introduction to the European Grid Infrastructure.
Ian Bird, CERN WLCG Project Leader Amsterdam, 24 th January 2012.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI DCI Collaborations Steven Newhouse 15/09/2010 DCI Vision1.
Piotr Bała, Marcin Radecki, Krzysztof Benedyczak
Seasonal School Demo and Assigments
Bob Jones EGEE Technical Director
New Paradigms: Clouds, Virtualization and Co.
Scientific workflow in Kepler – hands on tutorial
INTAROS – Integrated Arctic Observation System
Multiscale Applications on European e-Infrastructures
INTAROS WP5 Data integration and management
Exploitation and Sustainability updates
LinkSCEEM-2: A computational resource for the Eastern Mediterranean
EMI Interoperability Activities
Robert Szuman – Poznań Supercomputing and Networking Center, Poland
Grid Computing.
Grid Portal Services IeSE (the Integrated e-Science Environment)
ICT NCP Infoday Brussels, 23 June 2010
Abstract Machine Layer Research in VGrADS
EGI-Engage Engaging the EGI Community towards an Open Science Commons
BoF: VREs- Keith G Jeffery & Helen Glaves
PROCESS - H2020 Project Work Package WP6 JRA3
Management of Virtual Execution Environments 3 June 2008
Connecting the European Grid Infrastructure to Research Communities
SDM workshop Strawman report History and Progress and Goal.
EGI Webinar - Introduction -
Smart Learning concepts to enhance SMART Universities in Africa
Sergio Andreozzi Strategy and Policy Manager (EGI.eu)
Expand portfolio of EGI services
Maria Teresa Capria December 15, 2009 Paris – VOPlaneto 2009
e-Infrastructures for Research and Education:
Presentation transcript:

The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI Distributed Multiscale Computing, the MAPPER project Alfons Hoekstra, Krzysztof Kurowski, Carles Bona

2 Our Vision Distributed Multiscale Computing... Strongly science driven, application pull...on existing and emerging European e- Infrastructures, and exploiting as much as possible services and software developed in earlier (EU-funded) projects. Strongly technology driven, technology push

3 Nature is Multiscale Natural processes are multiscale 1 H 2 O molecule A large collection of H 2 O molecules, forming H-bonds A fluid called water, and, in solid form, ice.

4 Multi-Scale modeling Scale Separation Map Nature acts on all the scales We set the scales And then decompose the multiscale system in single scale sub-systems And their mutual coupling temporal scale spatial scale ΔxΔx L ΔtΔt T

5 From a Multi-Scale System to many Single-Scale Systems Identify the relevant scales Design specific models which solve each scale Couple the subsystems using a coupling method temporal scale spatial scale ΔxΔx L ΔtΔt T

6 Why multiscale models? There is simply no hope to computationally track complex natural processes at their finest spatio-temporal scales. Even with the ongoing growth in computational power.

7 Minimal demand for multiscale methods

8 But what about multiscale computing? Inherently hybrid models are best serviced by different types of computing environments When simulated in three dimensions, they usually require large scale computing capabilities. Such large scale hybrid models require a distributed computing ecosystem, where parts of the multiscale model are executed on the most appropriate computing resource. Distributed Multiscale Computing

9 Two Multiscale Computing paradigms Loosely Coupled One single scale model provides input to another Single scale models are executed once workflows Tightly Coupled Single scale models call each other in an iterative loop Single scale models may execute many times Dedicated coupling libraries are needed temporal scale spatial scale ΔxΔx L ΔtΔt T temporal scale spatial scale ΔxΔx L ΔtΔt T

10 MAPPER Multiscale APPlications on European e-infRastructures (proposal number ) Project Overview University of Amsterdam Max-Planck Gesellschaft zur Foerderung der Wissenschaften E.V. University of Ulster Poznan Supercomputing and Networking Centre Akademia Gorniczo-Hutnicza im. Stanislawa Staszica w Krakowie Ludwig-Maximilians-Universität München University of Geneva Chalmers Tekniska Högskola University College London

11 Motivation: user needs VPHFusionComputional Biology Material Science Engineering Distributed Multiscale Computing Needs

12 Ambition Develop computational strategies, software and services for distributed multiscale simulations across disciplines exploiting existing and evolving European e-infrastructure Deploy a computational science infrastructure Deliver high quality components aiming at large-scale, heterogeneous, high performance multi-disciplinary multiscale computing. Advance state-of-the-art in high performance computing on e- infrastructures enable distributed execution of multiscale models across e-Infrastructures,

13 MAPPER Roadmap October 1, 2010 – start of project Fast track deployment – first year of project Loosely and tightly coupled distributed multiscale simulations can be executed. Deep track deployment – second and third year More demanding loosely and tightly coupled distributed multiscale simulation can be executed Programming and access tools available Interoperability available

14 Highlights year 1 7 applications from 5 scientific domains brought under a common generic multiscale computing framework virtual physiological human fusion hydrology nano material science computational biology SSMCoupling topology (x)MML Task graph Scheduling

15 Highlights year 1 Application composition: from MML to executable experiment Application composition: from MML to executable experiment Registration of MML metadata: submodules and scales Result Management Result Management Execution of experiment using interoperability layer on e-infrastructure Execution of experiment using interoperability layer on e-infrastructure

16 …… MoU signed Taskforce established 1 st evaluation 05 1 st EU review selected two apps on MAPPER e-Infrastructure (EGI and PRACE resources) Tier - 2 Tier - 1 Tier - 0 MAPPER Taskforce Joint Taskforce between EGI, MAPPER, and PRACE Collaborate with EGI and PRACE to introduce new capabilities and policies onto e-Infrastructures thanks to QosCosGrid (QCG) and interoperability tools Deliver new application tools, problem solving environments and services to meet end-users needs Work closely with various end-users communities (involved directly in MAPPER) to perform distributed multiscale simulations and complex experiments

17 e-Infrastructure Services Offered (Unicore, gLite): Interactive access Data management Job execution Post-processing, e.g. visualization New capabilities (will be hopefully around thanks to MAPPER): Advance reservation Co-allocation Cross-cluster coupling library and services – MUSCLE Cross-cluster MPI (QCG-OMPI) Workflow management and execution (parallel jobs are part of workflows), ongoing discussions with SHIWA

18 Fast vs. Deep tracks components

19 MAPPER software integration

20 Interoperability and standards

21 Interoperability and efficiency Example benchmarks of remote job submissions using the OGF SAGA testsuit and OGF BES interfaces supported by different middleware Note that QCG middleware supports also additional features, in particular Advance Reservation and co- allocation to synchronize in time access to distributed computing resources

22 MAPPER e-Infrastructure (1 st review, November 2011) UCL Cyfronet SARA 10k + 10k +… 512 (no limit ;-) 124 (no limit ;-) CINECA PSNC LRZ/LMU

23 DMC demonstrations Two applications scenarios operational loosely coupled DMC tightly coupled DMC

24 Monitoring Provide information about availability and functionality of MAPPER services Nagios Hosted at LRZ Real-time service status Failure notification Statistics Integration with EGI and PRACE monitoring

25 Policy documents Policy document, and dissemination of its content. Grid Computing – The Next Decade Computing – The Next Decade

26 Invitation MAPPER Live Demonstration 15:30 – 16:00 Today Ground Floor FMI Next to EGI (MNM – Team)