Using slides from Shahbaz Memon, Dorothée Vallot and Thomas Zwinger

Slides:



Advertisements
Similar presentations
E-IRG Workshop CSC, October 4, 2006 Risto M. Nieminen Helsinki University of Technology HELSINKI UNIVERSITY OF TECHNOLOGY.
Advertisements

1 ITCS 4/5010 CUDA Programming, UNC-Charlotte, B. Wilkinson, Feb 26, 2013, DyanmicParallelism.ppt CUDA Dynamic Parallelism These notes will outline CUDA.
Coupling Continuum Model and Smoothed Particle Hydrodynamics Methods for Reactive Transport Yilin Fang, Timothy D Scheibe and Alexandre M Tartakovsky Pacific.
Latest Advances in “Hybrid” Codes & their Application to Global Magnetospheric Simulations A New Approach to Simulations of Complex Systems H. Karimabadi.
CSC Grid Activities Arto Teräs HIP Research Seminar February 18th 2005.
Testing RAVEN Helmut Neukirchen Faculty of Industrial Engineering, Mechanical Engineering and Computer Science University of Iceland, Reykjavík, Iceland.
Climate change in the UK. Like the rest of the world, the UK must be prepared to face a changing climate. Climate projections are predictions of how the.
Virtual Geophysics Laboratory Exploiting the Cloud and Empowering Geophysicists Ryan Fraser, Terry Rankine, Lesley Wyborn, Joshua Vote, Ben Evans. Presented.
7 th Annual Workshop on Charm++ and its Applications ParTopS: Compact Topological Framework for Parallel Fragmentation Simulations Rodrigo Espinha 1 Waldemar.
Antarctic Glaciology Julie Palais Program Manager NSF/Office of Polar Programs Antarctic Sciences Section.
9 February 2000CHEP2000 Paper 3681 CDF Data Handling: Resource Management and Tests E.Buckley-Geer, S.Lammel, F.Ratnikov, T.Watts Hardware and Resources.
A Node and Load Allocation Algorithm for Resilient CPSs under Energy-Exhaustion Attack Tam Chantem and Ryan M. Gerdes Electrical and Computer Engineering.
Software Engineering Prof. Ing. Ivo Vondrak, CSc. Dept. of Computer Science Technical University of Ostrava
Testbed in Division of Atmospheric Sciences point of view Markku Kulmala University of Helsinki Department of Physical Sciences.
Computing Environment The computing environment rapidly evolving ‑ you need to know not only the methods, but also How and when to apply them, Which computers.
Cracow Grid Workshop, November 5-6, 2001 Concepts for implementing adaptive finite element codes for grid computing Krzysztof Banaś, Joanna Płażek Cracow.
CYBERINFRASTRUCTURE FOR THE GEOSCIENCES Integrating high performance computing with data infrastructure using the GEON grid 4D simulation.
Perspectives in Computational Earth System Science an oceanographer’s view Aike Beckmann Division of Geophysics, Department of Physical Sciences
M. Zareinejad
Università di Perugia Enabling Grids for E-sciencE Status of and requirements for Computational Chemistry NA4 – SA1 Meeting – 6 th April.
GRIDSTART Brussels 20/9/02 1www.gridstart.org GRIDSTART and European activities Dr Francis Wray EPCC The University of Edinburgh.
Considering Time in Designing Large-Scale Systems for Scientific Computing Nan-Chen Chen 1 Sarah S. Poon 2 Lavanya Ramakrishnan 2 Cecilia R. Aragon 1,2.
(Srm) model application: SRM was developed by Martinec (1975) in small European basins. With the progress of satellite remote sensing of snow cover, SRM.
Single CPU Optimizations of SCEC AWP-Olsen Application Hieu Nguyen (UCSD), Yifeng Cui (SDSC), Kim Olsen (SDSU), Kwangyoon Lee (SDSC) Introduction Loop.
High Performance Computing Seminar
Enabling Grids for E-sciencE University of Perugia Computational Chemistry status report EGAAP Meeting – 21 rst April 2005 Athens, Greece.
Supercomputing versus Big Data processing — What's the difference?
Status of WLCG FCPPL project
Computational Prediction of Mechanical Performance of Particulate-Reinforced Al Metal-Matrix Composites (MMCs) using a XFEM Approach Emily A. Gerstein.
Nordic High Performance Computing & Applications Workshop
G. Cheng, R. Rimmer, H. Wang (Jefferson Lab, Newport News, VA, USA)
CHAPTER 2 - EXPLICIT TRANSIENT DYNAMIC ANALYSYS
UNIFIED GLOBAL COUPLED SYSTEM (UGCS) FOR WEATHER AND CLIMATE PREDICTION Saha-UMAC-09Aug2016.
POSTPROCESSING Review analysis results and evaluate the performance
Matthias Book1, Morris Riedel1,2, Helmut Neukirchen1, Markus Götz1,2
Ruslan Fomkin and Tore Risch Uppsala DataBase Laboratory
Bjerknes Centre for Climate Research
Readiness of ATLAS Computing - A personal view
Running R in parallel — principles and practice
Part I Status of GATE-related projects (Clermont-Ferrand; May 2017)
Spatial Analysis With Big Data
A Web-enabled Approach for generating data processors
Mattias Wadenstein Hepix 2012 Spring Meeting , Prague
Downscaling sea level rise in the Mediterranean Sea under different future climate change scenarios ( ) Kareem M. Tonbol (Ph.D.) Assistant Professor.
We need your input on gaps in Software Engineering (SE) for HPC!
Performance Evaluation of Adaptive MPI
Climate and Climate Science
Adaptation Behavior of Pipelined Adaptive Filters
Scientific Computing At Jefferson Lab
Lisbeth Louderback ESS 433
GeoFEST tutorial What is GeoFEST?
Azipod ® propulsor in oblique flow at full scale:
SDM workshop Strawman report History and Progress and Goal.
POSTPROCESSING Review analysis results and evaluate the performance
Closing remarks Many thanks to presenters and participants!
GENERAL VIEW OF KRATOS MULTIPHYSICS
Clouds from FutureGrid’s Perspective
Nordic High Performance Computing & Applications Workshop 2018
WP5 High-Performance Computing Overview
Xuezhu Wang, Qiang Wang, Sergey Danilov, Thomas Jung,
Met Office Unified Model and CloudNet
ANALYSIS OF USER SUBMISSION BEHAVIOR ON HPC AND HTC
Future EU Grid Projects
Soil hydrology soil moisture variability problem; interim solution
Numerical Analysis of slopes
The JISC Core Middleware Call
Parallel Implementation of Adaptive Spacetime Simulations A
Helmut Neukirchen University of Iceland
Continuum Simulation Monday, 9/30/2002.
Welcome.
Presentation transcript:

Using slides from Shahbaz Memon, Dorothée Vallot and Thomas Zwinger HPC use case: Glaciology/Ice sheet modelling and supporting scientific workflows Helmut Neukirchen helmut@hi.is University of Iceland Using slides from Shahbaz Memon, Dorothée Vallot and Thomas Zwinger

Helmut Neukirchen: Ice sheet modelling / Scientific workflows About me Helmut Neukirchen. Professor of Computer Science and Software Engineering, University of Iceland. Software Engineering for Distributed Systems, e.g. HPC/eScience such as scientific workflows. Collaboration with Jülich Supercomputing Centre, Germany. Morris Riedel coming every year as guest professor to Uni Iceland. Worthwhile as associated partner of NeIC just like partners in Estonia. Organised HPC Workshop in Iceland this August. Supported by NeIC, success according to participant evaluation. Helmut Neukirchen: Ice sheet modelling / Scientific workflows

Helmut Neukirchen: Ice sheet modelling / Scientific workflows Context of use case NordForsk funded Nordic Centre of Excellence eSTICC (eScience Tools for Investigating Climate Change at High Northern Latitudes). Climate modelling (millions of CPU hours needed in Norway), Ice sheet modelling (to predict sea level rise): 2 senior researchers: Helmut Neukirchen (Scientific workflows, Uni Iceland), Thomas Zwinger (Glaciology, CSC, Finland). 2 PhD students: Shahbaz Memon (Scientific workflows, Uni Iceland and Jülich Supercomputing Centre, Germany), Dorothée Vallot (Glaciology, Uni Uppsala, Sweden). Helmut Neukirchen: Ice sheet modelling / Scientific workflows

Use case: Glacier flow and calving Ice deformation and sliding Modelled as a continuous process (finite element method). Calving Modelled as discrete process (discrete element model). Helmut Neukirchen: Ice sheet modelling / Scientific workflows

Helmut Neukirchen: Ice sheet modelling / Scientific workflows Elmer/Ice Elmer: Finite Element Method (FEM) Software. Elmer/Ice add-on for modelling ice sheet flow. Mainly developed by Thomas Zwinger at CSC, Open source, Parallel processing. In FEM, finite elements may have different sizes (and even shapes). Adaptive mesh of finite elements: fine grained where a lot of stress occurs, coarse where minor stress. Helps to save CPU time. 5 Helmut Neukirchen: Ice sheet modelling / Scientific workflows 5

Helsinki Discrete Element Model (HiDEM) for ice calving Particle-based model of calving [Jan Åström et al.: A Particle-based simulation model for glacier dynamics, Cryosphere, 2013]. Parallel processing, Closed-source. Only available on CSC cluster Sisu. All particles of same size. High spatial (in the size of the cracks) and temporal (spatial scales divided by speed of sound) resolution. Computationally very expensive. Glacier divided into discrete particles as an initially dense package with random properties to simulate cracks and flaws in ice Elastic fracture strain 𝜀_𝑏 Change of stability between sub- to super-critical states Iterations over time (time step < 10-4 s) Water depth Ice Water 6 Helmut Neukirchen: Ice sheet modelling / Scientific workflows 6

Scientific workflow: coupling FEM & DEM Shared Preprocessing Serial Generate Mesh Serial ElmerSolver Parallel Cluster 1 Elmer-> Particle Serial Loop Until Num. Observations Originally: 2000 Lines of code shell script Parallel Particle Calving Cluster 2 Particle-> Elmer Serial Workflow engine of UNICORE middleware Helmut Neukirchen: Ice sheet modelling / Scientific workflows

Use Case: Kronebreen glacier, Spitsbergen, Svalbard One of the largest glacier streams on Svalbard. Draining about 690 km2. Average speed: 2 m/d. Status: Executed models coupled via shell script by Dorothée Vallot. Many iteration desirable: comparing model with reality observed for every 11 days for 3 years. 5000 CPU h FEM + 20 000 CPU h DEM for every iteration. Abstract scientific workflows developed without any allocated CPU time by Shahbaz Memon. Tested using local runs with dummy calving. Kronebreen is a glacier located at the western side of Spitsbergen, Svalbard. It is one of the largest glacier streams on Svalbard, draining about 690 square kilometers. The glacier moves with an average speed of two meters per day. Helmut Neukirchen: Ice sheet modelling / Scientific workflows 8

Helmut Neukirchen: Ice sheet modelling / Scientific workflows Status and Conclusion UNICORE middleware for workflows operational at CSC since last week. Elmer/Ice and HiDEM anyway installed at CSC. Now, waiting for CPU cycle allocation at CSC. 200 000 h applied for via Dellingr last Thursday. Would allow a few iterations of coupled ice sheet models for Kronebreen. Both needed by Dorothée Vallot to finish her PhD on modelling Kronebreen and by Shahbaz Memon to evaluate his workflow abstraction. Outlook: generic one-click ice sheet simulation possible for glaciologists via Shahbaz’s workflow. May result in need of even further CPU hours. Note: in addition to glaciologists in Uppsala and Helsinki, also glaciologists at University of Iceland do extensive simulations (coupled climate/ice sheet). Use cluster from Danish Met. Office (located at Icelandic Met. Office), but also Icelandic HPC cluster Garpur for smaller trial runs. Helmut Neukirchen: Ice sheet modelling / Scientific workflows