Energy efficient SCalable

Slides:



Advertisements
Similar presentations
Weather Research & Forecasting: A General Overview
Advertisements

Technology Drivers Traditional HPC application drivers – OS noise, resource monitoring and management, memory footprint – Complexity of resources to be.
The Role of Environmental Monitoring in the Green Economy Strategy K Nathan Hill March 2010.
Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
Parallel Research at Illinois Parallel Everywhere
*time Optimization Heiko, Diego, Thomas, Kevin, Andreas, Jens.
GFDL’s unified regional-global weather and climate modeling system is designed for all temporal-spatial scales Examples: Regional cloud-resolving Radiative-Convective.
1 Dr. Frederica Darema Senior Science and Technology Advisor NSF Future Parallel Computing Systems – what to remember from the past RAMP Workshop FCRC.
Astrophysics, Biology, Climate, Combustion, Fusion, Nanoscience Working Group on Simulation-Driven Applications 10 CS, 10 Sim, 1 VR.
Next Gen AQ model Need AQ modeling at Global to Continental to Regional to Urban scales – Current systems using cascading nests is cumbersome – Duplicative.
Priority Research Direction Key challenges Fault oblivious, Error tolerant software Hybrid and hierarchical based algorithms (eg linear algebra split across.
© Fujitsu Laboratories of Europe 2009 HPC and Chaste: Towards Real-Time Simulation 24 March
4.x Performance Technology drivers – Exascale systems will consist of complex configurations with a huge number of potentially heterogeneous components.
THEME[ENV ]: Inter-operable integration of shared Earth Observation in the Global Context Duration: Sept. 1, 2011 – Aug. 31, 2014 Total EC.
CCA Common Component Architecture Manoj Krishnan Pacific Northwest National Laboratory MCMD Programming and Implementation Issues.
What is a Climate Model?.
Page 1 Pacific THORPEX Predictability, 6-7 June 2005© Crown copyright 2005 The THORPEX Interactive Grand Global Ensemble David Richardson Met Office, Exeter.
Issues in (Financial) High Performance Computing John Darlington Director Imperial College Internet Centre Fast Financial Algorithms and Computing 4th.
Common Set of Tools for Assimilation of Data COSTA Data Assimilation Summer School, Sibiu, 6 th August 2009 COSTA An Introduction Nils van Velzen
A Portable Regional Weather and Climate Downscaling System Using GEOS-5, LIS-6, WRF, and the NASA Workflow Tool Eric M. Kemp 1,2 and W. M. Putman 1, J.
1 Earth Science Technology Office The Earth Science (ES) Vision: An intelligent Web of Sensors IGARSS 2002 Paper 02_06_08:20 Eduardo Torres-Martinez –
26/05/2005 Research Infrastructures - 'eInfrastructure: Grid initiatives‘ FP INFRASTRUCTURES-71 DIMMI Project a DI gital M ulti M edia I nfrastructure.
Programmability Hiroshi Nakashima Thomas Sterling.
Computing Systems: Next Call for Proposals Dr. Panagiotis Tsarchopoulos Computing Systems ICT Programme European Commission.
Who are the users? BioExcelCoEGSSECAMEoCoE academia and industry policy makers, public bodies, NGOs, industries, academia, SME extended CECAM academic.
1 Symposium on the 50 th Anniversary of Operational Numerical Weather Prediction Dr. Jack Hayes Director, Office of Science and Technology NOAA National.
IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide.
EU-Russia Call Dr. Panagiotis Tsarchopoulos Computing Systems ICT Programme European Commission.
Deutscher Wetterdienst 1FE 13 – Working group 2: Dynamics and Numerics report ‘Oct – Sept. 2008’ COSMO General Meeting, Krakau
Defining the Competencies for Leadership- Class Computing Education and Training Steven I. Gordon and Judith D. Gardiner August 3, 2010.
Sensing and Measurements Tom King Oak Ridge National Laboratory April 2016.
Chapter 6: Securing the Cloud
Emerging Research Opportunities at the Climate Modeling Laboratory NC State University (Presentation at NIA Meeting: 9/04/03) Fredrick H. M. Semazzi North.
2nd GEO Data Providers workshop (20-21 April 2017, Florence, Italy)
(Deutsches Klimarechenzentrum (DKRZ)
Introduction to Parallel Computing: MPI, OpenMP and Hybrid Programming
Crisis management related research at
H2020, COEs and PRACE.
Integrated Planning of Transmission and Distribution Systems
Database management system Data analytics system:
Xing Cai University of Oslo
Global Circulation Models
INTAROS WP5 Data integration and management
Open Weather Weather on the Web
Kostas Seferis, i2S Data science and e-infrastructures can help aquaculture to improve performance and sustainability!
To promote European cooperation among teachers, learners, and researchers on an effective use of ICT in the teaching of Physics.
FET Plans FET - Proactive 1.
Joseph JaJa, Mike Smorul, and Sangchul Song
Dynamic Data Driven Application Systems
Programming Models for SimMillennium
ICT NCP Infoday Brussels, 23 June 2010
Parallel Algorithm Design
Mobile edge computing Report by Weiqing huang.
Population Information Integration, Analysis and Modeling
Perturbation equations for all-scale atmospheric dynamics
Dynamic Data Driven Application Systems
DeFacto Planning on the Powerful Microsoft Azure Platform Puts the Power of Intelligent and Timely Planning at Any Business Manager’s Fingertips Partner.
Overview of working draft v. 29 January 2018
Power is Leading Design Constraint
Optimizing MapReduce for GPUs with Effective Shared Memory Usage
WIS Strategy – WIS 2.0 Submitted by: Matteo Dell’Acqua(CBS) (Doc 5b)
Bin Ren, Gagan Agrawal, Brad Chamberlain, Steve Deitz
Coordinator: DKRZ Weather Climate HPC
Autonomous Operations in Space
GEO - Define an Architecture Integrated Solutions
UPTIME & SEMANTIC WEB STANDARDS
Conservative Dynamical Core (CDC)
European Centre for Medium-Range Weather Forecasts
L. Glimcher, R. Jin, G. Agrawal Presented by: Leo Glimcher
Presentation transcript:

Energy efficient SCalable Algorithms for weather Prediction at Exascale www.hpc-escape.eu Peter Bauer

ESCAPE key objectives Define fundamental algorithm building blocks (“Weather & Climate Dwarfs”) to co-design, advance, benchmark and efficiently run the next generation of NWP and climate models on energy-efficient, heterogeneous HPC architectures. Combine frontier research on algorithm development and extreme-scale, high-performance computing applications with novel hardware technology, to create a flexible and sustainable weather and climate prediction system. Foster the future design of Earth-system models and commercialisation of weather-dependent innovative products and services in Europe through enabling open-source technology. Pairing world-leading NWP with innovative HPC solutions.

ESCAPE European impact map 34 countries 7 countries 11 countries 16 countries

ESCAPE partners & expertise Global operational NWP Petaflop data centre Numerical methods, parallel computing Regional operational NWP Numerical methods Key technology development Parallel computing Regional operational NWP Teraflop data centre Physical processes Numerical methods Teraflop data centre Parallel computing Teraflop data centre Parallel computing Performance modelling Teraflop data centre Parallel computing Global/regional operational NWP Teraflop data centre Numerical methods Global/regional operational NWP Petaflop data centre Numerical methods Key technology development Parallel computing Key technology development Parallel computing Regional operational NWP Petaflop data centre Parallel computing

Traditional science workflow [Schulthess 2015]

Future science workflow Energy efficient SCalable Algorithms for weather Prediction at Exascale  www.hpc-escape.eu  science specific code  generic code [Schulthess 2015]

Energy efficiency .. aiming at minimizing Watts per forecast

Model development: ESCAPE Energy efficient SCalable Algorithms for weather Prediction at Exascale (hpc-escape.eu) Disassemble global … extract, redesign … optimize for energy … reassemble global NWP model… key components = dwarfs… efficiency on new hardware… NWP model

Computational cost Global, spectral Regional, spectral Regional, Eulerian

What’s wrong with weather and climate codes? Advection: halo-communication, scalability Transforms: communication bandwidth, memory Physics: expensive calculations, scalable Ocean waves: expensive calculations, load balancing 3D-solver: iterations, memory, communication bandwidth No 1: sequential with time steps (10d x 24h x 3600s / 450s = 1920 @ 9km Dwarfs: Spectral transforms (FT/LT and bi-FT) ………….. very memory and communication bandwidth intensive, possibly limited scalability 2 & 3-dimensional elliptic solver ……………….... compute and communication latency intensive, possibly limited scalability Semi-Lagrangian advection …………………….... communication intensive, possibly limited scalability Flux-form finite-volume advection ……………….. local communication, latency intensive, limited scalability Cloud physics parameterization …………………. expensive computations, independent vertical columns, scalable Radiation parameterization ………………………. expensive computations, spatial, temporal and wavenumber space, scalable

Dwarfs & adaptation

Separation of concerns: Atlas

Atlas support of grids Grid object is collection of grid points (structured or unstructured), Global or Regional model set-up: with hierarchical interpretations:

Atals & GridTools GridTools library and interfaces: Atlas data structures & functionalities:

Algorithmic flexibility: path and control volume

Algorithmic flexibility: spectral and grid-point

Dwarf performance more fields

Porting of LAITRI dwarf LAITRI performs interpolations for semi-Lagrangian advection scheme The settings to achieve this: correct use of OpenMP (e.g. ensuring correct ‘first touch’ of data), suitable compiler data alignment directives to good vectorisation performance, and prudent setting of runtime variables such as KMP AFFINITY and KMP HW SUBSET. Aim: integrate these settings in Atlas, away from science code!

Weather & Climate in H2020 PantaRhei Reference application Dissemination across community Showcase of new technology Reference application Co-design hardware & programming models Co-design hardware & programming models Ingestion of dwarfs in full-scale ESM Dissemination platform for dwarfs Reference application Reference application Evaluation against alternative programming models High-impact application demonstrator Performance assessment and optimization tools Feedback on tool applicability and value PantaRhei Finite-volume fully compressible core Global NWP implementation

ESCAPE Staff

The 10-year production challenge Data acquisition Single model run Product generation Dissemination RMDCN Internet Web services Archive Data Handling System Scenarios Multi-model run Climate Data Store Observations shared worldwide O(GB) Dissemination To 300 destinations National met services and commercial customers O(TB) No.1 goal: Global 1 km simulations at >1 SYPD Computing and data challenges grow by factor 100-1000

The 10-year technical&programmatic challenge Top-3 computing bottlenecks: Memory bandwidth, Memory Size/Layout, Network latency… but not FLOPS! [G. Kalbe, 2017, European HPC, Time line overview]

The 10-year impact challenge & Climate Observations Satellites Radar Ground stations Nowcasting Weather forecasting Climate prediction Reanalyses + Computing Direct data links Internet TV, radio Other Threat Impact Probability Reliability Warning Protection Buying/selling Investing [J. Lazo 2016, the NWP value chain]