Meteo-GRID: World-wide Local Weather Forecasts by GRID Computing Deutscher Wetterdienst PO Box 10 04 65 D - 63004 Offenbach am Main Germany

Slides:



Advertisements
Similar presentations
[ ] Preliminary Results of Full-Scale Monitoring of Hurricane Wind Speeds and Wind Loads on Residential Buildings Peter L. Datin Graduate Research Assistant.
Advertisements

GRIP: Interoperability between UNICORE and Globus D. Erwin, M. Rambadt, Ph. Wieder Zentralinstitut für Angewandte Mathematik Forschungszentrum Jülich Terena.
UNICORE – The Seamless GRID Solution Hans–Christian Hoppe A Member of the ExperTeam Group Pallas GmbH Hermülheimer Straße 10 D–50321 Brühl, Germany
Dominik Stoklosa Poznan Supercomputing and Networking Center, Supercomputing Department EGEE 2007 Budapest, Hungary, October 1-5 Workflow management in.
Cultural Heritage in REGional NETworks REGNET Project Meeting Content Group Part 1: Usability Testing.
W w w. h p c - e u r o p a. o r g HPC-Europa Portal: Uniform Access to European HPC Infrastructure Ariel Oleksiak Poznan Supercomputing.
Forschungszentrum Jülich in der Helmholtz-Gesellschaft February 2007 – OGF19 A Collaborative Online Visualization and Steering (COVS) Framework for e-Science.
Prof. Natalia Kussul, PhD. Andrey Shelestov, Lobunets A., Korbakov M., Kravchenko A.
International Flood Network - IFNet - Akira Sasaki Deputy Director General Water in Rivers / IFNet Secretariat.
T HE ROLE OF PIARC IN GLOBAL ROAD INFORMATION AND TECHNOLOGY TRANSFER Julio, 2001 Oscar de Buen Richkarday C3 Technical Committee Chairman.
13 Febuary 2008GEO CBC Hannover Transfer of Regional NWP Capabilities to Developing Countries Detlev Majewski, Deutscher Wetterdienst, Germany
EC-GIN Europe-China Grid InterNetworking. Enriched with customised network mechanisms Original Internet technology Overview of Project Traditional Internet.
Forschungszentrum Jülich in der Helmholtz-Gesellschaft December 2006 A European Grid Middleware Achim Streit
SYSTEMATIC THOUGHT LEADERSHIP FOR INNOVATIVE BUSINESS Dr. Uli Eisert, SAP Research Ernst Sassen, University of St. Gallen (HSG) Successful IL* Solutions.
An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
Pan-European E-Government: The Quest for Value Andrea Di Maio.
JMA Takayuki MATSUMURA (Forecast Department, JMA) C Asia Air Survey co., ltd New Forecast Technologies for Disaster Prevention and Mitigation 1.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
Research Councils ICT Conference Welcome Malcolm Atkinson Director 17 th May 2004.
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
OMII-UK Steven Newhouse, Director. © 2 OMII-UK aims to provide software and support to enable a sustained future for the UK e-Science community and its.
Customization of a Mesoscale Numerical Weather Prediction System for Energy & Utility Applications Anthony P. Praino and Lloyd A. Treinish Deep Computing.
The JRC Tsunami Model A. Annunziato Joint Research Centre of the European Commission Institute for the Protection and Security of the Citizen Support.
© European Centre for Medium-Range Weather Forecasts Operational and research activities at ECMWF now and in the future Sarah Keeley Education Officer.
Gilbert Kalb, GMD-IK GMD was established in GMD is a non-profit company Owners are the Federal Republic of Germany (90%) and some federal states.
Eta_max space Robert Bosch Str. 7, D Darmstadt G-WaLe: GALILEO Supported Measurement of Water Level Systems 2006 Communication Forum page 1 DHI Wasser.
Hydrological information systems Svein Taksdal Head of section, Section for Hydroinformatics Hydrology department Norwegian Water Resources and Energy.
1 National Institute of Meteorology Brazil - INMET The NWP System at INMET COSMO Model Gilberto Ricardo Bonatti Antônio Cardoso Neto REUNIÓN DE EXPERTOS.
Member of the ExperTeam Group Ralf Ratering Pallas GmbH Hermülheimer Straße Brühl, Germany
The UNICORE GRID Project Karl Solchenbach Gesellschaft für Parallele Anwendungen und Systeme mbH Pallas GmbH Hermülheimer Straße 10 D Brühl, Germany.
Visual Solution to High Performance Computing Computer and Automation Research Institute Laboratory of Parallel and Distributed Systems
EMEA HPTC Virtual Team *Other brands and names are the property of their respective owners © Copyright Intel Corporation High-Performance Computing.
Problem-Solving Environments: The Next Level in Software Integration David W. Walker Cardiff University.
CERN Krakow 2001 F. Gagliardi - CERN/IT 1 RTD efforts in Europe by Kyriakos Baxevanidis Foster cohesion, interoperability, cross- fertilization of knowledge,
The CrossGrid project Juha Alatalo Timo Koivusalo.
UNICORE UNiform Interface to COmputing REsources Olga Alexandrova, TITE 3 Daniela Grudinschi, TITE 3.
Meteo-GRID: Performing Local Weather Forecast Using GRID Computing C.-J. Lenz, D. Majewski Deutscher Wetterdienst (DWD)
Contact person: Prof. M. Niezgódka Prof. Piotr Bała ICM Interdisciplinary Centre for Mathematical and Computational Modelling Warsaw University,
DAME, EuroGrid WP3 and GEODISE Esa Nuutinen. Introduction Dame, EuroGrid WP3 and GEODISE All are Grid based tools for Engineers. Many times engineers.
Advanced Data Mining and Integration Research for Europe ADMIRE – Framework 7 ICT ADMIRE Overview European Commission 7 th.
TOPIC 11 INTERNET AND THE WORLD WIDE WEB 1. OUTLINE 11.1 INTRODUCTION TO THE INTERNET 11.2 INTEGRATIVE MEDIA ON THE INTERNET 11.3 ISSUES OF INTEGRATIVE.
THEME[ENV ]: Inter-operable integration of shared Earth Observation in the Global Context Duration: Sept. 1, 2011 – Aug. 31, 2014 Total EC.
Forschungszentrum Jülich UNICORE and EUROGRID: Grid Computing in EUROPE Dietmar Erwin Forschungszentrum Jülich GmbH TERENA Networking Conference 2001 Antalya,
Introducing the Lokal-Modell LME at the German Weather Service Jan-Peter Schulz Deutscher Wetterdienst 27 th EWGLAM and 12 th SRNWP Meeting 2005.
Summary of MAP D-PHASE Strategy and Requirements MAP D-PHASE / Olympics Project Meeting 6 February 2006 Prepared by: Ron McTaggart-Cowan.
Forschungszentrum Jülich in der Helmholtz-Gemeinschaft UNICORE and Grid Computing in Europe Dietmar Erwin Forschungszentrum Jülich
Supercomputing Center CFD Grid Research in N*Grid Project KISTI Supercomputing Center Chun-ho Sung.
Contact person: Prof. M. Niezgódka Prof. Piotr Bała ICM Interdisciplinary Centre for Mathematical and Computational Modelling Warsaw University,
THEME[ENV ]: Inter-operable integration of shared Earth Observation in the Global Context Duration: Sept. 1, 2011 – Aug. 31, 2014 Total EC.
Forschungszentrum Jülich in der Helmholtz-Gesellschaft Experiences with using UNICORE in Production Grid Infrastructures DEISA and D-Grid Michael Rambadt.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
EUROGRID – An Integrated User–Friendly Grid System Hans–Christian Hoppe, Karl Solchenbach A Member of the ExperTeam Group Pallas GmbH Hermülheimer Straße.
EU-IndiaGrid (RI ) is funded by the European Commission under the Research Infrastructure Programme WP5 Application Support Marco.
Mcs/ HPC challenges in Switzerland Marie-Christine Sawley General Manager CSCS SOS8, Charleston April,
Research Infrastructures Information Day Brussels, March 25, 2003 Victor Alessandrini IDRIS - CNRS.
1 Critical Water Information for Floods to Droughts NOAA’s Hydrology Program January 4, 2006 Responsive to Natural Disasters Forecasts for Hazard Risk.
COSMO – 09/2007 STC Report and Presentation by Cosmo Partners DWD, MCH, USAM / ARPA SIM, HNMS, IMGW, NMA, HMC.
CERN The GridSTART EU accompany measure Fabrizio Gagliardi CERN
Support to MPI and interactivity on gLite infrastructures EGEE’07 Budapest, 4th Oct 2007.
GRIDSTART Brussels 20/9/02 1www.gridstart.org GRIDSTART and European activities Dr Francis Wray EPCC The University of Edinburgh.
Status of the NWP-System & based on COSMO managed by ARPA-SIM COSMO I77 kmBCs from IFSNudgingCINECA COSMO I22.8 kmBCs from COSMO I7 Interpolated from COSMO.
Page : 1 SC2004 Pittsburgh, November 12, 2004 DEISA : integrating HPC infrastructures in Europe DEISA : integrating HPC infrastructures in Europe Victor.
Introducing the Lokal-Modell LME at the German Weather Service
DEISA : integrating HPC infrastructures in Europe Prof
Coupled atmosphere-ocean simulation on hurricane forecast
Initial Adaptation of the Advanced Regional Prediction System to the Alliance Environmental Hydrology Workbench Dan Weber, Henry Neeman, Joe Garfield and.
EGI Webinar - Introduction -
WIS Strategy – WIS 2.0 Submitted by: Matteo Dell’Acqua(CBS) (Doc 5b)
NWP Strategy of DWD after 2006 GF XY DWD Feb-19.
Grid Application Programming Environment
Presentation transcript:

Meteo-GRID: World-wide Local Weather Forecasts by GRID Computing Deutscher Wetterdienst PO Box D Offenbach am Main Germany C.-J. Lenz, D. Majewski, G.-R. Hoffmann Claus-Jürgen Lenz, Detlev Majewski

Contents: - Introduction to EUROGRID and Meteo-GRID - Detailed Description of Meteo-GRID, computational requirements, status of work - Demonstration example

Gesellschaft für Parallele Anwendungen und Systeme mbH Pallas GmbH Hermülheimer Straße 10 D Brühl, Germany Application Testbed for European GRID computing Volume: 33 person years, 2 Million Euro funding by European Commission Grant No. IST , Funding time: Nov Oct. 2003

EUROGRID Vision Build a European GRID infrastructure that gives users a seamless, secure access to High Performance Computing resources and that advances computational science in Europe

EUROGRID Goals - Integrate resources of leading European HPC centres into a European HPC GRID - Develop new software components for GRID computing - Demonstrate the Application Service Provider (ASP) model for HPC access (HPC portal) for different applications - Contribute to the international GRID development

Structure of the Work Application GRIDs: application-specific interfaces, evaluation of GRID solutions –Bio-GRID –Meteo-GRID –CAE-GRID HPC GRID Infrastructure: connect HPC centers using UNICORE technology Development and integration of new software components Dissemination and exploitation

European Testbed for GRID Applications Bio-GRID Operate a GRID for biomolecular simulations Develop interfaces to existing biological and chemical codes Meteo-GRID Develop a relocatable version of DWDs weather prediction model Goal: Weather prediction-on- demand as an ASP solution CAE-GRID Coupled simulations of aircrafts HPC portals for EADS engineers and for engineers at Daimler-Chrysler and partners Develop GRID technology for computing cost estimates and billing HPC Research GRID Demonstrate a European HPC GRID testbed Develop new GRID applications Enable sharing of competence and know-how Agree on security standards, certification, access policies,... Technology Development Build on the functionality of UNICORE Extend UNICORE to provide the middleware necessary for the Domain specific GRIDs -Efficient data transfer -Resource brokerage -ASP services -Application coupling -Interactive access

EUROGRID Partners HPC Centres CSCS Manno (CH) FZ Jülich (D) ICM Warsaw (PL) IDRIS Paris (F) Univ Bergen (N) Univ Manchester (UK) Users Deutscher Wetterdienst EADS T-Systems (Assistant Partner) Integration Pallas (Project Coordinator) Fecit (Assistant Partner)

Goal of Meteo-GRID To provide high-resolution short range weather forecasts with the relocatable nonhydrostatic Lokal-Modell (LM) of DWD for any desired region in the world

Meteo-GRID Develop a relocatable version of DWDs weather prediction model Weather prediction-on- demand as an ASP solution

Meteo-GRID: Meteorological Portal Hoffmann (DWD)

Meteo-GRID: Potential Users Use by other meteorological services Use by weather service providers - commercial application Use by individuals via Internet - weather forecast on demand Use by individuals via Mobile Telephones - WAP services Hoffmann (DWD)

What´s special about Meteo-GRID ? (1) - Real-time weather forecasting is a time-critical task, a 48-h forecast must be completed in less than 60 minutes - LM is a large MPP code of about lines of code, Fortran95, MPI for message passing - Weather forecasting is computationally expensive ~ 4000 Flop/grid point and time step ~ 15 Tflop for a 48-h forecast (160 x 160 x 35 grid points, grid resolution ~ 7 km) ~ 3000 sec at a sustained speed of 5 Gflop/s

CPU requirements of LM

What´s special about Meteo-GRID ? (2) - Weather forecasting requires high band width for data transfer Forecast data (at hourly intervals): (48+1) x 20 Mbyte = 1 GByte Transfer in less than 1 hour: 2.4 Mbit/sec - Weather has large social and economic impact worldwide (storms, floodings, snow, freezing rain...)

Damages following cyclone Lothar in southwestern Germany (26 Dec 1999)

Flood at Vistula river, summer 2001 Coastal storm at Hamburg and at the North Sea Blizzard in New York

Tasks of Meteo-GRID (1) Selection of - model domain, - grid resolution, - forecast date, - forecast range and - forecast products using a Graphical User Interface (GUI)

Meteo-GRID GUI (1) Nellari, Ballabio (CSCS Manno)

Meteo-GRID GUI (2) Nellari, Ballabio (CSCS Manno)

Tasks of Meteo-GRID (2) Derivation of topographical data for the selected model domain from high-resolution (1 km x 1 km) data sets (GIS) at DWD

Examples for TOPO applications (1)

Examples for TOPO applications (2) water peat clay loamy clay loam loamy sand sand rock, concrete ice, glacier undefined

Examples for TOPO applications (3)

Tasks of Meteo-GRID (3) Derivation of - an initial data set and - lateral boundary data sets for LM from data of the global model GME of DWD (Oracle data base)

GME model grid and LM domain

Tasks of Meteo-GRID (4) - LM forecast run is performed on any supercomputer available in EUROGRID using UNICORE technology - Forecast data (GRIB code) are returned to the user via UNICORE and the Internet OR...

Tasks of Meteo-GRID (5) OR... - Visualization of LM forecasts ( 1 to 5 dimensional graphics) on the HP Computer and subsequent - Return of image files to the user via UNICORE and the Internet - Verification and validation of LM forecasts for any region worldwide

Information and data flow in Meteo-GRID (1) 1. Set up of LM-domain User Global topographical data set (GIS), ~ 7 GByte Topographical data set (1 - 5 MByte) DWDDWDGUI: Selection of - domain corners - grid resolution - forecast date - forecast range - forecast products Calculation at DWD on SGI Origin O 2000 or IBM RS/6000 SP ( min wallclock time)

Information and data flow in Meteo-GRID (2) 2. Define forecast date and range GME data base (Oracle) Hourly initial and lateral boundary data sets on GME grid (~ 50 MByte) UserDWDDWD Extraction of GME results covering the LM domain at DWD (SGI Origin O 2000 or IBM RS/6000 SP)

Information and data flow in Meteo-GRID (3) 3. Perform LM-forecast on EUROGRID HPC and send forecast data to user UserDWDDWD Topographical data set Initial and lateral boundary data sets on GME grid HPC GME2LM interpolation of GME results to LM grid LM calculation of weather forecast MByte GByte LM-forecast data visualisation ~50 MByte Initial and hourly lateral boundary data sets on LM grid ( GByte)

For more Information... about DWD: about Pallas GmbH: about UNICORE: about EUROGRID: