Www.see-grid-sci.eu SEE-GRID-SCI WRF – ARW application: Overview The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research.

Slides:



Advertisements
Similar presentations
SPEC ENV2002: Environmental Simulation Benchmarks Wesley Jones
Advertisements

Weather Research & Forecasting: A General Overview
Petar M. Gvero, Ph.D., Associate Professor University of Banja Luka Faculty of Mechanical Engineering WBC National Innovation Systems – Bosnia and Herzegovina.
® IBM Software Group © 2006 IBM Corporation Rational Software France Object-Oriented Analysis and Design with UML2 and Rational Software Modeler 04. Other.
SEE-GRID-SCI Hands-On Session: Workload Management System (WMS) Installation and Configuration Dusan Vudragovic Institute of Physics.
Development of mobile applications using PhoneGap and HTML 5
EXTENDING SCIENTIFIC WORKFLOW SYSTEMS TO SUPPORT MAPREDUCE BASED APPLICATIONS IN THE CLOUD Shashank Gugnani Tamas Kiss.
Networking and GRID Infrastructure in Georgia Prof. Ramaz Kvatadze Executive Director Georgian Research and Educational Networking Association - GRENA.
SEE-GRID-SCI Applications of the Meteorology VO in the frame of SEE-GRID-SCI project The SEE-GRID-SCI initiative is co-funded by the.
Introduction to the WRF Modeling System Wei Wang NCAR/MMM.
Using Grid Computing in Parallel Electronic Circuit Simulation Marko Dimitrijević FACULTY OF ELECTRONIC ENGINEERING, UNIVERSITY OF NIŠ LABORATORY FOR ELECTRONIC.
EGEE-III INFSO-RI Enabling Grids for E-sciencE Nov. 18, EGEE and gLite are registered trademarks EGEE-III, Regional, and National.
SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no
SEE-GRID-SCI Regional Grid Infrastructure: Resource for e-Science Regional eInfrastructure development and results IT’10, Zabljak,
SEE-GRID-SCI SEE-GRID-SCI ES VOs The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
SEE-GRID-SCI SEE-GRID-SCI Operations Procedures and Tools Antun Balaz Institute of Physics Belgrade, Serbia The SEE-GRID-SCI.
WRF4G The Weather Research Forecasting model workflow for the GRID Department of Applied Mathematics & Computer Sciences University of.
Grid Technologies  Slide text. What is Grid?  The World Wide Web provides seamless access to information that is stored in many millions of different.
SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no
PROGRESS: ICCS'2003 GRID SERVICE PROVIDER: How to improve flexibility of grid user interfaces? Michał Kosiedowski.
E-science grid facility for Europe and Latin America E2GRIS1 Gustavo Miranda Teixeira Ricardo Silva Campos Laboratório de Fisiologia Computacional.
Installing and Running the WPS Michael Duda 2006 WRF-ARW Summer Tutorial.
CCS Overview Rene Salmon Center for Computational Science.
SEE-GRID-SCI The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures contract no.
Donald Stark National Center for Atmospheric Research (NCAR) The Developmental Testbed Center (DTC) 15 January, 2014 Building the HWRF Components.
Nanco: a large HPC cluster for RBNI (Russell Berrie Nanotechnology Institute) Anne Weill – Zrahia Technion,Computer Center October 2008.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Status report on Application porting at SZTAKI.
Satellital Image Clasification with neural networks Step implemented – Final Report Susana Arias, Héctor Gómez UNIVERSIDAD TÉCNICA PARTICULAR DE LOJA ECUADOR.
SEE-GRID-SCI SEE-GRID-SCI ES VOs The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
Weather Research & Forecasting Model Xabriel J Collazo-Mojica Alex Orta Michael McFail Javier Figueroa.
KISTI & Belle experiment Eunil Won Korea University On behalf of the Belle Collaboration.
SEE-GRID-SCI NA1-Technical Execution Plan Overview Open of PSC-03 Bucharest Ioannis Liabotis Greece GRNET iliaboti grnetSPAMFREE.gr.
SEE-GRID-SCI Overview of YAIM and SEE-GRID-SCI YAIM templates Dusan Vudragovic Institute of Physics Belgrade Serbia The.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks HYP3D Gilles Bourhis Equipe SIMPA, laboratoire.
SEE-GRID-SCI Storage Element Installation and Configuration Branimir Ackovic Institute of Physics Serbia The SEE-GRID-SCI.
INFSO-RI Enabling Grids for E-sciencE Charon Extension Layer. Modular environment for Grid jobs and applications management Jan.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures contract no Workflow repository, user.
Weather Research and Forecast implementation on Grid Computing Chaker El Amrani Department of Computer Engineering Faculty of Science and Technology, Tangier.
OPTIMIZATION OF DIESEL INJECTION USING GRID COMPUTING Miguel Caballer Universidad Politécnica de Valencia.
Installing and Running the WPS Michael Duda 2006 WRF-ARW Summer Tutorial.
SEE-GRID-SCI REFS application: NOA The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
PACI Program : One Partner’s View Paul R. Woodward LCSE, Univ. of Minnesota NSF Blue Ribbon Committee Meeting Pasadena, CA, 1/22/02.
SEE-GRID-SCI NA4 Progress PSC07 Zabljak, February 2010 The SEE-GRID-SCI initiative is co-funded by the European Commission under.
Support to MPI and interactivity on gLite infrastructures EGEE’07 Budapest, 4th Oct 2007.
SEE-GRID-SCI Vangelis Floros, Vasso Kotroni, Kostas Lagouvardos – NOA, Athens, GREECE Goran Pejanovic, Luka Ilic, Momcilo Zivkovic.
SEE-GRID-SCI Meteo-VO: Overview The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
Evangelos Markatos and Charalampos Gkikas FORTH-ICS Athens, th Mar Institute of Computer Science - FORTH Christos.
Hernán García CeCalcULA Universidad de los Andes.
EGEE is a project funded by the European Union under contract IST Generic Applications Requirements Roberto Barbera NA4 Generic Applications.
SEE-GRID-SCI Grid Operations Procedures Antun Balaz Institute of Physics Belgrade Serbia The SEE-GRID-SCI initiative.
SEE-GRID-SCI New AEGIS services Dusan Vudragovic Institute of Physics Belgrade Serbia The SEE-GRID-SCI initiative is co-funded.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
SEE-GRID-SCI WRF-ARW model: Grid usage The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
Accessing the VI-SEEM infrastructure
HPC Roadshow Overview of HPC systems and software available within the LinkSCEEM project.
The EDG Testbed Deployment Details
HPC usage and software packages
Porting MM5 and BOLAM codes to the GRID
Grid Application Support Group Case study Schrodinger equations on the Grid Status report 16. January, Created by Akos Balasko
VOCE Peter Kaczuk, Dan Kouril, Miroslav Ruda, Jan Svec,
Introduction to Grid Technology
OpenGATE meeting/Grid tutorial, mars 9nd 2005
CompChem VO: User experience using MPI
NGS computation services: APIs and Parallel Jobs
SA1 ROC Meeting Bologna, October 2004
gWRF Workflow and Input Data Requirements
Job Application Monitoring (JAM)
Presentation transcript:

SEE-GRID-SCI WRF – ARW application: Overview The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures contract no PSC03, January 2009, Bucharest Davor Davidovic Rudjer Boskovic Institute Zagreb, Croatia

PSC03, January 2009, Bucharest2 WRF-ARW partnersPartners Partners involved in WRF-ARW grid deployment  Rudjer Boskovic Institute (RBI)  Department of Geophysics University of Zagreb, Croatia (AMGI)  Faculty of Graphical Art of the University of Zagreb, Croatia (FGA)  University of Banja Luka, Faculty of Electrical Engineering (UoBL)  Federal Hydro-Meteorological Institute, Sarajevo, BiH (FHMI)  Hydro-meteorological Institute of Republic of Srpska, BiH (RHMI)  Georgian Research and Educational Networking Association (GRENA)

PSC03, January 2009, Bucharest3 Objectives Tehnical objectives  Port the WRF/ARW model to the grid including pre-processing and post-processing.  Run model on very fine resolution  Develop a (quasi-)operationally deterministic forecast chain over the Balkans focused for the needs of Bosnia Herzegovina.  Increase the user community

PSC03, January 2009, Bucharest4 Partners role in the project RBI  Coordination and leading development of application  Porting model to grid, fine-tunning AMGI  Model development and testing FGA  Inspecting post-processing possibilities (visualization) UoBL  Porting model to grid and fine-tuning FHMI  Developing and testing RHMI  Testing, know-how of using model – deployment of operational model GRENA  Testing, know-how of using model

PSC03, January 2009, Bucharest5 Progress Dynamic application libraries and binaries has been built – WRF version (supporting both OpenMP and MPI) using ifort compiler version 9.1 Deployed application on UI nodes (home.irb.hr) and storage element (grid2.irb.hr) Start model locally (on local clusters using MPICH) Start model on grid using one WN (on one CPU, 64bit CE) Try first runs on grid using 4 CPUs Start working on the script for running the model on the grid (automatization of process for submitting the model and collecting data from distance SE to CE) Organize training event for meteorologists (grid basics and WRF model) Bulit and run pre-processing (WPS) on grid

PSC03, January 2009, Bucharest6 FHMI progress Compile WRF ARW model (version ) with PGI Fortran compiler Run model on in cluster environment with 4 CPUs using MPICH2 and on one node on SEEGRID-SCI site at Faculty of Electrical Engineering, University of Sarajevo (BA-03- ETFSA). Working on running WRF ARW model with MPICH2 library on all nodes on the BA-03-ETFSA site.

PSC03, January 2009, Bucharest7 RHMI progress At first time, RHMI Banja Luka and FHI Sarajevo installed numerical weather model – WRF-ARW version (in October 2008) Nowadays, RHMI BL is testing model (WPS, ARWpost, GrADS…) and providing the support in equipment for operational weather forecast model They are planning to run other models, like WRF/NMM and Eta slope Next step: run WRF-ARW model on Grid using cluster in UoBl

PSC03, January 2009, Bucharest8 Action points till next PSC Finish porting to the grid’s storage elements binaries and input static data(geo_data) – M11 Finish scripts for automatisation of starting model on grid infrastructure (collecting all data and binaries) – M11 Tune application for start on multiple WN (using MPI) – M10 Finish research mode of WRF-ARW application (ready for use for research purposes till the M12) - M12

PSC03, January 2009, Bucharest9 Discussion 1. Final decision which binaries to use (32 or 64bit), INTEL (free license) or PGI 2. File scheme on storage, physical organization of all WRF binaries and other file and libs, which storage elements to use (which are the best for us) 3. Should we pre-process geo_data (geogrid.exe in WPS)?  Should every partner prepare geo_data for his scope of interest and store them at the same place? (better sharing data) 4. Uniform naming for all files and data (to be the same for every partner to avoid collision of data and files)? 5. Dedicided CE elements for WRF – inspection? 6. Funcionality of scripts for starting model? 1. Copy to/from SE elements directly or UI, changing parameters – namelist.input, met_data 2. WPS and ARW – one or two scripts?

PSC03, January 2009, Bucharest10 Discussion 7. Allocation of work among partners: 1. Making scripts for model starting on grid 2. Preparation of input geo_data (met_data) – uploading on SE 3. Testing for best working nodes/CE – ex. fast WN interconnection