Www.epikh.eu The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) Indian Institute of Technology Kharagpur EPIKH Workshop Kolkata,

Slides:



Advertisements
Similar presentations
INFSO-RI Enabling Grids for E-sciencE Workload Management System and Job Description Language.
Advertisements

Setting up of condor scheduler on computing cluster Raman Sehgal NPD-BARC.
MPI support in gLite Enol Fernández CSIC. EMI INFSO-RI CREAM/WMS MPI-Start MPI on the Grid Submission/Allocation – Definition of job characteristics.
INFSO-RI Enabling Grids for E-sciencE EGEE Middleware The Resource Broker EGEE project members.
Basic Grid Job Submission Alessandra Forti 28 March 2006.
FESR Consorzio COMETA - Progetto PI2S2 Using MPI to run parallel jobs on the Grid Marcello Iacono Manno Consorzio COMETA
DIRAC API DIRAC Project. Overview  DIRAC API  Why APIs are important?  Why advanced users prefer APIs?  How it is done?  What is local mode what.
WORK ON CLUSTER HYBRILIT E. Aleksandrov 1, D. Belyakov 1, M. Matveev 1, M. Vala 1,2 1 Joint Institute for nuclear research, LIT, Russia 2 Institute for.
Trilinos 101: Getting Started with Trilinos November 7, :30-9:30 a.m. Mike Heroux Jim Willenbring.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) WMPROXY API Python & C++ Diego Scardaci
Enabling Grids for E-sciencE EGEE-II INFSO-RI BG induction to GRID Computing and EGEE project – Sofia, 2006 Practical: Porting applications.
K. Harrison CERN, 20th April 2004 AJDL interface and LCG submission - Overview of AJDL - Using AJDL from Python - LCG submission.
The gLite API – PART I Giuseppe LA ROCCA INFN Catania ACGRID-II School 2-14 November 2009 Kuala Lumpur - Malaysia.
Chapter 2: Operating-System Structures. 2.2 Silberschatz, Galvin and Gagne ©2005 Operating System Concepts Chapter 2: Operating-System Structures Operating.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) MPI Applications with the Grid Engine Riccardo Rotondo
GRID Computing: Ifrastructure, Development and Usage in Bulgaria M. Dechev, G. Petrov, E. Atanassov.
RISICO on the GRID architecture First implementation Mirko D'Andrea, Stefano Dal Pra.
E-science grid facility for Europe and Latin America Watchdog: A job monitoring solution inside the EELA-2 Infrastructure Riccardo Bruno,
:: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :: GridKA School 2009 MPI on Grids 1 MPI On Grids September 3 rd, GridKA School 2009.
1 HeMoLab - Porting HeMoLab's SolverGP to EELA glite Grid Environment FINAL REPORT Ramon Gomes Costa - Paulo Ziemer.
Nadia LAJILI User Interface User Interface 4 Février 2002.
UNIX Shell Script (1) Dr. Tran, Van Hoai Faculty of Computer Science and Engineering HCMC Uni. of Technology
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) GISELA Additional Services Diego Scardaci
Group 1 : Grid Computing Laboratory of Information Technology Supervisors: Alexander Ujhinsky Nikolay Kutovskiy.
Enabling Grids for E-sciencE EGEE-II INFSO-RI Introduction to Grid Computing, EGEE and Bulgarian Grid Initiatives Plovdiv, 2006.
Enabling Grids for E-sciencE EGEE-III INFSO-RI Using DIANE for astrophysics applications Ladislav Hluchy, Viet Tran Institute of Informatics Slovak.
HPC for Statistics Grad Students. A Cluster Not just a bunch of computers Linked CPUs managed by queuing software – Cluster – Node – CPU.
E-science grid facility for Europe and Latin America E2GRIS1 Gustavo Miranda Teixeira Ricardo Silva Campos Laboratório de Fisiologia Computacional.
_______________________________________________________________CMAQ Libraries and Utilities ___________________________________________________Community.
E-science grid facility for Europe and Latin America Using Secure Storage Service inside the EELA-2 Infrastructure Diego Scardaci INFN (Italy)
E-science grid facility for Europe and Latin America gLite MPI Tutorial for Grid School Daniel Alberto Burbano Sefair, Universidad de Los.
Jan 31, 2006 SEE-GRID Nis Training Session Hands-on V: Standard Grid Usage Dušan Vudragović SCL and ATLAS group Institute of Physics, Belgrade.
Getting started DIRAC Project. Outline  DIRAC information system  Documentation sources  DIRAC users and groups  Registration with DIRAC  Getting.
Job Management DIRAC Project. Overview  DIRAC JDL  DIRAC Commands  Tutorial Exercises  What do you have learned? KEK 10/2012DIRAC Tutorial.
Satellital Image Clasification with neural networks Step implemented – Final Report Susana Arias, Héctor Gómez UNIVERSIDAD TÉCNICA PARTICULAR DE LOJA ECUADOR.
INFSO-RI Enabling Grids for E-sciencE Αthanasia Asiki Computing Systems Laboratory, National Technical.
1 DIRAC Job submission A.Tsaregorodtsev, CPPM, Marseille LHCb-ATLAS GANGA Workshop, 21 April 2004.
Weather Research and Forecast implementation on Grid Computing Chaker El Amrani Department of Computer Engineering Faculty of Science and Technology, Tangier.
E-infrastructure shared between Europe and Latin America FP6−2004−Infrastructures−6-SSA Special Jobs Valeria Ardizzone INFN - Catania.
Enabling Grids for E-sciencE EGEE-II INFSO-RI Porting an application to the EGEE Grid & Data management for Application Rachel Chen.
How to configure, build and install Trilinos November 2, :30-9:30 a.m. Jim Willenbring.
Advanced topics Cluster Training Center for Simulation and Modeling September 4, 2015.
LCG2 Tutorial Viet Tran Institute of Informatics Slovakia.
Debugging Lab Antonio Gómez-Iglesias Texas Advanced Computing Center.
FESR Consorzio COMETA - Progetto PI2S2 Porting MHD codes on the GRID infrastructure of COMETA Germano Sacco & Salvatore Orlando.
Advanced gLite job management Paschalis Korosoglou, AUTH/GRNET EPIKH Application Porting School 2011 Beijing, China Paschalis Korosoglou,
The Finite Difference Time Domain Method FDTD By Dr. Haythem H. Abdullah Researcher at ERI, Electronics Research Institute, Microwave Engineering Dept.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) Advanced Job Riccardo Rotondo
LA 4 CHAIN GISELA EPIKH School SPECFEM3D on Science Gateway.
FESR Consorzio COMETA - Progetto PI2S2 Using MPI to run parallel jobs on the Grid Marcello Iacono Manno Consorzio Cometa
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) Special Topics: MPI jobs Maha Dessokey (
SEE-GRID-SCI WRF-ARW model: Grid usage The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
RegCM model Nato Kutaladze VI-SEEM national training event.
Regional Climate Model Version 4.1 (RegCM4.1) Centre for Oceans, Rivers, Atmosphere and Land Sciences Indian Institute of Technology Kharagpur Kharagpur.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) How to Run MPI-enabled Applications on the EUMEDGRID Infrastructure.
Advanced Computing Facility Introduction
Stephen Childs Trinity College Dublin
MPI Applications with the Grid Engine
Advanced Topics: MPI jobs
MCproduction on the grid
gLite MPI Job Amina KHEDIMI CERIST
Java standalone version
Porting MM5 and BOLAM codes to the GRID
Creating and running applications on the NGS
MPI Applications with the Grid Engine
Special Topics: MPI jobs
5. Job Submission Grid Computing.
gLite Job Management Christos Theodosiou
Chapter 2: Operating-System Structures
Presentation transcript:

The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) Indian Institute of Technology Kharagpur EPIKH Workshop Kolkata, Porting of Regional Climate Model (RegCM4.0) on EUIndia Grid Sridhara Nayak Dr. M. Mandal Suman Maity

2 Outline Scientific description of RegCM4.0 Technical details before Grid Technical details after Grid Results Future plans Summary References 2 Kolkata, EPIKH Workshop,

3 Scientific description of RegCM4.0 3 Kolkata, EPIKH Workshop, RegCM4.0 is  The latest version of RegCM  Designed by International Center for Theoretical Physics (ICTP), Trieste, Italy  A non-hydrostatic, σ-coorindate, primitive equation model  Developed by fortran-95 language  Having four components such as Terrain, ICBC, RegCM, and Postprocessor  Terrain and ICBC are the two components of RegCM preprocessor

4 Technical details before Grid 4 Kolkata, EPIKH Workshop, Operating System (OS) Any suitable version of LINUX/UNIX Compilers The model can be run with the following compilers: 1.gfortran/sunf90 These are general fortran-95 compiler on different Linux distribution gfortran compiler version should be > 4.2 These compilers are not sufficient to compile all the module of the model (Serial Part only) Compilers Contd…

55 Kolkata, EPIKH Workshop, pgi compiler(pgf90/pgcc) Portland Group Fortran compiler It is not freely available The version should be >= Intel Fortran Compiler (ifort/icc) This is a very popular fortran compiler It is freely available For our case this is the most efficient one Python This is required for some utility tools of the model The version should be >= 2.5 Technical details before Grid

6 Libraries 1. NetCDF libraries with HDF5 and zlib or szlib Input/Output of RegCM4.0 stored in NetCDF format Includes data compression through HDF file format & compression library (zlib or szlib) Kolkata, EPIKH Workshop, Message Passing Interface (MPI) RegCM4.0 supports Parallel execution through MPI interface Offers libraries, compiler wrappers and program that execute parallel code Generally, openmpi is used

7 Technical details after Grid  Interested in installing RegCM4.1 in the Grid infra- structure  Technical issues can’t be solved in this short span  RegCM4.0 is already available in one of the computing element (Briareo ce) of EUINDIA  An attempt has been made to run the RegCM4.0 for a small domain using that executables  RegCM has 4 basic steps  Domain setup (Terrain, Land Cover, SST)  Generation of initial condition and boundary condition (ICBC)  Model run  Post-processing

8 Technical details after Grid ! India &dimparam iy = 60, jx = 64 kz = 18, nsg = 1, / &geoparam iproj = ‘LAMCON', ds = 60.0, ptop = 5.0, clat = 16.00, clon = 75.00, plat = 20.00, plon = 80.00, truelatl = 2.0, truelath = 30.0, / &terrainparam domname = 'India', itype_in=1, ntypec = 10, ntypec_s = 10, ifanal =.true., smthbdy =.false., lakadj =.false., fudge_lnd =.false., fudge_lnd_s =.false., fudge_tex =.false., fudge_tex_s =.false., ntex=17, h2opct = 75., dirter = './Input', inpter = './data', / &ioparam igrads= 1, ibigend= 1, ibyte = 4, iotyp = 1, / &debugparam debug_level = 1, dbgfrq = 3, / &boundaryparam nspgx = 12, nspgd = 12, / &modesparam nsplit = 2, / &lakemodparam lkpts = 10, / &globtparam ssttyp = ‘GISST', dattyp = 'NNRP1', ehso4 =.false., globidate1 = , globidate2 = , dirglob = './Input', inpglob = './data', / &lsmparam lsmtyp = 'BATS' / &globwindow lat0 = 0.0 lat1 = 0.0 lon0 = 0.0 lon1 = 0.0 / Kolkata, EPIKH Workshop, regcm.in Contd… regcm.in

9 Technical details after Grid &outparam ifsave =.true., savfrq = 48., iftape =.true., tapfrq = 6., ifrad =.true., radisp = 6., ifbat =.true., ifsub =.true., batfrq = 3., ifchem =.true., chemfrq = 6., dirout = './output', / &physicsparam iboudy = 5, ibltyp = 1, icup = 4, igcc = 1, ipptls = 1, iocnflx = 2, ipgf = 0, iemiss = 0, lakemod = 0, ichem = 0, / &subexparam ncld = 1, fcmax = 0.80, qck1land=.250E-03, qck1oce =.250E-03, gulland = 0.4, guloce = 0.4, rhmax= 1.01, rh0oce = 0.90, rh0land=0.80, tc0 = 238.0, cevap =.100E-02, caccr = 3.000, cllwcv = 0.3E-3, clfrcvmax= 0.25, / &grellparam / &emanparam elcrit = D0, coeffr = 1.0D0, / &chemparam idirect = 1, inpchtrname = '', inpchtrsol =.00, inpchtrdpv =.00000,.00000, inpdustbsiz =.00,.00, / Kolkata, EPIKH Workshop, &aerosolparam aertyp = 'AER00D1' ntr = 10, nbin = 4, / &restartparam ifrest =.false., idate0 = , idate1 = , idate2 = , nslice = 120, / &timeparam radfrq = 30., abemh = 18., abatm = 540., dt = 180., ibdyfrq = 6, /

10 Kolkata, EPIKH Workshop, Technical details after Grid #!/bin/bash # Pull in the arguments. #MY_EXECUTABLE=`pwd`/$1 MY_EXECUTABLE=$1 MPI_FLAVOR=$2 # Convert flavor to lowercase in order to pass it to mpi-start. MPI_FLAVOR_LOWER=`echo $MPI_FLAVOR | tr '[:upper:]' '[:lower:]'` # Pull out the correct paths for the requested flavor. eval MPI_PATH=`printenv MPI_${MPI_FLAVOR}_PATH` # Ensure the prefix is correctly set. Don't rely on the defaults. eval I2G_${MPI_FLAVOR}_PREFIX=$MPI_PATH export I2G_${MPI_FLAVOR}_PREFIX # Touch the executable. It must exist for the shared file system check. # If it does not, then mpi-start may try to distribute the executable # (while it shouldn't do that). #touch $MY_EXECUTABLE # Setup for mpi-start. export I2G_MPI_APPLICATION=$MY_EXECUTABLE export I2G_MPI_APPLICATION_ARGS= export I2G_MPI_TYPE=$MPI_FLAVOR_LOWER export I2G_MPI_PRE_RUN_HOOK=mpi-hooks.sh export I2G_MPI_POST_RUN_HOOK=mpi-hooks.sh # If these are set then you will get more debugging information. export I2G_MPI_START_VERBOSE=1 #export I2G_MPI_START_DEBUG=1 # Invoke mpi-start. $I2G_MPI_START mpi-start-wrapper.sh

11 Kolkata, EPIKH Workshop, Technical details after Grid #!/bin/sh # This function will be called before the execution of MPI executable. #pre_run_hook () { # STEP 0 # Create the directories echo "Create necessary directories" mypwd=`pwd` cmd="mkdir RCM4simulation RCM4simulation/output RCM4simulation/Input RCM4simulation/DATA" echo $cmd $cmd if [ ! $? -eq 0 ]; then echo "Error creating directories. Exiting..." exit 1 fi # Everything's OK. echo "Successfully created the dirs" # STEP 1 # Transfer Input Data Files from SE to WN. echo "Downloading Input Data Files from SE" # Actually transfer the data. cmd="cd RCM4simulation" $cmd cmd="lcg-cp --vo euindia lfn:/grid/euindia/suman/data.tar.bz2 file:/$mypwd/RCM4simulation/data.tar.bz2" mypwd=`pwd` #cmd="cp /tmp/data.tar.bz2 $mypwd" echo $cmd $cmd if [ ! $? -eq 0 ]; then echo "Error transfering data files. Exiting..." exit 1 fi # Everything's OK. echo "Successfully downloaded the files" mpi-hooks.sh mpi-hooks.sh Contd…

12 Technical details after Grid # STEP 2 # Untar the data files echo "Untar the Data Files" cmd="tar xjf data.tar.bz2" echo $cmd $cmd if [ ! $? -eq 0 ]; then echo "Error untarring data files. Exiting..." exit 1 fi cmd="lcg-cp --vo euindia lfn:/grid/euindia/suman/regcm.in file:/$mypwd/regcm.in" # cmd="cp /tmp/regcm.in $mypwd" echo $cmd $cmd if [ ! $? -eq 0 ]; then echo "Error copying regcm.in file. Exiting..." exit 1 fi # Everything's OK. echo "Successfully untarred input files" # STEP 3 # PRE Processing tools. echo "Running some PRE-Processing: terrain" cmd="/opt/exp_soft/regcm/terrain regcm.in" echo $cmd $cmd if [ ! $? -eq 0 ]; then echo "Error running terrain. Exiting..." exit 1 fi # Everything's OK. echo "Successfully runned Terrain pre processing tool" Kolkata, EPIKH Workshop, mpi-hooks.sh Contd…

13 Technical details after Grid # STEP 4 # PRE Processing tools. echo "Running some PRE-Processing: sst" cmd="/opt/exp_soft/regcm/sst regcm.in" echo $cmd $cmd if [ ! $? -eq 0 ]; then echo "Error running sst. Exiting..." exit 1 fi # Everything's OK. echo "Successfully runned sst pre processing tool" # STEP 5 # PRE Processing tools. echo "Running some PRE-Processing: icbc" cmd="/opt/exp_soft/regcm/icbc regcm.in" echo $cmd $cmd if [ ! $? -eq 0 ]; then echo "Error running icbc. Exiting..." exit 1 fi # Everything's OK. echo "Successfully runned icbc pre processing tool" return 0 } # This function will be called after the execution of MPI executable. # A typical case for this is to upload the results to a storage element. post_run_hook () { echo "Executing post hook." echo "Finished the post hook." return 0 } Kolkata, EPIKH Workshop,

14 Technical details after Grid [ JobType = "Normal"; CERequirements = "smpgranularity==4"; CpuNumber = 4; Executable = "mpi-start-wrapper.sh"; Arguments = "/opt/exp_soft/regcm/regcmMPI OPENMPI"; StdOutput = "standard.out"; StdError = "standard.err"; InputSandbox = {"mpi-start-wrapper.sh","mpi-hooks.sh"}; OutputSandbox = {"standard.err","standard.out"}; OutputSandboxBaseDestUri = "gsiftp://briareo.grid.elettra.trieste.it/tmp"; Requirements = (other.GlueCEUniqueID == "briareo.grid.elettra.trieste.it:8443/cream-pbs-iblade"); ] Kolkata, EPIKH Workshop, regcm.jdl

15 Results Kolkata, EPIKH Workshop, Domain setup (Terrain, Land Cover, SST) Generation of initial condition and boundary condition (ICBC) xModel run xPost-processing Domain chosen: Central India (58 o E – 92 o E, 0 o N - 32 o N) Grid resolution taken: 60 km  Domain Setup and generation of ICBC successfully done  Next two steps were not completed due to some missing information in regcm.in (parameter file) The above problem can be shorted out and the model can be run at our home institution

16 Future plans RegCM4.0 will be implemented over Indian region 30 o E o E and 15 o S - 45 o N Grid resolution of 30 Km Duration of simulation is ~ 40 years RegCM4.0 with CLM3.5 coupler will be used Intel compilers (ifort) will be chosen Kolkata, EPIKH Workshop,

17 Summary Kolkata, EPIKH Workshop,  RegCM4.0 is already available in one of the computing element (Briareo ce) of EUINDIA  jdl file and shell scripts has successfully been written and tested for domain setup and generating ICBC  The present problems can be shorted out to run the model successfully  The model can successfully be implemented over our future domain

18 References MPI Job Submission on Briareo Regcm: Installing and configuring the RegCM package Gridseed | Tutorials – RegCM Simple Job Cycle: An explanation of how to use the job management commands to prepare and submit a simple job, monitor its status and retrieve the output RegCM4 Tutorial: part 2 running a simulation lab.org/gf/project/regcm/wiki/?pagename=How+to+setup+and+run+a+simple+RegCM 4+simulation Kolkata, EPIKH Workshop,

19 Thank You Any Questions …??