MPI Applications in the Interactive European Grid Dr. Isabel Campos Plasencia Instituto de Física de Cantabria, IFCA (Santander) Consejo Superior de Investigaciones.

Slides:



Advertisements
Similar presentations
Challenges for Interactive Grids a point of view from Int.Eu.Grid project Remote Instrumentation Services in Grid Environment RISGE BoF Manchester 8th.
Advertisements

Interactive System for Pulverized Coal Combustion Visualization with Fluid Simulator Marek Gayer, Pavel Slavík and František Hrdlička Department of Computer.
MPI support in gLite Enol Fernández CSIC. EMI INFSO-RI CREAM/WMS MPI-Start MPI on the Grid Submission/Allocation – Definition of job characteristics.
Fusion Simulations, data visualization and future requirements for the interactive grid infraestructure F. Castejón 1, D. López Bruna 1, J.M. Reynolds.
Evan Greer, Mentor: Dr. Marcelo Kobayashi, HARP REU Program August 2, 2012 Contact: globalwindgroup.com.
Astrophysics, Biology, Climate, Combustion, Fusion, Nanoscience Working Group on Simulation-Driven Applications 10 CS, 10 Sim, 1 VR.
Fluid Simulation using CUDA Thomas Wambold CS680: GPU Program Optimization August 31, 2011.
Direct and iterative sparse linear solvers applied to groundwater flow simulations Matrix Analysis and Applications October 2007.
The Finite Element Method
EUROPEAN UNION Polish Infrastructure for Supporting Computational Science in the European Research Space Cracow Grid Workshop’10 Kraków, October 11-13,
Computational Science jsusciencesimulation Principles of Scientific Simulation Spring Semester 2005 Geoffrey Fox Community.
INFSO-RI Enabling Grids for E-sciencE Gilda experiences and tools in porting application Giuseppe La Rocca INFN – Catania ICTP/INFM-Democritos.
EUFORIA FP7-INFRASTRUCTURES , Grant GridKa School 2008 Interactivity on the Grid Marcus Hardt SCC (The insitute formerly known as
Enabling Grids for E-sciencE EGEE-II INFSO-RI BG induction to GRID Computing and EGEE project – Sofia, 2006 Practical: Porting applications.
K. Harrison CERN, 20th April 2004 AJDL interface and LCG submission - Overview of AJDL - Using AJDL from Python - LCG submission.
INFSO-RI Enabling Grids for E-sciencE Grid Applications -- Cyprus Contribution to EGEE Organization: HPCL, University Of Cyprus.
GRID Computing: Ifrastructure, Development and Usage in Bulgaria M. Dechev, G. Petrov, E. Atanassov.
OGF 25/EGEE User Forum Catania, March 2 nd 2009 Meta Scheduling and Advanced Application Support on the Spanish NGI Enol Fernández del Castillo (IFCA-CSIC)
Computational grids and grids projects DSS,
:: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :: GridKA School 2009 MPI on Grids 1 MPI On Grids September 3 rd, GridKA School 2009.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
1 HeMoLab - Porting HeMoLab's SolverGP to EELA glite Grid Environment FINAL REPORT Ramon Gomes Costa - Paulo Ziemer.
Visualisation of Plasma in Fusion Devices Interactive European Grid 30 th May 2007.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks António Pina, Ricardo Marques, Bruno Oliveira.
A Grid fusion code for the Drift Kinetic Equation solver A.J. Rubio-Montero, E. Montes, M.Rodríguez, F.Castejón, R.Mayo CIEMAT. Avda Complutense, 22. Madrid.
E-science grid facility for Europe and Latin America Marcelo Risk y Juan Francisco García Eijó Laboratorio de Sistemas Complejos Departamento.
Migrating Desktop The graphical framework for running grid applications Bartek Palak Poznan Supercomputing and Networking Center The.
E-science grid facility for Europe and Latin America E2GRIS1 André A. S. T. Ribeiro – UFRJ (Brazil) Itacuruça (Brazil), 2-15 November 2008.
Enabling Grids for E-sciencE EGEE-II INFSO-RI Practical: Porting applications to the GILDA grid Slides from Vladimir Dimitrov, IPP-BAS.
Enabling Grids for E-sciencE EGEE-II INFSO-RI Introduction to Grid Computing, EGEE and Bulgarian Grid Initiatives Plovdiv, 2006.
E-science grid facility for Europe and Latin America E2GRIS1 Gustavo Miranda Teixeira Ricardo Silva Campos Laboratório de Fisiologia Computacional.
MODELLING OF MULTIPHASE FLOWS OVER SURFACE WITH PENETRABLE ROUGH RELIEF Yevgeniy A. Shkvar National Aviation University, Kyiv, Ukraine
EGEE-III INFSO-RI Enabling Grids for E-sciencE Application Porting Support in EGEE Gergely Sipos MTA SZTAKI EGEE’08.
INFSO-RI Enabling Grids for E-sciencE SALUTE – Grid application for problems in quantum transport E. Atanassov, T. Gurov, A. Karaivanova,
INFSO-RI Enabling Grids for E-sciencE Workflows in Fusion applications José Luis Vázquez-Poletti Universidad.
Finite Element Analysis
E-science grid facility for Europe and Latin America MAVs-Study Biologically Inspired, Super Maneuverable, Flapping Wing Micro-Air-Vehicles.
Progress on Component-Based Subsurface Simulation I: Smooth Particle Hydrodynamics Bruce Palmer Pacific Northwest National Laboratory Richland, WA.
Types of Models Marti Blad Northern Arizona University College of Engineering & Technology.
Int.eu.grid: Experiences with Condor to Run Interactive and Parallel Applications on the Grid Elisa Heymann Department of Computer Architecture and Operating.
EUFORIA FP7-INFRASTRUCTURES , Grant Migrating Desktop Uniform Access to the Grid Marcin Płóciennik Poznan Supercomputing and Networking Center.
Weather Research and Forecast implementation on Grid Computing Chaker El Amrani Department of Computer Engineering Faculty of Science and Technology, Tangier.
Università di Perugia Enabling Grids for E-sciencE Status of and requirements for Computational Chemistry NA4 – SA1 Meeting – 6 th April.
V.M. Sliusar, V.I. Zhdanov Astronomical Observatory, Taras Shevchenko National University of Kyiv Observatorna str., 3, Kiev Ukraine
1 Rocket Science using Charm++ at CSAR Orion Sky Lawlor 2003/10/21.
Support to MPI and interactivity on gLite infrastructures EGEE’07 Budapest, 4th Oct 2007.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Spanish National Research Council- CSIC Isabel.
Migrating Desktop Uniform Access to the Grid Marcin Płóciennik Poznan Supercomputing and Networking Center Poznan, Poland EGEE’07, Budapest, Oct.
Migrating Desktop Uniform Access to the Grid Marcin Płóciennik Poznan Supercomputing and Networking Center Poland EGEE’08 Conference, Istanbul, 24 Sep.
An Introduction to Computational Fluids Dynamics Prapared by: Chudasama Gulambhai H ( ) Azhar Damani ( ) Dave Aman ( )
FESR Consorzio COMETA - Progetto PI2S2 Porting MHD codes on the GRID infrastructure of COMETA Germano Sacco & Salvatore Orlando.
EGEE is a project funded by the European Union under contract IST Compchem VO's user support EGEE Workshop for VOs Karlsruhe (Germany) March.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
Multicore Applications in Physics and Biochemical Research Hristo Iliev Faculty of Physics Sofia University “St. Kliment Ohridski” 3 rd Balkan Conference.
The Finite Difference Time Domain Method FDTD By Dr. Haythem H. Abdullah Researcher at ERI, Electronics Research Institute, Microwave Engineering Dept.
Enabling Grids for E-sciencE LRMN ThIS on the Grid Sorina CAMARASU.
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
User requirements for interactive controlling and monitoring of applications in grid environments Dr. Isabel Campos Plasencia Institute of Physics of Cantabria.
Stephen Childs Trinity College Dublin
Job Management Exercises
Advanced Topics: MPI jobs
Tamas Kiss University Of Westminster
gLite MPI Job Amina KHEDIMI CERIST
Grid Application Support Group Case study Schrodinger equations on the Grid Status report 16. January, Created by Akos Balasko
I2G CrossBroker Enol Fernández UAB
S. Mangiagli1, L. Drago2, S. Coco2, A. Laudani2, L. Lodato1, G
CompChem VO: User experience using MPI
5. Job Submission Grid Computing.
gLite Job Management Christos Theodosiou
Topali Lombardo Alessandro
Presentation transcript:

MPI Applications in the Interactive European Grid Dr. Isabel Campos Plasencia Instituto de Física de Cantabria, IFCA (Santander) Consejo Superior de Investigaciones Científicas (CSIC) EGEE 07, Budapest 1rst-5th October

EGEE 07, Budapest, 1-5 October 2007 Computational Chemistry: GROMACS GROMACS has been installed in the i2g testbed  Together with the fftw3 libraries  Generally accesible applications software directory VO_ICOMPCHEM_SW_DIR.  We have used Open MPI as MPI implementation and GNU compilers JobType = "Parallel"; SubJobType = "OpenMPI"; NodeNumber = 4; VirtualOrganisation = "icompchem"; Executable = "mdrun"; Arguments ="-v -s full -e full-o full -c after_full -g flog -N 4"; StdOutput = "std.out"; StdError = "std.err"; InputSandbox = {"speptide.top","after_pr.gro","full.mdp","gromacs_hooks.sh"}; OutputSandbox = {"std.out","std.err"}; Environment = {"I2G_MPI_PRE_RUN_HOOK=./gromacs_hooks.sh","I2G_MPI_POST_RUN_ HOOK=./gromacs_hooks.sh"};

EGEE 07, Budapest, 1-5 October 2007 Computational Chemistry: GROMACS export OUTPUT_ARCHIVE=output.tar.gz export OUTPUT_HOST=se.i2g.cesga.es export OUTPUT_SE=lfn:/grid/icompchem/test export OUTPUT_VO=icompchem pre_run_hook () { ### Here comes the pre-mpirun actions of gromacs export PATH=$PATH:/$VO_ICOMPCHEM_SW_DIR/gromacs-3.3/bin grompp -v -f full -o full -c after_pr -p speptide -np 4 } post_run_hook () { echo "post_run_hook called" echo "pack the data and bring it to an Storage Element" tar cvzf $OUTPUT_ARCHIVE * lcg-cr --vo $OUTPUT_VO -d $OUTPUT_HOST -l $OUTPUT_SE/$OUTPUT_ARCHIVE file://$PWD/$OUTPUT_ARCHIVE return 0 }

EGEE 07, Budapest, 1-5 October 2007 Some MPI Applications tested: Telluride Portability Analysis Regarding Intel Fortran compilers ifc (F90) Libraries associated Open-MPI itself The Group of Research in Oceanography and Coasts of the University of Cantabria is using this code for landscape design of harbours in the North coast of Spain. Telluride is a MPI software used to simulate many problems in the areas of solidification, fluid flow, heat transfer, phase transformations and mechanical deformation

EGEE 07, Budapest, 1-5 October 2007 Some MPI Applications tested: Telluride JDL job script Executable = “truchas”; Arguments = “broken_dam.inp”; JobType = “openmpi”; NodeNumber = 8; StdOutput = “std.out”; StdError = “std.err”; InputSandBox = {“my_scripts.sh”,”gioc”,”broken_dam.inp”}; OutputSandBox = {“std.out”, “std.err”}; Environment = {“I2G_MPI_PRE_RUN_HOOK=./my_scrips.sh”, ”I2G_POST_RUN_HOOK=./my_scripts.sh”};

EGEE 07, Budapest, 1-5 October 2007 Lattice QCD  Lattice quantum chromodynamics (Lattice QCD) is a theory of quarks and gluons formulated on a space time lattice.That is, is a discretized version of QCD  Analytic solutions in QCD are hard or impossible due to the nature of the forces involved  Most importantly, lattice QCD provides the framework for investigation of non-perturbative phenomena such as confinement and quark-gluon plasma formation, which are intractable by means of analytic theories.  Lattice QCD has already made contact with experiments at various fields with good results: calculations of quark masses and decya constants

EGEE 07, Budapest, 1-5 October 2007 Lattice QCD  DD-HMC algorithm for two-flavour lattice QCD Numerical simulations of lattice QCD are still limited to light-quark masses significantly larger than their physical values. The use of efficient simulation techniques can make an important difference in this competitive field, and there has consequently been a continuous effort to improve the simulation algorithms. The DD-HMC algorithm combines domain decomposition ideas with the Hybrid-Monte-Carlo algorithm. Low latency intra-networks required ( ~ 10 microsecond: Infiniband or Myrinet) Competitive simulations run for several weeks, or months, on cluster of 32 – 64 processors

EGEE 07, Budapest, 1-5 October MPI in farm mode: Particle trajectories in Fusion devices Stellerator TJ-II (Madrid) Magnetic Confinement Investigate Plasma prop. National infrastructure Research CIEMAT Schema of the TJ-II design Visualization using Computational tools (OpenGL, Fox toolkit) Computing Visualization of TJ-II

EGEE 07, Budapest, 1-5 October Computational details  The plasma is analyzed as a many body system consisting of N particles  The application visualizes the behaviour of plasma inside a Fusion device  Applicability Stellerator Design (vacuum chamber damages, coil designs, etc…)  Inputs Geometry of the vacuum chamber Magnetic field in the environment Initial number, position, direction, velocity of particles Possibility of collisions between particles Density of particles inside the device  Solves a set of Stochastic Differential Equations with Runge-Kutta method  Outputs Trajectories of the particles Average of relevant magnitudes: densities, temperatures...

EGEE 07, Budapest, 1-5 October Porting the application to int.eu.grid  Spread the calculation over hundreds of Worker Nodes on the Grid to increase the number of particles simulated using MPI  Introduce remote Visualization tools  Interactive Steering  Design of a Grid collaborative environment for fusion device designing N particles distributed among P processes: MPI Particle trajectories are displayed graphically Interactive simulation steering

EGEE 07, Budapest, 1-5 October 2007 Application startup P0 P1 P2 P3 Master P0 N particles divided among P Processors: N / P Master process uses mpi-start to distribute input to Child processes P1 P2 P3 Magnetic field background SE, http, … Input: Number of particles, Simulation properties (file), Magnetic field background

EGEE 07, Budapest, 1-5 October 2007 Support to Environmental Applications IMS Model Suite (IISAS) Modelling dispersion of pollutants in the atmosphere  Study the movement of individual independent particles.  The term particle denotes any air pollutant or substance (or multiple substances) in the volume of air under consideration  The particles travel with the wind and the particle trajectory and particle composition reflects natural phenomena such as turbulent diffusion, dry deposition, wet deposition caused by the rain and radioactive decay  Work done Grid enabled batch sequential and MPI versions Migrating Desktop (MD) plugin Integrated MPI version on Migrating Desktop