Www.see-grid-sci.eu SEE-GRID-SCI REFS application: NOA The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.

Slides:



Advertisements
Similar presentations
EDGI European Desktop Grid Initiative gLite job submission to EDGI EDGI is supported by the FP7 Capacities Programme under contract nr RI
Advertisements

Development of test suites for the certification of EGEE-II Grid middleware Task 2: The development of testing procedures focused on special details of.
SEE-GRID-SCI Hands-On Session: Workload Management System (WMS) Installation and Configuration Dusan Vudragovic Institute of Physics.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
1 Bridging Clouds with CernVM: ATLAS/PanDA example Wenjing Wu
A tool to enable CMS Distributed Analysis
Makrand Siddhabhatti Tata Institute of Fundamental Research Mumbai 17 Aug
SEE-GRID-SCI Applications of the Meteorology VO in the frame of SEE-GRID-SCI project The SEE-GRID-SCI initiative is co-funded by the.
Introduction to the WRF Modeling System Wei Wang NCAR/MMM.
ATLAS Off-Grid sites (Tier-3) monitoring A. Petrosyan on behalf of the ATLAS collaboration GRID’2012, , JINR, Dubna.
1 portal.p-grade.hu Further information on P-GRADE Gergely Sipos MTA SZTAKI Hungarian Academy of Sciences.
SEE-GRID-SCI The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures contract no.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Grid application development with gLite and P-GRADE Portal Miklos Kozlovszky MTA SZTAKI.
SEE-GRID-SCI Regional Grid Infrastructure: Resource for e-Science Regional eInfrastructure development and results IT’10, Zabljak,
SRM 2.2: status of the implementations and GSSD 6 th March 2007 Flavia Donno, Maarten Litmaath INFN and IT/GD, CERN.
SEE-GRID-SCI SEE-GRID-SCI Operations Procedures and Tools Antun Balaz Institute of Physics Belgrade, Serbia The SEE-GRID-SCI.
BIOINFOGRID: Bioinformatics Grid Application for Life Science Giorgio Maggi INFN and Politecnico di Bari
INFSO-RI Enabling Grids for E-sciencE Project Gridification: the UNOSAT experience Patricia Méndez Lorenzo CERN (IT-PSS/ED) CERN,
Belle MC Production on Grid 2 nd Open Meeting of the SuperKEKB Collaboration Soft/Comp session 17 March, 2009 Hideyuki Nakazawa National Central University.
LCG Middleware Testing in 2005 and Future Plans E.Slabospitskaya, IHEP, Russia CERN-Russia Joint Working Group on LHC Computing March, 6, 2006.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) GISELA Additional Services Diego Scardaci
Integrated Grid workflow for mesoscale weather modeling and visualization Zhizhin, M., A. Polyakov, D. Medvedev, A. Poyda, S. Berezin Space Research Institute.
Resource Brokering in the PROGRESS Project Juliusz Pukacki Grid Resource Management Workshop, October 2003.
SEE-GRID-SCI Ognjen Prnjat, Project Coordinator GRNET NA1: Project Administrative and Technical Management PSC07 Meeting, Zabljak,
1 / 22 AliRoot and AliEn Build Integration and Testing System.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
PROGRESS: ICCS'2003 GRID SERVICE PROVIDER: How to improve flexibility of grid user interfaces? Michał Kosiedowski.
THORPEX Interactive Grand Global Ensemble (TIGGE) China Meteorological Administration TIGGE-WG meeting, Boulder, June Progress on TIGGE Archive Center.
Grid Operations Centre LCG Accounting Trevor Daniels, John Gordon GDB 8 Mar 2004.
LHCb Software Week November 2003 Gennady Kuznetsov Production Manager Tools (New Architecture)
SEE-GRID-SCI The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures contract no.
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
SEE-GRID-SCI SEE-GRID-SCI ES VOs The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
SEE-GRID-SCI NA1-Technical Execution Plan Overview Open of PSC-03 Bucharest Ioannis Liabotis Greece GRNET iliaboti grnetSPAMFREE.gr.
SEE-GRID-SCI Overview of YAIM and SEE-GRID-SCI YAIM templates Dusan Vudragovic Institute of Physics Belgrade Serbia The.
SEE-GRID-SCI WRF – ARW application: Overview The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research.
The ATLAS Cloud Model Simone Campana. LCG sites and ATLAS sites LCG counts almost 200 sites. –Almost all of them support the ATLAS VO. –The ATLAS production.
SAM Sensors & Tests Judit Novak CERN IT/GD SAM Review I. 21. May 2007, CERN.
The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures contract no Workflow repository, user.
1 DIRAC Job submission A.Tsaregorodtsev, CPPM, Marseille LHCb-ATLAS GANGA Workshop, 21 April 2004.
The GridPP DIRAC project DIRAC for non-LHC communities.
→ MIPRO Conference,Opatija, 31 May -3 June 2005 Grid-based Virtual Organization for Flood Prediction Miroslav Dobrucký Institute of Informatics, SAS Slovakia,
OPTIMIZATION OF DIESEL INJECTION USING GRID COMPUTING Miguel Caballer Universidad Politécnica de Valencia.
Development of test suites for the certification of EGEE-II Grid middleware Task 2: The development of testing procedures focused on special details of.
SEE-GRID-SCI NA4 Progress PSC05, Dubrovnik, 9-11 September 2009 Cevat Şener Dept. of Computer Engineering, Middle East Technical.
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
PROGRESS: GEW'2003 Using Resources of Multiple Grids with the Grid Service Provider Michał Kosiedowski.
SEE-GRID-SCI Vangelis Floros, Vasso Kotroni, Kostas Lagouvardos – NOA, Athens, GREECE Goran Pejanovic, Luka Ilic, Momcilo Zivkovic.
1 P-GRADE Portal hands-on Gergely Sipos MTA SZTAKI Hungarian Academy of Sciences.
EGEE is a project funded by the European Union under contract IST Enabling bioinformatics applications to.
SEE-GRID-SCI Meteo-VO: Overview The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
SEE-GRID-SCI NA4 Progress PSC06 Istanbul, 7-8 December 2009 The SEE-GRID-SCI initiative is co-funded by the European Commission under.
EGEE is a project funded by the European Union under contract IST Experiment Software Installation toolkit on LCG-2
D.Spiga, L.Servoli, L.Faina INFN & University of Perugia CRAB WorkFlow : CRAB: CMS Remote Analysis Builder A CMS specific tool written in python and developed.
Stephen Burke – Sysman meeting - 22/4/2002 Partner Logo The Testbed – A User View Stephen Burke, PPARC/RAL.
The GridPP DIRAC project DIRAC for non-LHC communities.
ATLAS Off-Grid sites (Tier-3) monitoring A. Petrosyan on behalf of the ATLAS collaboration GRID’2012, , JINR, Dubna.
David Adams ATLAS AJDL: Abstract Job Description Language David Adams BNL June 29, 2004 PPDG Collaboration Meeting Williams Bay.
Ganga/Dirac Data Management meeting October 2003 Gennady Kuznetsov Production Manager Tools and Ganga (New Architecture)
SEE-GRID-SCI Grid Operations Procedures Antun Balaz Institute of Physics Belgrade Serbia The SEE-GRID-SCI initiative.
SEE-GRID-SCI MON Hands-on Session Vladimir Slavnić Institute of Physics Belgrade Serbia The SEE-GRID-SCI initiative.
SEE-GRID-SCI Antun Balaz Scientific Computing Laboratory Institute of Physics Belgrade SEE-GRID eInfrastructure.
GDB Meeting CERN 09/11/05 EGEE is a project funded by the European Union under contract IST A new LCG VO for GEANT4 Patricia Méndez Lorenzo.
SEE-GRID-SCI WRF-ARW model: Grid usage The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
EDGI European Desktop Grid Initiative Have you ever submitted jobs to gLite in one run? If not, I will show.
MCproduction on the grid
Overview of the Belle II computing
Belle II Physics Analysis Center at TIFR
Porting MM5 and BOLAM codes to the GRID
TIGGE Archives and Access
Presentation transcript:

SEE-GRID-SCI REFS application: NOA The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures contract no PSC03, January 2009, Bucharest Kotroni V. Floros V., Lagouvardos K. National Observatory of Athens

PSC03, January 2009, Bucharest2 Objectives Technical objectives Develop two operational regional ensemble model chains based on BOLAM and MM5, using initial and boundary conditions from the global ensemble model members from NCEP-USA. Develop with SEWA and the Hydrometeorological Institute of Montenegro the postprocessing that will gather all ensemble members and produce a super-ensemble.

PSC03, January 2009, Bucharest3 REFS – Progress Developped a glite UI where we installed MPICH-1.2.7, PGI-5.2, and GANGA 5. Compiled BOLAM and MM5 model on the UI. Developped the scripting for the model workflow:  Downloading the initial and boundary conditions from NCEP  Decoding these data and preprocess them (create the initial and boundary conditions for the models)  Model run

PSC03, January 2009, Bucharest4 REFS – Progress Developped the REFS execution workflow for Bolam and MM5 models Local Job management: managed by a Python module that uses GANGA to submit a set of jobs to the grid. Prepares a compound grid job comprised of a number of subjobs, one for each ensemble member that should run. (a) prepares the ensemble members, (b) submits the jobs to the grid and (c) periodically monitors the execution until all members complete either successfully or unsuccessfully. Execution on the grid: When the job arrives on the WN it executes a script. LILO (lead-in lead-out) script is responsible (a) to prepare the environment for the model, (b) retrieve the model binaries, (c) unpack them, (d) set the appropriate environment variables, (e) execute the model workflow and (f) finally store the results on the LFC.

PSC03, January 2009, Bucharest5 REFS – Progress This procedure has been developped for both Bolam and MM5 models. Use of the SEE-GRID-SCI infrastructure and namely the hellasgrid sites for the moment Systematic testing of the response of the SEE-GRID VO infrastructure on daily requests of the whole workflow of BOLAM model regional ensemble prediction system.  IF THE INITIAL AND BOUNDARY CONDITIONS ARE COMPLETELY FTP’ED FROM NCEP-USA ALL 10 ENSEMBLE MEMBERS FINISH SUCCESSFULLY testing of the response of the SEE-GRID VO infrastructure on daily requests of the whole workflow of MM5 model regional ensemble prediction system.  IF THE INITIAL AND BOUNDARY CONDITIONS ARE COMPLETELY FTP’ED FROM NCEP-USA IT IS QUITE RARE THAT ALL 10 ENSEMBLE MEMBERS FINISH SUCCESSFULLY

PSC03, January 2009, Bucharest6 REFS – Progress A user manual “Regional Ensemble Forecast System (REFS) Users Manual” has been prepared and distributed to the partners for commenting 2 days ago. A proposal for Meteo applications storage and file schema has been also prepared and discussed during the 1 st EVO meeting (with contribution by B. Marovic). On 12/1/09 the LFC space for REFS applications has been prepared and the update scripts that generate the daily output folders has been activated. These scripts run by a cron job at the UI at NOA

PSC03, January 2009, Bucharest7 REFS – Progress PROPOSAL FOR METEO APPLICATIONS STORAGE SCHEME The LFC will be used for storing : Various data files produced by the models:  raw output (output produced daily by each ensemble member of a model),  post-processed output and  intermediate results from failed model executions. Software packages. Packages containing  the model binary executables,  additional required scripts and auxiliary data. These are fetched from the LFC to the WN before the model execution.

PSC03, January 2009, Bucharest8 REFS – Progress /grid /meteo.see-grid-sci.eu /REFS /data / /rawoutput / - -RAW- -.tgz /failed / - -FAIL- -.tgz /artifacts / - -.tgz /shared /software/ / - Example: Output produced by the 7th member of MM5 ensemble forecasting run on 27th November 2008: NOA-MM5-RAW tgz. The file is stored under directory: /grid/meteo.see-grid-sci.eu/REFS/data/SEEurope/rawdata/

PSC03, January 2009, Bucharest9 REFS – Problems (1) IF THE INITIAL AND BOUNDARY CONDITIONS FOR MM5 ARE COMPLETELY FTP’ED FROM NCEP-USA IT IS QUITE RARE THAT ALL 10 ENSEMBLE MEMBERS FINISH SUCCESSFULLY So we are working on different compilations of the MPI and consequently of MM5 in order to resolve the problem. (2) THE NOMADS SERVERS FROM WHERE WE FTP THE INITIAL AND BOUNDARY CONDITIONS ARE NOT VERY STABLE. SOMETIMES THE DATA ARE NOT AVAILABLE. A high availability server has been created in NCEP that provides the needed data but in a new format (GRIB2 instead of GRIB1) so some rescripting is needed.

PSC03, January 2009, Bucharest10 Activities and Time-Plan Up to M12:  improve the scripts for BOLAM and MM5  use the new GRIB2 initial data from NCEP  systematically check the pre-operational runs of these two models on the GRID  start working on the post-processing of all four model outputs towards the super-ensemble