Basic Grid Job Submission Alessandra Forti 28 March 2006.

Slides:



Advertisements
Similar presentations
Workload Management David Colling Imperial College London.
Advertisements

EU 2nd Year Review – Jan – Title – n° 1 WP1 Speaker name (Speaker function and WP ) Presentation address e.g.
Workload management Owen Maroney, Imperial College London (with a little help from David Colling)
INFSO-RI Enabling Grids for E-sciencE Workload Management System and Job Description Language.
FP7-INFRA Enabling Grids for E-sciencE EGEE Induction Grid training for users, Institute of Physics Belgrade, Serbia Sep. 19, 2008.
The Grid Constantinos Kourouyiannis Ξ Architecture Group.
Job Submission The European DataGrid Project Team
Steve LloydGridPP13 Durham July 2005 Slide 1 Using the Grid Steve Lloyd Queen Mary, University of London.
INFSO-RI Enabling Grids for E-sciencE EGEE Middleware The Resource Broker EGEE project members.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Services Abderrahman El Kharrim
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Job Submission Fokke Dijkstra RuG/SARA Grid.
Grid BIFI1 Grid computing at BIFI: Description of resources & examples of use seminar 8th June 2005 Guillermo Losilla Anadón.
FESR Consorzio COMETA - Progetto PI2S2 Using MPI to run parallel jobs on the Grid Marcello Iacono Manno Consorzio COMETA
Makrand Siddhabhatti Tata Institute of Fundamental Research Mumbai 17 Aug
Job Submission The European DataGrid Project Team
Riccardo Bruno INFN.CT Sevilla, Sep 2007 The GENIUS Grid portal.
Elisabetta Ronchieri - How To Use The UI command line - 10/29/01 - n° 1 How To Use The UI command line Elisabetta Ronchieri by WP1 elisabetta.ronchieri.
INFSO-RI Enabling Grids for E-sciencE GILDA Praticals GILDA Tutors INFN Catania ICTP/INFM-Democritos Workshop on Porting Scientific.
Computational grids and grids projects DSS,
:: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :: GridKA School 2009 MPI on Grids 1 MPI On Grids September 3 rd, GridKA School 2009.
Enabling Grids for E-sciencE Workload Management System on gLite middleware Matthieu Reichstadt CNRS/IN2P3 ACGRID School, Hanoi (Vietnam)
DataGrid WP1 Massimo Sgaravatto INFN Padova. WP1 (Grid Workload Management) Objective of the first DataGrid workpackage is (according to the project "Technical.
Nadia LAJILI User Interface User Interface 4 Février 2002.
INFSO-RI Enabling Grids for E-sciencE Workload Management System Mike Mineter
- Distributed Analysis (07may02 - USA Grid SW BNL) Distributed Processing Craig E. Tull HCG/NERSC/LBNL (US) ATLAS Grid Software.
Group 1 : Grid Computing Laboratory of Information Technology Supervisors: Alexander Ujhinsky Nikolay Kutovskiy.
Job Submission The European DataGrid Project Team
June 24-25, 2008 Regional Grid Training, University of Belgrade, Serbia Introduction to gLite gLite Basic Services Antun Balaž SCL, Institute of Physics.
EGEE-III INFSO-RI Enabling Grids for E-sciencE Feb. 06, Introduction to High Performance and Grid Computing Faculty of Sciences,
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Job Submission Fokke Dijkstra RuG/SARA Grid.
Jan 31, 2006 SEE-GRID Nis Training Session Hands-on V: Standard Grid Usage Dušan Vudragović SCL and ATLAS group Institute of Physics, Belgrade.
TERENA 2003, May 21, Zagreb TERENA Networking Conference, 2003 MOBILE WORK ENVIRONMENT FOR GRID USERS. TESTBED Miroslaw Kupczyk Rafal.
Getting started DIRAC Project. Outline  DIRAC information system  Documentation sources  DIRAC users and groups  Registration with DIRAC  Getting.
Stephen Burke – Data Management - 3/9/02 Partner Logo Data Management Stephen Burke, PPARC/RAL Jeff Templon, NIKHEF.
1 Andrea Sciabà CERN Critical Services and Monitoring - CMS Andrea Sciabà WLCG Service Reliability Workshop 26 – 30 November, 2007.
E-infrastructure shared between Europe and Latin America 1 Workload Management System-WMS Luciano Diaz Universidad Nacional Autónoma de México - UNAM Mexico.
INFSO-RI Enabling Grids for E-sciencE Αthanasia Asiki Computing Systems Laboratory, National Technical.
INFSO-RI Enabling Grids for E-sciencE Αthanasia Asiki Computing Systems Laboratory, National Technical.
Enabling Grids for E-sciencE Workload Management System on gLite middleware - commands Matthieu Reichstadt CNRS/IN2P3 ACGRID School, Hanoi.
High-Performance Computing Lab Overview: Job Submission in EDG & Globus November 2002 Wei Xing.
EGEE-0 / LCG-2 middleware Practical.
INFSO-RI Enabling Grids for E-sciencE GILDA and GENIUS Guy Warner NeSC Training Team An induction to EGEE for GOSC and the NGS NeSC,
Tier 3 Status at Panjab V. Bhatnagar, S. Gautam India-CMS Meeting, July 20-21, 2007 BARC, Mumbai Centre of Advanced Study in Physics, Panjab University,
Workload Management System Jason Shih WLCG T2 Asia Workshop Dec 2, 2006: TIFR.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Command Line Grid Programming Spiros Spirou Greek Application Support Team NCSR “Demokritos”
INFSO-RI Enabling Grids for E-sciencE GILDA Praticals Giuseppe La Rocca INFN – Catania gLite Tutorial at the EGEE User Forum CERN.
Further aspects of EGEE middleware components INFN, Catania EGEE is funded by the European Union under contract IST
Enabling Grids for E-sciencE Sofia, 17 March 2009 INFSO-RI Introduction to Grid Computing, EGEE and Bulgarian Grid Initiatives –
Job Submission The European DataGrid Project Team
User Interface UI TP: UI User Interface installation & configuration.
WMS baseline issues in Atlas Miguel Branco Alessandro De Salvo Outline  The Atlas Production System  WMS baseline issues in Atlas.
LCG2 Tutorial Viet Tran Institute of Informatics Slovakia.
EGEE is a project funded by the European Union under contract IST GENIUS and GILDA Guy Warner NeSC Training Team Induction to Grid Computing.
GRID commands lines Original presentation from David Bouvet CC/IN2P3/CNRS.
Introduction to Computing Element HsiKai Wang Academia Sinica Grid Computing Center, Taiwan.
Antonio Fuentes RedIRIS Barcelona, 15 Abril 2008 The GENIUS Grid portal.
FESR Consorzio COMETA - Progetto PI2S2 Using MPI to run parallel jobs on the Grid Marcello Iacono Manno Consorzio Cometa
Enabling Grids for E-sciencE Work Load Management & Simple Job Submission Practical Shu-Ting Liao APROC, ASGC EGEE Tutorial.
EGEE is a project funded by the European Union under contract IST Job Submission Giuseppe La Rocca EGEE NA4 Generic Applications INFN Catania.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI solution for high throughput data analysis Peter Solagna EGI.eu Operations.
Grid Computing: Running your Jobs around the World
Classic Storage Element
Corso di Calcolo Parallelo Grid Computing
Workload Management System
CRC exercises Not happy with the way the document for testbed architecture is progressing More a collection of contributions from the mware groups rather.
5. Job Submission Grid Computing.
login: clermont-ferrandxx password: GridCLExx
EGEE Middleware: gLite Information Systems (IS)
gLite Job Management Christos Theodosiou
WMS+LB Server Installation and Configuration
Presentation transcript:

Basic Grid Job Submission Alessandra Forti 28 March 2006

Outline Grid components Job submission Documentation

Grid Components RB and LB RB=Resource Broker the equivalent of a batch server. Jobs land on a RB which decides depending on the User specifications where to send the Job LB=Logging and Bookkeeping Handles the information about the jobs submitted through a specific RB. Normally a RB and a LB resides on the same machine but it is not necessary. A user can submit a job to an RB and log them on an LB at another site

Grid Components UI and CE UI=User Interface The front end where the grid clients accessible to the users reside. It has login access. It can be located anywhere. A laptop with UI software on it can access grid resources. CE=Computing Element The gateway to a local batch system. It hanldes final authentication and authorization to access the local batch system. It can be on the same machine as the batch server but is not required.

Grid Components SE SE=Storage element Gateway to the data. In the simplest form it is basically a GridFTP server, but this type is considered obsolete now. It handles authentication and authorization to access the local data. SRM=Storage Resource Manager SRM is a protocol designed to hide the implementation actually used as a backend from the user. Backends are now more sophisticated and have storage management tools and will support access policies based on the DN of the certificate rather than simple unix IDs It is the current version of a SE

Grid Componenents IS IS=Information system Each site publishes an amount of information about the resources available in the IS. IS has a hierarchical structure (top level BDII, site BDII, service GIIS). The user only sees the top level BDII and this is what the RB, WN and UI see as well. Generic top level BDII will contain all the same information (they are replicas). However a VO might want to run its own BDII containing only information about resources open to it. This gives control to the VO to select good sites in a way transparent to the user.

Work Load Management

Job Submission First step login to a UI A desktop in the department, your laptop, lxplus at CERN… if correctly configured they should be equivalent You need your certificate and key in $HOME/.globus If they have the wrong permissions the software will complain. This is to remind users to protect their certificates. grid-proxy-init Your identity: /C=UK/O=eScience/OU=Manchester/L=HEP/CN=alessandra forti Enter GRID pass phrase for this identity Creating proxy Done Your proxy is valid until: Wed Mar 29 02:00:

Job submission: JDL language JDL=Job Description Language To submit a job you’ll need to write what is called a jdl file in which you specify the type of resources the job needs and what files the job needs to find. Unfortunately not everything can be specified. Only what can be retrieved by the IS and the catalogs and sometimes not even that if the RB can’t handle it. Most simple jdl is cat testJob.jdl Executable = “test.sh"; StdOutput = "testJob.out"; StdError = "testJob.err"; InputSandbox = {"./test.sh"}; OutputSandbox = {"testJob.out","testJob.err"};

How to list resources for a job edg-job-list-match --vo dteam testJob.jdl Selected Virtual Organisation name (from --vo option): dteam Connecting to host lcgrb01.gridpp.rl.ac.uk, port 7772 ************************************************************ ******** COMPUTING ELEMENT IDs LIST The following CE(s) matching your job requirements have been found: *CEId* ce01.tier2.hep.manchester.ac.uk:2119/jobmanager-pbs-dteam ************************************************************ ********

Submit a Job edg-job-submit --vo dteam testJob.jdl Selected Virtual Organisation name (from --vo option): dteam Connecting to host lcgrb01.gridpp.rl.ac.uk, port 7772 Logging to host lcgrb01.gridpp.rl.ac.uk, port 9002 ****************************************************************************** JOB SUBMIT OUTCOME The job has been successfully submitted to the Network Server. Use edg-job-status command to check job current status. Your job identifier (edg_jobId) is: - ******************************************************************************

Check the job status edg-job-status ************************************************************* BOOKKEEPING INFORMATION: Status info for the Job : Current Status: Scheduled Status Reason: Job successfully submitted to Globus Destination: lcg-ce.ecm.ub.es:2119/jobmanager-pbs-dteam reached on: Tue Mar 28 13:16: *************************************************************

Cancel a Job edg-job-cancel Are you sure you want to remove specified job(s)? [y/n]n :y =============== edg-job-cancel Success ================ The cancellation request has been successfully submitted for the following job(s): - =============================================== This command works only for jobs that have been already scheduled or are already running

Check status of all jobs edg-job-status --all --vo dteam Selected Virtual Organisation name (from --vo option): dteam Retrieving Information from LB server lcgrb01.gridpp.rl.ac.uk:9000 Please wait: this operation could take some seconds. ************************************************************* BOOKKEEPING INFORMATION: Status info for the Job : Current Status: Scheduled Status Reason: Job successfully submitted to Globus Destination: ce01.tier2.hep.manchester.ac.uk:2119/jobmanager-pbs-dteam reached on: Tue Mar 28 13:32: ************************************************************* BOOKKEEPING INFORMATION: Status info for the Job : Current Status: Running Status Reason: Job successfully submitted to Globus Destination: grid-ce.physik.uni-wuppertal.de:2119/jobmanager-lcgpbs-large reached on: Tue Mar 28 13:44: *************************************************************

Retrieve the output edg-job-get-output Retrieving files from host: lcgrb01.gridpp.rl.ac.uk ( for ) ******************************************************************** JOB GET OUTPUT OUTCOME Output sandbox files for the job: - have been successfully retrieved and stored in the directory: /tmp/jobOutput/aforti_pCsb-7yWvzSGHfr_jCqTqQ *******************************************************************

The Output less /tmp/jobOutput/aforti_pCsb-7yWvzSGHfr_jCqTqQ total 32 drwxrwxr-x 2 aforti aforti 4096 Mar 28 14:49./ -rw-rw-r-- 1 aforti aforti 0 Mar 28 14:49 testJob.err -rw-rw-r-- 1 aforti aforti Mar 28 14:49 testJob.out

Documentation LCG Main page Latest version of LCG User Manual