August 30, 2002Jerry Gieraltowski Launching ATLAS Jobs to either the US-ATLAS or EDG Grids using GRAPPA Goal: Use GRAPPA to launch a job to one or more.

Slides:



Advertisements
Similar presentations
DataTAG WP4 Meeting CNAF Jan 14, 2003 Interfacing AliEn and EDG 1/13 Stefano Bagnasco, INFN Torino Interfacing AliEn to EDG Stefano Bagnasco, INFN Torino.
Advertisements

Andrew McNab - Manchester HEP - 17 September 2002 Putting Existing Farms on the Testbed Manchester DZero/Atlas and BaBar farms are available via the Testbed.
ATLAS/LHCb GANGA DEVELOPMENT Introduction Requirements Architecture and design Interfacing to the Grid Ganga prototyping A. Soroko (Oxford), K. Harrison.
NorduGrid Grid Manager developed at NorduGrid project.
GUMS status Gabriele Carcassi PPDG Common Project 12/9/2004.
CERN LCG Overview & Scaling challenges David Smith For LCG Deployment Group CERN HEPiX 2003, Vancouver.
INFN - Ferrara BaBarGrid Meeting SPGrid Efforts in Italy BaBar Collaboration Meeting - SLAC December 11, 2002 Enrica Antonioli - Paolo Veronesi.
The Grid Constantinos Kourouyiannis Ξ Architecture Group.
Andrew McNab - EDG Access Control - 14 Jan 2003 EU DataGrid security with GSI and Globus Andrew McNab University of Manchester
WP 1 Grid Workload Management Massimo Sgaravatto INFN Padova.
The DataGrid Project NIKHEF, Wetenschappelijke Jaarvergadering, 19 December 2002
GRID workload management system and CMS fall production Massimo Sgaravatto INFN Padova.
Slides for Grid Computing: Techniques and Applications by Barry Wilkinson, Chapman & Hall/CRC press, © Chapter 1, pp For educational use only.
Basic Grid Job Submission Alessandra Forti 28 March 2006.
GRID Workload Management System Massimo Sgaravatto INFN Padova.
K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
Israel Cluster Structure. Outline The local cluster Local analysis on the cluster –Program location –Storage –Interactive analysis & batch analysis –PBS.
Magda – Manager for grid-based data Wensheng Deng Physics Applications Software group Brookhaven National Laboratory.
DIRAC API DIRAC Project. Overview  DIRAC API  Why APIs are important?  Why advanced users prefer APIs?  How it is done?  What is local mode what.
K. Harrison CERN, 15th May 2003 GANGA: GAUDI/ATHENA AND GRID ALLIANCE - Development strategy - Ganga prototype - Release plans - Conclusions.
The ATLAS Production System. The Architecture ATLAS Production Database Eowyn Lexor Lexor-CondorG Oracle SQL queries Dulcinea NorduGrid Panda OSGLCG The.
Grappa: Grid access portal for physics applications Shava Smallen Extreme! Computing Laboratory Department of Physics Indiana University.
BaBar WEB job submission with Globus authentication and AFS access T. Adye, R. Barlow, A. Forti, A. McNab, S. Salih, D. H. Smith on behalf of the BaBar.
Don Quijote Data Management for the ATLAS Automatic Production System Miguel Branco – CERN ATC
Workload Management WP Status and next steps Massimo Sgaravatto INFN Padova.
K.Harrison CERN, 21st November 2002 GANGA: GAUDI/ATHENA AND GRID ALLIANCE - Background and scope - Project organisation - Technology survey - Design -
K. Harrison CERN, 20th April 2004 AJDL interface and LCG submission - Overview of AJDL - Using AJDL from Python - LCG submission.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
BaBar Grid Computing Eleonora Luppi INFN and University of Ferrara - Italy.
The gLite API – PART I Giuseppe LA ROCCA INFN Catania ACGRID-II School 2-14 November 2009 Kuala Lumpur - Malaysia.
VOX Project Status T. Levshina. Talk Overview VOX Status –Registration –Globus callouts/Plug-ins –LRAS –SAZ Collaboration with VOMS EDG team Preparation.
Computational grids and grids projects DSS,
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
Nadia LAJILI User Interface User Interface 4 Février 2002.
Belle MC Production on Grid 2 nd Open Meeting of the SuperKEKB Collaboration Soft/Comp session 17 March, 2009 Hideyuki Nakazawa National Central University.
LCG Middleware Testing in 2005 and Future Plans E.Slabospitskaya, IHEP, Russia CERN-Russia Joint Working Group on LHC Computing March, 6, 2006.
Group 1 : Grid Computing Laboratory of Information Technology Supervisors: Alexander Ujhinsky Nikolay Kutovskiy.
INFSO-RI Enabling Grids for E-sciencE Installation and configuration of gLite Resource Broker Emidio Giorgio INFN EGEE-EMBRACE tutorial,
Production Tools in ATLAS RWL Jones GridPP EB 24 th June 2003.
13 May 2004EB/TB Middleware meeting Use of R-GMA in BOSS for CMS Peter Hobson & Henry Nebrensky Brunel University, UK Some slides stolen from various talks.
Virtual Batch Queues A Service Oriented View of “The Fabric” Rich Baker Brookhaven National Laboratory April 4, 2002.
EGEE-II INFSO-RI Enabling Grids for E-sciencE The GILDA training infrastructure.
GRID Zhen Xie, INFN-Pisa, on DataGrid WP6 meeting1 Globus Installation Toolkit Zhen Xie On behalf of grid-release team INFN-Pisa.
Role Based VO Authorization Services Ian Fisk Gabriele Carcassi July 20, 2005.
Creating and running an application.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
Andrew McNab - Manchester HEP - 17 September 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –“How much of the Testbed has.
Tier 3 Status at Panjab V. Bhatnagar, S. Gautam India-CMS Meeting, July 20-21, 2007 BARC, Mumbai Centre of Advanced Study in Physics, Panjab University,
© Geodise Project, University of Southampton, Geodise Middleware Graeme Pound, Gang Xue & Matthew Fairman Summer 2003.
SAM Sensors & Tests Judit Novak CERN IT/GD SAM Review I. 21. May 2007, CERN.
VO Box Issues Summary of concerns expressed following publication of Jeff’s slides Ian Bird GDB, Bologna, 12 Oct 2005 (not necessarily the opinion of)
Development of test suites for the certification of EGEE-II Grid middleware Task 2: The development of testing procedures focused on special details of.
INFN - Ferrara BaBar Meeting SPGrid: status in Ferrara Enrica Antonioli - Paolo Veronesi Ferrara, 12/02/2003.
STAR Scheduling status Gabriele Carcassi 9 September 2002.
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
Open Science Grid Build a Grid Session Siddhartha E.S University of Florida.
Grid Workload Management (WP 1) Massimo Sgaravatto INFN Padova.
The DataGrid Project NIKHEF, Wetenschappelijke Jaarvergadering, 19 December 2002
VOX Project Tanya Levshina. 05/17/2004 VOX Project2 Presentation overview Introduction VOX Project VOMRS Concepts Roles Registration flow EDG VOMS Open.
LHCb Grid MeetingLiverpool, UK GRID Activities Glenn Patrick Not particularly knowledgeable-just based on attending 3 meetings.  UK-HEP.
STAR Scheduler Gabriele Carcassi STAR Collaboration.
Tests at Saclay D. Calvet, A. Formica, Z. Georgette, I. Mandjavidze, P. Micout DAPNIA/SEDI, CEA Saclay Gif-sur-Yvette Cedex.
10 March Andrey Grid Tools Working Prototype of Distributed Computing Infrastructure for Physics Analysis SUNY.
Consorzio COMETA - Progetto PI2S2 UNIONE EUROPEA Grid2Win : gLite for Microsoft Windows Elisa Ingrà - INFN.
The EDG Testbed Deployment Details
Eleonora Luppi INFN and University of Ferrara - Italy
U.S. ATLAS Grid Production Experience
Patricia Méndez Lorenzo ALICE Offline Week CERN, 13th July 2007
CRC exercises Not happy with the way the document for testbed architecture is progressing More a collection of contributions from the mware groups rather.
Grid Computing Software Interface
Presentation transcript:

August 30, 2002Jerry Gieraltowski Launching ATLAS Jobs to either the US-ATLAS or EDG Grids using GRAPPA Goal: Use GRAPPA to launch a job to one or more grid servers which may be part of the US-ATLAS grid or the EDG Testbed grid. How are these grids different?

August 30, 2002Jerry Gieraltowski Comparing the US-ATLAS grid and the EDG Testbed US-ATLAS: Jobs destined for execution on a site in the US-ATLAS grid would be launched by GRAPPA as a “globus” job (ex. globus-job- run) to a gatekeeper server at that site. The job would execute on the gatekeeper or be farmed out to other globus client servers at the site. EDG Testbed: Jobs destined for execution on a site in the EDG Testbed would be launched by GRAPPA as a “globus” job (ex. globus-job- run) to a UI (User Interface) server at that site. Once the job reaches the server, a JDL script would be created for the job and it would subsequently be submitted to an EDG Resource Broker for execution on some node in the Testbed.

August 30, 2002Jerry Gieraltowski Minimum Server Functionality Needed ? Minimum functionality needed on both the US-ATLAS server and the EDG Testbed UI server: Operating System: Linux – RedHat 6.2/7.2 (prefer 7.2 ) Grid Middleware: Globus 2.0- gatekeeper, PACMAN, GRAPPA, EDG-only(Rel 1.2.x ) Java Software: Java: jdk1.3.x [j2sdk1.4.0 will NOT work!] Application Software:  Boxed (for now) versions of “released” ATLAS software (3.0.1 or higher) User Certificate:  EDG Testbed: DOESciencegrid certificate and registration in ATLAS VO  US-ATLAS: DOESciencegrid certificate and grid-mapfile entry  Signing Policies: Union of US-ATLAS and EDG signing policies

August 30, 2002Jerry Gieraltowski Changes Needed to GRAPPA Functionality? The basic functionality provided today by the Athena notebook within GRAPPA will be relatively unchanged for the boxed version of “atlfast”. Local and AFS version of “atlfast” and all versions of “atlsim” will be addressed at a later date. The following changes are expected to be “relatively” easy to incorporate. Make the executable work for either RedHat 6.2 or RedHat 7.2 If the host that GRAPPA sends the job to is an EDG node, create an appropriate JDL script which will submit the job and wait until a state of “OutputReady” is obtained before exiting.

August 30, 2002Jerry Gieraltowski Any other changes needed ? Initial testing has shown that what should have been “relatively easy” is never true in reality! GRAPPAs application notebook code has been written in several languages: bash, csh, python, perl. You must be careful that the change you are making is appropriate for that language. The code MUST be more aware of and tolerant to error conditions. You can not assume that every server has defined environment variables in the same way – if they have even defined them at all. Problems with the EDG job submission service in Release allow you to submit and complete a job but NOT retrieve any of the output. (Bugzilla bug #551 – reported Aug. 22 by Jerry Gieraltowski)

August 30, 2002Jerry Gieraltowski Where am I today? [Special thanks to Shava Smallen and Dan Engh for their patience and consultation!] The ANL server – atlas10.hep.anl.gov – has been loaded with the EDG UI Release rpms. It has also been configured as a Globus gatekeeper. I have created a local copy on atlas14.hep.anl.gov of the input files need to run the boxed version and have modified them to be somewhat flexible to RedHat releases and environment variable settings. I have almost completed the jdl-creation script if the host server is recognized as an EDG node. I have submitted several trial scripts to the CERN UI-sever (testbed010.cern.ch) using a DOESciencegrid certificate. The jobs complete but I still can not get back any output.