The ATLAS software in the Grid Alessandro De Salvo <Alessandro

Slides:



Advertisements
Similar presentations
ATLAS Installation System 2 (status update) Alessandro De Salvo A. De Salvo – 28 May 2013 Status.
Advertisements

10/01/2007 Installing and Using ATLAS Distribution in LNF Release
S/W meeting 18 October 2007RSD 1 Remote Software Deployment Nick West.
Low level CASE: Source Code Management. Source Code Management  Also known as Configuration Management  Source Code Managers are tools that: –Archive.
Athena. Outline Setting up the environment Running an Athena job.
ATLAS Software Kaushik De University of Texas At Arlington based on a tutorial by P. Calafiura (LBNL) LHC Computing Workshop, Ankara May 2, 2008.
The Atlas Software Distribution Christian ArnaultAlessandro De SalvoSimon GeorgeGrigori Rybkine
M. Taimoor Khan * Java Server Pages (JSP) is a server-side programming technology that enables the creation of dynamic,
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
October, Scientific Linux INFN/Trieste B.Gobbo – Compass R.Gomezel - T.Macorini - L.Strizzolo INFN - Trieste.
Maximilian Berger David Gstir Thomas Fahringer Distributed and parallel Systems Group University of Innsbruck Austria Oct, 13, Krakow, PL.
J.T Moscicki CERN LCG - Software Process & Infrastructure1 SPI Software Process & Infrastructure for LCG Software Packaging and Distribution LCG Application.
F. Brasolin / A. De Salvo – The ATLAS benchmark suite – May, Benchmarking ATLAS applications Franco Brasolin - INFN Bologna - Alessandro.
Distribution After Release Tool Natalia Ratnikova.
The huge amount of resources available in the Grids, and the necessity to have the most up-to-date experimental software deployed in all the sites within.
05/29/2002Flavia Donno, INFN-Pisa1 Packaging and distribution issues Flavia Donno, INFN-Pisa EDG/WP8 EDT/WP4 joint meeting, 29 May 2002.
M. Schott (CERN) Page 1 CERN Group Tutorials CAT Tier-3 Tutorial October 2009.
EGEE is a project funded by the European Union under contract IST JRA1-SA1 requirement gathering Maite Barroso JRA1 Integration and Testing.
Architecture and ATLAS Western Tier 2 Wei Yang ATLAS Western Tier 2 User Forum meeting SLAC April
My Name: ATLAS Computing Meeting – NN Xxxxxx A Dynamic System for ATLAS Software Installation on OSG Sites Xin Zhao, Tadashi Maeno, Torre Wenaus.
GRID Zhen Xie, INFN-Pisa, on DataGrid WP6 meeting1 Globus Installation Toolkit Zhen Xie On behalf of grid-release team INFN-Pisa.
Nurcan Ozturk University of Texas at Arlington US ATLAS Transparent Distributed Facility Workshop University of North Carolina - March 4, 2008 A Distributed.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
1 MSTE Visual SourceSafe For more information, see:
CERN IT Department t LHCb Software Distribution Roberto Santinelli CERN IT/GS.
Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Tools and techniques for managing virtual machine images Andreas.
Setup and run athena locally and on grid Cunfeng feng
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
How to configure, build and install Trilinos November 2, :30-9:30 a.m. Jim Willenbring.
Gridmake for GlueX software Richard Jones University of Connecticut GlueX offline computing working group, June 1, 2011.
T3g software services Outline of the T3g Components R. Yoshida (ANL)
TP: Grid site installation BEINGRID site installation.
EGEE is a project funded by the European Union under contract IST Experiment Software Installation toolkit on LCG-2
A GANGA tutorial Professor Roger W.L. Jones Lancaster University.
Wouter Verkerke, NIKHEF 1 Using ‘stoomboot’ for NIKHEF-ATLAS batch computing What is ‘stoomboot’ – Hardware –16 machines, each 2x quad-core Pentium = 128.
II EGEE conference Den Haag November, ROC-CIC status in Italy
ATLAS TIER3 in Valencia Santiago González de la Hoz IFIC – Instituto de Física Corpuscular (Valencia)
Installing the ATLAS Offline Software Frederick Luehring Indiana University June 20, 2007 “Setting Up a Tier 3 Center (I)” Session.
CVMFS Alessandro De Salvo Outline  CVMFS architecture  CVMFS usage in the.
SPI Report for the LHCC Comprehensive Review Stefan Roiser for the SPI project.
Software Release Build Process and Components in ATLAS Offline Emil Obreshkov for the ATLAS collaboration.
ATLAS TIER3 in Valencia Santiago González de la Hoz IFIC – Instituto de Física Corpuscular (Valencia)
Parrot and ATLAS Connect
Jean-Philippe Baud, IT-GD, CERN November 2007
Installation of the ALICE Software
How to Contribute to System Testing and Extract Results
gLite Information System
Virtualization and Clouds ATLAS position
Progress on NA61/NA49 software virtualisation Dag Toppe Larsen Wrocław
Andreas Unterkircher CERN Grid Deployment
CEPC Software Management Proposal
3D Application Tests Application test proposals
ALICE FAIR Meeting KVI, 2010 Kilian Schwarz GSI.
ATLAS Software Distribution
ATLAS support in LCG.
Patricia Méndez Lorenzo ALICE Offline Week CERN, 13th July 2007
Panda-based Software Installation
Generator Services planning meeting
ATLAS Software Installation redundancy Alessandro De Salvo Alessandro
glexec/SCAS pilot service
NIGHTLY BUILD SCRIPT FOR THE LEVEL-1 TRIGGER ONLINE SOFTWARE
Software Testing With Testopia
June 2011 David Front Weizmann Institute
CernVM Status Report Predrag Buncic (CERN/PH-SFT).
Building and Testing using Condor
Leigh Grundhoefer Indiana University
A Web-Based Data Grid Chip Watson, Ian Bird, Jie Chen,
Introduction to Athena
Site availability Dec. 19 th 2006
AtlasSetup & Evolution
Presentation transcript:

The ATLAS software in the Grid Alessandro De Salvo <Alessandro The ATLAS software in the Grid Alessandro De Salvo <Alessandro.DeSalvo@roma1.infn.it> 20-02-2008 Outline Overview The Experiment Software Structure in the Grid Using the ATLAS software from a Grid node Selecting software releases Using the Installation System for EGEE Documentation & contacts A. De Salvo – 20 Apr 2008

The ATLAS software in the Grid: overview The Grid nodes are installed with standard distribution kits Using pacman from the central caches or from one of the mirrors Same installation as you would have in your local machine pacman -get am-CERN:13.0.40 After the installation step, each site is validated using KitValidation Only if the site passes the KV tests is considered validated When a site is validated for a given release number, the relevant tag is published to the Information System In EGEE a tag is published to the CE (we’ll see later how to use it) In OSG this corresponds to publishing the number of the release and the path to the information system

The Experiment Software Area The ATLAS software in the Grid is installed in the Experiment Software Area reserved for ATLAS Disk area shared among the Worker Nodes via a shared filesystem (NFS, AFS, GPFS, PANFS, …) The user may access the software from a WN by using some variables defined at runtime in the WN EGEE $VO_ATLAS_SW_DIR OSG $OSG_APP/atlas Example: CERN VO_ATLAS_SW_DIR=/afs/cern.ch/project/gd/apps/atlas/slc3

The structure of the software installations in EGEE Each release has a separate entry point $VO_ATLAS_SW_DIR/software/<release_number> Example $VO_ATLAS_SW_DIR/software/13.0.40 The entry point (logical installation) is a displaced installation of the physical release The physical area is located in a different place and may have multiple releases sharing the same disk area Different areas for production, development and nightly releases Production: $VO_ATLAS_SW_DIR/prod/releases Development: $VO_ATLAS_SW_DIR/dev/releases Nightlies: $VO_ATLAS_SW_DIR/nightlies The users should never use directly the physical releases, which could change location at any point Always use the release entry point under $VO_ATLAS_SW_DIR Once the release has been set up from the logical release, the variable SITEROOT will be set to point to the physical installation area

Using the main ATLAS releases in EGEE Setup the runtime environment For releases < 13.0.30 For future releases (> 13.0.30) Run athena as normal source $VO_ATLAS_SW_DIR/software/<rel_num>/setup.sh cd $SITEROOT/AtlasOffline/<rel_num>/AtlasOfflineRunTime/cmt source setup.sh cd - source $VO_ATLAS_SW_DIR/software/<rel_num>/setup.sh Athena <jobOption>

Simulating a Grid run: running Athena HelloWorld at Roma1 From the practical point of view, running within the Grid environment and in local mode is equivalent, provided that We have the env var $VO_ATLAS_SW_DIR set We have access to the Experiment Software Area Let’s simulate a Grid environment at CERN and run an Athena HelloWorld, using release 13.0.40 Login to the Roma1 Tier2 (atlas-ui.roma1.infn.it) Use the following commands # Set the Experiment Software Area export VO_ATLAS_SW_DIR=/opt/exp_soft/atlas # Setup the release source $VO_ATLAS_SW_DIR/software/13.0.40/setup.sh cd $SITEROOT/AtlasOffline/13.0.40/AtlasOfflineRunTime/cmt source setup.sh cd – # Start Athena athena AthExHelloWorld/HelloWorldOptions.py

Using patch releases Patch releases are shipped separately, after the main release is built Main release 13.0.40 Patch releases 13.0.40.1 13.0.40.2 Each patch release is Sharing the same physical area of the main release Sharing the same release entry point of the main release To use a patch release the user has to Setup the main release Setup the patch release from the AtlasProduction or AtlasPoint1 package

Using patch releases (2) Example: setting up AtlasProduction 13.0.40.2 Example for recent releases (> 13.0.30) # Setup the main release source $VO_ATLAS_SW_DIR/software/13.0.40/setup.sh # Setup the patch release unset CMTPATH cd $SITEROOT/AtlasProduction/13.0.40.2/AtlasProductionRunTime/cmt source setup.sh cd – # Setup the patch release source $VO_ATLAS_SW_DIR/software/13.0.40/setup.sh –tag=13.0.40.1,AtlasProduction,runtime

Compiling user code User code may be compiled against an installed kit during a Grid job To complie user code Create a test area using the create-cmthome.sh script from https://twiki.cern.ch/twiki/bin/view/Atlas/UseAtlasSoftwareProjectsKit …or use a simple requirements file Compile your code and use your testarea ($HOME/testarea) $> cat requirements set CMTSITE STANDALONE macro ATLAS_DIST_AREA ${SITEROOT} apply_tag projectArea macro SITE_PROJECT_AREA ${SITEROOT} macro EXTERNAL_PROJECT_AREA ${SITEROOT} apply_tag opt macro ATLAS_TEST_AREA ${HOME}/testarea use AtlasLogin AtlasLogin-* $(ATLAS_DIST_AREA) $> source $VO_ATLAS_SW_DIR/software/13.0.40/setup.sh $> cmt config $> source setup.sh –tag=AtlasOffline,13.0.40,opt,oneTest,runtime

Which releases do we expect to find in the Grid? The release installation in the Grid is centrally managed Production releases are automatically pushed in the sites Obsolete releases are automatically removed when they are no more required for production The analysis people should also play a fundamental role here Need to pin releases if used for analysis, even if considered obsolete from the production point of view The ATLAS Installation System is able to manage user requests for installation, testing and removals of releases What you may install in the grid Production releases (>= 11.0.X) (Some) development releases Nightly releases (also with automatic deployment) Production patches Point1 patches Installation architectures All the nodes are currently installed with SLC3, gcc323, 32bits releases, independent from the underlying architecture 64 bits releases are not yet officially released Need to use SLC3 software until the majority of the nodes in the Grid will be updated to SL4 or copatible OS SLC4 nodes may run SLC3 software, while the reverse is not true

Selecting sites with the required software release For each release installed, a VO software tag is published VO-atlas-<project>-<rel_num> Examples VO-atlas-production-13.0.40 VO-atlas-production-13.0.40.2 The VO software tags may be used in the requirements of your jobs to select the sites and resources where a given software release is available and working In case of failures of a release in a site Try to identify the problem Open a ticket in GGUS, possibly specifying the application you were running, the software release, the site and the node where you had the problem http://www.ggus.org Requirements =(Member(“VO-atlas-production-13.0.40",other.GlueHostApplicationSoftwareRunTimeEnvironment));

Checking if a release is actually present in the sites Sometimes the sites still publish tags even if the release is missing To check if a release is actually present in the sites Easy approach Check for the presence of the $VO_ATLAS_SW_DIR/software/<rel_num>/setup.sh file Safer approach Check the return code of “source $VO_ATLAS_SW_DIR/software/<rel_num>/setup.sh” #!/bin/sh if [ ! -s $VO_ATLAS_SW_DIR/software/13.0.40/setup.sh ] ; then echo “Release not found!!!” fi #!/bin/sh source $VO_ATLAS_SW_DIR/software/13.0.40/setup.sh if [ $? –ne 0 ] ; then echo “Release not found!!!” fi

The ATLAS Installation System for LCG/EGEE The software installations in EGEE are operated via the Installation System https://atlas-install.roma1.infn.it/atlas_install Every user may browse the online status of the installations in LCG/EGEE

The ATLAS Installation System for LCG/EGEE (2) On-demand installations or maintenance requests may also be requested https://atlas-install.roma1.infn.it/atlas_install/protected/req.php The page will be shown only if you have a valid personal certificate imported in your browser

The ATLAS Installation System for LCG/EGEE (3) The Installation System may also give you the following informations Informations on the releases (Release Matrix: https://atlas-install.roma1.infn.it/atlas_install/protected/rel.php) Release name Architecture Project name Installation paths Tags … Snapshots of the current tags in the Grid (Tags Matrix: https://atlas-install.roma1.infn.it/atlas_install/protected/tags.php)

The ATLAS Installation System for LCG/EGEE (4) Users may set pins to releases to avoid the Installation System to remove specific releases For example if you need a specific release for your analysis in a minimum number of sites and you want to keep it even if it has been considered obsolete https://atlas-install.roma1.infn.it/atlas_install/protected/pin.php

The ATLAS Installation System for LCG/EGEE (5) Exercise Login to the Roma1 Tier2 (atlas-ui.roma1.infn.it), setup release 13.0.40 using the logical area Discover where is the physical location of the release 13.0.40 Compare the paths of the logical and physical location you found to what you get from the Release Matrix page in the Installation System

Documentation and contacts Using the Distribution Kits https://twiki.cern.ch/twiki/bin/view/Atlas/UseAtlasSoftwareProjectsKit In case of failures of a release in a site Try to identify the problem Open a ticket in GGUS, possibly specifying the application you were running, the software release, the site and the node where you had the problem http://www.ggus.org For site-specific issues (EGEE only) The ATLAS Installation Team atlas-grid-install@cern.ch The ATLAS Installation System Main page https://atlas-install.roma1.infn.it/atlas_install On-demand installation and test requests https://atlas-install.roma1.infn.it/atlas_install/protected/req.php Release pinning https://atlas-install.roma1.infn.it/atlas_install/protected/pin.php Overview and status of the installed releases https://atlas-install.roma1.infn.it/atlas_install/protected/rel.php Overview of the tags in the sites https://atlas-install.roma1.infn.it/atlas_install/protected/tags.php