ATLAS Software Distribution

Slides:



Advertisements
Similar presentations
Automated software packaging and installation for the ATLAS experiment Simon George Royal Holloway, University of London Christian Arnault, LAL Orsay;
Advertisements

1 Generic logging layer for the distributed computing by Gene Van Buren Valeri Fine Jerome Lauret.
10/01/2007 Installing and Using ATLAS Distribution in LNF Release
Automated Tests in NICOS Nightly Control System Alexander Undrus Brookhaven National Laboratory, Upton, NY Software testing is a difficult, time-consuming.
ATLAS Software Kaushik De University of Texas At Arlington based on a tutorial by P. Calafiura (LBNL) LHC Computing Workshop, Ankara May 2, 2008.
DIRAC API DIRAC Project. Overview  DIRAC API  Why APIs are important?  Why advanced users prefer APIs?  How it is done?  What is local mode what.
SPI Software Process & Infrastructure GRIDPP Collaboration Meeting - 3 June 2004 Jakub MOSCICKI
The Atlas Software Distribution Christian ArnaultAlessandro De SalvoSimon GeorgeGrigori Rybkine
Abstract The automated multi-platform software nightly build system is a major component in the ATLAS collaborative software organization, validation and.
Framework for Automated Builds Natalia Ratnikova CHEP’03.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
C. Loomis – Testbed Status – 28/01/2002 – n° 1 Future WP6 Tasks Charles Loomis January 28, 2002
October, Scientific Linux INFN/Trieste B.Gobbo – Compass R.Gomezel - T.Macorini - L.Strizzolo INFN - Trieste.
J.T Moscicki CERN LCG - Software Process & Infrastructure1 SPI Software Process & Infrastructure for LCG Software Packaging and Distribution LCG Application.
Nick Brook Current status Future Collaboration Plans Future UK plans.
The LCG SPI project in LCG Phase II CHEP’06, Mumbai, India Feb. 14, 2006 Andreas Pfeiffer -- for the SPI team
Contents 1.Introduction, architecture 2.Live demonstration 3.Extensibility.
05/29/2002Flavia Donno, INFN-Pisa1 Packaging and distribution issues Flavia Donno, INFN-Pisa EDG/WP8 EDT/WP4 joint meeting, 29 May 2002.
CERN-PH-SFT-SPI August Ernesto Rivera Contents Context Automation Results To Do…
 CASTORFS web page - CASTOR web site - FUSE web site -
Organization and Management of ATLAS Nightly Builds F. Luehring a, E. Obreshkov b, D.Quarrie c, G. Rybkine d, A. Undrus e University of Indiana, USA a,
Feedback from the POOL Project User Feedback from the POOL Project Dirk Düllmann, LCG-POOL LCG Application Area Internal Review October 2003.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
LHCb-ATLAS GANGA Workshop, 21 April 2004, CERN 1 DIRAC Software distribution A.Tsaregorodtsev, CPPM, Marseille LHCb-ATLAS GANGA Workshop, 21 April 2004.
Servicing HEP experiments with a complete set of ready integrated and configured common software components Stefan Roiser 1, Ana Gaspar 1, Yves Perrin.
CERN IT Department t LHCb Software Distribution Roberto Santinelli CERN IT/GS.
INFSO-RI Enabling Grids for E-sciencE ARDA Experiment Dashboard Ricardo Rocha (ARDA – CERN) on behalf of the Dashboard Team.
G.Govi CERN/IT-DB 1 September 26, 2003 POOL Integration, Testing and Release Procedure Integration  Packages structure  External dependencies  Configuration.
Maite Barroso - 10/05/01 - n° 1 WP4 PM9 Deliverable Presentation: Interim Installation System Configuration Management Prototype
Separate distribution of the analysis code (and more) P. Hristov 19/03/2014.
Andrew McNab - Globus Distribution for Testbed 1 Globus Distribution for Testbed 1 Andrew McNab, University of Manchester
ATLAS-specific functionality in Ganga - Requirements for distributed analysis - ATLAS considerations - DIAL submission from Ganga - Graphical interfaces.
Marco Cattaneo Core software programme of work Short term tasks (before April 2012) 1.
BESIII Offline Software Development Environment Ma qiumei * Development environment * Configuration & management tool * Software development.
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
Predrag Buncic (CERN/PH-SFT) Software Packaging: Can Virtualization help?
CERN Tutorial, September Overview of LHCb applications and software environment.
TP: Grid site installation BEINGRID site installation.
GLite build and integration system Building and Packaging Robert HARAKALY
36 th LHCb Software Week Pere Mato/CERN.  Provide a complete, portable and easy to configure user environment for developing and running LHC data analysis.
INFSO-RI Enabling Grids for E-sciencE Ganga 4 Technical Overview Jakub T. Moscicki, CERN.
Predrag Buncic (CERN/PH-SFT) CernVM Status. CERN, 24/10/ Virtualization R&D (WP9)  The aim of WP9 is to provide a complete, portable and easy.
Bologna Tutorial, June Overview of LHCb applications and software environment.
SPI Software Process & Infrastructure Project Plan 2004 H1 LCG-PEB Meeting - 06 April 2004 Alberto AIMAR
SCD Monthly Projects Meeting 2014 Scientific Linux Update Rennie Scott January 14, 2014.
Comments on SPI. General remarks Essentially all goals set out in the RTAG report have been achieved. However, the roles defined (Section 9) have not.
Use of CMT in LHCb CMT Workshop, LAL (Orsay) 28 th February - 1 st March 2002 P. Mato / CERN.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
Distribution of ATLAS Software and configuration data Costin Caramarcu on behalf of ATLAS TDAQ SysAdmins.
SPI Report for the LHCC Comprehensive Review Stefan Roiser for the SPI project.
Savannah to Jira Migration
Installation of the ALICE Software
Virtualisation for NA49/NA61
NA61/NA49 virtualisation:
Dag Toppe Larsen UiB/CERN CERN,
Progress on NA61/NA49 software virtualisation Dag Toppe Larsen Wrocław
Dag Toppe Larsen UiB/CERN CERN,
SPI external software build tool and distribution mechanism
SPI Software Process & Infrastructure
CMS OSG Motivation and Introduction Overview
CMT Define the development work models
Virtualisation for NA49/NA61
Generator Services planning meeting
GLAST Release Manager Automated code compilation via the Release Manager Navid Golpayegani, GSFC/SSAI Overview The Release Manager is a program responsible.
The ATLAS software in the Grid Alessandro De Salvo <Alessandro
NIGHTLY BUILD SCRIPT FOR THE LEVEL-1 TRIGGER ONLINE SOFTWARE
Leanne Guy EGEE JRA1 Test Team Manager
Overview of LHCb applications and software environment
CernVM Status Report Predrag Buncic (CERN/PH-SFT).
SEAL Project Core Libraries and Services
Presentation transcript:

ATLAS Software Distribution Grigori Rybkine Royal Holloway, University of London on behalf of ATLAS

Overview ATHENA Configuration Management ATLAS Releases and Distribution Kit Distribution Kit Use Cases Distribution Kit Packaging and Tools Distribution Kit Repositories Distribution Kit and RPMs Package Dependencies Problems Used from AA/SPI Distribution Kit Statistics

ATHENA ATLAS offline software is based on the ATHENA framework http://atlas.web.cern.ch/Atlas/GROUPS/SOFTWARE/OO/architectu re/index.html ATHENA is based on the GAUDI Data Processing Applications Framework that was originally developed by LHCb, but which is now a joined development project http://proj-gaudi.web.cern.ch/proj- gaudi/welcome.htm

Configuration Management ATLAS uses the Configuration Management Tool (CMT) http://www.cmtsite.org to manage configuration and building of its software and set up the user environment this tool supports the decomposition of the software into packages, or groups of packages – projects external packages are interfaced to CMT by defining a glue package, where configuration specifications for this external package are detailed, in particular, how to pack the external package is specified in the glue package

ATLAS Releases and Distribution Kit http://atlas-computing.web.cern.ch/atlas- computing/projects/releases/releases.php used to be monolithic, sub- divided into CMT packages, now split into CMT projects - AtlasCore, AtlasEvent, AtlasTrigger, AtlasSimulation, etc. currently releases are regularly built for i686, Scientific Linux CERN 3, gcc 3.2.3 dbg & opt after a release has been built, a distribution kit is made

Distribution Kit Use Cases running the application (on the grid, at a remote centre, on a disconnected laptop, etc.) software development (at a remote centre, on a disconnected laptop, etc.) CERN software installations from the distribution kit (so that all use the same distribution) using offline software alongside TDAQ and HLT software at point 1 (the location of the ATLAS detector and the trigger computing farms)

Distribution Kit Packaging and Tools built with a suite of shell and python scripts http://atlas- sw.cern.ch/cgi-bin/viewcvs-atlas.cgi/offline/Deployment comprises compressed tarballs "pacman" meta-data files (where to fetch from, dependencies, pre- and post- install scripts, etc.) for pacman http://physics.bu.edu/pacman accessible via http from the repositories

Distribution Kit Repositories monolithic releases kits http://atlas.web.cern.ch/Atlas/GROUPS/SOFTWARE/OO/pacman/cache/ (RedHat 7.3) http://atlas.web.cern.ch/Atlas/GROUPS/SOFTWARE/OO/pacman/slc3/cach e/ (SLC 3.0.x) project based releases kits http://atlas.web.cern.ch/Atlas/GROUPS/SOFTWARE/OO/pacman/projects/ cache/ (SLC 3.0.x)

Distribution Kit Tarballs Types monolithic releases for each CMT package - 1 tarball comprising binaries (optimised), platform- independent files including header files but excluding the other source code files and documentation project based releases for each CMT project - separate tarballs containing binaries (optimised), binaries (debug), source code excluding header files, documentation, other platform-independent files external packages (their contents are specified in the glue packages with special macros) 1 tarball containing binaries (optimised), platform-independent files if debug binaries available, 1 tarball containing them

Distribution Kit and RPMs considering packaging as RPMs also may be more suitable to meet requirements for software installation at point1 (the location of the ATLAS detector and the trigger computing farms) meet requirements from some grid sites

Package Dependencies at Distribution Kit Build Time of CMT packages (monolithic releases) or of CMT projects (project based releases) on each other as well as on the external packages are specified based on CMT of the separate tarballs for each CMT project (project based releases) as project opt -> project noarch + project dependencies opt + CMT project src -> project noarch project dbg -> project src + project dependencies dbg + CMT project doc, noarch (no dependencies) at Installation Time resolved by Pacman Pacman, also, simplifies installation updates

Problems (1/2) have had some issues with Pacman (e.g., installation failures on NFS mounted partitions, execution speed, diagnostic messages), several problems and feature requests in the past have had a good response from the author who is working on ATLAS related to the packaging of external software (some files missing from the kit) some non ATLAS specific libraries missing at grid or remote sites e.g., libshift (castor-devel) is part of the kit, but libX11 (XFree86- devel), libnsl, libcrypt, libdl (glibc-devel), libg2c (gcc-g77-ssa) are not

Problems (2/2) requirement from some grid sites to install the software as RPMs running and development on platforms other than SLC3 (and SL3, RHEL3) do not have LCG and External software mapping onto offline kits (debug, src, etc.)

Used from AA/SPI LCG and External software installations on AFS (when building the distribution kit) tarballs of LCG and External software (by those rebuilding ATLAS software on other platforms) if LCG (and External) software were provided packaged (as tarballs and RPMs) so that headers, optimised and debug builds, source code, documentation for a platform are not in one piece, we could use them

Distribution Kit Statistics A typical distribution kit of offline software (optimised mode) consists of 18 ATLAS projects packages, 2 GAUDI packages, LCGCMT package, CMT package and ~50 external packages Size of release 11.3.0 installation (opt) 4.4G Atlas software 2G Atlas data 1.1G external 1.3G More information is available at https://uimon.cern.ch/twiki/bin/view/Atlas/InstallingAtlasSoftware