Nightly System Growth Graphs Abstract For over 10 years of development the ATLAS Nightly Build System has evolved into a factory for automatic release.

Slides:



Advertisements
Similar presentations
EHarmony in Cloud Subtitle Brian Ko. eHarmony Online subscription-based matchmaking service Available in United States, Canada, Australia and United Kingdom.
Advertisements

U.S. ATLAS S&C/PS Support Meeting – August ATLAS Software Infrastructure : LS1 Upgrade Challenges Alex Undrus.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
S. Gadomski, "ATLAS computing in Geneva", journee de reflexion, 14 Sept ATLAS computing in Geneva Szymon Gadomski description of the hardware the.
VisIt Software Engineering Infrastructure and Release Process LLNL-PRES Lawrence Livermore National Laboratory, P. O. Box 808, Livermore,
Grid Architecture Grid Canada Certificates International Certificates Grid Canada Issued over 2000 certificates Condor G Resource TRIUMF.
Automated Tests in NICOS Nightly Control System Alexander Undrus Brookhaven National Laboratory, Upton, NY Software testing is a difficult, time-consuming.
Test results Test definition (1) Istituto Nazionale di Fisica Nucleare, Sezione di Roma; (2) Istituto Nazionale di Fisica Nucleare, Sezione di Bologna.
ATLAS Software Infrastructure Frederick Luehring Indiana University US ATLAS Tier 1&2 Meeting at Harvard August 8, 2006.
Quality Assurance and Testing in LCG CHEP 2004 Interlaken, Switzerland 30 September 2004 Manuel Gallas, Jakub MOSCICKI CERN
SPI Software Process & Infrastructure GRIDPP Collaboration Meeting - 3 June 2004 Jakub MOSCICKI
The Atlas Software Distribution Christian ArnaultAlessandro De SalvoSimon GeorgeGrigori Rybkine
Abstract The automated multi-platform software nightly build system is a major component in the ATLAS collaborative software organization, validation and.
Framework for Automated Builds Natalia Ratnikova CHEP’03.
SPI Software Process & Infrastructure EGEE France - 11 June 2004 Yannick Patois
M Gallas CERN EP-SFT LCG-SPI: SW-Testing1 LCG-SPI: SW-Testing LCG Applications Area GridPP 7 th Collaboration Meeting LCG/SPI LCG.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
Nightly Releases and Testing Alexander Undrus Atlas SW week, May
EMI INFSO-RI EMI Quality Assurance Processes (PS ) Alberto Aimar (CERN) CERN IT-GT-SL Section Leader EMI SA2 QA Activity Leader.
NICOS System of Nightly Builds for Distributed Development Alexander Undrus CHEP’03.
INFSO-RI Enabling Grids for E-sciencE The gLite Software Development Process Alberto Di Meglio CERN.
Irina Sourikova Brookhaven National Laboratory for the PHENIX collaboration Migrating PHENIX databases from object to relational model.
Distribution After Release Tool Natalia Ratnikova.
The LCG SPI project in LCG Phase II CHEP’06, Mumbai, India Feb. 14, 2006 Andreas Pfeiffer -- for the SPI team
Software Validation Infrastructure for the ATLAS Trigger Wolfgang Ehrenfeld – DESY On behalf of the ATLAS Trigger Validation Group CHEP 2009 – Prague –
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
C. Arnault – ATLAS RPMs- 14/03/ n° 1 ATLAS RPMs from CMT C. Arnault & C. Loomis
Control in ATLAS TDAQ Dietrich Liko on behalf of the ATLAS TDAQ Group.
LCG-SPI: SW-Testing LCG AppArea internal review (20/10/03)
DataGRID WPMM, Geneve, 17th June 2002 Testbed Software Test Group work status for 1.2 release Andrea Formica on behalf of Test Group.
Organization and Management of ATLAS Nightly Builds F. Luehring a, E. Obreshkov b, D.Quarrie c, G. Rybkine d, A. Undrus e University of Indiana, USA a,
20/09/2006LCG AA 2006 Review1 Committee feedback to SPI.
Owen SyngeTitle of TalkSlide 1 Storage Management Owen Synge – Developer, Packager, and first line support to System Administrators. Talks Scope –GridPP.
U.S. ATLAS S&C Planning Meeting - June ATLAS Software Infrastructure : Requirements and Goals at Run 2 Period Alex Undrus.
LCG Generator Meeting, December 11 th 2003 Introduction to the LCG Generator Monthly Meeting.
SEAL Core Libraries and Services CLHEP Workshop 28 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
Changes to CernVM-FS repository are staged on an “installation box" using a read/write file system interface. There is a dedicated installation box for.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
Alex Undrus – Nightly Builds – ATLAS SW Week – Dec Preamble: Code Referencing Code Referencing is a vital service to cope with 7 million lines of.
F. Luehring: ATLAS Computing SW Components, Tools and Databases – 26 March Organization, Management, and Documentation of ATLAS Offline Software.
Marco Cattaneo - DTF - 28th February 2001 File sharing requirements of the physics community  Background  General requirements  Visitors  Laptops 
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
CERN IT Department t LHCb Software Distribution Roberto Santinelli CERN IT/GS.
INFSO-RI Enabling Grids for E-sciencE ARDA Experiment Dashboard Ricardo Rocha (ARDA – CERN) on behalf of the Dashboard Team.
G.Govi CERN/IT-DB 1 September 26, 2003 POOL Integration, Testing and Release Procedure Integration  Packages structure  External dependencies  Configuration.
23/2/2000Status of GAUDI 1 P. Mato / CERN Computing meeting, LHCb Week 23 February 2000.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
Alex Undrus – Shifters Meeting – 16 Oct ATLAS Nightly System Integration LS1 Ugrade SIT Task Force Objective: increase efficiency, flexibility,
ATLAS-specific functionality in Ganga - Requirements for distributed analysis - ATLAS considerations - DIAL submission from Ganga - Graphical interfaces.
D4Science and ETICS Building and Testing gCube and gCore Pedro Andrade CERN EGEE’08 Conference 25 September 2008 Istanbul (Turkey)
Geant4 is a toolkit to simulate the passage of particles through matter, and is widely used in HEP, in medical physics and for space applications. Ongoing.
An attempt to summarize…or … some highly subjective observations Matthias Kasemann, CERN & DESY.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 SPI Infrastructure for LCG Software Projects GRIDPP 7 th Collaboration Meeting 30 June – 2 July.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 SPI Infrastructure for LCG Software Projects Status and work plan for H July 2003 A.Aimar.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
Alex Undrus – GRID Testing – 18 Oct Nightlies Testing on the GRID: Status Oct Purposes: Moderate scale production for quick validation (when.
Feedback from CMS Andrew Lahiff STFC Rutherford Appleton Laboratory Contributions from Christoph Wissing, Bockjoo Kim, Alessandro Degano CernVM Users Workshop.
SPI Software Process & Infrastructure Project Plan 2004 H1 LCG-PEB Meeting - 06 April 2004 Alberto AIMAR
JRA1 Meeting – 09/02/ Software Configuration Management and Integration EGEE is proposed as a project funded by the European Union under contract.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure SPI Infrastructure for LCG Software Projects CHEP 2003 A.Aimar EP/SFT CERN LCG Software Process.
ATLAS TIER3 in Valencia Santiago González de la Hoz IFIC – Instituto de Física Corpuscular (Valencia)
Comments on SPI. General remarks Essentially all goals set out in the RTAG report have been achieved. However, the roles defined (Section 9) have not.
Use of CMT in LHCb CMT Workshop, LAL (Orsay) 28 th February - 1 st March 2002 P. Mato / CERN.
Software Release Build Process and Components in ATLAS Offline Emil Obreshkov for the ATLAS collaboration.
ATLAS – statements of interest (1) A degree of hierarchy between the different computing facilities, with distinct roles at each level –Event filter Online.
SPI Infrastructure for LCG Software Projects
Marc-Elian Bégin ETICS Project, CERN
ATLAS Software Distribution
The AMI Database Project
Module 01 ETICS Overview ETICS Online Tutorials
Presentation transcript:

Nightly System Growth Graphs Abstract For over 10 years of development the ATLAS Nightly Build System has evolved into a factory for automatic release production and grid distribution. The numerous branches of ATLAS releases provide vast opportunities for testing new packages, verification of patches to existing software, and migration to new platforms and compilers for ATLAS code that currently contains ~2200 packages with 4 million C++ and 1.4 million python scripting lines written by ~1000 developers. The nightly releases lead up to stable releases used for data processing and analysis worldwide. The ATLAS Nightly System is managed by the NICOS control tool [1] on the ATLAS Build Farm. The ATN testing framework [2] runs unit and integration tests for the nightly releases. ATLAS nightlies in numbers  Number of branches: 50  Total number of platforms in all branches: 70  Led up to 379 stable releases in 2011  Nightlies computing farm: 50 8-processsor nodes, GB RAM build parallelism (distcc, CMT [3])  Time to rebuild nightly release: 10 hours  Number of ATN tests: ~ 450  Additional time to complete ATN tests: 5 hours Abstract For over 10 years of development the ATLAS Nightly Build System has evolved into a factory for automatic release production and grid distribution. The numerous branches of ATLAS releases provide vast opportunities for testing new packages, verification of patches to existing software, and migration to new platforms and compilers for ATLAS code that currently contains ~2200 packages with 4 million C++ and 1.4 million python scripting lines written by ~1000 developers. The nightly releases lead up to stable releases used for data processing and analysis worldwide. The ATLAS Nightly System is managed by the NICOS control tool [1] on the ATLAS Build Farm. The ATN testing framework [2] runs unit and integration tests for the nightly releases. ATLAS nightlies in numbers  Number of branches: 50  Total number of platforms in all branches: 70  Led up to 379 stable releases in 2011  Nightlies computing farm: 50 8-processsor nodes, GB RAM build parallelism (distcc, CMT [3])  Time to rebuild nightly release: 10 hours  Number of ATN tests: ~ 450  Additional time to complete ATN tests: 5 hours Nightly Framework ATLAS collaborative tools connections:  CMT [3] code management and build tool  Tag Collector [4] web based tool for managing the tags of packages in release  ATLAS SVN code repository  ATLAS metadata DB (AMI) [5] for storage of NICOS configurations for different nightly branches  Download availability with ATLAS distribution kits tools [6]  Worldwide access on CERN AFS or CernVM-FS [7] distributed file systems  ATN [2] and RTT [8] testing frameworks  ATLAS Installation System [9] Evolution of the ATLAS Nightly Build System A. Undrus, on behalf of the ATLAS Collaboration Brookhaven National Laboratory, USA Growth Stages of the Nightly System : Establishment of the System  Bug fix & development branches only  Few tens of ATN tests  ~800 leaf packages  No parallelism in builds and tests : Accelerating Growth  30 nightly branches  300 ATN tests, high success rate  ~2000 leaf packages  Build parallelism  Sophisticated software validation Since 2010: Refinement of the System  Stable number of packages and tests  High parallelism in builds and tests  98% reliability Wide assortment of nightlies branches of different scopes and purposes Migration branches: isolated development environment Personal branches: for bleeding edge development Analysis branches: code collections for physicists (for use with stable releases) Patch branches: Overrides to stable releases Experimental branches: testing new operating systems and compilers Integration branches: Current code status References [1] A. Undrus, CHEP 03, La Jolla, USA, 2003, eConf C , TUJT006 [hep- ex/ ]; F. Luehring et al, 2010 J. Phys.: Conf. Ser [2] A. Undrus, CHEP 04, Interlaken, 2004, Conference proceedings, p. 521 [3] C. Arnault, CHEP 01, Beijing, 2001; [4] [5] S. Albrand et al, 2010 J. Phys.: Conf. Ser [6] C. Arnault, CHEP 04, Interlaken, 2004; G.Rybkine 2012 ATLAS software packaging To be published in CHEP 2012 Proceedings [7] J. Blomer et al, 2011 J. Phys.: Conf. Ser [8] B. Simmons et al., 2010 J. Phys.: Conf. Ser [9] A. De Salvo et al., 2008 J. Phys.: Conf. Ser References [1] A. Undrus, CHEP 03, La Jolla, USA, 2003, eConf C , TUJT006 [hep- ex/ ]; F. Luehring et al, 2010 J. Phys.: Conf. Ser [2] A. Undrus, CHEP 04, Interlaken, 2004, Conference proceedings, p. 521 [3] C. Arnault, CHEP 01, Beijing, 2001; [4] [5] S. Albrand et al, 2010 J. Phys.: Conf. Ser [6] C. Arnault, CHEP 04, Interlaken, 2004; G.Rybkine 2012 ATLAS software packaging To be published in CHEP 2012 Proceedings [7] J. Blomer et al, 2011 J. Phys.: Conf. Ser [8] B. Simmons et al., 2010 J. Phys.: Conf. Ser [9] A. De Salvo et al., 2008 J. Phys.: Conf. Ser NICOS FRAMEWORK NIGHTLY CONFIGURATION DISTRIBUTION CODE MANAGEMENT EXTERNAL TESTING Nightly release CMT build ATN testing AMI DB Tag Collector SVN repository ATLAS Run Time Tester (RTT) Validation in ATLAS Installation System Distribution KITS CERN AFS ATLAS Tier Centers NICOS WEB PAGES CernVM-FS