Comments on SPI. General remarks Essentially all goals set out in the RTAG report have been achieved. However, the roles defined (Section 9) have not.

Slides:



Advertisements
Similar presentations
Current methods for negotiating firewalls for the Condor ® system Bruce Beckles (University of Cambridge Computing Service) Se-Chang Son (University of.
Advertisements

A centre of expertise in digital information management A QA Framework To Support Your Library Web Site Review Brian Kelly UKOLN University of Bath Bath.
HP Quality Center Overview.
D. Düllmann - IT/DB LCG - POOL Project1 POOL Release Plan for 2003 Dirk Düllmann LCG Application Area Meeting, 5 th March 2003.
Automated Tests in NICOS Nightly Control System Alexander Undrus Brookhaven National Laboratory, Upton, NY Software testing is a difficult, time-consuming.
EC Review – 01/03/2002 – G. Zaquine – Quality Assurance – WP12 – CS-SI – n° 1 DataGrid Quality Assurance Gabriel Zaquine Quality Engineer - WP12 – CS-SI.
PopMedNet Software Development Life Cycle Chayim Herzig-Marx Harvard Pilgrim Health Care Institute Daniel Dee Lincoln Peak Partners.
Quality Assurance and Testing in LCG CHEP 2004 Interlaken, Switzerland 30 September 2004 Manuel Gallas, Jakub MOSCICKI CERN
SPI Software Process & Infrastructure GRIDPP Collaboration Meeting - 3 June 2004 Jakub MOSCICKI
M. Gallas IT-API LCG SPI project: testing1 Software Testing Infrastructure status LCG Software Process & Infrastructure (CERN, 10/23/02)
Abstract The automated multi-platform software nightly build system is a major component in the ATLAS collaborative software organization, validation and.
Framework for Automated Builds Natalia Ratnikova CHEP’03.
SPI Software Process & Infrastructure EGEE France - 11 June 2004 Yannick Patois
M Gallas CERN EP-SFT LCG-SPI: SW-Testing1 LCG-SPI: SW-Testing LCG Applications Area GridPP 7 th Collaboration Meeting LCG/SPI LCG.
SPI Software Process & Infrastructure Project Status Application Area Review – 30 March 2005.
J.T Moscicki CERN LCG - Software Process & Infrastructure1 SPI Software Process & Infrastructure for LCG Software Packaging and Distribution LCG Application.
NICOS System of Nightly Builds for Distributed Development Alexander Undrus CHEP’03.
INFSO-RI Enabling Grids for E-sciencE The gLite Software Development Process Alberto Di Meglio CERN.
EMI INFSO-RI SA2 - Quality Assurance Alberto Aimar (CERN) SA2 Leader EMI First EC Review 22 June 2011, Brussels.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 Software Process panel SPI GRIDPP 7 th Collaboration Meeting 30 June – 2 July 2003 A.Aimar -
EGEE is a project funded by the European Union under contract IST Testing processes Leanne Guy Testing activity manager JRA1 All hands meeting,
The LCG SPI project in LCG Phase II CHEP’06, Mumbai, India Feb. 14, 2006 Andreas Pfeiffer -- for the SPI team
LCG Applications Area – Overview, Planning, Resources Torre Wenaus, BNL/CERN LCG Applications Area Manager LHCC Comprehensive Review.
MINER A Software The Goals Software being developed have to be portable maintainable over the expected lifetime of the experiment extensible accessible.
LCG-SPI: SW-Testing LCG AppArea internal review (20/10/03)
GLite – An Outsider’s View Stephen Burke RAL. January 31 st 2005gLite overview Introduction A personal view of the current situation –Asked to be provocative!
Organization and Management of ATLAS Nightly Builds F. Luehring a, E. Obreshkov b, D.Quarrie c, G. Rybkine d, A. Undrus e University of Indiana, USA a,
JRA Execution Plan 13 January JRA1 Execution Plan Frédéric Hemmer EGEE Middleware Manager EGEE is proposed as a project funded by the European.
20/09/2006LCG AA 2006 Review1 Committee feedback to SPI.
Feedback from the POOL Project User Feedback from the POOL Project Dirk Düllmann, LCG-POOL LCG Application Area Internal Review October 2003.
LCG Generator Meeting, December 11 th 2003 Introduction to the LCG Generator Monthly Meeting.
SEAL Core Libraries and Services CLHEP Workshop 28 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 SPI Infrastructure for LCG Software Projects Overview A.Aimar EP/SFT CERN LCG Software Process.
M Gallas CERN EP-SFT LCG-SPI: SW-Testing1 LCG-SPI: SW-Testing QMTest test framework LCG AppArea meeting (16/07/03) LCG/SPI LCG Software.
14th Oct 2005CERN AB Controls Development Process of Accelerator Controls Software G.Kruk L.Mestre, V.Paris, S.Oglaza, V. Baggiolini, E.Roux and Application.
Marco Cattaneo - DTF - 28th February 2001 File sharing requirements of the physics community  Background  General requirements  Visitors  Laptops 
Marco Cattaneo -EP Forum - 11th June 2001 File sharing requirements of the physics community  Background  General requirements  Visitors  Laptops 
Feedback from LHC Experiments on using CLHEP Lorenzo Moneta CLHEP workshop 28 January 2003.
CERN IT Department t LHCb Software Distribution Roberto Santinelli CERN IT/GS.
Software Engineering Overview DTI International Technology Service-Global Watch Mission “Mission to CERN in Distributed IT Applications” June 2004.
G.Govi CERN/IT-DB 1 September 26, 2003 POOL Integration, Testing and Release Procedure Integration  Packages structure  External dependencies  Configuration.
SEAL Project Overview LCG-AA Internal Review October 2003 P. Mato / CERN.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 SPI Software Process & Infrastructure for LCG Project Overview LCG Application Area Internal.
W. Pokorski - EP/SFT Simulation Project1 Generator Services Subproject Witek Pokorski
10/2/2000LHCb Computing, CHEP Use of Configuration Management tool in LHCb software J. Harvey, P. Mato, F. Ranjard CERN (Switzerland)
LCG – AA review 1 Simulation LCG/AA review Sept 2006.
LCG CERN David Foster LCG WP4 Meeting 20 th June 2002 LCG Project Status WP4 Meeting Presentation David Foster IT/LCG 20 June 2002.
EMI INFSO-RI Software Quality Assurance in EMI Maria Alandes Pradillo (CERN) SA2.2 Task Leader.
JSPG Update David Kelsey MWSG, Zurich 31 Mar 2009.
1 Comments to SPI. 2 General remarks Impressed by progress since last review Widespread adoption by experiments and projects Savannah, ExtSoft Build system.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 SPI Infrastructure for LCG Software Projects GRIDPP 7 th Collaboration Meeting 30 June – 2 July.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 SPI Infrastructure for LCG Software Projects Status and work plan for H July 2003 A.Aimar.
ATLAS Distributed Analysis Dietrich Liko IT/GD. Overview  Some problems trying to analyze Rome data on the grid Basics Metadata Data  Activities AMI.
News from EP SFT John Harvey FOCUS Meeting – October 3 rd 2003.
20 October 2005 LCG Generator Services monthly meeting, CERN Validation of GENSER & News on GENSER Alexander Toropin LCG Generator Services monthly meeting.
A. Aimar - IT/API LCG - Software Process & Infrastructure1 SPI - News and Status Update CERN,
J.T Moscicki CERN LCG - Software Process & Infrastructure1 Quality Assurance LCG Application Area Internal Review October 2003 Jakub T. Moscicki.
SPI Software Process & Infrastructure Project Plan 2004 H1 LCG-PEB Meeting - 06 April 2004 Alberto AIMAR
JRA1 Meeting – 09/02/ Software Configuration Management and Integration EGEE is proposed as a project funded by the European Union under contract.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure SPI Infrastructure for LCG Software Projects CHEP 2003 A.Aimar EP/SFT CERN LCG Software Process.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 SPI Software Process & Infrastructure for LCG Project Overview (38 slides, 22 screen dumps)
Use of CMT in LHCb CMT Workshop, LAL (Orsay) 28 th February - 1 st March 2002 P. Mato / CERN.
SPI Report for the LHCC Comprehensive Review Stefan Roiser for the SPI project.
SPI Infrastructure for LCG Software Projects
DataGrid Quality Assurance
SPI external software build tool and distribution mechanism
SPI Software Process & Infrastructure
LCG Software Quality Assurance
Quality Control in the dCache team.
User Feedback from SEAL
Presentation transcript:

Comments on SPI

General remarks Essentially all goals set out in the RTAG report have been achieved. However, the roles defined (Section 9) have not been centralized leading to duplication of effort and several people having problems learning and adapting tools consistently. Good overall perception and take up of service in particular Savannah portal, External packages, QMTest framework Not clear what is interaction with other LCG areas SPI should provide service to all

CVS service Encourage to move to IT central service ASAP (if test satisfactory – otherwise make IT satisfy the requirements) use centrally maintained tools

Savannah portal Very well received, impressive take up Concerns about scalability Can it scale to ~50 contributions/day/project (c.f. root-talk) Avoid future divergence of CERN customisation from mainstream Savannah by close collaboration with authors (looks promising)

Documentation Very impressive and complete web site Workbook will require continuous maintenance to be kept up to date Need a documenter Quality (existence) of doxygen comments should be part of QA tests

External software Recognise professional quality of external software repository Require more transparent decision-making process for provision and maintenance of external software Well identified “owners” – who requested/needs/uses a given package Documented procedure for interacting with users and authors to report/follow up bug fixes Management of versions and dependencies via Architects Forum -Approval of new external package dependencies -Agreement on version changes – ideally freeze version for ~6 months except for well justified critical bug fixes Policy needed for handling differing structures and conventions of external packages in a consistent way -(the structure of the package itself should not be modified) Suggest making available compilation/installation scripts as well as installation logs Suggest simple QA tests for external packages -e.g. to spot configuration changes in new versions

QA QM Test framework well received Policies vs. tools – time to put more emphasis on tools to facilitate compliance? As projects become mature and are deployed, need person centrally responsible for QA to chase projects to improve compliance Should be fully automated (“nightly QA”)

Build system and infrastructure (1) Concern about long term maintenance of NICOS – institutional commitment? Role of central librarian There should be more centralised services across projects -Common build settings -Common release procedures -Sharing of build+release tools -Should not be developed within projects (c.f. POOL) Some of the problems perceived by developers with configuring *any* build tool may be mitigated by delegating to a common librarian

Build system and infrastructure (2) Is nightly build model correct? POOL seems to prefer very frequent internal releases -Which facilitates experiment integration Developers need simple tools to test compilations on other platforms -Without waiting for nightly build How thorough are nightly build tests? -At least compilation should be run on all platforms -Including Windows, need build service -Who checks the output? Encourage procedure in which pre-releases are exposed to users, before freezing the official release

Build system and infrastructure (3) Acknowledge problems with SCRAM build functionality Would be a big advantage if external contribution to LCG software development did not require installation of non- standard tools Support autoconf/automake investigation What is foreseen effort to provide complete functionality? Suggestions for improvements -E.g. abandon recursive makefile -Discussion with C.Arnault & S.Ashby & F.Rademakers to validate ideas Careful study – what additional functionality do SCRAM & CMT provide? Not clear what long term strategy is -Do not use several tools (autoconf/automake+SCRAM+CMT) -Will something other than autoconf always be needed to configure environment? Does it make sense to hold SCRAM course just yet? Welcome acknowledgement of CMT importance in Atlas and LHCb

Distribution More interaction needed with LCG grid deployment area to develop common distribution tools (pacman?) Need tool to generate corresponding (pacman) configuration (exists in CMT) Concern about granularity of existing distribution Need customizable installation/distribution kits for specific components/platforms