19 March 20031 LCG software quality Massimo Lamanna CERN LCG/SPI.

Slides:



Advertisements
Similar presentations
Configuration management
Advertisements

Project Management Summary Castor Development Team Castor Readiness Review – June 2006 German Cancio, Giuseppe Lo Presti, Sebastien Ponce CERN / IT.
Software Modeling SWE5441 Lecture 3 Eng. Mohammed Timraz
Fundamentals of Information Systems, Second Edition
SDLC. Information Systems Development Terms SDLC - the development method used by most organizations today for large, complex systems Systems Analysts.
Automated Tests in NICOS Nightly Control System Alexander Undrus Brookhaven National Laboratory, Upton, NY Software testing is a difficult, time-consuming.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 17 Slide 1 Rapid software development.
Quality Assurance and Testing in LCG CHEP 2004 Interlaken, Switzerland 30 September 2004 Manuel Gallas, Jakub MOSCICKI CERN
SPI Software Process & Infrastructure GRIDPP Collaboration Meeting - 3 June 2004 Jakub MOSCICKI
M. Gallas IT-API LCG SPI project: testing1 Software Testing Infrastructure status LCG Software Process & Infrastructure (CERN, 10/23/02)
SCRAM Software Configuration, Release And Management Background SCRAM has been developed to enable large, geographically dispersed and autonomous groups.
SPI Software Process & Infrastructure EGEE France - 11 June 2004 Yannick Patois
Chapter 3 – Agile Software Development 1Chapter 3 Agile software development.
M Gallas CERN EP-SFT LCG-SPI: SW-Testing1 LCG-SPI: SW-Testing LCG Applications Area GridPP 7 th Collaboration Meeting LCG/SPI LCG.
Chapter 8: Systems analysis and design
CHEP2000 February 2000 Impact of Software Review and Inspection Doris Burckhart CERN ATLAS DAQ/EF-1 Back-end software.
CS 360 Lecture 3.  The software process is a structured set of activities required to develop a software system.  Fundamental Assumption:  Good software.
Nightly Releases and Testing Alexander Undrus Atlas SW week, May
 To explain the importance of software configuration management (CM)  To describe key CM activities namely CM planning, change management, version management.
1 Software Development Configuration management. \ 2 Software Configuration  Items that comprise all information produced as part of the software development.
EMI INFSO-RI SA2 - Quality Assurance Alberto Aimar (CERN) SA2 Leader EMI First EC Review 22 June 2011, Brussels.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 Software Process panel SPI GRIDPP 7 th Collaboration Meeting 30 June – 2 July 2003 A.Aimar -
Configuration Management (CM)
The LCG SPI project in LCG Phase II CHEP’06, Mumbai, India Feb. 14, 2006 Andreas Pfeiffer -- for the SPI team
Chapter 14 Part II: Architectural Adaptation BY: AARON MCKAY.
L. Mancera IT/API LCG SPI project: Code documentation1 Code Documentation Luis Mancera LCG Software Process & Infrastructure (CERN, 10/23/02)
Software Engineering Principles Principles form the basis of methods, techniques, methodologies and tools Principles form the basis of methods, techniques,
TEST-1 6. Testing & Refactoring. TEST-2 How we create classes? We think about what a class must do We focus on its implementation We write fields We write.
LCG-SPI: SW-Testing LCG AppArea internal review (20/10/03)
Fundamentals of Information Systems, Second Edition 1 Systems Development.
Feedback from the POOL Project User Feedback from the POOL Project Dirk Düllmann, LCG-POOL LCG Application Area Internal Review October 2003.
SEAL Core Libraries and Services CLHEP Workshop 28 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
M Gallas CERN EP-SFT LCG-SPI: SW-Testing1 LCG-SPI: SW-Testing QMTest test framework LCG AppArea meeting (16/07/03) LCG/SPI LCG Software.
Chapter 6 CASE Tools Software Engineering Chapter 6-- CASE TOOLS
Marco Cattaneo - DTF - 28th February 2001 File sharing requirements of the physics community  Background  General requirements  Visitors  Laptops 
15 December 2015M. Lamanna “The ARDA project”1 The ARDA Project (meeting with the LCG referees) Massimo Lamanna CERN.
INFSO-RI Enabling Grids for E-sciencE ARDA Experiment Dashboard Ricardo Rocha (ARDA – CERN) on behalf of the Dashboard Team.
Software Engineering Overview DTI International Technology Service-Global Watch Mission “Mission to CERN in Distributed IT Applications” June 2004.
10 September POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI.
G.Govi CERN/IT-DB 1 September 26, 2003 POOL Integration, Testing and Release Procedure Integration  Packages structure  External dependencies  Configuration.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 SPI Software Process & Infrastructure for LCG Project Overview LCG Application Area Internal.
Modelling the Process and Life Cycle. The Meaning of Process A process: a series of steps involving activities, constrains, and resources that produce.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
1 Comments to SPI. 2 General remarks Impressed by progress since last review Widespread adoption by experiments and projects Savannah, ExtSoft Build system.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 SPI Infrastructure for LCG Software Projects GRIDPP 7 th Collaboration Meeting 30 June – 2 July.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 SPI Infrastructure for LCG Software Projects Status and work plan for H July 2003 A.Aimar.
Follow-up to SFT Review (2009/2010) Priorities and Organization for 2011 and 2012.
20 October 2005 LCG Generator Services monthly meeting, CERN Validation of GENSER & News on GENSER Alexander Toropin LCG Generator Services monthly meeting.
Objective ICT : Internet of Services, Software & Virtualisation FLOSSEvo some preliminary ideas.
A. Aimar - IT/API LCG - Software Process & Infrastructure1 SPI - News and Status Update CERN,
J.T Moscicki CERN LCG - Software Process & Infrastructure1 Quality Assurance LCG Application Area Internal Review October 2003 Jakub T. Moscicki.
SPI Software Process & Infrastructure Project Plan 2004 H1 LCG-PEB Meeting - 06 April 2004 Alberto AIMAR
JRA1 Meeting – 09/02/ Software Configuration Management and Integration EGEE is proposed as a project funded by the European Union under contract.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure SPI Infrastructure for LCG Software Projects CHEP 2003 A.Aimar EP/SFT CERN LCG Software Process.
HPHC - PERFORMANCE TESTING Dec 15, 2015 Natarajan Mahalingam.
Comments on SPI. General remarks Essentially all goals set out in the RTAG report have been achieved. However, the roles defined (Section 9) have not.
Maria Alandes Pradillo, CERN Training on GLUE 2 information validation EGI Technical Forum September 2013.
Software Project Configuration Management
Regional Operations Centres Core infrastructure Centres
SPI Infrastructure for LCG Software Projects
The Development Process of Web Applications
Chapter 18 Maintaining Information Systems
SPI Software Process & Infrastructure
LCG Software Quality Assurance
User Feedback from SEAL
Chapter 5 Designing the Architecture Shari L. Pfleeger Joanne M. Atlee
Module 01 ETICS Overview ETICS Online Tutorials
Rapid software development
Overview Activities from additional UP disciplines are needed to bring a system into being Implementation Testing Deployment Configuration and change management.
System Analysis and Design:
Presentation transcript:

19 March LCG software quality Massimo Lamanna CERN LCG/SPI

19 March Quality Assurance Improve on the quality of the different components with respect to functionality and performances. Help the different software components to be “accepted”, i.e. to provide those features (appropriate interface, effective documentation, handy examples, easy installation kits), which make a software tool a success in HEP. Building software which can be effectively deployed in a variety of enviroments (different Regional Centres and desktop installations, different operating system, compilers, etc..) and mantained for a number of years.

19 March QA in LCG Phase 1: QA will listen to the developers to ease the development of first packages, gathering valuable input for the SPI project. In parallel, the testing and the documentation infrastructure in SPI will be followed in detail; wherever possible this infrastructure will be used also for QA. Phase 2: QA will focus on the validation of the final products (and of the procedure to arrive to it), exploring the tools and metrics best suited to improve the quality standards There is no precise plan to improve on some parts of the process (Requirements gathering, formal Use Cases verification,...) All inputs are most welcome!

19 March Quality Assurance (cont.d) In the first phase, the expected problems will come to the fact that the all infrastructure is immature and too much pressure is expected on the different teams, all focused on delivering the first betas of their packages. On a longer term, it is not clear yet the best approach to guarantee high quality standards without too much overhead on the developers. If the procedure is too heavy, it will simply be refused; this is expected in particular due to the features of the HEP enviroment. Many of the QA procedures do not translate in a set of predefined procedures and automatic programs; nevertheless we might benefit from some automatic evaluation tools (software metrics). This field is open for investigation. There is a general suggestion to use multiple platform and compilers at least to build the code and perform basic testing; to be sensible this require automated procedures.

19 March QA summary There is basically only one requirement: the LCG software should be of a quality comparable or superior to the one normally achieved in the experiment software. The users will finally judge... Note that: A small fraction of these users are simply using the executables provided; in most of the cases they will modify them. Tipical usage patterns are: –The user gets a kind of example program, tipically the result of the activity of somebody else. The user modifies it... –The starting point is a piece of C++ using the contributed library. The user takes the code as the only documentation and uses the library in a new class or program. –A user should/want to modify a code written by somebody else because it stops the program execution (is suspected to slow down execution). Error or execution dumps are used to look up in the reference documentation to figure out where the problem is.

19 March QA to POOL/(SEAL)/((PI)) but also to SPI Disclaimer: –The fact that practically only POOL is mentioned in the examples (good and bad…) reflects the narrow and constructive relationship with a relative mature project and does not imply any final judgment Liaison for requirements –Very good with POOL –Very encouraging with SEAL –This produces naturally a sort of QA for SPI Analyze the usage of the different SPI tools –Used? –Not used for a good reasons?

19 March Savannah portal Many projects use the Savannah portal right from the start and look happy with it –Good user feedback –Over 40 projects in by now… Most of the requirements (fine tuning, helpful functionalities) best dealt within the Savannah service in SPI Problems: –Difficult to re-inject patches and improvements in the Savannah repository  –Adding too much functionality can end up in a long-term maintenance headache  Hosted Projects: infrastructure - 7 LCG Application Area - 7 CMS - 4 LHCb - 2 LCG Grid deployment - 1 HepPackages - 17 ATLAS4 infrastructure7 LCG Application Area7 CMS4 LHCb2 LCG Grid deployment1 HepPackages17 ATLAS

19 March Savannah portal Example (Requirement from projects) –Missing advanced statistics tools (for example from Bug Tracker) Solutions: –SPI has an example script (python+MySQLdb) to interrogate the Savannah database back-end to provide statistics –More elegant and probably much better for maintenance

19 March Savannah BugTracker usage Unfortunately the activity looks low as well as the user base. The statistics is low for all hosted projects (LCG and “LHC experiments specific” projects) and numbers are similar across most of the projects  savannah_bugstat example for project ORCA Number of closed bugs: 33 ( over 52 submitted by 6 users ) Mean time to solve: 265 hours ( ) Max/Min to solve: 2038 / 0 hours --- savannah_bugstat example for project POOL Number of closed bugs: 25 ( over 35 submitted by 11 users ) Mean time to solve: 147 hours ( ) Max/Min to solve: 1404 / 0 hours

19 March CVS structure and directory structure CVS Defined (months ago) –Basically followed in POOL –Smoothly evolving without revolutionary changes bin/lib structure being fixed (changed) –Requirements understood by all parts –Positive attitude of the projects involved (SPI, POOL, SEAL) –Hope for a fast convergence (meeting on Monday, decision on Wednesday… of the same week )

19 March Testing Main tool provided CPPUnit –Same family of JUnit, PyUnit, … Very good documentation and expertise from SPI Still not much used by developers –IMHO, the quality of the tool (actually very good) does not play a role  –The fact that testing legs way behind must change  –Sensible tests are absolutely needed to attack more OS/compiler/architecture combinations! –Testing is not a cosmetic activity: it should go together with the development!!! (common sense, eXtreme Programming,…)

19 March Automatic tests execution and miscellaneous tests Tool selected: OVAL Main modification: –Define responsibilities between SCRAM and OVAL –SCRAM builds, OVAL executes It can be used to steer exiting tests, CPPUnit tests, scripts executing commands, valgrind (memory check) –Working on POOL 0_3_2 –Broken in 0_4_0 rc = os.system(cmd) if rc != 0: raise MyError(cmd) print ‘[OVAL] OK’

19 March Code reviews POOL itself is enforcing code reviews (based on the SPI rules). Presently the scheme is that 2 developers get somebody else code, submit remarks (~10 most outstanding ones) to the code developers which reply (accepting the suggestions or explaining why they are refused). Definitely a practice to be encouraged in all projects

19 March Documentation POOL Workbook adequate and improving Examples needed (POOL has some) Doxygen –Every day a “snapshot” version is created, put on the web, and checked for packages with too many warnings –Mail send to the developers (with explanations of the origin with the errors) –Slowly improving (see next transparencies) LXR –OK, useful –Ideal to build on it (code lines  html links)

19 March Doxygen Doxygen is very popular because can create the alibi “but yes, we have documentation!!!” Even this requires discipline! POOL_0_3_ Common : 20 warning messages DataSvc : 42 warning messages FileCatalog : 32 warning messages MySQLCatalog : 5 warning messages MySQLCollection : 1 warning messages POOLCore : 10 warning messages PersistencySvc : 21 warning messages Reflection : 45 warning messages ReflectionBuilder : 13 warning messages RootStorageSvc : 2 warning messages StorageSvc : 62 warning messages Tests : 4 warning messages XMLCatalog : 3 warning messages

19 March Slowly improving POOL_0_4_ Common : 20 warning messages DataSvc : 42 warning messages FileCatalog : 31 warning messages MySQLCatalog : 6 warning messages MySQLCollection : 1 warning messages POOLCore : 10 warning messages PersistencySvc : 20 warning messages Reflection : 46 warning messages ReflectionBuilder : 13 warning messages RootStorageSvc : 2 warning messages StorageSvc : 63 warning messages Tests : 4 warning messages XMLCatalog : 3 warning messages Developers focused on functionalities… Note that the doc is usable anyway!!!

19 March User Doc  Use Cases Presently (start-up projects), examples are naturally very simple –the simple ones are often the most useful For all projects it is mandatory to verify use cases interacting directly with experiments code! Experiments role essential! –Promote the creation of a user basis –Bug finding/fixing –Impact on the evolution (priority, functionality,…) It has to be started ASAP

19 March Summary (1) Interesting job! –In perspective, interaction with all LCG projects –Because it is difficult? The SPI QA has to do with –The usability of the proposed tools –Flexibility to adapt –

19 March Summary (2) The LCG QA has to do with: –Cohesive behavior across projects –Be ready to compromise… Homogeneity will be a major added-value of LCG software –Implementation of good practices –Try to improve on the developer culture Out of the permanent crisis mode Long term perspective –Don’t be locked to a specific piece of software forever Concentrate on the exciting part and do not fight for dull details

19 March Outlook Automatisation –To be improved! –To be built on top of nightly build system –Essential for regression and interoperability –Also to monitor the “SPI-compliance” of all projects Better metrics usage –Now just skeleton  complexity = n. of lines –Logiscope (v.5.1 installed under pttools on Monday) –Absolutely needed to go for “hot spots” if we do not want to prepare zillions of tests In our software! 3 rd party or legacy software (e.g. SEAL)