Download presentation
Presentation is loading. Please wait.
1
LCG Software Quality Assurance
Massimo Lamanna, Jakub Moscicki CERN LCG/SPI 9/9/03
2
Contents Overview of LCG QA: goals and methods
POOL 1: Quality Assurance Report SEAL 1: Progress Snapshot Summary 9/9/03
3
QA Goals The main goal of QA activity is to help LCG projects assess and improve the quality of their software and procedures provide tools to collect useful software metrics which help asses software quality; provide monitoring tools to see the evolution of the projects; verify if project setup is correct and compliant with LCG Policies. 9/9/03
4
How we want to do it Clear rules and the checklist of assessed items is available in advance to projects QA Reports are generated automatically by the tools and contain data and statistics about the project, so the reader may easily track the project evolution. Anybody who is interested may see (and generate) the report at any time for any release of any project. 9/9/03
5
Caveat SPI provides QA tools to spot the potential problems
…but SPI does not change the projects. Responsibility for software quality and compliance with LCG procedures/policies is within projects QA to be successful requires active collaboration from the projects. 9/9/03
6
QA Checklist Please wait for the annoucement Build the release
Statistics Test Inventory Documentation Inventory Examples Inventory Savannah Statistics Code Inventory Rule Checker Logiscope LCG Policies Configuration of a build system CVS directory structure Software Practices Peer review Please wait for the annoucement 9/9/03
7
Tools Command line tools will be made available soon in the standard LCG App environment Automatic generation of reports HTML Cross-report statistics, to see the evolution of the projects pickled data + gnuplot based graphics Simple common python API front-end for different output formats 9/9/03
8
Activities: Apr-Sept 2003 Manual / semi-automatic reports
POOL QA POOL 1 available at Contents: QA reviews for 0.4.0, 1.0.0, 1.1.0, 1.2.0 User feedback (Atlas, CMS) General comments and Recommendations SEAL QA in progress Development / integration of automatic tools to be finished: Fall 2003 9/9/03
9
Activities: Apr-Sept 2003 Evaluation of tools Rule Checker Logiscope
LCG Coding Rules vs existing activities (Atlas Rules) Logiscope Test coverage SLOC Valgrind 9/9/03
10
POOL1 QA Reviews Highlights
Large number of tests added rapidly Initial 0.4.0: 46 Final 1.2.0: 80 Number of open bugs: On rough average: ~ 5 Weekly maximum: 20 Fixed quickly LCG Compliance: QMTest not fully integrated Custom build system 9/9/03
11
POOL Test Inventory Package 1.0.0 1.1.0 1.2.0 1.2.1 AttributeList 2 4
Collection 1 DataSvc EDGCatalog 3 FileCatalog MySQLCatalog 6 MySQLCollection POOLCore XMLCatalog 5 RootCollection 9/9/03
12
POOL Test Inventory 1.0.0 1.1.0 1.2.0 1.2.1 Unit tests 26 32 34
Integration / System 20 35 46 47 Total 67 80 81 9/9/03
13
Web Information 9/9/03
14
POOL 1 QA 9/9/03
15
Summary 9/9/03
16
Main problems in LCG Apps
Main identified problems in LCG Apps: Common build configuration POOL and SEAL use different approaches Documentation: apart from reference manuals more written documentation is needed: workbook, user manuals, architecture and design overview documents Links with users should be tighter 9/9/03
17
Outlook Completion of automatic reporting system and setup of QA information page Clear rules and checklist Open, reproducible procedure to generate QA reports QA Web Page 9/9/03
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.