Presentation is loading. Please wait.

Presentation is loading. Please wait.

An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000.

Similar presentations


Presentation on theme: "An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000."— Presentation transcript:

1 An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000

2 10.02.2000CHEP20002 Origin, purpose and scope SPIDER-CCU project (CERN IT-IPT, LHC experiments, IT projects) to define a common C++ coding standard and a tool to automatically check code against it  SPIDER - C++ Coding Standard 108 rules for naming, coding and style  Tool evaluation Scope limited to rule checking functionality

3 10.02.2000CHEP20003 Approach and tool selection Involve potential users of the tool in –the definition of evaluation criteria –the planning of the evaluation –the actual technical evaluations Take into account time and resource constraints Preselect tools based on technical merit

4 10.02.2000CHEP20004 Evaluated tools CodeCheck 8.01 B1 (Abraxas) QA C++ 3.1 (Programming Research Ltd.) CodeWizard 3.0 (Parasoft) Logiscope RuleChecker (Concerto/AuditC++) 3.5 (CS Verilog S.A.) TestBed 5.8.4 (LDRA Ltd.)

5 10.02.2000CHEP20005 Evaluation environment Evaluate on real and representative HEP C++ code –GEANT4 toolkit –ATLAS “Event” package –ATLAS “classlib” utility library chosen because of  complexity  extensive use of STL  variety of style and expertise  familiarity to members of evaluation team

6 10.02.2000CHEP20006 Evaluation criteria Technical –Coverage of standard –Addition of customised checks –Other relevant configured checks –Support of ANSI C++ standard –Support of template libraries and STL –Robustness –Reliability –Usability –Customisability –Performance

7 10.02.2000CHEP20007 Evaluation criteria (cont’d) Operational –installation, deployment and upgrade of a centrally supported tool Managerial –licensing, maintenance costs, vendor information Other –quality and quantity of documentation (electronic, paper, WWW) –quality of available support

8 10.02.2000CHEP20008 Evaluation results: CodeCheck  limitations in parsing real code making extensive use of STL (no enhancements foreseen)  cumbersome in terms of customisability and implementation of new rules  excluded from further evaluation

9 10.02.2000CHEP20009 Evaluation results: TestBed  limitations in parsing complex code  limited number of built-in rules, no possibility of adding new rules  excluded from further evaluation

10 10.02.2000CHEP200010 Evaluation results: Logiscope RuleChecker  simple, easy to use, fast  limited number of built-in rules  limited possibility of adding new rules  flexibility in report generation and quality  limited by proprietary language (CQL)  excluded from further evaluation

11 10.02.2000CHEP200011 Evaluation results: CodeWizard  at least 71 checks implemented incl. most of items from S. Meyers “Effective C++” and “More Effective C++”  configurable to cover 71% of “SPIDER” standard  customisable in terms of rule selection  customisable in terms of code inclusion/exclusion  ability to parse ANSI C++ with STL  possibility of using RuleWizard for addition of customised checks  not yet usable owing to poor documentation

12 10.02.2000CHEP200012 Evaluation results: CodeWizard (cont’d)  reports in graphical and ASCII format –not customisable  information for headers and libraries necessary  straightforward by using the makefile –repetition of parsing and reporting  performance equivalent to compiler  fully evaluated

13 10.02.2000CHEP200013 Evaluation results: QA C++  at least 500 checks implemented incl. ISO C++  configurable to cover 65% of “SPIDER” standard  customisable in terms of rule selection  customisable in terms of code inclusion/exclusion  full STL support foreseen for next release  partial analysis possible via STL stubs provided by the company  easy to learn and use, robust

14 10.02.2000CHEP200014 Evaluation results: QA C++ (cont’d)  information for headers and libraries necessary  possibility of single parsing and caching of headers  makefile integration non trivial  powerful GUI and command line - largely interchangeable  high quality, customisable reports  factor 2 slower performance compared to compiler  fully evaluated BUT completely new version (full ANSI C++ compliance, new parser) not available at the time of the evaluation

15 10.02.2000CHEP200015 Conclusions  Evaluation process  suited to the goals, pragmatic, efficient  user involvement, careful definition of evaluation criteria and detailed planning essential  Evaluation results  out of five tools considered, two, CodeWizard and QA C++, preselected on technical merit and fully evaluated  final choice to depend on weight given to various features, relative cost, needs of institutes concerned and development of promising new tools (e.g. Together/Enterprise CASE tool and tool by ITC-IRST and ALICE experiment)


Download ppt "An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000."

Similar presentations


Ads by Google