EMI INFSO-RI-261611 Software Metric Definitions, Reports and Analysis in EMI Authors: Eamonn Kenny (TCD), Gianni Pucciani (CERN) Date: Tuesday 12 th April.

Slides:



Advertisements
Similar presentations
Automated Software Testing: Test Execution and Review Amritha Muralidharan (axm16u)
Advertisements

Stoimen Stoimenov QA Engineer SitefinityLeads, SitefinityTeam6 Telerik QA Academy Telerik QA Academy.
Sixth Hour Lecture 10:30 – 11:20 am, September 9 Framework for a Software Management Process – Artifacts of the Process (Part II, Chapter 6 of Royce’ book)
Project Management based on the Project Management book of knowledge Risk Identify, analyse and respond to risks Resources Make most effective use of human.
Chapter 24 - Quality Management
EMI INFSO-RI SA2: Session Summary Alberto Aimar WP Package Leader 1 June 2011, Lund.
EGEE is a project funded by the European Union under contract IST JRA1 Testing Activity: Status and Plans Leanne Guy EGEE Middleware Testing.
Objectives Understand the basic concepts and definitions relating to testing, like error, fault, failure, test case, test suite, test harness. Explore.
© 2012 IBM Corporation Rational Insight | Back to Basis Series Chao Zhang Unit Testing.
Relating Testing to Quality –Timeliness of Testing –Quality Attributes Gauge by Testing –Roles Defining Test Discipline Activities Elaborating the Test.
EMI SA2: Quality Assurance (EMI-SA2 Work Package) Alberto Aimar (CERN) WP Leader.
EMI INFSO-RI EMI SA2 Report Quality Assurance Alberto Aimar (CERN) SA2 WP Leader.
1 How to Apply Static and Dynamic Analysis in Practice © Software Quality Week ‘97 How to Apply Static and Dynamic Analysis in Practice - Otto Vinter Manager.
EMI INFSO-RI Metrics review Claudio (SA1), Lars, Duarte, Eamonn and Maria (SA2)
EMI INFSO-RI EMI Quality Assurance Processes (PS ) Alberto Aimar (CERN) CERN IT-GT-SL Section Leader EMI SA2 QA Activity Leader.
EMI SA2: Quality Assurance (EMI-SA2 Work Package) Alberto Aimar (CERN) WP Leader.
EMI is partially funded by the European Commission under Grant Agreement RI Post EMI Plans and MeDIA Alberto DI MEGLIO, CERN Project Director WLCG.
INFSO-RI Enabling Grids for E-sciencE The gLite Software Development Process Alberto Di Meglio CERN.
EMI INFSO-RI SA2 - Quality Assurance Alberto Aimar (CERN) SA2 Leader EMI First EC Review 22 June 2011, Brussels.
Webster Visualize Webster Financial Team Visual Scrumware Joe Andrusyszyn Mark Bryant Brian Hannan Robert Songer.
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
1 Experience-Driven Process Improvement Boosts Software Quality © Software Quality Week 1996 Experience-Driven Process Improvement Boosts Software Quality.
European Middleware Initiative (EMI) – Release Process Doina Cristina Aiftimiei (INFN) EGI Technical Forum, Amsterdam 17. Sept.2010.
INFSO-RI Enabling Grids for E-sciencE The gLite Software Development Process Alberto Di Meglio EGEE – JRA1 CERN.
INFSO-RI SA1 Service Management Alberto AIMAR (CERN) ETICS 2 Final Review Brussels - 11 May 2010.
EMI is partially funded by the European Commission under Grant Agreement RI SA2 – Quality Assurance Alberto AIMAR (CERN) SA2 Leader EMI Second EC.
JRA Execution Plan 13 January JRA1 Execution Plan Frédéric Hemmer EGEE Middleware Manager EGEE is proposed as a project funded by the European.
SOFTWARE PROCESS AND PROJECT METRICS. Topic Covered  Metrics in the process and project domains  Process, project and measurement  Process Metrics.
EMI INFSO-RI Guidelines and SQA Process Maria Alandes Pradillo (CERN) SA2.2 Task Leader.
JRA2: Quality Assurance Overview EGEE is proposed as a project funded by the European Union under contract IST JRA.
EMI INFSO-RI SA1 – Maintenance and Support Francesco Giacomini (INFN) SA1 Leader 1 st EMI Periodic Review Brussels, 22 June 2011.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
EMI INFSO-RI SA2.3 Metrics Report Eamonn Kenny (TCD) Gianni Pucciani (CERN)
Test status report Test status report is important to track the important project issues, accomplishments of the projects, pending work and milestone analysis(
INFSOM-RI WP 4 : Testing Tools and Methodologies Status Report ETICS Review – 15 February 2008 Éva Takács (4D SOFT)
EMI INFSO-RI SA1 Session Report Francesco Giacomini (INFN) EMI Kick-off Meeting CERN, May 2010.
European Middleware Initiative (EMI) The Software Engineering Model Alberto Di Meglio (CERN) Interim Project Director.
Project Management Training
EMI is partially funded by the European Commission under Grant Agreement RI Project Status and NA1 Alberto Di Meglio, CERN 3 rd EMI All-Hands Meeting.
INFSO-RI Enabling Grids for E-sciencE The gLite Software Development Process Alberto Di Meglio EGEE – JRA1 CERN.
SA2.6 Task: EMI Testbeds Danilo Dongiovanni INFN-CNAF.
EMI INFSO-RI Software Quality Assurance in EMI Maria Alandes Pradillo (CERN) SA2.2 Task Leader.
EMI INFSO-RI EMI Quality Assurance Tools Lorenzo Dini (CERN) SA2.4 Task Leader.
Advances In Software Inspection
Software Testing and QA- ARC middleware Jozef Černák, Marek Kočan, Martin Savko P. J. Safarik University in Kosice Slovak Republic.
IConverter TEAM 1 Denesh Kumar Krishnan Rajaram Sumanth Meda Jayaprakash Kapil Vyas.
INFSO-RI SA2 ETICS2 first Review Valerio Venturi INFN Bruxelles, 3 April 2009 Infrastructure Support.
T EST T OOLS U NIT VI This unit contains the overview of the test tools. Also prerequisites for applying these tools, tools selection and implementation.
EMI INFSO-RI SA1 – Maintenance and Support Francesco Giacomini (INFN) EMI First EC Review Brussels, 22 June 2011.
Project Planning Goal 1 - Estimates are documented for use in tracking and planning project. Goal 2 - Project Activities and commitments planned and documented.
INFSO-RI JRA2 Test Management Tools Eva Takacs (4D SOFT) ETICS 2 Final Review Brussels - 11 May 2010.
EMI is partially funded by the European Commission under Grant Agreement RI Build and Test Services of the EMI project: Lessons Learned and Perspectives.
CS223: Software Engineering Lecture 21: Unit Testing Metric.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Implementing product teams Oliver Keeble.
6/6/ SOFTWARE LIFE CYCLE OVERVIEW Professor Ron Kenett Tel Aviv University School of Engineering.
JRA1 Meeting – 09/02/ Software Configuration Management and Integration EGEE is proposed as a project funded by the European Union under contract.
EMI INFSO-RI SA2: Quality Assurance Status Report Alberto Aimar(SA2) SA2 Leader EMI First EC Review 22 June 2011, Brussels.
EMI INFSO-RI Testbed for project continuous Integration Danilo Dongiovanni (INFN-CNAF) -SA2.6 Task Leader Jozef Cernak(UPJŠ, Kosice, Slovakia)
EMI is partially funded by the European Commission under Grant Agreement RI Common Framework for Extracting Information and Metrics from Multiple.
Tool Support for Testing Classify different types of test tools according to their purpose Explain the benefits of using test tools.
Implementation of GLUE 2.0 support in the EMI Data Area Elisabetta Ronchieri on behalf of JRA1’s GLUE 2.0 Working Group INFN-CNAF 13 April 2011, EGI User.
INFSO-RI SA2 ETICS2 first Review Valerio Venturi INFN Bruxelles, 3 April 2009 Infrastructure Support.
SOFTWARE TESTING TRAINING TOOLS SUPPORT FOR SOFTWARE TESTING Chapter 6 immaculateres 1.
The Quality Assurance Metric Infrastructure in the EMI Project
The EMT Oliver Keeble, SA3 CERN.
Chapter 8 – Software Testing
JRA1 (Middleware) Overview
Leanne Guy EGEE JRA1 Test Team Manager
Software testing strategies 2
Test Planning Mike O’Dell (some edits by Vassilis Athitsos)
Presentation transcript:

EMI INFSO-RI Software Metric Definitions, Reports and Analysis in EMI Authors: Eamonn Kenny (TCD), Gianni Pucciani (CERN) Date: Tuesday 12 th April

EMI INFSO-RI Measure quality/quantity of the software. Encourage conformance between the 4 middleware groups. Highlight previously unlooked at problems (e.g: using static analysers) 2 Why do we need metrics?

EMI INFSO-RI To measure: – Performance: e.g: How long to fix a bug? – Complexity: e.g: How many fixed bugs? – Support: e.g: How complex is a component? E.g: Cyclomatic Complexity and SLOC EMI is funded to provide quality software and to measure its quality With metrics one can see trends and understand what needs to be improved and how many defects a component has. 3 Why do we need metrics?

EMI INFSO-RI Metrics Role in EMI SA2 4 Area Leaders Product Teams (PT) SA2.3 Metrics SA2.2 QA Plan SA2.4 Tools SA2.5 QA Review Top 3 Metrics?Existing Middleware Metrics? Tool Definition Verification SA1 Quality Control Evaluation Definitions & Analysis EMT Weekly Meetings SA2 SA1 JRA1 Report Generation SA2.1 Coordination PEB link Survey Weekly report Define & Structure

EMI INFSO-RI Model for Metric Collection Practical metrics, not a theoretical set! Governed by the end goals of the project End-Goals Suggests a Question Leads to a Metric

EMI INFSO-RI Metric Template Metrics Id E.g: PriorityBugs Name Description Measurement Calculation Mathematical formula Input(s) & Units E.g: time range in days Output(s) & Units E.g: Average time in hours Scope Thresholds/Target Value Tools Availability Per middleware availability Goals Quality factor Follows the McCall factors Risks Special Notes

EMI INFSO-RI Metrics Description Example 7

EMI INFSO-RI Areas of CoverageTypes of Metrics Process Management (Bug-tracking related) Relates to priority, severity, open/closed bugs, state changes in bugs, improving turnaround times External:Quality in Use (Bug-tracking/EGI) 3 rd level GGUS related metrics (KPIs for SA1) Product Team software - Static Analysers Language specific analysers, SLOC, CLOC, Cyclomatic Complexity, etc. Product Team software - Testing, platform, etc. Unit tests, supported platforms, bug density Optional Add-on ToolsValgrind/Helgrind (Memory leak & thread checking) 8 Metric Categories

EMI INFSO-RI Average Time To Close a Bug (PriorityBugs, BugSeverityDistribution) Time To Fix a Bug (SA1-QC) Bug-tracking Process 9 Open Priority Bugs Open Is Accepted? YES Accepted Fixed Can I test it? Test successful? Not Tested Closed Rejected NO YES NO YES Suggested Metrics Untouched Open Bugs (DSA 1.1 requirement)

EMI INFSO-RI Exposed XML to ETICS (run anytime) Middleware Bug- Mapping (needs periodic intervention) Middleware Bug- tracking (Snapshot taken daily) 10 Handling Multiple Bug-Trackers ARC Bugzilla dCache Request Tracker (RT) gLite Savannah UNICORE SourceForge Multiple bug-trackers One common interface

EMI INFSO-RI Metric NameDesignedImplementedRecipient Open Priority Bugs (High/Immediate) EMT/Reports Successful Builds metric Open Untouched Bugs (> 14 days) Fixed Bugs SA1-QC Priority Bugs (High/Immediate) Periodic Reports/ Deliverables/ Product Team Reports Bug Severity Distribution Backlog Management Index Integration test effectiveness metric Delay on release schedule metric Up-to-date documentation metric 11 SQAP/Bug-tracking Metrics

EMI INFSO-RI Fixed Bugs (SA1-QC) Required for assessing the number of regression tests produced in association with each fixed bug.

EMI INFSO-RI Open Untouched Bugs (weekly EMT) Report from 10 th March 2011 Inter-quartile ranges used in visualization

EMI INFSO-RI Successful Builds Metric (for EMT) Highlight: product teams having failures due to their own internal issues Highlight: product teams causing other product teams to fail

EMI INFSO-RI Backlog Management Index Track whether product teams have an increasing backlog or decreasing backlog of bugs or defects?

EMI INFSO-RI Metric NameDesignedImplementedRecipient Unit test coverage metric Reports/ Deliverables/ Periodic PT reports Number of supported Platforms Total bug density Bug density per release Cyclomatic complexity C/C++ metrics – CCCC, cppcheck Java metrics - FindBugs, PMD, Checkstyle Python metrics - pylint Code commenting metrics 16 Product related Metrics

EMI INFSO-RI Static Analysers: Java – FindBugs (PT) Project Overview - Results per Product team Finer Grained Reporting for Product Teams

EMI INFSO-RI Source Lines of Code Source lines of code is particularly important when assessing the Bug Density Distribution (i.e: Bugs open per lines of codes)

EMI INFSO-RI More Static Analysers 19

EMI INFSO-RI The metrics are useful and we defined them for the whole of EMI: – There is now a common framework for producing reports for the EMT, deliverables and Product Teams (PTs). – The reporting structure is interpretable at the project level, activity level & Product Teams level. – Static analysers produce reports highlighting problems previously not seen by developers. 20 Current Status

EMI INFSO-RI We do have an objective way to measure performance/complexity/support. EMI can provide quality software and measure its quality. With the current metric reports one can see trends and understand what needs to be improved and how many defects a component has. 21 Conclusions

EMI INFSO-RI A few important metrics are currently being implemented. The metrics will be reassessed and evaluated after the EMI-1 release. 22 Future Work