Software Metrics and Measurements Supannika Koolmanojwong CS510
Outline General Concepts about Metrics Example of Metrics Agile Metrics Metrics from Empirical Data
Measurements in daily life
Why do we measure?
Objectives of software measurement “You can not control what you cannot measure.” – Tom DeMarco “Not everything that counts can be counted. Not everything that is counted counts.” – Albert Einstein
Software Metrics Numerical data related to software development Strongly support software project management activities Can be directly observable quantities or can be derived from one
A simplified measurement information model products Decisions / Actions Measurements Information needs Information Needs, Objectives, Control Attributes Process Work Products Results Ref: Ebert and Dumke 2007
How the software measurements are used? Understand and communicate Specify and achieve objectives Identify and resolve problems Decide and Improve
Measurement Standard How to do How to do better ISO/IEC 12207 Software Life Cycle Processes ISO/IEC 15288 System Life Cycle processes SWEBOK Software Engineering Body of Knowledge PMBOK Project Management Body of Knowledge CMMI Capability Maturity Model Integration ISO 15504 Software Process Capability Determination ISO 9001 Quality Management System ISO/IEC 9126 Software Product Quality TL 9000, AS 9100, etc. Objectives adaptations How to measure what you are doing ISO/IEC 15939:2002 Software Measurement Process
Ground rules for a Metrics Metrics must be Understandable to be useful Economical Field tested Highly leveraged Timely Must give proper incentives for process improvement Evenly spaced throughout all phases of development Useful at multiple levels http://www.stsc.hill.af.mil/resources/tech_docs/gsam3/chap13.pdf
Measurements for Senior Management Easy and reliable visibility of business performance Forecasts and indicators where action is needed Drill-down into underlying information and commitments Flexible resource refocus
Measurements for Project Management Immediate project reviews Status and forecasts for quality, schedule, and budget Follow-up action points Report based on consistent raw data
Project management supporting metrics Planning - Metrics serve as a basis of cost estimating, training planning, resource planning, scheduling, and budgeting. Organizing - Size and schedule metrics influence a project's organization. Controlling - Metrics are used to status and track software development activities for compliance to plans. Improving - Metrics are used as a tool for process improvement and to identify where improvement efforts should be concentrated and measure the effects of process improvement efforts.
Measurements for Engineers Immediate access to team planning and progress Get visibility into own performance and how it can be improved Indicators that show weak spots in deliverables Focus energy on software development
The E4-Measurement Process Objectives, needs Decisions, re-direction, updated plans Business Process Environment, resources 1. Establish 2. Extract 3. Evaluate 4. Execute Ref: Ebert and Dumke 2007
Aggregation of information Enterprise Division Product Line/ Department Projects Cash flow, Shareholder value, Operations cost Cost reduction, Sakes, margins, Customer Service Sales, cost reduction, innovative products, level of customization Cycle time, quality, cost, productivity, customer satisfaction, resources, skills
SMART goals Specific - precise Measurable - tangible Accountable – in line with individual responsibilities Realistic - achievable Timely – suitable for the current needs
What do you want to measure? Processes Software-related activities Products Artifacts, deliverables, documents Resources The items which are inputs to the process
Components of software measurements
Example of Metrics Progress / Effort / Cost Indicator Earned value management Requirements / Code Churn Defect-related metrics Test-related metrics
Size How big is the healthcare.gov website? http://www.informationisbeautiful.net/visualizations/million-lines-of-code/
Size Earth System Modeling Framework Project http://www.earthsystemmodeling.org/metrics/sloc.shtml
Progress Indicator
Effort Indicator
Cost Indicator
Earned value management Planned Value (PV) or Budgeted Cost of Work Scheduled (BCWS) Earned Value (EV) or Budgeted Cost of Work Performed (BCWP) http://en.wikipedia.org/wiki/Earned_value_management
Burndown Chart http://en.wikipedia.org/wiki/Burn_down_chart
Requirements Churn/ Requirements Creep/ Requirements Volatility number of changes to system requirements in each phase/week/increment
Code Churn Software change history Large / recent changes Total added, modified and deleted LOC Number of times that a binary was edited Number of consecutive edits
Code Complexity Gathered from code itself Multiple complexity values Cyclomatic complexity Fan-In / Fan-Out of functions Lines of Code Weighted methods per class Depth of Inheritance Coupling between objects Number of subclasses Total global variables
Code coverage Degree to which the source code is tested Statement coverage Has each node in the program been executed? Branch coverage Has each control structure been evaluated both to true and false?
Code Coverage http://www.firstlinesoftware.com/metrics_group2.html
Package Level Converage JUnit Code Coverage Tool instruments byte code with extra code to measure which statements are and are not reached. Line Level Coverage A Code Coverage Report Package Level Converage http://www.cafeaulait.org/slides/albany/codecoverage/Measuring_JUnit_Code_Coverage.html
Defect reporting metric Can be categorized by Status Remaining / Resolved / Found Defect Sources Requirements / Design / Development Defect found Peer review / unit testing / sanity check Time Defect arrival rate / Defect age
Defect Status
Defect Density
Test Pass Coverage http://www.jrothman.com/Papers/QW96.html
Defect Density
Defect Per LOC
Developer Code Review After 60‒90 minutes, our ability to find defects drops off precipitously http://answers.oreilly.com/topic/2265-best-practices-for-developer-code-review/
As the size of the code under review increases, our ability to find all the defects decreases. Don’t review more than 400 lines of code at a time. http://answers.oreilly.com/topic/2265-best-practices-for-developer-code-review/
Top 6 Agile Metrics Ref: Measuring Agility, Peter Behrens
Velocity = Work Completed per sprint Ref: Measuring Agility, Peter Behrens
Measurements in organizational level Empirical analysis Change from the top
Richard W. Selby, Northrop Grumman Space Technology, ICSP '09 Title: "Synthesis, Analysis, and Modeling of Large-Scale Mission-Critical Embedded Software Systems"
Measurements for progress vs predictions Project Phase For Measurements For Predictions Project Management Effort and Budget Tracking Requirements Status Task Status Top 10 risks Cost to complete Schedule evolution Quality Management Code Stability Open defects Review status and follow up Residual defects Reliability Customer satisfaction Requirements Management Analysis status Specification progress Requirements volatility / completeness Construction Status of documents Change requests Review status Design progress of reqm Time to complete Test Test progress (defects, coverage, efficiency, stability - Residual defects - reliability Transition, deployment Field performance (failure, corrections) maintenance effort Maintenance effort Ref: Ebert and Dumke, 2007
Recommended books Practical Software Measurement: Objective Information for Decision Makers by John McGarry, David Card, Cheryl Jones and Beth Layman (Oct 27, 2001) Software Measurement: Establish - Extract - Evaluate – Execute by Christof Ebert, Reiner Dumke (2010)
References http://sunset.usc.edu/classes/cs577b_2001/metricsguide/metrics.html Fenton NE, Software Metrics: A Rigorous Approach, Chapman and Hall, 1991. http://www.stsc.hill.af.mil/resources/tech_docs/gsam3/chap13.pdf Christof Ebert, Reiner Dumke, Software measurement: establish, extract, evaluate, execute, Springer 2007 http://se.inf.ethz.ch/old/teaching/2010-S/0276/slides/kissling.pdf Richard W. Selby, Northrop Grumman Space Technology, ICSP '09 Title: "Synthesis, Analysis, and Modeling of Large-Scale Mission-Critical Embedded Software Systems“ Measuring Agility, Peter Behrens [Nikora 91] Nikora, Allen P. Error Discovery Rate by Severity Category and Time to Repair Software Failures for Three JPL Flight Projects. Software Product Assurance Section, Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099, November 5, 1991. [Nikora 91] Nikora, Allen P. Error Discovery Rate by Severity Category and Time to Repair Software Failures for Three JPL Flight Projects. Software Product Assurance Section, Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099, November 5, 1991.