University of Southern California Center for Systems and Software Engineering Software Metrics and Measurements Supannika Koolmanojwong CS577 1
University of Southern California Center for Systems and Software Engineering Outline General Concepts about Metrics Example of Metrics Agile Metrics Case Studies –Metrics from Empirical Data – Northrop Grumman –Metrics in IT services – Lockheed Martin 2
University of Southern California Center for Systems and Software Engineering Measurements in daily life 3
University of Southern California Center for Systems and Software Engineering Why do we measure? 4
University of Southern California Center for Systems and Software Engineering 5
University of Southern California Center for Systems and Software Engineering Objectives of software measurement “You can not control what you cannot measure.” – Tom DeMarco “Not everything that counts can be counted. Not everything that is counted counts.” – Albert Einstein 6
University of Southern California Center for Systems and Software Engineering Software Metrics Numerical data related to software development Strongly support software project management activities Can be directly observable quantities or can be derived from one 7
University of Southern California Center for Systems and Software Engineering A simplified measurement information model 8 Ref: Ebert and Dumke 2007 Decisions / Actions Measurements ProcessWork Products Information products Information needs Attributes Results Information Needs, Objectives, Control
University of Southern California Center for Systems and Software Engineering How the software measurements are used? Understand and communicate Specify and achieve objectives Identify and resolve problems Decide and Improve 9
University of Southern California Center for Systems and Software Engineering Measurement Standard 10 ISO/IEC Software Life Cycle Processes ISO/IEC System Life Cycle processes SWEBOK Software Engineering Body of Knowledge PMBOK Project Management Body of Knowledge CMMI Capability Maturity Model Integration ISO Software Process Capability Determination ISO 9001 Quality Management System ISO/IEC 9126 Software Product Quality TL 9000, AS 9100, etc. Objectives adaptations How to do How to do better ISO/IEC 15939:2002 Software Measurement Process How to measure what you are doing
University of Southern California Center for Systems and Software Engineering Ground rules for a Metrics Metrics must be –Understandable to be useful –Economical –Field tested –Highly leveraged –Timely –Must give proper incentives for process improvement –Evenly spaced throughout all phases of development –Useful at multiple levels 11
University of Southern California Center for Systems and Software Engineering Measurements for Senior Management Easy and reliable visibility of business performance Forecasts and indicators where action is needed Drill-down into underlying information and commitments Flexible resource refocus 12
University of Southern California Center for Systems and Software Engineering 13
University of Southern California Center for Systems and Software Engineering Measurements for Project Management Immediate project reviews Status and forecasts for quality, schedule, and budget Follow-up action points Report based on consistent raw data 14
University of Southern California Center for Systems and Software Engineering Metrics for Senior Management 15
University of Southern California Center for Systems and Software Engineering Project management supporting metrics 1.Planning - Metrics serve as a basis of cost estimating, training planning, resource planning, scheduling, and budgeting. 2.Organizing - Size and schedule metrics influence a project's organization. 3.Controlling - Metrics are used to status and track software development activities for compliance to plans. 4.Improving - Metrics are used as a tool for process improvement and to identify where improvement efforts should be concentrated and measure the effects of process improvement efforts. 16
University of Southern California Center for Systems and Software Engineering Measurements for Engineers Immediate access to team planning and progress Get visibility into own performance and how it can be improved Indicators that show weak spots in deliverables Focus energy on software development 17
University of Southern California Center for Systems and Software Engineering 18
University of Southern California Center for Systems and Software Engineering The E4-Measurement Process 19 Objectives, needs Business Process Environment, resources 1. Establish2. Extract3. Evaluate4. Execute Decisions, re- direction, updated plans Ref: Ebert and Dumke 2007
University of Southern California Center for Systems and Software Engineering Aggregation of information 20 Cycle time, quality, cost, productivity, customer satisfaction, resources, skills Sales, cost reduction, innovative products, level of customization Cost reduction, Sales, margins, Customer Service Cash flow, Shareholder value, Operations cost
University of Southern California Center for Systems and Software Engineering SMART goals S pecific - precise M easurable - tangible A ccountable – in line with individual responsibilities R ealistic - achievable T imely – suitable for the current needs 21
University of Southern California Center for Systems and Software Engineering What do you want to measure? Processes –Software-related activities Products –Artifacts, deliverables, documents, capacity Resources –The items which are inputs to the process 22
University of Southern California Center for Systems and Software Engineering Components of software measurements 23
University of Southern California Center for Systems and Software Engineering Example of Metrics Progress / Effort / Cost Indicator Earned value management Requirements / Code Churn Defect-related metrics Test-related metrics System-related metrics 24
University of Southern California Center for Systems and Software Engineering Size How big is the healthcare.gov website? – ons/million-lines-of-code/ 25
University of Southern California Center for Systems and Software Engineering Size Earth System Modeling Framework Project 26
University of Southern California Center for Systems and Software Engineering Progress Indicator 27
University of Southern California Center for Systems and Software Engineering Effort Indicator 28
University of Southern California Center for Systems and Software Engineering Cost Indicator 29
University of Southern California Center for Systems and Software Engineering Earned value management Planned Value (PV) or Budgeted Cost of Work Scheduled (BCWS) Earned Value (EV) or Budgeted Cost of Work Performed (BCWP) 30
University of Southern California Center for Systems and Software Engineering Burndown Chart 31
University of Southern California Center for Systems and Software Engineering Requirements Churn/ Requirements Creep/ Requirements Volatility number of changes to system requirements in each phase/week/increment 32
University of Southern California Center for Systems and Software Engineering Code Churn Software change history Large / recent changes Total added, modified and deleted LOC Number of times that a binary was edited Number of consecutive edits 33
University of Southern California Center for Systems and Software Engineering Code Complexity 34 Gathered from code itself Multiple complexity values Cyclomatic complexity Fan-In / Fan-Out of functions Lines of Code Weighted methods per class Depth of Inheritance Coupling between objects Number of subclasses Total global variables
University of Southern California Center for Systems and Software Engineering Code coverage Degree to which the source code is tested Statement coverage –Has each node in the program been executed? Branch coverage –Has each control structure been evaluated both to true and false? 80% coverage for all general code 90% coverage for critical software component items 35
University of Southern California Center for Systems and Software Engineering Code Coverage 36
University of Southern California Center for Systems and Software Engineering JUnit Code Coverage Tool instruments byte code with extra code to measure which statements are and are not reached. 37 Package Level Converage A Code Coverage Report Line Level Coverage
University of Southern California Center for Systems and Software Engineering Defect reporting metric Can be categorized by –Status Remaining / Resolved / Found –Defect Sources Requirements / Design / Development –Defect found Peer review / unit testing / sanity check –Time Defect arrival rate / Defect age 38
University of Southern California Center for Systems and Software Engineering Defect Status 39
University of Southern California Center for Systems and Software Engineering Defect Density 40
University of Southern California Center for Systems and Software Engineering Test Pass Coverage 41
University of Southern California Center for Systems and Software Engineering Defect Density 42
University of Southern California Center for Systems and Software Engineering Defect Per LOC 43
University of Southern California Center for Systems and Software Engineering Developer Code Review 44 After 60 ‒ 90 minutes, our ability to find defects drops off precipitously
University of Southern California Center for Systems and Software Engineering 45 As the size of the code under review increases, our ability to find all the defects decreases. Don’t review more than 400 lines of code at a time.
University of Southern California Center for Systems and Software Engineering Memory utilization CPU utilization, memory utilization (%) 46
University of Southern California Center for Systems and Software Engineering Network performance 47
University of Southern California Center for Systems and Software Engineering Website-business process-related metrics No. of visitors: new/unique/repeating Referrals: where do they come from? –blog, search engine Bounce rate : immediately click back button Top pages, average time on page 48
University of Southern California Center for Systems and Software Engineering Storage and bandwidth Bandwidth: rate of data movement per time : KB/s, Gbit/s Throughput: work units per time : # accesses/s, IO/s, request/s 49 PRODUCT Throughput PROCESS Throughput
University of Southern California Center for Systems and Software Engineering Top 6 Agile Metrics 50 Ref: Measuring Agility, Peter Behrens
University of Southern California Center for Systems and Software Engineering 51 Ref: Measuring Agility, Peter Behrens Velocity = Work Completed per sprint
University of Southern California Center for Systems and Software Engineering 52
University of Southern California Center for Systems and Software Engineering Case studies Metrics from Empirical Data – Northrop Grumman Metrics in IT services – Lockheed Martin 53
University of Southern California Center for Systems and Software Engineering 54 Richard W. Selby, Northrop Grumman Space Technology, ICSP '09 Title: "Synthesis, Analysis, and Modeling of Large-Scale Mission-Critical Embedded Software Systems"
University of Southern California Center for Systems and Software Engineering 55
University of Southern California Center for Systems and Software Engineering 56
University of Southern California Center for Systems and Software Engineering Case studies Lockheed Martin Consolidated Information Technology Infrastructure Contract (CITIC) Customer: Centers for Medicare & Medicaid Services (CMS) –day-to-day IT Operations & Maintenance (O&M) functions (high availability – 24/7) Service Desk, Mainframe Support (Tier 1), Mid-Tier Support (Tier 2), Desktop Support (Tier 3), Voice, Data, NOC, etc. Performance, Request, Incident, Configuration, Asset, and Change management Security and Disaster Recovery –modernization of the IT infrastructure 57
University of Southern California Center for Systems and Software Engineering Service Measurement Plans & SLAs 58
University of Southern California Center for Systems and Software Engineering 59
University of Southern California Center for Systems and Software Engineering 60
University of Southern California Center for Systems and Software Engineering 61
University of Southern California Center for Systems and Software Engineering 62
University of Southern California Center for Systems and Software Engineering Measurements for progress vs predictions Project PhaseFor MeasurementsFor Predictions Project Management -Effort and Budget Tracking - Requirements Status -Task Status -Top 10 risks - Cost to complete - Schedule evolution Quality Management -Code Stability -Open defects -Review status and follow up -Residual defects -Reliability -Customer satisfaction Requirements Management -Analysis status -Specification progress -Requirements volatility / completeness Construction-Status of documents - Change requests -Review status -Design progress of reqm - Cost to complete -Time to complete TestTest progress (defects, coverage, efficiency, stability - Residual defects - reliability Transition, deployment -Field performance (failure, corrections) - maintenance effort -Reliability -Maintenance effort 63 Ref: Ebert and Dumke, 2007
University of Southern California Center for Systems and Software Engineering Recommended books 64 Software Measurement: Establish - Extract - Evaluate – Execute by Christof Ebert, Reiner Dumke (2010) Practical Software Measurement: Objective Information for Decision Makers by John McGarry, David Card, Cheryl Jones and Beth Layman (Oct 27, 2001)
University of Southern California Center for Systems and Software Engineering References Fenton NE, Software Metrics: A Rigorous Approach, Chapman and Hall, Christof Ebert, Reiner Dumke, Software measurement: establish, extract, evaluate, execute, Springer Richard W. Selby, Northrop Grumman Space Technology, ICSP '09 Title: "Synthesis, Analysis, and Modeling of Large-Scale Mission-Critical Embedded Software Systems“ Measuring Agility, Peter Behrens [Nikora 91] Nikora, Allen P. Error Discovery Rate by Severity Category and Time to Repair Software Failures for Three JPL Flight Projects. Software Product Assurance Section, Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA , November 5,