Download presentation
Presentation is loading. Please wait.
Published byBernard Rice Modified over 9 years ago
1
University of Southern California Center for Systems and Software Engineering Software Metrics and Measurements Supannika Koolmanojwong CS577 1
2
University of Southern California Center for Systems and Software Engineering Outline General Concepts about Metrics Example of Metrics Agile Metrics Case Studies –Metrics from Empirical Data – Northrop Grumman –Metrics in IT services – Lockheed Martin 2
3
University of Southern California Center for Systems and Software Engineering Measurements in daily life 3
4
University of Southern California Center for Systems and Software Engineering Why do we measure? 4
5
University of Southern California Center for Systems and Software Engineering 5 http://www.teamqualitypro.com/software-metrics/do-you-know-your-abcs-of-software-metrics/
6
University of Southern California Center for Systems and Software Engineering Objectives of software measurement “You can not control what you cannot measure.” – Tom DeMarco “Not everything that counts can be counted. Not everything that is counted counts.” – Albert Einstein 6
7
University of Southern California Center for Systems and Software Engineering Software Metrics Numerical data related to software development Strongly support software project management activities Can be directly observable quantities or can be derived from one 7
8
University of Southern California Center for Systems and Software Engineering A simplified measurement information model 8 Ref: Ebert and Dumke 2007 Decisions / Actions Measurements ProcessWork Products Information products Information needs Attributes Results Information Needs, Objectives, Control
9
University of Southern California Center for Systems and Software Engineering How the software measurements are used? Understand and communicate Specify and achieve objectives Identify and resolve problems Decide and Improve 9
10
University of Southern California Center for Systems and Software Engineering Measurement Standard 10 ISO/IEC 12207 Software Life Cycle Processes ISO/IEC 15288 System Life Cycle processes SWEBOK Software Engineering Body of Knowledge PMBOK Project Management Body of Knowledge CMMI Capability Maturity Model Integration ISO 15504 Software Process Capability Determination ISO 9001 Quality Management System ISO/IEC 9126 Software Product Quality TL 9000, AS 9100, etc. Objectives adaptations How to do How to do better ISO/IEC 15939:2002 Software Measurement Process How to measure what you are doing
11
University of Southern California Center for Systems and Software Engineering Ground rules for a Metrics Metrics must be –Understandable to be useful –Economical –Field tested –Highly leveraged –Timely –Must give proper incentives for process improvement –Evenly spaced throughout all phases of development –Useful at multiple levels 11 http://www.stsc.hill.af.mil/resources/tech_docs/gsam3/chap13.pdf
12
University of Southern California Center for Systems and Software Engineering Measurements for Senior Management Easy and reliable visibility of business performance Forecasts and indicators where action is needed Drill-down into underlying information and commitments Flexible resource refocus 12
13
University of Southern California Center for Systems and Software Engineering 13
14
University of Southern California Center for Systems and Software Engineering Measurements for Project Management Immediate project reviews Status and forecasts for quality, schedule, and budget Follow-up action points Report based on consistent raw data 14
15
University of Southern California Center for Systems and Software Engineering Metrics for Senior Management 15
16
University of Southern California Center for Systems and Software Engineering Project management supporting metrics 1.Planning - Metrics serve as a basis of cost estimating, training planning, resource planning, scheduling, and budgeting. 2.Organizing - Size and schedule metrics influence a project's organization. 3.Controlling - Metrics are used to status and track software development activities for compliance to plans. 4.Improving - Metrics are used as a tool for process improvement and to identify where improvement efforts should be concentrated and measure the effects of process improvement efforts. 16
17
University of Southern California Center for Systems and Software Engineering Measurements for Engineers Immediate access to team planning and progress Get visibility into own performance and how it can be improved Indicators that show weak spots in deliverables Focus energy on software development 17
18
University of Southern California Center for Systems and Software Engineering 18 http://www.ibm.com/developerworks/rational/library/customized-reports-rational-team-concert/
19
University of Southern California Center for Systems and Software Engineering The E4-Measurement Process 19 Objectives, needs Business Process Environment, resources 1. Establish2. Extract3. Evaluate4. Execute Decisions, re- direction, updated plans Ref: Ebert and Dumke 2007
20
University of Southern California Center for Systems and Software Engineering Aggregation of information 20 Cycle time, quality, cost, productivity, customer satisfaction, resources, skills Sales, cost reduction, innovative products, level of customization Cost reduction, Sales, margins, Customer Service Cash flow, Shareholder value, Operations cost
21
University of Southern California Center for Systems and Software Engineering SMART goals S pecific - precise M easurable - tangible A ccountable – in line with individual responsibilities R ealistic - achievable T imely – suitable for the current needs 21
22
University of Southern California Center for Systems and Software Engineering What do you want to measure? Processes –Software-related activities Products –Artifacts, deliverables, documents, capacity Resources –The items which are inputs to the process 22
23
University of Southern California Center for Systems and Software Engineering Components of software measurements 23
24
University of Southern California Center for Systems and Software Engineering Example of Metrics Progress / Effort / Cost Indicator Earned value management Requirements / Code Churn Defect-related metrics Test-related metrics System-related metrics 24
25
University of Southern California Center for Systems and Software Engineering Size How big is the healthcare.gov website? –http://www.informationisbeautiful.net/visualizati ons/million-lines-of-code/ 25
26
University of Southern California Center for Systems and Software Engineering Size Earth System Modeling Framework Project 26 http://www.earthsystemmodeling.org/metrics/sloc.shtml
27
University of Southern California Center for Systems and Software Engineering Progress Indicator 27
28
University of Southern California Center for Systems and Software Engineering Effort Indicator 28
29
University of Southern California Center for Systems and Software Engineering Cost Indicator 29
30
University of Southern California Center for Systems and Software Engineering Earned value management Planned Value (PV) or Budgeted Cost of Work Scheduled (BCWS) Earned Value (EV) or Budgeted Cost of Work Performed (BCWP) 30 http://en.wikipedia.org/wiki/Earned_value_management
31
University of Southern California Center for Systems and Software Engineering Burndown Chart 31 http://en.wikipedia.org/wiki/Burn_down_chart
32
University of Southern California Center for Systems and Software Engineering Requirements Churn/ Requirements Creep/ Requirements Volatility number of changes to system requirements in each phase/week/increment 32
33
University of Southern California Center for Systems and Software Engineering Code Churn Software change history Large / recent changes Total added, modified and deleted LOC Number of times that a binary was edited Number of consecutive edits 33
34
University of Southern California Center for Systems and Software Engineering Code Complexity 34 Gathered from code itself Multiple complexity values Cyclomatic complexity Fan-In / Fan-Out of functions Lines of Code Weighted methods per class Depth of Inheritance Coupling between objects Number of subclasses Total global variables
35
University of Southern California Center for Systems and Software Engineering Code coverage Degree to which the source code is tested Statement coverage –Has each node in the program been executed? Branch coverage –Has each control structure been evaluated both to true and false? 80% coverage for all general code 90% coverage for critical software component items 35
36
University of Southern California Center for Systems and Software Engineering Code Coverage 36 http://www.firstlinesoftware.com/metrics_group2.html
37
University of Southern California Center for Systems and Software Engineering JUnit Code Coverage Tool instruments byte code with extra code to measure which statements are and are not reached. 37 Package Level Converage A Code Coverage Report Line Level Coverage http://www.cafeaulait.org/slides/albany/codecoverage/Measuring_JUnit_Code_Coverage.html
38
University of Southern California Center for Systems and Software Engineering Defect reporting metric Can be categorized by –Status Remaining / Resolved / Found –Defect Sources Requirements / Design / Development –Defect found Peer review / unit testing / sanity check –Time Defect arrival rate / Defect age 38
39
University of Southern California Center for Systems and Software Engineering Defect Status 39
40
University of Southern California Center for Systems and Software Engineering Defect Density 40
41
University of Southern California Center for Systems and Software Engineering Test Pass Coverage 41 http://www.jrothman.com/Papers/QW96.html
42
University of Southern California Center for Systems and Software Engineering Defect Density 42
43
University of Southern California Center for Systems and Software Engineering Defect Per LOC 43
44
University of Southern California Center for Systems and Software Engineering Developer Code Review 44 After 60 ‒ 90 minutes, our ability to find defects drops off precipitously http://answers.oreilly.com/topic/2265-best-practices-for-developer-code-review/
45
University of Southern California Center for Systems and Software Engineering 45 As the size of the code under review increases, our ability to find all the defects decreases. Don’t review more than 400 lines of code at a time. http://answers.oreilly.com/topic/2265-best-practices-for-developer-code-review/
46
University of Southern California Center for Systems and Software Engineering Memory utilization CPU utilization, memory utilization (%) 46 http://docs.aws.amazon.com/AmazonECS/latest/developerguide/viewing_cloudwatch_metrics.html
47
University of Southern California Center for Systems and Software Engineering Network performance 47 http://www.sqlshack.com/sql-server-network-performance-metrics-important-metrics/
48
University of Southern California Center for Systems and Software Engineering Website-business process-related metrics No. of visitors: new/unique/repeating Referrals: where do they come from? –blog, search engine Bounce rate : immediately click back button Top pages, average time on page 48
49
University of Southern California Center for Systems and Software Engineering Storage and bandwidth Bandwidth: rate of data movement per time : KB/s, Gbit/s Throughput: work units per time : # accesses/s, IO/s, request/s 49 http://hssl.cs.jhu.edu/~randal/419/lectures/L2.Metrics.pdf PRODUCT Throughput PROCESS Throughput https://itopskanban.wordpress.com/metrics/
50
University of Southern California Center for Systems and Software Engineering Top 6 Agile Metrics 50 Ref: Measuring Agility, Peter Behrens
51
University of Southern California Center for Systems and Software Engineering 51 Ref: Measuring Agility, Peter Behrens Velocity = Work Completed per sprint
52
University of Southern California Center for Systems and Software Engineering 52
53
University of Southern California Center for Systems and Software Engineering Case studies Metrics from Empirical Data – Northrop Grumman Metrics in IT services – Lockheed Martin 53
54
University of Southern California Center for Systems and Software Engineering 54 Richard W. Selby, Northrop Grumman Space Technology, ICSP '09 Title: "Synthesis, Analysis, and Modeling of Large-Scale Mission-Critical Embedded Software Systems"
55
University of Southern California Center for Systems and Software Engineering 55
56
University of Southern California Center for Systems and Software Engineering 56
57
University of Southern California Center for Systems and Software Engineering Case studies Lockheed Martin Consolidated Information Technology Infrastructure Contract (CITIC) Customer: Centers for Medicare & Medicaid Services (CMS) –day-to-day IT Operations & Maintenance (O&M) functions (high availability – 24/7) Service Desk, Mainframe Support (Tier 1), Mid-Tier Support (Tier 2), Desktop Support (Tier 3), Voice, Data, NOC, etc. Performance, Request, Incident, Configuration, Asset, and Change management Security and Disaster Recovery –modernization of the IT infrastructure 57 http://www.psmsc.com/UsersGroup2012.asp
58
University of Southern California Center for Systems and Software Engineering Service Measurement Plans & SLAs 58 http://www.psmsc.com/UsersGroup2012.asp
59
University of Southern California Center for Systems and Software Engineering 59
60
University of Southern California Center for Systems and Software Engineering 60
61
University of Southern California Center for Systems and Software Engineering 61
62
University of Southern California Center for Systems and Software Engineering 62
63
University of Southern California Center for Systems and Software Engineering Measurements for progress vs predictions Project PhaseFor MeasurementsFor Predictions Project Management -Effort and Budget Tracking - Requirements Status -Task Status -Top 10 risks - Cost to complete - Schedule evolution Quality Management -Code Stability -Open defects -Review status and follow up -Residual defects -Reliability -Customer satisfaction Requirements Management -Analysis status -Specification progress -Requirements volatility / completeness Construction-Status of documents - Change requests -Review status -Design progress of reqm - Cost to complete -Time to complete TestTest progress (defects, coverage, efficiency, stability - Residual defects - reliability Transition, deployment -Field performance (failure, corrections) - maintenance effort -Reliability -Maintenance effort 63 Ref: Ebert and Dumke, 2007
64
University of Southern California Center for Systems and Software Engineering Recommended books 64 Software Measurement: Establish - Extract - Evaluate – Execute by Christof Ebert, Reiner Dumke (2010) Practical Software Measurement: Objective Information for Decision Makers by John McGarry, David Card, Cheryl Jones and Beth Layman (Oct 27, 2001)
65
University of Southern California Center for Systems and Software Engineering References http://sunset.usc.edu/classes/cs577b_2001/metricsguide/metrics.html Fenton NE, Software Metrics: A Rigorous Approach, Chapman and Hall, 1991. http://www.stsc.hill.af.mil/resources/tech_docs/gsam3/chap13.pdf Christof Ebert, Reiner Dumke, Software measurement: establish, extract, evaluate, execute, Springer 2007 http://se.inf.ethz.ch/old/teaching/2010-S/0276/slides/kissling.pdf Richard W. Selby, Northrop Grumman Space Technology, ICSP '09 Title: "Synthesis, Analysis, and Modeling of Large-Scale Mission-Critical Embedded Software Systems“ Measuring Agility, Peter Behrens [Nikora 91] Nikora, Allen P. Error Discovery Rate by Severity Category and Time to Repair Software Failures for Three JPL Flight Projects. Software Product Assurance Section, Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099, November 5, 1991. 65
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.