Presentation is loading. Please wait.

Presentation is loading. Please wait.

Software Metrics and Measurements

Similar presentations


Presentation on theme: "Software Metrics and Measurements"— Presentation transcript:

1 Software Metrics and Measurements
Supannika Koolmanojwong CS510

2 Outline General Concepts about Metrics Example of Metrics
Agile Metrics Metrics from Empirical Data

3 Measurements in daily life

4 Why do we measure?

5 Objectives of software measurement
“You can not control what you cannot measure.” – Tom DeMarco “Not everything that counts can be counted. Not everything that is counted counts.” – Albert Einstein

6 Software Metrics Numerical data related to software development
Strongly support software project management activities Can be directly observable quantities or can be derived from one

7 A simplified measurement information model
products Decisions / Actions Measurements Information needs Information Needs, Objectives, Control Attributes Process Work Products Results Ref: Ebert and Dumke 2007

8 How the software measurements are used?
Understand and communicate Specify and achieve objectives Identify and resolve problems Decide and Improve

9 Measurement Standard How to do How to do better ISO/IEC 12207
Software Life Cycle Processes ISO/IEC 15288 System Life Cycle processes SWEBOK Software Engineering Body of Knowledge PMBOK Project Management Body of Knowledge CMMI Capability Maturity Model Integration ISO 15504 Software Process Capability Determination ISO 9001 Quality Management System ISO/IEC 9126 Software Product Quality TL 9000, AS 9100, etc. Objectives adaptations How to measure what you are doing ISO/IEC 15939:2002 Software Measurement Process

10 Ground rules for a Metrics
Metrics must be Understandable to be useful Economical Field tested Highly leveraged Timely Must give proper incentives for process improvement Evenly spaced throughout all phases of development Useful at multiple levels

11 Measurements for Senior Management
Easy and reliable visibility of business performance Forecasts and indicators where action is needed Drill-down into underlying information and commitments Flexible resource refocus

12 Measurements for Project Management
Immediate project reviews Status and forecasts for quality, schedule, and budget Follow-up action points Report based on consistent raw data

13 Project management supporting metrics
Planning - Metrics serve as a basis of cost estimating, training planning, resource planning, scheduling, and budgeting. Organizing - Size and schedule metrics influence a project's organization. Controlling - Metrics are used to status and track software development activities for compliance to plans. Improving - Metrics are used as a tool for process improvement and to identify where improvement efforts should be concentrated and measure the effects of process improvement efforts.

14 Measurements for Engineers
Immediate access to team planning and progress Get visibility into own performance and how it can be improved Indicators that show weak spots in deliverables Focus energy on software development

15 The E4-Measurement Process
Objectives, needs Decisions, re-direction, updated plans Business Process Environment, resources 1. Establish 2. Extract 3. Evaluate 4. Execute Ref: Ebert and Dumke 2007

16 Aggregation of information
Enterprise Division Product Line/ Department Projects Cash flow, Shareholder value, Operations cost Cost reduction, Sakes, margins, Customer Service Sales, cost reduction, innovative products, level of customization Cycle time, quality, cost, productivity, customer satisfaction, resources, skills

17 SMART goals Specific - precise Measurable - tangible
Accountable – in line with individual responsibilities Realistic - achievable Timely – suitable for the current needs

18 What do you want to measure?
Processes Software-related activities Products Artifacts, deliverables, documents Resources The items which are inputs to the process

19 Components of software measurements

20 Example of Metrics Progress / Effort / Cost Indicator
Earned value management Requirements / Code Churn Defect-related metrics Test-related metrics

21 Size How big is the healthcare.gov website?

22 Size Earth System Modeling Framework Project

23 Progress Indicator

24 Effort Indicator

25 Cost Indicator

26 Earned value management
Planned Value (PV) or Budgeted Cost of Work Scheduled (BCWS) Earned Value (EV) or Budgeted Cost of Work Performed (BCWP)

27 Burndown Chart

28 Requirements Churn/ Requirements Creep/ Requirements Volatility
number of changes to system requirements in each phase/week/increment

29 Code Churn Software change history Large / recent changes
Total added, modified and deleted LOC Number of times that a binary was edited Number of consecutive edits

30 Code Complexity Gathered from code itself Multiple complexity values
Cyclomatic complexity Fan-In / Fan-Out of functions Lines of Code Weighted methods per class Depth of Inheritance Coupling between objects Number of subclasses Total global variables

31 Code coverage Degree to which the source code is tested
Statement coverage Has each node in the program been executed? Branch coverage Has each control structure been evaluated both to true and false?

32 Code Coverage

33 Package Level Converage
JUnit Code Coverage Tool instruments byte code with extra code to measure which statements are and are not reached. Line Level Coverage A Code Coverage Report Package Level Converage

34 Defect reporting metric
Can be categorized by Status Remaining / Resolved / Found Defect Sources Requirements / Design / Development Defect found Peer review / unit testing / sanity check Time Defect arrival rate / Defect age

35 Defect Status

36 Defect Density

37 Test Pass Coverage

38 Defect Density

39 Defect Per LOC

40 Developer Code Review After 60‒90 minutes, our ability to find defects drops off precipitously

41 As the size of the code under review increases, our ability to find all the defects decreases. Don’t review more than 400 lines of code at a time.

42 Top 6 Agile Metrics Ref: Measuring Agility, Peter Behrens

43 Velocity = Work Completed per sprint
Ref: Measuring Agility, Peter Behrens

44

45 Measurements in organizational level
Empirical analysis Change from the top

46 Richard W. Selby, Northrop Grumman Space Technology, ICSP '09 Title: "Synthesis, Analysis, and Modeling of Large-Scale Mission-Critical Embedded Software Systems"

47

48

49 Measurements for progress vs predictions
Project Phase For Measurements For Predictions Project Management Effort and Budget Tracking Requirements Status Task Status Top 10 risks Cost to complete Schedule evolution Quality Management Code Stability Open defects Review status and follow up Residual defects Reliability Customer satisfaction Requirements Management Analysis status Specification progress Requirements volatility / completeness Construction Status of documents Change requests Review status Design progress of reqm Time to complete Test Test progress (defects, coverage, efficiency, stability - Residual defects - reliability Transition, deployment Field performance (failure, corrections) maintenance effort Maintenance effort Ref: Ebert and Dumke, 2007

50 Recommended books Practical Software Measurement: Objective Information for Decision Makers by John McGarry, David Card, Cheryl Jones and Beth Layman (Oct 27, 2001) Software Measurement: Establish - Extract - Evaluate – Execute by Christof Ebert, Reiner Dumke (2010)

51 References Fenton NE, Software Metrics: A Rigorous Approach, Chapman and Hall, 1991. Christof Ebert, Reiner Dumke, Software measurement: establish, extract, evaluate, execute, Springer 2007 Richard W. Selby, Northrop Grumman Space Technology, ICSP '09 Title: "Synthesis, Analysis, and Modeling of Large-Scale Mission-Critical Embedded Software Systems“ Measuring Agility, Peter Behrens [Nikora 91] Nikora, Allen P. Error Discovery Rate by Severity Category and Time to Repair Software Failures for Three JPL Flight Projects. Software Product Assurance Section, Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA , November 5, 1991. [Nikora 91] Nikora, Allen P. Error Discovery Rate by Severity Category and Time to Repair Software Failures for Three JPL Flight Projects. Software Product Assurance Section, Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA , November 5, 1991.


Download ppt "Software Metrics and Measurements"

Similar presentations


Ads by Google