Download presentation
Presentation is loading. Please wait.
1
Measurement in Practice
2
Siemens Experience Application - systems software - size: 10,000 to 5 million lines of code - 3000 software engineers/support specialists
3
Siemens Experience Development process Customer Installation Pilot Installation & testInstallation & Maintenance System test Product testQuality Control Functional test Component test Coding Code Design Component DesignDetail Design and Implementation Detail project plan Interface Design Functional Design Solution study Requirement studyPlanning & high level design Process stepsProcess Phase
4
Siemens experience Metrics Used - measure the quality of products - productivity of development and maintenance process - profitability of development and maintenance process - measures of current development provide indications of action required to keep a project on plan
5
Siemens experience Metrics Used - measures of completed projects accumulate to a wealth of empirical knowledge - useful for estimating development costs, expected volume of defects for new projects, and for improving development process
6
Siemens experience Primary data for calculating metrics - problem reports and defect counted during implementation, during quality control, during pilot test, during first year after customer installation, and total. Defects are defined as problem reports that have been approved as real faults - development costs and staff-month - maintenance costs
7
Siemens experience Primary data for calculating metrics - turnover (sales) - product size in gross lines of code (BLOC), net lines of code without comments and blank lines (NLOC), newly developed or changed net lines of code as compared to a prior release (DLOC)
8
Siemens experience Quality metrics - QKZ1 = the number of defects counted during code review, component and functional test divided by KDLOC - QKZ2 = the number of defects counted during quality control divided by KDLOC - QKZ3 = the number of defects counted during pilot test divided by KDLOC - QKZ4 = the number of defects counted during the first year after customer installation divided by KDLOC
9
Siemens experience Quality metrics - QKZ5 = the total number of defects received per fiscal year from the field after customer installation divided by KNLOC for the latest version in the field - QKZ6 = the total number of field problem reports received after customer installation divided by the number of defects identified for every fiscal year
10
Siemens experience Productivity metrics - PKZ1 = the development costs divided by KDLOC - PKZ3 = the maintenance costs divided by the number of defects after customer delivery for every fiscal year and for each product line - PKZ4 = the total gross lines of code (KBLOC) delivered to customers divided by the total staff-months expended for system software development for every fiscal year - PKZ5 = KDLOC divided by the development effort in staff-months - PKZ6 = KDLOC divided by the development time in months
11
Siemens experience Profitability metrics - WKZ1 = turnover divided by the software development costs for every fiscal year for each product line - WKZ3 = turnover divided by the total costs for software development, maintenance, and marketing for every fiscal year for each product line
12
Siemens experience Quality Improvement techniques - Quality : the most important factor for success - quality deputy - quality first (slogan) - quality before deadline before functionality - quality from the beginning - Reviews - development document control - formal inspection
13
Siemens experience Quality Improvement techniques - Quality reporting - qualification - motivation - component ownership
14
Siemens experience Benefits - Improved quality culture - better process and product quality - improved customer satisfaction - model for other siemens businesses - business growth and profitability
15
Data Logic Experience Projects Overview - 500 staff - development of software for large organizations - periodically analyses output from termination reports - 19 projects analysed, ranging in effort from a 93 staff- day consultancy study to a 36 staff-year coding and testing development - average effort app. 5 staff-year
16
Data Logic Experience Development Process - Systems Development Life-Cycle > functional requirement specification > functional specification > Design specification (including program spec) > Code and Unit testing > Systems Testing > Implementation
17
Data Logic Experience Metrics Used - Staff-days of effort originally planned for completion of the project, by phase - staff-days of effort estimated during the progression of the project for implementing agreed changes to functionality (system variation requests) - Actual staff-days of effort expended against the original plan - actual staff-days of effort spent implementing SVRs
18
Data Logic Experience Metrics Used - Sizing metrics (KLOC), thousands of lines of uncommented procedural code measured at the point of handover from unit testing to systems testing - the number of software difficulty reports identified, resolved, and outstanding at weekly intervals throughout the systems testing and implementation phase - the level of criticality of software difficulty reports (critical,serious,minor,cosmetic) - the number of programs, modules.
19
Data Logic Experience benefits - since most of the process was automated, I found that it took very little extra effort - the metrics on the rate of clearing SDRs helped me convince the client of the need for more resources
20
Metrics Program Approach Business Objectives Quality Improvement Goals Metrics Measure Progress Identify &Implement Development Process Improvement Actions
21
Benefits Improved product quality Increased development team productivity Better project planning and estimation Better project management Company quality culture Increased customer satisfaction Increased visibility of the software development process
22
Lesson learned 60 out of 300 US companies were succesful (Rubin, 1990) Failure because of: - no clear definition of the purpose and later saw it as irrelevant - resistance from professionals (negative commentary on performance) - burden by extensive data-collection req. and cumbersome procedures - failed to get action from management - management withdrew support
23
Lesson learned Grady and Caswell at HP list steps to success 1. define objectives for the program 2. assign responsibilities to each activity 3. do research 4. define initial metrics to collect 5. sell the initial collection 6. get tools for automatic data collection and analysis 7. Establish a training class on sw measurement 8. Publicize success stories and exchange ideas 9. create metrics database 10. establish mechanism to change standard in an orderly way
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.