Presentation is loading. Please wait.

Presentation is loading. Please wait.

Measuring the Benefits of Mature Processes

Similar presentations


Presentation on theme: "Measuring the Benefits of Mature Processes"— Presentation transcript:

1 Measuring the Benefits of Mature Processes
20th International Forum on COCOMO and Software Cost Modeling 24 October 2005 Many organizations have implemented process improvement efforts (e.g., CMM, CMMI, ISO, Six Sigma), in the hopes of becoming better, cheaper, and faster. Published results on these efforts often contain quantifiable data regarding these benefits. These measures are useful in showing that process improvements will return enough money to justify the investment. Other organizations have achieved their maturity level goals, but appear to see little or no financial benefits. Are the published ROI results untrue, or is it a question of properly measuring the results? This presentation will discuss issues surrounding the measurement of process improvement results. Data and lessons learned on achieving measurable ROI will be presented. A framework for measuring the benefits will be outlined based on Cost of Quality concepts and existing industry models for ROI. Based on these principles, practical tips and techniques for realizing the qualitative and quantitative benefits offered by process improvement will be provided. Insights are based on the Northrop Grumman Mission Systems process improvement program, which has successfully implemented CMM, CMMI, ISO, and Six Sigma based improvements for the past 10 years. Seventeen Northrop Grumman organizations have achieved CMMI Level 5. Discussion Several authors have discussed the challenges in measuring the benefits of mature processes: In 1994, the Software Engineering Institute (Herbsleb et al. 1994) released a report called “Benefits of CMM-Based Process Improvement”. This report includes information from 13 organizations regarding the benefits of process improvement using the CMM for Software. The paper shows ROI ranging from 4:1 to 8.8:1. In 2003, the SEI (Goldenson and Gibson) presented preliminary ROI data from the adoption of CMMI. They summarized results from 12 organizations, categorized by cost, schedule, quality, and customer satisfaction. Sarah Sheard and Christopher Miller of the Software Productivity Consortium, in “The Shangri-La of ROI”, discuss the difficulty of accurately measuring ROI, and question its value in convincing process skeptics that process improvement is beneficial. The “Cost of Quality” accounting technique introduced in 1951 by Juran to provide ROI justification for process improvements. It was expanded in 1979 by Crosby to “get management’s attention and to provide a measurement base for seeing how quality improvement is doing.” Although used in manufacturing and services industries for controlling costs of quality initiatives and for identifying opportunities to reduce quality costs, it has seen limited adoption in the software and systems engineering community. Six Sigma, as described in numerous texts, explicitly focuses on the financial benefits of improvement. These and related work form the basis for the discussion. Rick Hefner, Ph.D. Northrop Grumman Corporation

2 Background Many organizations have implemented process improvement efforts (CMM, CMMI, ISO, Six Sigma) to become better, cheaper and faster Some organizations have not realized the quantitative or return- on-investment (ROI) benefits reported in the literature Are the literature claims of ROI true? Are their tricks for getting better ROI? What are the timelines for realizing these benefits? CMM® and CMMI® are registered trademarks of Carnegie Mellon University

3 Agenda Principles of process improvement Industry data on ROI
Issues surrounding the measurement of benefits Strategic actions needed to achieve maximum ROI Northrop Grumman lessons learned

4 Software Projects Have Historically Suffered from Mistakes
People-Related Mistakes 1. Undermined motivation 2. Weak personnel 3. Uncontrolled problem employees 4. Heroics 5. Adding people to a late project 6. Noisy, crowded offices 7. Friction between developers and customers 8. Unrealistic expectations 9. Lack of effective project sponsorship 10. Lack of stakeholder buy-in 11. Lack of user input 12. Politics placed over substance 13. Wishful thinking Process-Related Mistakes 14. Overly optimistic schedules 15. Insufficient Risk Management 16. Contractor failure Insufficient planning 17. Abandonment of planning under pressure 18. Wasted time during the fuzzy front end 19. Shortchanged upstream activities 20. Inadequate design 21. Shortchanged quality assurance 22. Insufficient management controls 23. Premature or too frequent convergence 25. Omitting necessary tasks from estimates 26. Planning to catch up later 27. Code-like-hell programming Product-Related Mistakes 28. Requirements gold-plating 29. Feature creep 30. Developer gold-plating 31. Push me, pull me negotiation 32. Research-oriented development Technology-Related Mistakes 33. Silver-bullet syndrome 34. Overestimated savings from new tools or methods 35. Switching tools in the middle of a project 36. Lack of automated source-code control Standish Group, 2003 survey of 13,000 projects 34% successes 15% failures 51% overruns Reference: Steve McConnell, Rapid Development

5 Many Approaches to Solving the Problem
Which weaknesses are causing my problems? Which strengths may mitigate my problems? Which improvement investments offer the best return? People Business Environment Management Structure Tools Product Methods One solution! Process Technology

6 Approaches to Process Improvement
Data-Driven (e.g., Six Sigma, Lean) Clarify what your customer wants (Voice of Customer) Critical to Quality (CTQs) Determine what your processes can do (Voice of Process) Statistical Process Control Identify and prioritize improvement opportunities Causal analysis of data Determine where your customers/competitors are going (Voice of Business) Design for Six Sigma Model-Driven (e.g., CMM, CMMI) Determine the industry best practice Benchmarking, models Compare your current practices to the model Appraisal, education Identify and prioritize improvement opportunities Implementation Institutionalization Look for ways to optimize the processes

7 Typical Data-Driven Results Cited in Literature
Year Revenue ($B) Invested ($B) Savings ($1B) % Revenue Motorola 356.9(e) ND 16 4.5 Allied Signal 1998 15.1 0.5 3.3 GE 1996 79.2 0.2 1997 90.8 0.4 1 1.1 100.5 1.3 1.2 1999 111.6 0.6 2 1.8 382.1 1.6 4.4 Honeywell 23.6 2.2 23.7 2.5 2000 25.0 0.7 2.6 72.3 2.4 Ford 43.9 2.3

8 Typical Model-Driven Results Cited in Literature
Category Range Yearly cost of process improvement activities $49K - $1,202K Years engaged in SPI 1 – 9 Cost of SPI per engineer $490 - $2,004 Productivity gain per year 9% - 67% Early detection gain per year (defects discovered pre-test) 6% - 25% Yearly reduction in time to market 15% - 23% Yearly reduction in post-release defects 10% - 94% Business value (ROI) 10:1 Defect Reduction 50% Productivity Increase Rework Drops From 54% to 4% More Accurate Estimates “Benefits of CMM-Based Process Improvement”, Herbsleb et al., Software Engineering Institute, 1994

9 Typical CMMI Benefits Cited in Literature
Reduced costs 33% decrease in the average cost to fix a defect (Boeing) 20% reduction in unit software costs (Lockheed Martin) Reduced cost of poor quality from over 45 percent to under 30 percent over a three year period (Siemens) Faster Schedules 50% reduction in release turnaround time (Boeing) 60% reduction in re-work following test (Boeing) Increase from 50% to 95% the number of milestones met (General Motors) Greater Productivity 25-30% increase in productivity within 3 years (Lockheed Martin, Harris, Siemens) Higher Quality 50% reduction of software defects (Lockheed Martin) Customer Satisfaction 55% increase in award fees (Lockheed Martin) “Demonstrating the Impact and Benefits of CMMI: An Update and Preliminary Results,” Software Engineering Institute, CMU/SEI-2003-SR-009, Oct 2003

10 The Knox Cost of Quality Model
Extension of the Cost of Quality model used in manufacturing Cost Category Definition Typical Costs for Software Conformance Appraisal Discovering the condition of the product Testing and associated activities, product quality audits Prevention Efforts to ensure product quality SQA administration, inspections, process improvements, metrics collection and analysis Non-conformance Internal failures Quality failures detected prior to product shipment Defect management, rework, retesting External failures Quality failures detected after product shipment Technical support, complaint investigation, defect notification Knox’s Theoretical Model for Cost of Software Quality (Digital Technical Journal, vol.5, No. 4., Fall 1993, Stephen T. Knox.)

11 Knox Model – Theoretical Benefits
COCOMO also predicts ~10% increase in productivity for each increase in CMMI Level

12 Why Measuring ROI is Hard
Savings Investment ROI = What do you count as “Investment”? Training? QA? Data gathering? New practices? What would we have done instead? What do you count as “Savings? Increased predictability – what’s the value? Increased productivity – who gets the benefit? Better competitive position – how measured? How do you measure the change? Multiple causes – awareness, knowledge, infrastructure Short-term vs. long-term – Hawthorne effect Over what time-frame? See also The Shangri-La of ROI”, Sarah Sheard and Christopher Miller, Software Productivity Consortium

13 Northrop Grumman Mission Systems Approach
Mission Success Requires Multiple Approaches Risk Management Systems Engineering Independent Reviews Training, Tools, & Templates Dashboards for Enterprise- Wide Measurement Communications & Best-Practice Sharing Robust Governance Model (Policies, Processes, Procedures) Program Effectiveness Mission Assurance Process Operations Effectiveness Effectiveness CMMI Level 5 for Software, Systems, and Services ISO 9001 and AS Certification Six Sigma

14 Process Effectiveness
Audits & Appraisals Staff Competence & Training 5 CMMI & Six Sigma courses Policies & processes course Standard Training Modules for each job function: engineering, project management, QA, CM, etc. 13 Northrop Grumman sites externally appraised at CMMI Level 5 Process Asset Library Communications & Collaboration Assuring mission success by making the people and processes more informed and effective

15 Program Effectiveness
Six Sigma connects process improvement and business value Six Sigma projects can help focus and measure CMMI-driven process improvements Identify the customer’s needs, maximize the value/cost Tools for management by variation (CMMI Levels 4 and 5) Results to date 4000 Green Belts, 200 Black Belts, 12 Master Black Belts 500 completed Six Sigma projects, 250 in progress Significant benefit to our customer – lower costs, better performance DEFINE MEASURE ANALYZE IMPROVE CONTROL Charter team, map process & specify CTQs Measure process performance Identify & quantify root causes Select, design & implement solution Institutionalize improvement, ongoing control Assuring mission success by identifying the customer’s needs and reducing defects

16 Operational Effectiveness
Tying it all together People Product Process Capability Data Repository Software, Hardware, Accounting Productivity, Defects, Maintenance Phase Relationships,Systems Engineering Functions Lessons Learned Parametric Modeling Expertise DOD Software Industry Expertise Risk and Predictive Modeling Analysis Certified Function Point Specialists Six Sigma Black Belts Presence Risk Analysis Cost Estimates Cost Estimation Relationships Program Benchmarking Life Cycle Productivity Analysis Software Sizing and Modeling Predictive Modeling Quantitative Management Professional Society Board Members Active on Government Working groups Key participants on Milestone reviews) Tools Commercial Modeling Tools Northrop Grumman Developed Tools Programs Process Structured Project Reporting, Training Standardization of Data, Metrics Manual, Approval CMMI Measurement, Monitor, Manage, Report, Update and Calibrate Assuring mission success by providing independent cost, schedule and risk realism

17 Lessons Learned Model-driven and data-driven process improvements compliment each other Model-driven improvements are difficult to measure precisely Long improvement cycles and broad focus make it difficult to isolate cause and effect Substantial anecdotal evidence of significant ROI Data-drive improvements are more easily measured Short improvement cycles, narrow focus Efforts concentrate data, measurement systems, tie improvements to business goals


Download ppt "Measuring the Benefits of Mature Processes"

Similar presentations


Ads by Google