University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE1 Barry Boehm, USC Motorola Quality Workshop September 15,

Slides:



Advertisements
Similar presentations
Ninth Lecture Hour 8:30 – 9:20 pm, Thursday, September 13
Advertisements

CHAPTER 8 PLANNING FOR IT SYSTEMS Knowing Where You‘re Going.
Tradespace, Affordability, and COCOMO III Barry Boehm, USC CSSE Annual Research Review 2014 April 30,
Computer Engineering 203 R Smith Project Tracking 12/ Project Tracking Why do we want to track a project? What is the projects MOV? – Why is tracking.
Using UML, Patterns, and Java Object-Oriented Software Engineering Royce’s Methodology Chapter 16, Royce’ Methodology.
Testing Without Executing the Code Pavlina Koleva Junior QA Engineer WinCore Telerik QA Academy Telerik QA Academy.
! Copyright © Insight Test Services 2008 Top 10 Tips Preparing a business case/plan for investment in Test.
University of Southern California Center for Software Engineering C S E USC 02/16/05©USC-CSE1 LiGuo Huang Computer Science Department.
Chapter 14 Assessing the Value of IT. Traditional Financial Approaches  ROI – Return on Investments Each area is considered an investment center ROI.
Chapter 3: The Internal Organization: Resources, Capabilities, Core Competencies and Competitive Advantages Overview: Importance of understanding internal.
Software Quality Engineering Roadmap
University of Southern California Center for Software Engineering C S E USC Barry Boehm, USC USC-CSE Executive Workshop March 15, 2006 Processes for Human.
Overview Lesson 10,11 - Software Quality Assurance
SE 450 Software Processes & Product Metrics Reliability: An Introduction.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 3/18/08 (Systems and) Software Process Dynamics Ray Madachy USC.
University of Southern California Center for Software Engineering C S E USC Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6, 6.5.
University of Southern California Center for Systems and Software Engineering Integrating Systems and Software Engineering (IS&SE) with the Incremental.
University of Southern California Center for Software Engineering C S E USC Barry Boehm, USC COCOMO/SCM Forum #16 October 24, 2001
University of Southern California Center for Software Engineering CSE USC ©USC-CSE Overview: USC Annual Research Review Barry Boehm, USC-CSE February.
SWE Introduction to Software Engineering
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Barry Boehm USC Center for Systems and Software Engineering.
VBSE Theory, and SimVBSE CSE, Annual Research Review Apurva Jain, Barry Boehm Version 1.0 (modified March 02, 2006)
University of Southern California Center for Software Engineering C S E USC Barry Boehm, USC CS 510 Lecture Fall 2011 Value-Based Software Engineering:
© USC-CSE1 Determine How Much Dependability is Enough: A Value-Based Approach LiGuo Huang, Barry Boehm University of Southern California.
© USC-CSE Feb Keun Lee ( & Sunita Chulani COQUALMO and Orthogonal Defect.
April 13, 2004CS WPI1 CS 562 Advanced SW Engineering General Dynamics, Needham Tuesdays, 3 – 7 pm Instructor: Diane Kramer.
University of Southern California Center for Software Engineering C S E USC Marilee Wheaton, USC CS 510 Lecture Fall 2010 Value-Based Software Engineering:
1 Computer Systems & Architecture Lesson 1 1. The Architecture Business Cycle.
University of Southern California Center for Software Engineering CSE USC ©USC-CSE 3/11/2002 Empirical Methods for Benchmarking High Dependability The.
University of Southern California Center for Software Engineering C S E USC Agile and Plan-Driven Methods Barry Boehm, USC USC-CSE Affiliates’ Workshop.
Chapter 5: Supply Chain Performance Measurement and Financial Analysis
Software Process and Product Metrics
12 Steps to Useful Software Metrics
University of Southern California Center for Systems and Software Engineering Improving Affordability via Value-Based Testing 27th International Forum.
University of Southern California Center for Software Engineering C S E USC ISERN 2005 November 15, 2005 Stefan Biffl, Aybuke Aurum, Rick Selby, Dan Port,
© Copyright High Performance Concepts, Inc. 12 Criteria for Software Vendor Selection July 14, 2014 prepared by: Brian Savoie Vice President HIGH.
Chapter1 Software Economics.
Achieving Better Reliability With Software Reliability Engineering Russel D’Souza Russel D’Souza.
1 Software Process Lecture Outline Nature of software projects Engineering approaches Software process A process step Characteristics of a good.
High Dependability Computing in a Competitive World
Architecture Business Cycle
Project Tracking. Questions... Why should we track a project that is underway? What aspects of a project need tracking?
University of Southern California Center for Systems and Software Engineering 10/30/2009 © 2009 USC CSSE1 July 2008©USC-CSSE1 The Incremental Commitment.
Feasibility Study.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 1: Software and Software Engineering.
Chapter – 9 Checkpoints of the process
IT Requirements Management Balancing Needs and Expectations.
1 SAIV/CAIV/SCQAIV LiGuo Huang USC University of Southern California Center for Software Engineering CSE USC.
CHECKPOINTS OF THE PROCESS Three sequences of project checkpoints are used to synchronize stakeholder expectations throughout the lifecycle: 1)Major milestones,
Supply Chain Doctors SCM Fundamentals Introduction Planning Sourcing Making Warehousing Transporting Sharpening the Saw.
© USC-CSE 2001 Oct Constructive Quality Model – Orthogonal Defect Classification (COQUALMO-ODC) Model Keun Lee (
University of Southern California Center for Software Engineering C S E USC Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6 Barry.
University of Palestine software engineering department Testing of Software Systems Testing throughout the software life cycle instructor: Tasneem.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 1: Software and Software Engineering.
University of Southern California Center for Software Engineering C S E USC Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6 Barry.
Assessing the influence on processes when evolving the software architecture By Larsson S, Wall A, Wallin P Parul Patel.
University of Southern California Center for Systems and Software Engineering Metrics Organizational Guidelines [1] ©USC-CSSE1 [1] Robert Grady, Practical.
Fifth Lecture Hour 9:30 – 10:20 am, September 9, 2001 Framework for a Software Management Process – Life Cycle Phases (Part II, Chapter 5 of Royce’ book)
Rational Unified Process (RUP) Process Meta-model Inception Phase These notes adopted and slightly modified from “RUP Made Easy”, provided by the IBM Academic.
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
SFWR ENG 3KO4 Slide 1 Management of Software Engineering Chapter 8: Fundamentals of Software Engineering C. Ghezzi, M. Jazayeri, D. Mandrioli.
University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop.
The COCOMO model An empirical model based on project experience. Well-documented, ‘independent’ model which is not tied to a specific software vendor.
University of Southern California Center for Systems and Software Engineering Aug. 26, 2010 © USC-CSE Page 1 A Winsor Brown CS 577a Lecture Fall.
Introduction to Supply Chain Management TEI, Larissa 2012.
Network Optimization Executive Seminar Track 1, Session A
Identify the Risk of Not Doing BA
The Economics of Systems and Software Reliability Assurance
Quality Management, Peer Review, & Architecture Review Board
Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6
Presentation transcript:

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE1 Barry Boehm, USC Motorola Quality Workshop September 15, 2005 ( The Economics of Software Quality

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE2 Outline The business case for software quality –Example: net value of test aids –Value depends on stakeholder value propositions What qualities do stakeholders really value? –Safety, accuracy, response time, … –Attributes often conflict Software quality in a competitive world Conclusions

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE3 Software Quality Business Case Software quality investments compete for resources –With investments in functionality, response time, adaptability, speed of development, … Each investment option generates curves of net value and ROI –Net Value NV = PV(benefit flows-cost flows) –Present Value PV = Flows at future times discounted –Return on Investment ROI = NV/PV(cost flows) Value of benefits varies by stakeholder and role

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE4 Software Testing Business Case Vendor proposition –Our test data generator will cut your test costs in half –We’ll provide it to you for 30% of your test costs –After you run all your tests for 50% of your original costs, you’re 20% ahead Any concerns with vendor proposition?

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE5 Software Testing Business Case Vendor proposition –Our test data generator will cut your test costs in half –We’ll provide it to you for 30% of your test costs –After you run all your tests for 50% of your original costs, you’re 20% ahead Any concerns with vendor proposition? –Test data generator is value-neutral* –Every test case, defect is equally important –Usually, 20% of test cases cover 80% of business value * As are most current software engineering techniques

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE6 20% of Features Provide 80% of Value: Focus Testing on These (Bullock, 2000) % of Value for Correct Customer Billing Customer Type Automated test generation tool - all tests have equal value

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE7 Value-Based Testing Provides More Net Value Net Value NV (100, 20) Percent of tests run Test Data Generator Value-Based Testing (30, 58) % TestsTest Data GeneratorValue-Based Testing CostValueNVCostValueNV …

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE8 There is No Universal Quality-Value Metric Different stakeholders rely on different value attributes –Protection: safety, security, privacy –Robustness: reliability, availability, survivability –Quality of Service: performance, accuracy, ease of use –Adaptability: evolvability, interoperability –Affordability: cost, schedule, reusability Value attributes continue to tier down –Performance: response time, resource consumption (CPU, memory, comm.) Value attributes are scenario-dependent –5 seconds normal response time; 2 seconds in crisis Value attributes often conflict –Most often with performance and affordability

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE9 Major Information System Quality Stakeholders Information Suppliers -citizens, companies Information Brokers Information Consumers -financial services, -decisions, education, Mission Controllers Developers, Acquirers, Administrators Dependents Information System - passengers, patients news media entertainment -pilots, distribution controllers

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE10 Overview of Stakeholder/Value Dependencies Strength of direct dependency on value attribute **- Critical ; *-Significant; blank-insignificant or indirect Attributes Stakeholders ** * * * * * * ** * Protection Robustness Quality of Service Adaptability Affordability Developers, Acquirers Mission Controllers, Administrators Info. Consumers Info. Brokers Info. Suppliers, Dependents

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE11 Stakeholders Elaboration of Stakeholder/Value Dependencies

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE12 Implications for Quality Engineering There is no universal quality metric to optimize Need to identify system’s success-critical stakeholders –And their quality priorities Need to balance satisfaction of stakeholder dependencies –Stakeholder win-win negotiation –Quality attribute tradeoff analysis Need value-of-quality models, methods, and tools

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE13 Outline The business case for software quality What qualities do stakeholders really value? Software quality in a competitive world –Is market success a monotone function of quality? –Is quality really free? –Is “faster, better, cheaper” really achievable? –Value-based vs. value-neutral methods Conclusions

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE14 Competing on Cost and Quality - adapted from Michael Porter, Harvard Too cheap (low Q) Cost leader (acceptable Q) Stuck in the middle Quality leader (acceptable C) Over- priced Price or Cost Point ROI

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE15 Competing on Schedule and Quality - A risk analysis approach Risk Exposure RE = Prob (Loss) * Size (Loss) –“Loss” – financial; reputation; future prospects, … For multiple sources of loss: sources RE =  [Prob (Loss) * Size (Loss)] source

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE16 Example RE Profile: Time to Ship - Loss due to unacceptable quality Time to Ship (amount of testing) RE = P(L) * S(L) Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE17 Example RE Profile: Time to Ship - Loss due to unacceptable quality - Loss due to market share erosion Time to Ship (amount of testing) RE = P(L) * S(L) Few rivals: low P(L) Weak rivals: low S(L) Many rivals: high P(L) Strong rivals: high S(L) Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE18 Example RE Profile: Time to Ship - Sum of Risk Exposures Time to Ship (amount of testing) RE = P(L) * S(L) Few rivals: low P(L) Weak rivals: low S(L) Many rivals: high P(L) Strong rivals: high S(L) Sweet Spot Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE19 Comparative RE Profile: Safety-Critical System Time to Ship (amount of testing) RE = P(L) * S(L) Mainstream Sweet Spot Higher S(L): defects High-Q Sweet Spot

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE20 Comparative RE Profile: Internet Startup Time to Ship (amount of testing) RE = P(L) * S(L) Mainstream Sweet Spot Higher S(L): delays Low-TTM Sweet Spot TTM: Time to Market

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE21 How much Architecting is Enough? - A COCOMOII Analysis Percent of Project Schedule Devoted to Initial Architecture and Risk Resolution Added Schedule Devoted to Rework (COCOMO II RESL factor) Total % Added Schedule KSLOC 100 KSLOC 10 KSLOC Sweet Spot Sweet Spot Drivers: Rapid Change: leftward High Assurance: rightward

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE22 Interim Conclusions Unwise to try to compete on both cost/schedule and quality –Some exceptions: major technology or marketplace edge There are no one-size-fits-all cost/schedule/quality strategies Risk analysis helps determine how much testing (prototyping, formal verification, etc.) is enough –Buying information to reduce risk Often difficult to determine parameter values –Some COCOMO II values discussed next

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE Slight inconvenience Low, easily recoverable loss Moderate recoverable loss High Financial Loss Loss of Human Life 1 month 1 day 2 years 100 years Defect RiskRough MTBF(mean time between failures) In-house support software 1.0 Relative Cost/Source Instruction Software Development Cost/Quality Tradeoff - COCOMO II calibration to 161 projects 1 hour High RELY Rating Very High Nominal Low Very Low

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE Slight inconvenience Low, easily recoverable loss Moderate recoverable loss High Financial Loss Loss of Human Life 1 month 1 day 2 years 100 years Defect RiskRough MTBF(mean time between failures) Commercial quality leader 1.10 In-house support software 1.0 Commercial cost leader 0.92 Relative Cost/Source Instruction Software Development Cost/Quality Tradeoff - COCOMO II calibration to 161 projects 1 hour High RELY Rating Very High Nominal Low Very Low

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE Slight inconvenience (1 hour) Low, easily recoverable loss Moderate recoverable loss High Financial Loss Loss of Human Life 1 month 1 day 2 years 100 years Defect RiskRough MTBF(mean time between failures) Commercial quality leader 1.10 In-house support software 1.0 Commercial cost leader Startup demo Safety-critical 1.26 Relative Cost/Source Instruction Software Development Cost/Quality Tradeoff - COCOMO II calibration to 161 projects High RELY Rating Very High Nominal Low Very Low

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE26 “Quality is Free” Did Philip Crosby’s book get it all wrong? Investments in dependable systems – Cost extra for simple, short-life systems – Pay off for high-value, long-life systems

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE27 Software Life-Cycle Cost vs. Quality 0.8 Very Low NominalHigh Very High Relative Cost to Develop COCOMO II RELY Rating

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE28 Software Life-Cycle Cost vs. Quality 0.8 Very Low NominalHigh Very High Relative Cost to Develop, Maintain COCOMO II RELY Rating

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE29 Software Life-Cycle Cost vs. Quality 0.8 Very Low NominalHigh Very High Relative Cost to Develop, Maintain COCOMO II RELY Rating % Maint

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE30 Software Ownership Cost vs. Quality 0.8 Very Low NominalHigh Very High Relative Cost to Develop, Maintain, Own and Operate COCOMO II RELY Rating % Maint VL = 2.55 L = 1.52 Operational-defect cost at Nominal dependability = Software life cycle cost Operational - defect cost = 0

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE31 Outline The business case for software quality What qualities do stakeholders really value? Software quality in a competitive world –Is market success a monotone function of quality? –Is quality really free? –Is “faster, better, cheaper” really achievable? –Value-based vs. value-neutral methods Conclusions

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE32 Cost, Schedule, Quality: Pick any Two? C QS

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE33 Cost, Schedule, Quality: Pick any Two? C QS C QS Consider C, S, Q as Independent Variable – Feature Set as Dependent Variable

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE34 C, S, Q as Independent Variable Determine Desired Delivered Defect Density (D4) –Or a value-based equivalent Prioritize desired features –Via QFD, IPT, stakeholder win-win Determine Core Capability –90% confidence of D4 within cost and schedule –Balance parametric models and expert judgment Architect for ease of adding next-priority features –Hide sources of change within modules (Parnas) Develop core capability to D4 quality level –Usually in less than available cost and schedule Add next priority features as resources permit Versions used successfully on 32 of 34 USC digital library projects

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE35 Value-Based vs. Value Neutral Methods Value-based defect reduction Constructive Quality Model (COQUALMO) Information Dependability Attribute Value Estimation (iDAVE) model

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE36 Value-Based Defect Reduction Example: Goal-Question-Metric (GQM) Approach Goal:Our supply chain software packages have too many defects. We need to get their defect rates down Question: ?

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE37 Value-Based GQM Approach – I Q: How do software defects affect system value goals? ask why initiative is needed -Order processing -Too much downtime on operations critical path -Too many defects in operational plans -Too many new-release operational problems G: New system-level goal: Decrease software-defect-related losses in operational effectiveness -With high-leverage problem areas above as specific subgoals New Q: ?

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE38 Value-Based GQM Approach – II New Q: Perform system problem-area root cause analysis: ask why problems are happening via models Example: Downtime on critical path Order Validate order Produce status reports Schedule packaging, delivery Validate items in stock items Prepare delivery packages Deliver order Where are primary software-defect-related delays? Where are biggest improvement-leverage areas? Reducing software defects in Scheduling module Reducing non-software order-validation delays Taking Status Reporting off the critical path Downstream, getting a new Web-based order entry system Ask “why not?” as well as “why?”

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE39 Value-Based GQM Results Defect tracking weighted by system-value priority –Focuses defect removal on highest-value effort Significantly higher effect on bottom-line business value –And on customer satisfaction levels Engages software engineers in system issues –Fits increasing system-criticality of software Strategies often helped by quantitative models –COQUALMO, iDAVE

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE40 COCOMO II Current COQUALMO System COQUALMO Defect Introduction Model Defect Removal Model Software platform, Project, product and personnel attributes Software Size Estimate Defect removal profile levels Automation, Reviews, Testing Software development effort, cost and schedule estimate Number of residual defects Defect density per unit of size

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE41 Defect Removal Rating Scales Highly advanced tools, model- based test More advance test tools, preparation. Dist-monitoring Well-defined test seq. and basic test coverage tool system Basic test Test criteria based on checklist Ad-hoc test and debug No testing Execution Testing and Tools Extensive review checklist Statistical control Root cause analysis, formal follow Using historical data Formal review roles and Well- trained people and basic checklist Well-defined preparation, review, minimal follow-up Ad-hoc informal walk-through No peer review Peer Reviews Formalized specification, verification. Advanced dist- processing More elaborate req./design Basic dist- processing Intermediate- level module Simple req./design Compiler extension Basic req. and design consistency Basic compiler capabilities Simple compiler syntax checking Automated Analysis Extra HighVery HighHighNominalLowVery Low COCOMO II p.263

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE42 Defect Removal Estimates - Nominal Defect Introduction Rates Delivered Defects / KSLOC Composite Defect Removal Rating

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE43 iDAVE Model

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE44 ROI Analysis Results Comparison

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE45 Conclusions: Software Quality (SWQ) There is no universal SWQ metric to optimize –Need to balance stakeholder SWQ value propositions Increasing need for value-based approaches to SWQ –Methods and models emerging to address needs “Faster, better, cheaper” is feasible –If feature content can be a dependent variable “Quality is free” for stable, high-value, long-life systems –But not for dynamic, lower-value, short life systems Future trends intensify SWQ needs and challenges –Criticality, complexity, decreased control, faster change