University of Southern California Center for Software Engineering CSE USC 3/26/101 © 2005-2010 USC-CSSE Value-Based Software Engineering CS 577b Winsor.

Slides:



Advertisements
Similar presentations
Course: e-Governance Project Lifecycle Day 1
Advertisements

Ninth Lecture Hour 8:30 – 9:20 pm, Thursday, September 13
Software Project Management
CSE 470 : Software Engineering The Software Process.
University of Southern California Center for Software Engineering C S E USC 02/16/05©USC-CSE1 LiGuo Huang Computer Science Department.
1 Independent Verification and Validation Current Status, Challenges, and Research Opportunities Dan McCaugherty IV&V Program Manager Titan Systems Corporation.
Rational Unified Process
University of Southern California Center for Software Engineering CSE USC System Dynamics Modeling of a Spiral Hybrid Process Ray Madachy, Barry Boehm,
Software Quality Engineering Roadmap
University of Southern California Center for Software Engineering CSE USC MBASE Essentials Planning and control Milestone content Process models Life cycle.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 3/18/08 (Systems and) Software Process Dynamics Ray Madachy USC.
SE 555 Software Requirements & Specification Requirements Management.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Barry Boehm USC Center for Systems and Software Engineering.
University of Southern California Center for Software Engineering C S E USC Barry Boehm, USC CS 510 Lecture Fall 2011 Value-Based Software Engineering:
2/13/07(c) USC-CSSE1 An Empirical Study on MBASE and LeanMBASE Supannika Koolmanojwong Center for Systems and Software Engineering CSSE- Annual Research.
VBSE Theory, and SimVBSE CSE, Annual Research Review Apurva Jain, Barry Boehm Version 1.0 (modified March 02, 2006)
Creating Architectural Descriptions. Outline Standardizing architectural descriptions: The IEEE has published, “Recommended Practice for Architectural.
© USC-CSE1 Determine How Much Dependability is Enough: A Value-Based Approach LiGuo Huang, Barry Boehm University of Southern California.
University of Southern California Center for Software Engineering C S E USC Barry Boehm, USC University of Idaho Talk April 23, 2010 Value-Based Software.
University of Southern California Center for Software Engineering C S E USC Marilee Wheaton, USC CS 510 Lecture Fall 2010 Value-Based Software Engineering:
1 Computer Systems & Architecture Lesson 1 1. The Architecture Business Cycle.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy USC Center for Systems and Software Engineering
The Software Product Life Cycle. Views of the Software Product Life Cycle  Management  Software engineering  Engineering design  Architectural design.
University of Southern California Center for Software Engineering C S E USC Agile and Plan-Driven Methods Barry Boehm, USC USC-CSE Affiliates’ Workshop.
Change Request Management
University of Southern California Center for Systems and Software Engineering Improving Affordability via Value-Based Testing 27th International Forum.
University of Southern California Center for Software Engineering C S E USC ISERN 2005 November 15, 2005 Stefan Biffl, Aybuke Aurum, Rick Selby, Dan Port,
S/W Project Management
Introduction to RUP Spring Sharif Univ. of Tech.2 Outlines What is RUP? RUP Phases –Inception –Elaboration –Construction –Transition.
RUP Fundamentals - Instructor Notes
Twelfth Lecture Hour 10:30 – 11:20 am, Saturday, September 15 Software Management Disciplines Project Organization and Responsibilities (from Part III,
CPIS 357 Software Quality & Testing
University of Southern California Center for Systems and Software Engineering Value-Based Software Engineering CS 577a Software Engineering I Barry Boehm.
CS 360 Lecture 3.  The software process is a structured set of activities required to develop a software system.  Fundamental Assumption:  Good software.
University of Southern California Center for Systems and Software Engineering 10/30/2009 © 2009 USC CSSE1 July 2008©USC-CSSE1 The Incremental Commitment.
Feasibility Study.
 CS 5380 Software Engineering Chapter 2 – Software Processes Chapter 2 Software Processes1.
MD Digital Government Summit, June 26, Maryland Project Management Oversight & System Development Life Cycle (SDLC) Robert Krauss MD Digital Government.
IT Requirements Management Balancing Needs and Expectations.
The Architecture Lecture September 2006 Cem Kaner CSE 1001.
Yazd University, Electrical and Computer Engineering Department Course Title: Advanced Software Engineering By: Mohammad Ali Zare Chahooki The Project.
CHECKPOINTS OF THE PROCESS Three sequences of project checkpoints are used to synchronize stakeholder expectations throughout the lifecycle: 1)Major milestones,
Assessing the influence on processes when evolving the software architecture By Larsson S, Wall A, Wallin P Parul Patel.
Notes of Rational Related cyt. 2 Outline 3 Capturing business requirements using use cases Practical principles  Find the right boundaries for your.
Microsoft Office Project 2003: Selling EPM in your Organization Matt Wilson Business Solutions Specialist LMR Solutions.
Fifth Lecture Hour 9:30 – 10:20 am, September 9, 2001 Framework for a Software Management Process – Life Cycle Phases (Part II, Chapter 5 of Royce’ book)
Historical Aspects Origin of software engineering –NATO study group coined the term in 1967 Software crisis –Low quality, schedule delay, and cost overrun.
Chapter 1: Fundamental of Testing Systems Testing & Evaluation (MNN1063)
Design, Development and Roll Out
Technology Commercialization Technology Commercialization, 2011 Sanjay Dhole, Technology Programs Coordinator Maricopa SBDC.
University of Southern California Center for Systems and Software Engineering Reducing Estimation Uncertainty with Continuous Assessment: Tracking the.
Overview of RUP Lunch and Learn. Overview of RUP © 2008 Cardinal Solutions Group 2 Welcome  Introductions  What is your experience with RUP  What is.
Yazd University, Electrical and Computer Engineering Department Course Title: Advanced Software Engineering By: Mohammad Ali Zare Chahooki The Project.
Ivar Jacobson, Grady Booch, and James Rumbaugh The Unified Software Development Process Addison Wesley, : James Rumbaugh's OOMD 1992: Ivar Jacobson's.
1 Value-Based Software Engineering II: Theory, Process, and Case Study LiGuo Huang Computer Science and Engineering Southern Methodist University.
CSE SW Metrics and Quality Engineering Copyright © , Dennis J. Frailey, All Rights Reserved CSE8314M13 8/20/2001Slide 1 SMU CSE 8314 /
Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M00 - Version 7.09 SMU CSE 8314 Software Measurement.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
University of Southern California Center for Systems and Software Engineering Aug. 26, 2010 © USC-CSE Page 1 A Winsor Brown CS 577a Lecture Fall.
University of Southern California Center for Systems and Software Engineering Reducing Estimation Uncertainty with Continuous Assessment Framework Pongtip.
RUP RATIONAL UNIFIED PROCESS Behnam Akbari 06 Oct
CS 389 – Software Engineering Lecture 2 – Part 2 Chapter 2 – Software Processes Adapted from: Chap 1. Sommerville 9 th ed. Chap 1. Pressman 6 th ed.
Change Request Management
CS 577b: Software Engineering II
Identify the Risk of Not Doing BA
Software Quality Engineering
Value-Based Software Engineering: Case Study and Value-Based Control
Chapter 2 – Software Processes
Value-Based Software Engineering II: Theory, Process, and Case Study
Presentation transcript:

University of Southern California Center for Software Engineering CSE USC 3/26/101 © USC-CSSE Value-Based Software Engineering CS 577b Winsor Brown (Courtesy Ray Madachy, Barry Boehm, Keun Lee & Apurva Jain) March 26, 2010

University of Southern California Center for Software Engineering CSE USC 3/26/102 © USC-CSSE Outline VBSE Refresher and Background Example: Value-Based Reviews Example: Value-Based Product and Process Modeling

University of Southern California Center for Software Engineering CSE USC 3/26/103 © USC-CSSE A Short History of Software Processes - necessarily oversimplified

University of Southern California Center for Software Engineering CSE USC 3/26/104 © USC-CSSE Software Engineering Is Not Well-Practiced Today -Standish Group CHAOS Report 1995 On-time On-budget Discontinued Averages 189% of original budget 221% of original schedule 61% of original functionality Seriously Overrun

University of Southern California Center for Software Engineering CSE USC 3/26/105 © USC-CSSE Less Chaos Today -Standish Group CHAOS Report 2007 On-time On-budget 35% Averages 189% of original budget 221% of original schedule 61% of original functionality Discontinued Failures 19% Seriously Overrun Challenged 46%

University of Southern California Center for Software Engineering CSE USC 3/26/106 © USC-CSSE Why Software Projects Fail

University of Southern California Center for Software Engineering CSE USC 3/26/107 © USC-CSSE Software Testing Business Case Vendor proposition –Our test data generator will cut your test costs in half –We’ll provide it to you for 30% of your test costs –After you run all your tests for 50% of your original cost, you are 20% ahead Any concerns with vendor proposition? –Test data generator is value-neutral –Every test case, defect is equally important –Usually, 20% of test cases cover 80% of business case

University of Southern California Center for Software Engineering CSE USC 3/26/108 © USC-CSSE 20% of Features Provide 80% of Value: Focus Testing on These (Bullock, 2000) % of Value for Correct Customer Billing Customer Type Automated test generation tool - all tests have equal value

University of Southern California Center for Software Engineering CSE USC 3/26/109 © USC-CSSE Value-Based Testing Provides More Net Value Net Value NV (100, 20) Percent of tests run Test Data Generator Value-Based Testing (30, 58) % Tests Test Data GeneratorValue-Based Testing CostValueNVCostValueNV …

University of Southern California Center for Software Engineering CSE USC 3/26/1010 © USC-CSSE Motivation for Value-Based SE Current SE methods are basically value-neutral –Every requirement, use case, object, test case, and defect is equally important –Object oriented development is a logic exercise –“Earned Value” Systems don’t track business value –Separation of concerns: SE’s job is to turn requirements into verified code –Ethical concerns separated from daily practices Value-neutral SE methods are increasingly risky –Software decisions increasingly drive system value –Corporate adaptability to change achieved via software decisions –System value-domain problems are the chief sources of software project failures

University of Southern California Center for Software Engineering CSE USC 3/26/1011 © USC-CSSE Key Definitions Value (from Latin “valere” – to be worth) 1.A fair or equivalent in goods, services, or money 2.The monetary worth of something 3.Relative worth, utility or importance Software validation (also from Latin “valere”) –Validation: Are we building the right product? –Verification: Are we building the product right?

University of Southern California Center for Software Engineering CSE USC 3/26/1012 © USC-CSSE 7 Key Elements of VBSE 1.Benefits Realization Analysis 2.Stakeholders’ Value Proposition Elicitation and Reconciliation 3.Business Case Analysis 4.Continuous Risk and Opportunity Management 5.Concurrent System and Software Engineering 6.Value-Based Monitoring and Control 7.Change as Opportunity

University of Southern California Center for Software Engineering CSE USC 3/26/1013 © USC-CSSE Maslow Human Need Hierarchy A. Maslow, Motivation and Personality, Self-Actualization Esteem and Autonomy Belongingness and love Safety and Security Physiological (Shelter, Food and Drink)

University of Southern California Center for Software Engineering CSE USC 3/26/1014 © USC-CSSE M aslow Need Hierarchy Satisfied needs aren’t motivators Unsatisfied lower-level needs dominate higher-level needs Management implications – Create environment and subculture which satisfies lower-level needs Stability Shared values, community Match to special needs – Tailor project objectives, structure to participants’ self-actualization priorities

University of Southern California Center for Software Engineering CSE USC 3/26/1015 © USC-CSSE People Self-Actualize in Different Ways Becoming a Better Manager Becoming a Better Technologist Helping Other Developers Helping Users Making People Happy Making People Unhappy Doing New Things Increasing Professional Stature

University of Southern California Center for Software Engineering CSE USC 3/26/1016 © USC-CSSE Theory W: WinWin Achievement Theorem Making winners of your success-critical stakeholders requires: i.Identifying all of the success-critical stakeholders (SCSs). ii.Understanding how the SCSs want to win. iii.Having the SCSs negotiate a win-win set of product and process plans. iv.Controlling progress toward SCS win-win realization, including adaptation to change.

University of Southern California Center for Software Engineering CSE USC 3/26/1017 © USC-CSSE VBSE Theory 4+1 Structure

University of Southern California Center for Software Engineering CSE USC 3/26/1018 © USC-CSSE VBSE Component Theories Theory W (Stakeholder win-win) –Enterprise Success Theorem, Win-Win Achievement Theorem Dependency Theory (Product, process, people interdependencies) –Systems architecture/performance theory, costing and scheduling theory; organization theory Utility Theory –Utility functions, bounded rationality, Maslow need hierarchy, multi-attribute utility theory Decision Theory –Statistical decision theory, game theory, negotiation theory, theory of Justice Control Theory –Observability, predictability, controllability, stability theory

University of Southern California Center for Software Engineering CSE USC 3/26/1019 © USC-CSSE Dependency Theory - Example

University of Southern California Center for Software Engineering CSE USC 3/26/1020 © USC-CSSE Utility Theory - Example

University of Southern California Center for Software Engineering CSE USC 3/26/1021 © USC-CSSE Decision Theory - Example

University of Southern California Center for Software Engineering CSE USC 3/26/1022 © USC-CSSE Decision Theory – Example (2) [Apply Models to Predict Balance] - Risk Exposure (RE) due to inadequate plans - RE due to market share erosion - Sum of risk exposures

University of Southern California Center for Software Engineering CSE USC 3/26/1023 © USC-CSSE Control Theory - Example Value Realization Feedback Control

University of Southern California Center for Software Engineering CSE USC 3/26/1024 © USC-CSSE Example Project: Sierra Mountainbikes –Based on what would have worked on a similar project Quality leader in specialty area Competitively priced Major problems with order processing –Delivery delays and mistakes –Poor synchronization of order entry, confirmation, fulfillment –Disorganized responses to problem situations –Excess costs; low distributor satisfaction

University of Southern California Center for Software Engineering CSE USC 3/26/1025 © USC-CSSE Order Processing Project Goals Goals: Improve profits, market share, customer satisfaction via improved order processing Questions: Current state? Root causes of problems? Keys to improvement? Metrics: Balanced Scorecard of benefits realized, proxies –Customer satisfaction ratings; key elements (ITV: in- transit visibility) –Overhead cost reduction –Actual vs. expected benefit and cost flows, ROI

University of Southern California Center for Software Engineering CSE USC 3/26/1026 © USC-CSSE Expanded Order Processing System Benefits Chain

University of Southern California Center for Software Engineering CSE USC 3/26/1027 © USC-CSSE Project Strategy and Partnerships Partner with eServices, Inc. for order processing and fulfillment system –Profit sharing using jointly-developed business case Partner with key distributors to provide user feedback –Evaluate prototypes, beta-test early versions, provide satisfaction ratings Incremental development using MBASE/RUP anchor points –Life Cycle Objectives; Architecture (LCO; LCA) –Core Capability Drivethrough (CCD) –Initial; Full Operational Capability (IOC; FOC) Architect for later supply chain extensions

University of Southern California Center for Software Engineering CSE USC 3/26/1028 © USC-CSSE MilestoneDue DateBudget ($K)Cumulative Budget ($K) Inception Readiness1/1/ Life Cycle Objectives1/31/ Life Cycle Architecture3/31/ Core Capability Drivethrough7/31/ Initial Oper. Capability: SW9/30/ Initial Oper. Capability: HW9/30/ Developed IOC12/31/ Responsive IOC3/31/ Full Oper. Cap’y CCD7/31/ FOC Beta9/30/ FOC Deployed12/31/ Annual Oper. & Maintenance3800 Annual O&M; Old System7600 Order Processing System Schedules and Budgets

University of Southern California Center for Software Engineering CSE USC 3/26/1029 © USC-CSSE Order Processing System: Expected Benefits and Business Case

University of Southern California Center for Software Engineering CSE USC 3/26/1030 © USC-CSSE A Real Earned Value System Current “earned value” systems monitor cost and schedule, not business value –Budgeted cost of work performed (“earned”) –Budgeted cost of work scheduled (“yearned”) –Actual costs vs. schedule (“burned”) A real earned value system monitors benefits realized –Financial benefits realized vs. cost (ROI) –Benefits realized vs. schedule - Including non-financial metrics –Actual costs vs. schedule

University of Southern California Center for Software Engineering CSE USC 3/26/1031 © USC-CSSE Value-Based Expected/Actual Outcome Tracking Capability

University of Southern California Center for Software Engineering CSE USC 3/26/1032 © USC-CSSE Conclusions So Far Value considerations are software success-critical “Success” is a function of key stakeholder values –Risky to exclude key stakeholders Values vary by stakeholder role Non-monetary values are important –Fairness, customer satisfaction, trust

University of Southern California Center for Software Engineering CSE USC 3/26/1033 © USC-CSSE Initial VBSE Theory: 4+1 Process – With a great deal of concurrency and backtracking July 7, 2009© USC-CSSE33

University of Southern California Center for Software Engineering CSE USC 3/26/1034 © USC-CSSE Outline VBSE Refresher and Background Example: Value-Based Reviews Example: Value-Based Product and Process Modeling

University of Southern California Center for Software Engineering CSE USC 3/26/1035 © USC-CSSE Problem finding via peer reviews [Lee 2005] Hypothesis: Current value-neutral software peer reviews misallocate effort - Every requirement, use case, object, defect is equally important - Too much effort is spent on trivial issues - Current checklist, function, perspective, usage-based reviews are largely value-neutral

University of Southern California Center for Software Engineering CSE USC 3/26/1036 © USC-CSSE Motivation: Value-neutral method vs. Value-based method Pareto distribution of test value ATG – all tests have equal value Pareto testing (Actual business value) Automated test generation(ATG) vs. Pareto Testing Cumulative business value(%) Customer Billing Type

University of Southern California Center for Software Engineering CSE USC 3/26/1037 © USC-CSSE Motivation: Value-neutral review vs. Value-based review Return of Investment (ROI) = (benefit – costs) / costs Assumptions - $1M of the development costs has been invested in the customer billing system by the beginning of reviewing. - The both review techniques will cost 200K. - The business case for the system will produce $4M in business value in return for the $2M investment cost. - The business case will provide a similar 80:20 distribution for the value- based review.

University of Southern California Center for Software Engineering CSE USC 3/26/1038 © USC-CSSE Motivation: Value-neutral review vs. Value-based review

University of Southern California Center for Software Engineering CSE USC 3/26/1039 © USC-CSSE Objective Adding values into reading process Focusing on higher- valued issues first Find more number of higher-valued issues Increase return value from reading Adding Priority & Criticality VBR process Increase impact of issues Increase Cost Effectiveness

University of Southern California Center for Software Engineering CSE USC 3/26/1040 © USC-CSSE Objective Objectives: Develop and experiment with value-based peer review processes and checklists - Initial value-based checklists available as USC-CSE Tech Report - Experimentally applied across 28 remote IV&V reviewers

University of Southern California Center for Software Engineering CSE USC 3/26/1041 © USC-CSSE Review Techniques DefinitionCharacteristicsStrengthsShortfalls (also generally value-neutral) Checklist Based Reading (CBR) Reading technique with checklist Common review technique used in the fields - Easy to apply - Checklist is helpful to focus what to do Weak mapping to artifacts reviewed Defect Based Reading (DBR) Reading guided by defect classes Proposed for requirement documents - Clearer focus for reviewers Often weak mapping to artifacts reviewed Perspective Based Reading (PBR) Reading guided by different reviewers perspective Different reviewers’ points of view - Clearer focus, less overlaps Less redundancy; little backup for less-effective reviewers. Functionality Based Reading (FBR) Reading guided by Functionality Types Function-oriented- Good for functional specifications Mismatches to object-oriented specifications Usage Based Reading (UBR) Reading guided by use cases Sometimes prioritized based on use cases - Very good for usage problems Weaker coverage of other problems Review/Reading Techniques

University of Southern California Center for Software Engineering CSE USC 3/26/1042 © USC-CSSE Value-Based Review Concepts Priority The priority of the system capability in the artifact. In MBASE, the priority is determined from negotiations, meetings with clients and priorities indicated in the MBASE Guidelines. The values of priority are High, Medium, Low or 3, 2, 1. Higher priority capabilities will be reviewed first. The value will be used to calculate effectiveness metrics. Criticality Generally, the values of criticalities are given by SE experts, but IV&Vers can determine the values when better qualified. The values of criticality are High, Medium, Low or 3, 2, 1. Higher criticality issues will be reviewed first at a given priority level. The value will be used to calculate effectiveness metrics.

University of Southern California Center for Software Engineering CSE USC 3/26/1043 © USC-CSSE Value-Based Review Process (II) Negotiation Meetings Developers Customers Users Other stakeholders Priorities of system capabilities Artifact-oriented checklist Criticalities of issues General Value- based checklist Domain Expert Priority High Medi um Low Critic ality High Medi um Low optio nal 6 Reviewing Artifacts Number indicates the usual ordering of review* * May be more cost-effective to review highly-coupled mixed-priority artifacts.

University of Southern California Center for Software Engineering CSE USC 3/26/1044 © USC-CSSE Value-Based Checklist (I) High-Criticality IssuesMedium-Criticality IssuesLow-Criticality Issues CompletenessCritical missing elements: backup/ recovery, external interfaces, success-critical stakeholders; critical exception handling, missing priorities Critical missing processes and tools; planning and preparation for major downstream tasks (development, integration, test, transition) Critical missing project assumptions (client responsiveness, COTS adequacy, needed resources) Medium-criticality missing elements, processes and tools: maintenance and diagnostic support; user help Medium-criticality exceptions and off- nominal conditions; smaller tasks (review, client demos), missing desired growth capabilities, workload characterization Easily-deferrable, low-impact missing elements: straightforward error messages, help messages, GUI details doable via GUI builder, project task sequence details Consistency/ Feasibility Critical elements in OCD, SSRD, SSAD, LCP not traceable to each other Critical inter-artifact inconsistencies: priorities, assumptions, input/output, preconditions/post-conditions Missing evidence of critical consistency/feasibility assurance in FRD Medium-criticality shortfalls in traceability, inter-artifact inconsistencies, evidence of consistency/feasibility in FRD Easily-deferrable, low-impact inconsistencies or inexplicit traceability: GUI details, report details, error messages, help messages, grammatical errors AmbiguityVaguely defined critical dependability capabilities: fault tolerance, graceful degradation, interoperability, safety, security, survivability Critical misleading ambiguities: stakeholder intent, acceptance criteria, critical user decision support, terminology Vaguely defined medium-criticality capabilities, test criteria Medium-criticality misleading ambiguities Non-misleading, easily deferrable, low- impact ambiguities: GUI details, report details, error messages, help messages, grammatical errors ConformanceLack of conformance with critical operational standards, external interfaces Lack of conformance with medium- criticality operational standards, external interfaces Misleading lack of conformance with document formatting standards, method and tool conventions Non-misleading lack of conformance with document formatting standards, method and tool conventions, optional or low-impact operational standards RiskMissing FRD evidence of critical capability feasibility: high-priority features, levels of service, budgets and schedules Critical risks in top-10 risk checklist: personnel, budgets and schedules, requirements, COTS, architecture, technology Missing FRD evidence of mitigation strategies for low-probability high-impact or high-probability, low-impact risks: unlikely disasters, off-line service delays, missing but easily-available information Missing FRD evidence of mitigation strategies for low-probability, low-impact risks

University of Southern California Center for Software Engineering CSE USC 3/26/1045 © USC-CSSE Value-Based Checklist (II) – Example of General Value-Based Checklists Consistency/Feasibility High-Criticality Issues Consistency/Feasibility Low-Criticality Issues Critical elements in OCD, SSRD, SSAD, LCP not traceable to each other Critical inter-artifact inconsistencies: priorities, assumptions, input/output, preconditions/post-conditions Missing evidence of critical consistency/feasibility assurance in FRD Easily-deferrable, low-impact inconsistencies or inexplicit traceability: GUI details, report details, error messages, help messages, grammatical errors 7

University of Southern California Center for Software Engineering CSE USC 3/26/1046 © USC-CSSE QuestionCriticality Are the system capabilities consistent with the system services provided as described in OCD 2.3? 3 Are there critical missing capabilities needed to perform the system services? 3 Are capabilities prioritized as High, Medium, or Low?3 Are capability priorities consistent with current system shortcoming priorities (OCD 3.3.5)? 3 Are capabilities traced back to corresponding project goals and constraints (OCD 4.2)? 3 Are simple lower-priority capabilities (e.g., login) described in less detail?2 Are there no levels of service goals (OCD 4.4) included as system capabilities? 2 Value-Based Checklist (III)

University of Southern California Center for Software Engineering CSE USC 3/26/1047 © USC-CSSE Weight of Review Issues

University of Southern California Center for Software Engineering CSE USC 3/26/1048 © USC-CSSE Experiment Overall (II) [577A IV&Ver’s 2004] IV&Vers are involved in real-client projects (e-services) in the course CSCI 577A 28 IV&Vers are randomly selected and divided into two groups A(15), B(13) Group A : Value-Based Review, Group B : Review with traditional checklist Trained Value-Based Review technique and review with traditional checklist separately. Reviewed three documents (directly related to development) OCD(Operational Concept Description ) SSRD(System and Software Requirement Description ) SSAD(System and Software Architecture Description) Hypotheses tested: No difference between Groups A and B in - Average number of Concerns and Problems - Average impact of Concerns and Problems - Average number of Concerns and Problems per effort hour - Average impact of Concerns and Problems per effort hour

University of Southern California Center for Software Engineering CSE USC 3/26/1049 © USC-CSSE Independent Verification and Validation (IV&V) Developers IV&Vers OCD SSRD SSAD Checklist Training Group AGroup B VBRCBR Review Develop Concerns Problems identify filter for fix provide

University of Southern California Center for Software Engineering CSE USC 3/26/1050 © USC-CSSE Result (V) – T-Test p-values and other By NumberP- value % Group A higher By ImpactP- value % Group A higher Average of Concerns Average Impact of Concerns Average of Problems Average Impact of Problems Average of Concerns per hour Average Cost Effectiveness of Concerns Average of Problems per hour Average Cost Effectiveness of Problems Statistically, group A performed the review value-effectively to find concerns and problems. Group B had significantly higher numbers of trivial concerns and problems found (typo and grammar faults)

University of Southern California Center for Software Engineering CSE USC 3/26/1051 © USC-CSSE Result (VI-A) – Effort Comparison

University of Southern California Center for Software Engineering CSE USC 3/26/1052 © USC-CSSE Number of concerns and problems found by IV&Vers

University of Southern California Center for Software Engineering CSE USC 3/26/1053 © USC-CSSE Result (I-A) – Average Number of Concerns and Problems By Number P-value% Group A higher Average of Concerns Average of Problems

University of Southern California Center for Software Engineering CSE USC 3/26/1054 © USC-CSSE Result (II-A) – Average Impact of Concerns and Problems By ImpactP-value% Group A higher Average Impact of Concerns Average Impact of Problems Impact = issues (Artifact Priority) * (Issue Criticality)

University of Southern California Center for Software Engineering CSE USC 3/26/1055 © USC-CSSE Conclusions of the experiment Conclusions: At least in this small-team, remote IV&V context, Value-based reviews had significantly higher payoff than value-neutral reviews With statistical significance for concerns and problems per hour, value impact, and value impact per hour VBR Required minimum effort comparing with CBR VBR checklists were helpful to understand and review artifacts.

University of Southern California Center for Software Engineering CSE USC 3/26/1056 © USC-CSSE Outline VBSE Refresher and Background Example: Value-Based Reviews Example: Value-Based Product and Process Modeling

University of Southern California Center for Software Engineering CSE USC 3/26/1057 © USC-CSSE Model Background [Madachy 2005] Purpose: Support software business decision-making by experimenting with product strategies and development practices to assess real earned value Description: System dynamics model relates the interactions between product specifications and investments, software processes including quality practices, market share, license retention, pricing and revenue generation for a commercial software enterprise

University of Southern California Center for Software Engineering CSE USC 3/26/1058 © USC-CSSE Model Features A Value-Based Software Engineering (VBSE) model covering the following VBSE elements: –Stakeholders’ value proposition elicitation and reconciliation –Business case analysis –Value-based monitoring and control Integrated modeling of business value, software products and processes to help make difficult tradeoffs between perspectives –Value-based production functions used to relate different attributes Addresses the planning and control aspect of VBSE to manage the value delivered to stakeholders –Experiment with different strategies and track financial measures over time –Allows easy investigation of different strategy combinations Can be used dynamically before or during a project –User inputs and model factors can vary over the project duration as opposed to a static model –Suitable for actual project usage or “flight simulation” training where simulations are interrupted to make midstream decisions

University of Southern California Center for Software Engineering CSE USC 3/26/1059 © USC-CSSE Model Sectors and Major Interfaces Software process and product sector computes the staffing and quality over time Market and sales sector accounts for market dynamics including effect of quality reputation Finance sector computes financial measures from investments and revenues

University of Southern California Center for Software Engineering CSE USC 3/26/1060 © USC-CSSE Software Process and Product product defect flows effort and schedule calculation with dynamic COCOMO variant

University of Southern California Center for Software Engineering CSE USC 3/26/1061 © USC-CSSE Finances, Market and Sales investment and revenue flows software license sales market share dynamics including quality reputation

University of Southern California Center for Software Engineering CSE USC 3/26/1062 © USC-CSSE Quality Assumptions COCOMO cost driver Required Software Reliability is a proxy for all quality practices Resulting quality will modulate the actual sales relative to the highest potential Perception of quality in the market matters –Quality reputation quickly lost and takes much longer to regain (bad news travels fast) –Modeled as asymmetrical information smoothing via negative feedback loop

University of Southern California Center for Software Engineering CSE USC 3/26/1063 © USC-CSSE Market Share Production Function and Feature Sets Cases from Example 1

University of Southern California Center for Software Engineering CSE USC 3/26/1064 © USC-CSSE Sales Production Function & Reliability Cases from Example 1

University of Southern California Center for Software Engineering CSE USC 3/26/1065 © USC-CSSE Example 1: Dynamically Changing Scope and Reliability Shows how model can assess the effects of combined strategies by varying the scope and required reliability independently or simultaneously Simulates midstream descoping, a frequent strategy to meet time constraints by shedding features Three cases are demonstrated: –Unperturbed reference case –Midstream descoping of the reference case after ½ year –Simultaneous midstream descoping and lowered required reliability at ½ year

University of Southern California Center for Software Engineering CSE USC 3/26/1066 © USC-CSSE Control Panel and Simulation Results Unperturbed Reference Case Case 2 Case 1 Descope Descope + Lower Reliability

University of Southern California Center for Software Engineering CSE USC 3/26/1067 © USC-CSSE Case Summaries CaseDelivered Size (Function Points) Delivered Reliability Setting Cost ($M) Delivery Time (Years) Final Market Share ROI Reference Case: Unperturbed %1.3 Case 1: Descope at Time = ½ years %2.2 Case 2: Descope and Lower Reliability at Time = ½ years %1.0

University of Southern California Center for Software Engineering CSE USC 3/26/1068 © USC-CSSE Example 2: Determining the Reliability Sweet Spot Analysis process –Vary reliability across runs –Use risk exposure framework to find process optimum –Assess risk consequences of opposing trends: market delays and bad quality losses –Sum market losses and development costs –Calculate resulting net revenue Simulation parameters –A new 80 KSLOC product release can potentially increase market share by 15%-30% (varied in model runs) –75% schedule acceleration –Initial total market size = $64M annual revenue vendor has 15% of market overall market doubles in 5 years

University of Southern California Center for Software Engineering CSE USC 3/26/1069 © USC-CSSE Cost Components 3-year time horizon

University of Southern California Center for Software Engineering CSE USC 3/26/1070 © USC-CSSE Sweet Spot Depends on Time Horizon

University of Southern California Center for Software Engineering CSE USC 3/26/1071 © USC-CSSE To achieve real earned value, business value attainment must be a key consideration when designing software products and processes Software enterprise decision-making can improve with information from simulation models that integrate business and technical perspectives Optimal policies operate within a multi-attribute decision space including various stakeholder value functions, opposing market factors and business constraints Risk exposure is a convenient framework for software decision analysis Commercial process sweet spots with respect to reliability are a balance between market delay losses and quality losses Model demonstrates a stakeholder value chain whereby the value of software to end-users ultimately translates into value for the software development organization Conclusions

University of Southern California Center for Software Engineering CSE USC 3/26/1072 © USC-CSSE Future Work Enhance product defect model with dynamic version of COQUALMO to enable more constructive insight into quality practices Add maintenance and operational support activities in the workflows Elaborate market and sales for other considerations including –pricing scheme impacts, –varying market assumptions and –periodic upgrades of varying quality Account for feedback loops to generate product specifications (closed-loop control) –External feedback from users to incorporate new features –Internal feedback on product initiatives from organizational planning and control entity to the software process More empirical data on attribute relationships in the model will help identify areas of improvement Assessment of overall dynamics includes more collection and analysis of field data on business value and quality measures from actual software product rollouts

University of Southern California Center for Software Engineering CSE USC 3/26/1073 © USC-CSSE C. Baldwin & K. Clark, Design Rules: The Power of Modularity, MIT Press, S. Biffl, A. Aurum, B. Boehm, H. Erdogmus, and P. Gruenbacher (eds.), Value-Based Software Engineering, Springer, 2005 (to appear). D. Blackwell and M. Girshick, Theory of Games and Statistical Decisions, Wiley, B. Boehm, C. Abts, A.W. Brown, S. Chulani, B. Clark, E. Horowitz, R. Madachy, D. Reifer, and B. Steece, Software Cost Estimation with COCOMO II, Prentice Hall, B. Boehm and L. Huang, “Value-Based Software Engineering: A Case Study, Computer, March 2003, pp B. Boehm, and R. Ross, Theory-W Software Project Management: Principles and Examples, IEEE Trans. SW Engineering., July 1989, pp W. Brogan, Modern Control Theory, Prentice Hall, 1974 (3 rd ed., 1991). P. Checkland, Systems Thinking, Systems Practice, Wiley, C. W. Churchman, R. Ackoff, and E. Arnoff, An Introduction to Operations Research, Wiley, R. M. Cyert and J.G. March, A Behavioral Theory of the Firm, Prentice Hall, K. Lee, “Development and Evaluation of Value-Based Review Methods”, USC-CSE, 2005 C. G. Hempel and P. Oppenheim, Problems of the Concept of General Law, in (eds.) A. Danto and S. Mogenbesser, Philosophy of Science, Meridian Books, References - I

University of Southern California Center for Software Engineering CSE USC 3/26/1074 © USC-CSSE R. Kaplan & D. Norton, The Balanced Scorecard: Translating Strategy into Action, Harvard Business School Press, R. L. Keeney and H. Raiffa, Decisions with Multiple Objectives: Preferences and Value Tradeoffs, Cambridge University Press, R. Madachy, “Integrated Modeling of Business Value and Software Processes”, ProSim Workshop, 2005 R. Madachy, Software Process Dynamics, IEEE Computer Society, 2006 (to-be published) A. Maslow, Motivation and Personality, Harper, J. Rawls, A Theory of Justice, Belknap/Harvard U. Press, 1971, J. Thorp and DMR, The Information Paradox, McGraw Hill, R. J. Torraco, Theory-building research methods, in R. A. Swanson & E. F. Holton III (eds.), Human resource development handbook: Linking research and practice pp. 114–137, Berrett-Koehler, S. Toulmin, Cosmopolis: The Hidden Agenda of Modernity, U. of Chicago Press, 1992 reprint edition. J. von Neumann and O. Morgenstern, Theory of Games and Economic Behavior, Princeton University Press, A. W. Wymore, A Mathematical Theory of Systems Engineering: The Elements, Wiley, New York, References - II