Download presentation
Presentation is loading. Please wait.
Published byCurtis Henderson Modified over 9 years ago
1
University of Southern California Center for Software Engineering CSE USC 3/26/101 © 2005-2010 USC-CSSE Value-Based Software Engineering CS 577b Winsor Brown (Courtesy Ray Madachy, Barry Boehm, Keun Lee & Apurva Jain) March 26, 2010
2
University of Southern California Center for Software Engineering CSE USC 3/26/102 © 2005-2010 USC-CSSE Outline VBSE Refresher and Background Example: Value-Based Reviews Example: Value-Based Product and Process Modeling
3
University of Southern California Center for Software Engineering CSE USC 3/26/103 © 2005-2010 USC-CSSE A Short History of Software Processes - necessarily oversimplified
4
University of Southern California Center for Software Engineering CSE USC 3/26/104 © 2005-2010 USC-CSSE Software Engineering Is Not Well-Practiced Today -Standish Group CHAOS Report 1995 On-time On-budget Discontinued Averages 189% of original budget 221% of original schedule 61% of original functionality Seriously Overrun
5
University of Southern California Center for Software Engineering CSE USC 3/26/105 © 2005-2010 USC-CSSE Less Chaos Today -Standish Group CHAOS Report 2007 On-time On-budget 35% Averages 189% of original budget 221% of original schedule 61% of original functionality Discontinued Failures 19% Seriously Overrun Challenged 46%
6
University of Southern California Center for Software Engineering CSE USC 3/26/106 © 2005-2010 USC-CSSE Why Software Projects Fail
7
University of Southern California Center for Software Engineering CSE USC 3/26/107 © 2005-2010 USC-CSSE Software Testing Business Case Vendor proposition –Our test data generator will cut your test costs in half –We’ll provide it to you for 30% of your test costs –After you run all your tests for 50% of your original cost, you are 20% ahead Any concerns with vendor proposition? –Test data generator is value-neutral –Every test case, defect is equally important –Usually, 20% of test cases cover 80% of business case
8
University of Southern California Center for Software Engineering CSE USC 3/26/108 © 2005-2010 USC-CSSE 20% of Features Provide 80% of Value: Focus Testing on These (Bullock, 2000) % of Value for Correct Customer Billing Customer Type 100 80 60 40 20 51015 Automated test generation tool - all tests have equal value
9
University of Southern California Center for Software Engineering CSE USC 3/26/109 © 2005-2010 USC-CSSE Value-Based Testing Provides More Net Value Net Value NV 60 40 20 0 -20 20401006080 -40 (100, 20) Percent of tests run Test Data Generator Value-Based Testing (30, 58) % Tests Test Data GeneratorValue-Based Testing CostValueNVCostValueNV 0300-30000 103510-25105040 204020-20207555 304530-15308858 405040-10409454 …. 10080100+20100 0
10
University of Southern California Center for Software Engineering CSE USC 3/26/1010 © 2005-2010 USC-CSSE Motivation for Value-Based SE Current SE methods are basically value-neutral –Every requirement, use case, object, test case, and defect is equally important –Object oriented development is a logic exercise –“Earned Value” Systems don’t track business value –Separation of concerns: SE’s job is to turn requirements into verified code –Ethical concerns separated from daily practices Value-neutral SE methods are increasingly risky –Software decisions increasingly drive system value –Corporate adaptability to change achieved via software decisions –System value-domain problems are the chief sources of software project failures
11
University of Southern California Center for Software Engineering CSE USC 3/26/1011 © 2005-2010 USC-CSSE Key Definitions Value (from Latin “valere” – to be worth) 1.A fair or equivalent in goods, services, or money 2.The monetary worth of something 3.Relative worth, utility or importance Software validation (also from Latin “valere”) –Validation: Are we building the right product? –Verification: Are we building the product right?
12
University of Southern California Center for Software Engineering CSE USC 3/26/1012 © 2005-2010 USC-CSSE 7 Key Elements of VBSE 1.Benefits Realization Analysis 2.Stakeholders’ Value Proposition Elicitation and Reconciliation 3.Business Case Analysis 4.Continuous Risk and Opportunity Management 5.Concurrent System and Software Engineering 6.Value-Based Monitoring and Control 7.Change as Opportunity
13
University of Southern California Center for Software Engineering CSE USC 3/26/1013 © 2005-2010 USC-CSSE Maslow Human Need Hierarchy A. Maslow, Motivation and Personality, 1954. Self-Actualization Esteem and Autonomy Belongingness and love Safety and Security Physiological (Shelter, Food and Drink)
14
University of Southern California Center for Software Engineering CSE USC 3/26/1014 © 2005-2010 USC-CSSE M aslow Need Hierarchy Satisfied needs aren’t motivators Unsatisfied lower-level needs dominate higher-level needs Management implications – Create environment and subculture which satisfies lower-level needs Stability Shared values, community Match to special needs – Tailor project objectives, structure to participants’ self-actualization priorities
15
University of Southern California Center for Software Engineering CSE USC 3/26/1015 © 2005-2010 USC-CSSE People Self-Actualize in Different Ways Becoming a Better Manager Becoming a Better Technologist Helping Other Developers Helping Users Making People Happy Making People Unhappy Doing New Things Increasing Professional Stature
16
University of Southern California Center for Software Engineering CSE USC 3/26/1016 © 2005-2010 USC-CSSE Theory W: WinWin Achievement Theorem Making winners of your success-critical stakeholders requires: i.Identifying all of the success-critical stakeholders (SCSs). ii.Understanding how the SCSs want to win. iii.Having the SCSs negotiate a win-win set of product and process plans. iv.Controlling progress toward SCS win-win realization, including adaptation to change.
17
University of Southern California Center for Software Engineering CSE USC 3/26/1017 © 2005-2010 USC-CSSE VBSE Theory 4+1 Structure
18
University of Southern California Center for Software Engineering CSE USC 3/26/1018 © 2005-2010 USC-CSSE VBSE Component Theories Theory W (Stakeholder win-win) –Enterprise Success Theorem, Win-Win Achievement Theorem Dependency Theory (Product, process, people interdependencies) –Systems architecture/performance theory, costing and scheduling theory; organization theory Utility Theory –Utility functions, bounded rationality, Maslow need hierarchy, multi-attribute utility theory Decision Theory –Statistical decision theory, game theory, negotiation theory, theory of Justice Control Theory –Observability, predictability, controllability, stability theory
19
University of Southern California Center for Software Engineering CSE USC 3/26/1019 © 2005-2010 USC-CSSE Dependency Theory - Example
20
University of Southern California Center for Software Engineering CSE USC 3/26/1020 © 2005-2010 USC-CSSE Utility Theory - Example
21
University of Southern California Center for Software Engineering CSE USC 3/26/1021 © 2005-2010 USC-CSSE Decision Theory - Example
22
University of Southern California Center for Software Engineering CSE USC 3/26/1022 © 2005-2010 USC-CSSE Decision Theory – Example (2) [Apply Models to Predict Balance] - Risk Exposure (RE) due to inadequate plans - RE due to market share erosion - Sum of risk exposures
23
University of Southern California Center for Software Engineering CSE USC 3/26/1023 © 2005-2010 USC-CSSE Control Theory - Example Value Realization Feedback Control
24
University of Southern California Center for Software Engineering CSE USC 3/26/1024 © 2005-2010 USC-CSSE Example Project: Sierra Mountainbikes –Based on what would have worked on a similar project Quality leader in specialty area Competitively priced Major problems with order processing –Delivery delays and mistakes –Poor synchronization of order entry, confirmation, fulfillment –Disorganized responses to problem situations –Excess costs; low distributor satisfaction
25
University of Southern California Center for Software Engineering CSE USC 3/26/1025 © 2005-2010 USC-CSSE Order Processing Project Goals Goals: Improve profits, market share, customer satisfaction via improved order processing Questions: Current state? Root causes of problems? Keys to improvement? Metrics: Balanced Scorecard of benefits realized, proxies –Customer satisfaction ratings; key elements (ITV: in- transit visibility) –Overhead cost reduction –Actual vs. expected benefit and cost flows, ROI
26
University of Southern California Center for Software Engineering CSE USC 3/26/1026 © 2005-2010 USC-CSSE Expanded Order Processing System Benefits Chain
27
University of Southern California Center for Software Engineering CSE USC 3/26/1027 © 2005-2010 USC-CSSE Project Strategy and Partnerships Partner with eServices, Inc. for order processing and fulfillment system –Profit sharing using jointly-developed business case Partner with key distributors to provide user feedback –Evaluate prototypes, beta-test early versions, provide satisfaction ratings Incremental development using MBASE/RUP anchor points –Life Cycle Objectives; Architecture (LCO; LCA) –Core Capability Drivethrough (CCD) –Initial; Full Operational Capability (IOC; FOC) Architect for later supply chain extensions
28
University of Southern California Center for Software Engineering CSE USC 3/26/1028 © 2005-2010 USC-CSSE MilestoneDue DateBudget ($K)Cumulative Budget ($K) Inception Readiness1/1/200400 Life Cycle Objectives1/31/2004120 Life Cycle Architecture3/31/2004280400 Core Capability Drivethrough7/31/20046501050 Initial Oper. Capability: SW9/30/20043501400 Initial Oper. Capability: HW9/30/200421003500 Developed IOC12/31/20045004000 Responsive IOC3/31/20055004500 Full Oper. Cap’y CCD7/31/20057005200 FOC Beta9/30/20054005600 FOC Deployed12/31/20054006000 Annual Oper. & Maintenance3800 Annual O&M; Old System7600 Order Processing System Schedules and Budgets
29
University of Southern California Center for Software Engineering CSE USC 3/26/1029 © 2005-2010 USC-CSSE Order Processing System: Expected Benefits and Business Case
30
University of Southern California Center for Software Engineering CSE USC 3/26/1030 © 2005-2010 USC-CSSE A Real Earned Value System Current “earned value” systems monitor cost and schedule, not business value –Budgeted cost of work performed (“earned”) –Budgeted cost of work scheduled (“yearned”) –Actual costs vs. schedule (“burned”) A real earned value system monitors benefits realized –Financial benefits realized vs. cost (ROI) –Benefits realized vs. schedule - Including non-financial metrics –Actual costs vs. schedule
31
University of Southern California Center for Software Engineering CSE USC 3/26/1031 © 2005-2010 USC-CSSE Value-Based Expected/Actual Outcome Tracking Capability
32
University of Southern California Center for Software Engineering CSE USC 3/26/1032 © 2005-2010 USC-CSSE Conclusions So Far Value considerations are software success-critical “Success” is a function of key stakeholder values –Risky to exclude key stakeholders Values vary by stakeholder role Non-monetary values are important –Fairness, customer satisfaction, trust
33
University of Southern California Center for Software Engineering CSE USC 3/26/1033 © 2005-2010 USC-CSSE Initial VBSE Theory: 4+1 Process – With a great deal of concurrency and backtracking July 7, 2009© USC-CSSE33
34
University of Southern California Center for Software Engineering CSE USC 3/26/1034 © 2005-2010 USC-CSSE Outline VBSE Refresher and Background Example: Value-Based Reviews Example: Value-Based Product and Process Modeling
35
University of Southern California Center for Software Engineering CSE USC 3/26/1035 © 2005-2010 USC-CSSE Problem finding via peer reviews [Lee 2005] Hypothesis: Current value-neutral software peer reviews misallocate effort - Every requirement, use case, object, defect is equally important - Too much effort is spent on trivial issues - Current checklist, function, perspective, usage-based reviews are largely value-neutral
36
University of Southern California Center for Software Engineering CSE USC 3/26/1036 © 2005-2010 USC-CSSE Motivation: Value-neutral method vs. Value-based method 20 40 60 80 100 51015 Pareto 80-20 distribution of test value ATG – all tests have equal value Pareto testing (Actual business value) Automated test generation(ATG) vs. Pareto Testing Cumulative business value(%) Customer Billing Type
37
University of Southern California Center for Software Engineering CSE USC 3/26/1037 © 2005-2010 USC-CSSE Motivation: Value-neutral review vs. Value-based review Return of Investment (ROI) = (benefit – costs) / costs Assumptions - $1M of the development costs has been invested in the customer billing system by the beginning of reviewing. - The both review techniques will cost 200K. - The business case for the system will produce $4M in business value in return for the $2M investment cost. - The business case will provide a similar 80:20 distribution for the value- based review.
38
University of Southern California Center for Software Engineering CSE USC 3/26/1038 © 2005-2010 USC-CSSE Motivation: Value-neutral review vs. Value-based review
39
University of Southern California Center for Software Engineering CSE USC 3/26/1039 © 2005-2010 USC-CSSE Objective Adding values into reading process Focusing on higher- valued issues first Find more number of higher-valued issues Increase return value from reading Adding Priority & Criticality VBR process Increase impact of issues Increase Cost Effectiveness
40
University of Southern California Center for Software Engineering CSE USC 3/26/1040 © 2005-2010 USC-CSSE Objective Objectives: Develop and experiment with value-based peer review processes and checklists - Initial value-based checklists available as USC-CSE Tech Report - Experimentally applied across 28 remote IV&V reviewers
41
University of Southern California Center for Software Engineering CSE USC 3/26/1041 © 2005-2010 USC-CSSE Review Techniques DefinitionCharacteristicsStrengthsShortfalls (also generally value-neutral) Checklist Based Reading (CBR) Reading technique with checklist Common review technique used in the fields - Easy to apply - Checklist is helpful to focus what to do Weak mapping to artifacts reviewed Defect Based Reading (DBR) Reading guided by defect classes Proposed for requirement documents - Clearer focus for reviewers Often weak mapping to artifacts reviewed Perspective Based Reading (PBR) Reading guided by different reviewers perspective Different reviewers’ points of view - Clearer focus, less overlaps Less redundancy; little backup for less-effective reviewers. Functionality Based Reading (FBR) Reading guided by Functionality Types Function-oriented- Good for functional specifications Mismatches to object-oriented specifications Usage Based Reading (UBR) Reading guided by use cases Sometimes prioritized based on use cases - Very good for usage problems Weaker coverage of other problems Review/Reading Techniques
42
University of Southern California Center for Software Engineering CSE USC 3/26/1042 © 2005-2010 USC-CSSE Value-Based Review Concepts Priority The priority of the system capability in the artifact. In MBASE, the priority is determined from negotiations, meetings with clients and priorities indicated in the MBASE Guidelines. The values of priority are High, Medium, Low or 3, 2, 1. Higher priority capabilities will be reviewed first. The value will be used to calculate effectiveness metrics. Criticality Generally, the values of criticalities are given by SE experts, but IV&Vers can determine the values when better qualified. The values of criticality are High, Medium, Low or 3, 2, 1. Higher criticality issues will be reviewed first at a given priority level. The value will be used to calculate effectiveness metrics.
43
University of Southern California Center for Software Engineering CSE USC 3/26/1043 © 2005-2010 USC-CSSE Value-Based Review Process (II) Negotiation Meetings Developers Customers Users Other stakeholders Priorities of system capabilities Artifact-oriented checklist Criticalities of issues General Value- based checklist Domain Expert Priority High Medi um Low Critic ality High Medi um Low 1 2 3 4 5 optio nal 6 Reviewing Artifacts Number indicates the usual ordering of review* * May be more cost-effective to review highly-coupled mixed-priority artifacts.
44
University of Southern California Center for Software Engineering CSE USC 3/26/1044 © 2005-2010 USC-CSSE Value-Based Checklist (I) High-Criticality IssuesMedium-Criticality IssuesLow-Criticality Issues CompletenessCritical missing elements: backup/ recovery, external interfaces, success-critical stakeholders; critical exception handling, missing priorities Critical missing processes and tools; planning and preparation for major downstream tasks (development, integration, test, transition) Critical missing project assumptions (client responsiveness, COTS adequacy, needed resources) Medium-criticality missing elements, processes and tools: maintenance and diagnostic support; user help Medium-criticality exceptions and off- nominal conditions; smaller tasks (review, client demos), missing desired growth capabilities, workload characterization Easily-deferrable, low-impact missing elements: straightforward error messages, help messages, GUI details doable via GUI builder, project task sequence details Consistency/ Feasibility Critical elements in OCD, SSRD, SSAD, LCP not traceable to each other Critical inter-artifact inconsistencies: priorities, assumptions, input/output, preconditions/post-conditions Missing evidence of critical consistency/feasibility assurance in FRD Medium-criticality shortfalls in traceability, inter-artifact inconsistencies, evidence of consistency/feasibility in FRD Easily-deferrable, low-impact inconsistencies or inexplicit traceability: GUI details, report details, error messages, help messages, grammatical errors AmbiguityVaguely defined critical dependability capabilities: fault tolerance, graceful degradation, interoperability, safety, security, survivability Critical misleading ambiguities: stakeholder intent, acceptance criteria, critical user decision support, terminology Vaguely defined medium-criticality capabilities, test criteria Medium-criticality misleading ambiguities Non-misleading, easily deferrable, low- impact ambiguities: GUI details, report details, error messages, help messages, grammatical errors ConformanceLack of conformance with critical operational standards, external interfaces Lack of conformance with medium- criticality operational standards, external interfaces Misleading lack of conformance with document formatting standards, method and tool conventions Non-misleading lack of conformance with document formatting standards, method and tool conventions, optional or low-impact operational standards RiskMissing FRD evidence of critical capability feasibility: high-priority features, levels of service, budgets and schedules Critical risks in top-10 risk checklist: personnel, budgets and schedules, requirements, COTS, architecture, technology Missing FRD evidence of mitigation strategies for low-probability high-impact or high-probability, low-impact risks: unlikely disasters, off-line service delays, missing but easily-available information Missing FRD evidence of mitigation strategies for low-probability, low-impact risks
45
University of Southern California Center for Software Engineering CSE USC 3/26/1045 © 2005-2010 USC-CSSE Value-Based Checklist (II) – Example of General Value-Based Checklists Consistency/Feasibility High-Criticality Issues Consistency/Feasibility Low-Criticality Issues Critical elements in OCD, SSRD, SSAD, LCP not traceable to each other Critical inter-artifact inconsistencies: priorities, assumptions, input/output, preconditions/post-conditions Missing evidence of critical consistency/feasibility assurance in FRD Easily-deferrable, low-impact inconsistencies or inexplicit traceability: GUI details, report details, error messages, help messages, grammatical errors 7
46
University of Southern California Center for Software Engineering CSE USC 3/26/1046 © 2005-2010 USC-CSSE QuestionCriticality Are the system capabilities consistent with the system services provided as described in OCD 2.3? 3 Are there critical missing capabilities needed to perform the system services? 3 Are capabilities prioritized as High, Medium, or Low?3 Are capability priorities consistent with current system shortcoming priorities (OCD 3.3.5)? 3 Are capabilities traced back to corresponding project goals and constraints (OCD 4.2)? 3 Are simple lower-priority capabilities (e.g., login) described in less detail?2 Are there no levels of service goals (OCD 4.4) included as system capabilities? 2 Value-Based Checklist (III)
47
University of Southern California Center for Software Engineering CSE USC 3/26/1047 © 2005-2010 USC-CSSE Weight of Review Issues
48
University of Southern California Center for Software Engineering CSE USC 3/26/1048 © 2005-2010 USC-CSSE Experiment Overall (II) [577A IV&Ver’s 2004] IV&Vers are involved in real-client projects (e-services) in the course CSCI 577A 28 IV&Vers are randomly selected and divided into two groups A(15), B(13) Group A : Value-Based Review, Group B : Review with traditional checklist Trained Value-Based Review technique and review with traditional checklist separately. Reviewed three documents (directly related to development) OCD(Operational Concept Description ) SSRD(System and Software Requirement Description ) SSAD(System and Software Architecture Description) Hypotheses tested: No difference between Groups A and B in - Average number of Concerns and Problems - Average impact of Concerns and Problems - Average number of Concerns and Problems per effort hour - Average impact of Concerns and Problems per effort hour
49
University of Southern California Center for Software Engineering CSE USC 3/26/1049 © 2005-2010 USC-CSSE Independent Verification and Validation (IV&V) Developers IV&Vers OCD SSRD SSAD Checklist Training Group AGroup B VBRCBR Review Develop Concerns Problems identify filter for fix provide
50
University of Southern California Center for Software Engineering CSE USC 3/26/1050 © 2005-2010 USC-CSSE Result (V) – T-Test p-values and other By NumberP- value % Group A higher By ImpactP- value % Group A higher Average of Concerns 0.20234 Average Impact of Concerns 0.04965 Average of Problems 0.05651 Average Impact of Problems 0.01289 Average of Concerns per hour 0.02655 Average Cost Effectiveness of Concerns 0.004105 Average of Problems per hour 0.02361 Average Cost Effectiveness of Problems 0.007108 Statistically, group A performed the review value-effectively to find concerns and problems. Group B had significantly higher numbers of trivial concerns and problems found (typo and grammar faults)
51
University of Southern California Center for Software Engineering CSE USC 3/26/1051 © 2005-2010 USC-CSSE Result (VI-A) – Effort Comparison
52
University of Southern California Center for Software Engineering CSE USC 3/26/1052 © 2005-2010 USC-CSSE Number of concerns and problems found by IV&Vers
53
University of Southern California Center for Software Engineering CSE USC 3/26/1053 © 2005-2010 USC-CSSE Result (I-A) – Average Number of Concerns and Problems By Number P-value% Group A higher Average of Concerns 0.20234 Average of Problems 0.05651
54
University of Southern California Center for Software Engineering CSE USC 3/26/1054 © 2005-2010 USC-CSSE Result (II-A) – Average Impact of Concerns and Problems By ImpactP-value% Group A higher Average Impact of Concerns 0.02665 Average Impact of Problems 0.02389 Impact = issues (Artifact Priority) * (Issue Criticality)
55
University of Southern California Center for Software Engineering CSE USC 3/26/1055 © 2005-2010 USC-CSSE Conclusions of the experiment Conclusions: At least in this small-team, remote IV&V context, Value-based reviews had significantly higher payoff than value-neutral reviews With statistical significance for concerns and problems per hour, value impact, and value impact per hour VBR Required minimum effort comparing with CBR VBR checklists were helpful to understand and review artifacts.
56
University of Southern California Center for Software Engineering CSE USC 3/26/1056 © 2005-2010 USC-CSSE Outline VBSE Refresher and Background Example: Value-Based Reviews Example: Value-Based Product and Process Modeling
57
University of Southern California Center for Software Engineering CSE USC 3/26/1057 © 2005-2010 USC-CSSE Model Background [Madachy 2005] Purpose: Support software business decision-making by experimenting with product strategies and development practices to assess real earned value Description: System dynamics model relates the interactions between product specifications and investments, software processes including quality practices, market share, license retention, pricing and revenue generation for a commercial software enterprise
58
University of Southern California Center for Software Engineering CSE USC 3/26/1058 © 2005-2010 USC-CSSE Model Features A Value-Based Software Engineering (VBSE) model covering the following VBSE elements: –Stakeholders’ value proposition elicitation and reconciliation –Business case analysis –Value-based monitoring and control Integrated modeling of business value, software products and processes to help make difficult tradeoffs between perspectives –Value-based production functions used to relate different attributes Addresses the planning and control aspect of VBSE to manage the value delivered to stakeholders –Experiment with different strategies and track financial measures over time –Allows easy investigation of different strategy combinations Can be used dynamically before or during a project –User inputs and model factors can vary over the project duration as opposed to a static model –Suitable for actual project usage or “flight simulation” training where simulations are interrupted to make midstream decisions
59
University of Southern California Center for Software Engineering CSE USC 3/26/1059 © 2005-2010 USC-CSSE Model Sectors and Major Interfaces Software process and product sector computes the staffing and quality over time Market and sales sector accounts for market dynamics including effect of quality reputation Finance sector computes financial measures from investments and revenues
60
University of Southern California Center for Software Engineering CSE USC 3/26/1060 © 2005-2010 USC-CSSE Software Process and Product product defect flows effort and schedule calculation with dynamic COCOMO variant
61
University of Southern California Center for Software Engineering CSE USC 3/26/1061 © 2005-2010 USC-CSSE Finances, Market and Sales investment and revenue flows software license sales market share dynamics including quality reputation
62
University of Southern California Center for Software Engineering CSE USC 3/26/1062 © 2005-2010 USC-CSSE Quality Assumptions COCOMO cost driver Required Software Reliability is a proxy for all quality practices Resulting quality will modulate the actual sales relative to the highest potential Perception of quality in the market matters –Quality reputation quickly lost and takes much longer to regain (bad news travels fast) –Modeled as asymmetrical information smoothing via negative feedback loop
63
University of Southern California Center for Software Engineering CSE USC 3/26/1063 © 2005-2010 USC-CSSE Market Share Production Function and Feature Sets Cases from Example 1
64
University of Southern California Center for Software Engineering CSE USC 3/26/1064 © 2005-2010 USC-CSSE Sales Production Function & Reliability Cases from Example 1
65
University of Southern California Center for Software Engineering CSE USC 3/26/1065 © 2005-2010 USC-CSSE Example 1: Dynamically Changing Scope and Reliability Shows how model can assess the effects of combined strategies by varying the scope and required reliability independently or simultaneously Simulates midstream descoping, a frequent strategy to meet time constraints by shedding features Three cases are demonstrated: –Unperturbed reference case –Midstream descoping of the reference case after ½ year –Simultaneous midstream descoping and lowered required reliability at ½ year
66
University of Southern California Center for Software Engineering CSE USC 3/26/1066 © 2005-2010 USC-CSSE Control Panel and Simulation Results Unperturbed Reference Case Case 2 Case 1 Descope Descope + Lower Reliability
67
University of Southern California Center for Software Engineering CSE USC 3/26/1067 © 2005-2010 USC-CSSE Case Summaries CaseDelivered Size (Function Points) Delivered Reliability Setting Cost ($M) Delivery Time (Years) Final Market Share ROI Reference Case: Unperturbed 7001.04.782.128%1.3 Case 1: Descope at Time = ½ years 5501.03.701.728%2.2 Case 2: Descope and Lower Reliability at Time = ½ years 550.923.301.512%1.0
68
University of Southern California Center for Software Engineering CSE USC 3/26/1068 © 2005-2010 USC-CSSE Example 2: Determining the Reliability Sweet Spot Analysis process –Vary reliability across runs –Use risk exposure framework to find process optimum –Assess risk consequences of opposing trends: market delays and bad quality losses –Sum market losses and development costs –Calculate resulting net revenue Simulation parameters –A new 80 KSLOC product release can potentially increase market share by 15%-30% (varied in model runs) –75% schedule acceleration –Initial total market size = $64M annual revenue vendor has 15% of market overall market doubles in 5 years
69
University of Southern California Center for Software Engineering CSE USC 3/26/1069 © 2005-2010 USC-CSSE Cost Components 3-year time horizon
70
University of Southern California Center for Software Engineering CSE USC 3/26/1070 © 2005-2010 USC-CSSE Sweet Spot Depends on Time Horizon
71
University of Southern California Center for Software Engineering CSE USC 3/26/1071 © 2005-2010 USC-CSSE To achieve real earned value, business value attainment must be a key consideration when designing software products and processes Software enterprise decision-making can improve with information from simulation models that integrate business and technical perspectives Optimal policies operate within a multi-attribute decision space including various stakeholder value functions, opposing market factors and business constraints Risk exposure is a convenient framework for software decision analysis Commercial process sweet spots with respect to reliability are a balance between market delay losses and quality losses Model demonstrates a stakeholder value chain whereby the value of software to end-users ultimately translates into value for the software development organization Conclusions
72
University of Southern California Center for Software Engineering CSE USC 3/26/1072 © 2005-2010 USC-CSSE Future Work Enhance product defect model with dynamic version of COQUALMO to enable more constructive insight into quality practices Add maintenance and operational support activities in the workflows Elaborate market and sales for other considerations including –pricing scheme impacts, –varying market assumptions and –periodic upgrades of varying quality Account for feedback loops to generate product specifications (closed-loop control) –External feedback from users to incorporate new features –Internal feedback on product initiatives from organizational planning and control entity to the software process More empirical data on attribute relationships in the model will help identify areas of improvement Assessment of overall dynamics includes more collection and analysis of field data on business value and quality measures from actual software product rollouts
73
University of Southern California Center for Software Engineering CSE USC 3/26/1073 © 2005-2010 USC-CSSE C. Baldwin & K. Clark, Design Rules: The Power of Modularity, MIT Press, 1999. S. Biffl, A. Aurum, B. Boehm, H. Erdogmus, and P. Gruenbacher (eds.), Value-Based Software Engineering, Springer, 2005 (to appear). D. Blackwell and M. Girshick, Theory of Games and Statistical Decisions, Wiley, 1954. B. Boehm, C. Abts, A.W. Brown, S. Chulani, B. Clark, E. Horowitz, R. Madachy, D. Reifer, and B. Steece, Software Cost Estimation with COCOMO II, Prentice Hall, 2000. B. Boehm and L. Huang, “Value-Based Software Engineering: A Case Study, Computer, March 2003, pp. 33-41. B. Boehm, and R. Ross, Theory-W Software Project Management: Principles and Examples, IEEE Trans. SW Engineering., July 1989, pp. 902-916. W. Brogan, Modern Control Theory, Prentice Hall, 1974 (3 rd ed., 1991). P. Checkland, Systems Thinking, Systems Practice, Wiley, 1981. C. W. Churchman, R. Ackoff, and E. Arnoff, An Introduction to Operations Research, Wiley, 1957. R. M. Cyert and J.G. March, A Behavioral Theory of the Firm, Prentice Hall, 1963. K. Lee, “Development and Evaluation of Value-Based Review Methods”, USC-CSE, 2005 C. G. Hempel and P. Oppenheim, Problems of the Concept of General Law, in (eds.) A. Danto and S. Mogenbesser, Philosophy of Science, Meridian Books, 1960. References - I
74
University of Southern California Center for Software Engineering CSE USC 3/26/1074 © 2005-2010 USC-CSSE R. Kaplan & D. Norton, The Balanced Scorecard: Translating Strategy into Action, Harvard Business School Press, 1996. R. L. Keeney and H. Raiffa, Decisions with Multiple Objectives: Preferences and Value Tradeoffs, Cambridge University Press, 1976. R. Madachy, “Integrated Modeling of Business Value and Software Processes”, ProSim Workshop, 2005 R. Madachy, Software Process Dynamics, IEEE Computer Society, 2006 (to-be published) A. Maslow, Motivation and Personality, Harper, 1954. J. Rawls, A Theory of Justice, Belknap/Harvard U. Press, 1971, 1999. J. Thorp and DMR, The Information Paradox, McGraw Hill, 1998. R. J. Torraco, Theory-building research methods, in R. A. Swanson & E. F. Holton III (eds.), Human resource development handbook: Linking research and practice pp. 114–137, Berrett-Koehler, 1997. S. Toulmin, Cosmopolis: The Hidden Agenda of Modernity, U. of Chicago Press, 1992 reprint edition. J. von Neumann and O. Morgenstern, Theory of Games and Economic Behavior, Princeton University Press, 1944. A. W. Wymore, A Mathematical Theory of Systems Engineering: The Elements, Wiley, New York, 1967. References - II
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.