Download presentation
Presentation is loading. Please wait.
1
USC CSSE Workshop Overview: Top 3 Software-Intensive Systems Risk Items Barry Boehm, USC-CSSE February 14, 2007 http://csse.usc.edu/BoehmsTop10/ http://csse.usc.edu/BoehmsTop10/ boehm@usc.edu
2
2 Outline: Top-3 SIS Risks Workshop Working group guidelines Risk survey results and survey update(?) The top three risks –Architecture complexity; system quality tradeoffs –Requirements volatility; rapid change –Acquisition and contracting process mismatches Architecture complexity and system quality tradeoffs –Architecture complexity phenomenology –Nature of system quality –Quality tradeoff perspectives
3
3 Working Group Guidelines Product: briefing, preferably with notes Topics should include: –Most critical success factors in each area –Current best practices for addressing them –Areas for further research Rated 0-10 on value and difficulty of research
4
4 Research Topics: Agile Methods 1.Relationship between plan driven and agility a.For individuals b.For organizations 2.Differences between agile and plan driven outcomes 3.Effect of Gurus 4.Mismatches between development approach and acquisition practices 5.How do you measure quality in an agile environment? 6.Data collection; agile experience base 7.Team of teams 8.Agile Development and Evolutionary Prototyping 9.Shared Code and/or module ownership 10.Architecture: when, how much, how to express 11.Lack of user consensus 12.Dynamic Homegrounds
5
5 SIS Risk Survey 2006: Statistics Number of Surveys: 25 Average Experience: ~28 years (6 years – 51 years) Area Distribution: –Software: 20 –Systems: 17 –Hardware: 0 Business Domain Distribution: –Aerospace: 18 –Software Infrastructure: 5 –Business: 4 –Telecom: 3 –Others: Secure Apps (1); Safety Critical Apps (1); C4ISR (1)
6
6 Risk Survey 2006: Nominees Acquisition and contracting process mismatches Architecture complexity; quality tradeoffs Budget and schedule constraints COTS and other independently evolving systems Customer-developer-user team cohesion Migration complexity Personnel shortfalls Process maturity Requirements mismatch Requirements volatility; rapid change Technology maturity User interface mismatch
7
7 Risk Survey 2006 Results
8
8 SIS Risk Grouping #Risk ItemΣRanks 1Architecture complexity, quality tradeoffs142 2Requirements volatility131.66 3Acquisition and contracting process mismatches130.5 4Customer-developer-user115.5 5Budget and schedule109.5 6Requirements mismatch100.33 7Personnel shortfalls99 8COTS77 9Migration complexity75.5 10User interface mismatch64.67 11Technology maturity58.5 12Process maturity46.83
9
9 Survey 2007: Early Statistics Number or Surveys: 41 Average Experience: ~27 years (6 years – 51 years) Area Distribution: –Software: 33 –Systems: 34 –Hardware: 0 Business Domain Distribution: –Aerospace: 32 –Software Infrastructure: 7 –Business: 6 –Telecom: 5 –Others: Secure Apps (1); Safety Critical Apps (1); C4ISR (1); Network and Protocols (1); Defense (1); Program and Risk Management (1)
10
10 Survey Results: 2006-2007
11
11 SIS Risk Grouping 2006-2007 #Risk Item Previou s Rank ΣRanks 1.Architecture complexity, quality tradeoffs↔1284.7 2.Requirements volatility↔2265.06 3. Acquisition and contracting process mismatches ↔3241.8 4.Budget and schedule↑5238.5 5.Customer-developer-user↓4206.2 6.Requirements mismatch↔6205.93 7.Personnel shortfalls↔7188.65 8.COTS↔8188.4 9.Technology maturity↑11158.03 10.Migration complexity↓9133.8 11.User interface mismatch↓10113.2 12.Process maturity↔12109.31
12
12 Outline: Top-3 SIS Risks Workshop Working group guidelines Risk survey results and survey update(?) The top three risks –Architecture complexity; system quality tradeoffs –Requirements volatility; rapid change –Acquisition and contracting process mismatches Architecture complexity and system quality tradeoffs –Architecture complexity phenomenology –Nature of system quality –Quality tradeoff perspectives
13
13 SIS Architecture Complexity: Future Combat Systems
14
14 Requirements Volatility: Ripple Effects of Changes - Breadth, Depth, and Length Platform N Platform 1 Infra C4ISR Command and Control Situation Assessment Info Fusion Sensor Data Management Sensor Data Integration Sensors Sensor Components : 2008 2010 2012 2014 2016 … 1.0 2.0 3.0 4.0 5.0 Breadth Length Depth DOTMLPF Legend: DOTMLPF Doctrine, Organization, Training, Materiel, Leadership, Personnel, Facilities C4ISR Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance
15
15 Average Change Processing Time: 2 Systems of Systems Average workdays to process changes
16
16 Acquisition/Contracting Mismatches: Fitness Landscapes Role of Fitness Landscapes in Complex Adaptive Systems (CAS) –S. Kauffman, At Home in the Universe, Oxford University Press, 1995 CSoS Acquisition Challenges –B. Boehm, “Some Future Trends and Implications for Systems and Software Engineering Processes”, Systems Engineering 9(1), 2006, pp. 1-19. A Candidate Three-Agent Acquisition Fitness Landscape –D. Reifer and B. Boehm, “Providing Incentives for Spiral Development: An Award Fee Plan”, Defense Acquisition Review 13(1), 2006, pp. 63-79.
17
17 Role of Fitness Landscapes in CAS Incentive structures for local behavior Induce global behavior via adaptation to change Fitness Landscape UniformRandom Survival- Related Global Result GridlockChaosEdge of Chaos Acronym (Metaphor) OWHITS (Ostriches with Heads in the Sand) TRAW (Turkeys Running Around Wild) NOSUFAS (No One-Size- Uniformly-Fits-All Solutions Acquisition Example MIL-STD-1521B Waterfall, Fixed Price, Build-to-Spec Recursive Acquisition Reform, Total Systems Performance Responsibility Candidate for Discussion: 3-Agent Model
18
18 Complex Systems Acquisition Challenges Objective Candidate Solution ExampleChallenges Avoid Obsolescence Plan-Driven Rapid Development 4-Hour HouseInflexible Adapt to Rapid Change Agile Methods Extreme Programming Unscalable; Buggy Releases Assure Resilience Independent V&VFormal MethodsExpensive
19
19 Candidate Approach: 3-Agent Model Agent ObjectiveAgent Approach Fitness Landscape/ Incentive Criteria* Build Current Increment Rapid, Stable, Schedule-As- Independent Variable (SAIV), Build to Specs and Plans Meet Milestones/Exercise SAIV; Deliver on Time; Collaboration with Other Agents Assure Resilience Integrated, Independent Verification and Validation Priority-Weighted Identification of Risks and Concerns; Collaboration with Other Agents Prepare for Build of Next Increment Observe, Orient, Decide on Proof-Carrying Rebaselined Specs and Plans Risk/Opportunity Management; Rebasline Proof Thoroughness; Collaboration with Other Agents
20
20 Risk-Driven Scalable Spiral Model: Increment View
21
21 Outline: Top-3 SIS Risks Workshop Working group guidelines Risk survey results and survey update(?) The top three risks –Architecture complexity; system quality tradeoffs –Requirements volatility; rapid change –Acquisition and contracting process mismatches Architecture complexity and system quality tradeoffs –Architecture complexity phenomenology –Nature of system quality –Quality tradeoff perspectives
22
22 Larger Systems Need More Architecting: COCOMO II Analysis Percent of Project Schedule Devoted to Initial Architecture and Risk Resolution Added Schedule Devoted to Rework (COCOMO II RESL factor) Total % Added Schedule 10000 KSLOC 100 KSLOC 10 KSLOC Sweet Spot Sweet Spot Drivers: Rapid Change: leftward High Assurance: rightward
23
23 Architecture-Breakers are the Biggest Source of Rework 0 10 20 30 40 50 60 70 80 90 100 0102030405060708090100 % of Software Problem Reports (SPR’s) TRW Project A 373 SPR’s TRW Project B 1005 SPR’s % of Cost to Fix SPR’s Major Rework Sources: Off-Nominal Architecture-Breakers A - Network Failover B - Extra-Long Messages
24
24 Best Architecture is a Discontinuous Function of Quality Level $100M $50M Arch. A: Custom many cache processors Arch. B: Modified Client-Server 12 3 4 5 Response Time (sec) Original Spec After Prototyping
25
25 The Nature of Quality: Participant Survey Which figure best symbolizes quality improvement?
26
Holistic Approach
27
Lean Approach
28
Analytic Approach
29
Preoccupation with Booze and Sex
30
30 There is No Universal Quality-Value Metric Different stakeholders rely on different value attributes –Protection: safety, security, privacy –Robustness: reliability, availability, survivability –Quality of Service: performance, accuracy, ease of use –Adaptability: evolvability, interoperability –Affordability: cost, schedule, reusability Value attributes continue to tier down –Performance: response time, resource consumption (CPU, memory, comm.) Value attributes are scenario-dependent –5 seconds normal response time; 2 seconds in crisis Value attributes often conflict –Most often with performance and affordability
31
31 Overview of Stakeholder/Value Dependencies Attributes Stakeholders ** * * * * * * ** * Protection Robustness Quality of Service Adaptability Affordability Developers, Acquirers Mission Controllers, Administrators Info. Consumers Info. Brokers Info. Suppliers, Dependents Strength of direct dependency on value attribute **- Critical ; *-Significant; blank-insignificant or indirect
32
32 Implications for Quality Engineering There is no universal quality metric to optimize Need to identify system’s success-critical stakeholders –And their quality priorities Need to balance satisfaction of stakeholder dependencies –Stakeholder win-win negotiation –Quality attribute tradeoff analysis Need value-of-quality models, methods, and tools
33
33 Tradeoffs Among Cost, Schedule, and Reliability: COCOMO II Want 10K hour MTBF within $5.5M, 20 months -- Cost/Schedule/RELY: “pick any two” points (RELY, MTBF (hours)) For 100-KSLOC set of features Can “pick all three” with 77-KSLOC set of features
34
34 Agenda : Wednesday, Feb 14 8:15 – 10:00 am: Architecture Complexity and Quality Tradeoffs; Elliot Axelband (RAND), Chair –Overview, Issues and Approaches; Barry Boehm (USC) –From Dependable Architectures To Dependable Systems; Nenad Medvidovic (USC) –Architecture Tradeoff Analysis: Towards a Disciplined Approach to Balancing Quality Requirements; Azad Madni (Intelligent Systems Technology) 10:00 – 10:30 am: Break 10:30 am – 12:30 pm: Requirements Volatility; George Friedman (USC), Chair –Process Synchronization and Stabilization; Rick Selby, Northrop Grumman –Disciplined Agility; Rich Turner (SSCI) –Using Anchor Point Milestones; Tom Schroeder, BAE Systems 12:30 – 1:30 pm: Lunch 1:30 – 3:30 pm Acquisition and Contracting Mismatches; Rick Selby (NGC), Chair –Acquisition Assessment Analyses; Kristen Baldwin (OSD/AT&T/S&SE) –Commercial Acquisition Practices; Stan Rifkin (Master Systems Inc.) –Space Program Acquisition: Systems Engineering & Programmatic Improvements; Marilee Wheaton (Aerospace Corporation) 3:30 – 4:00 pm: Break 4:00 – 5:00 pm: General Discussion: Working Group Formation; Barry Boehm, Chair
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.