Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Southern California Center for Software Engineering CSE USC A Case for Anchor Point Milestones and Feasibility Rationales April 2005 Barry.

Similar presentations


Presentation on theme: "University of Southern California Center for Software Engineering CSE USC A Case for Anchor Point Milestones and Feasibility Rationales April 2005 Barry."— Presentation transcript:

1 University of Southern California Center for Software Engineering CSE USC A Case for Anchor Point Milestones and Feasibility Rationales April 2005 Barry Boehm, USC

2 University of Southern California Center for Software Engineering CSE USC 4/15/05© USC-CSE2 Use of APs and FRs Used to provide and independently review EVIDENCE of the feasibility of a project’s specifications and plans before proceeding into development. Originally developed to enhance the original spiral model –Avoid late discovery of infeasible requirements –Control concurrent engineering –Provide in-process stakeholder commitment milestones Can be approximated within traditional waterfall or V-models –By adding FRs to the content of SRRs, SFRs, and PDRs

3 University of Southern California Center for Software Engineering CSE USC 4/15/05© USC-CSE3 $100M $50M Required Architecture: Custom; many cache processors Original Architecture: Modified Client-Server 12 3 4 5 Response Time (sec) Original Spec After Prototyping Original Cost Problems Encountered without FR: 15-Month Architecture Rework Delay

4 University of Southern California Center for Software Engineering CSE USC 4/15/05© USC-CSE4 Problems Avoided with FR Attempt to validate 1-second KPP –Architecture analysis: needs expensive custom solution –Prototype: 4-seconds OK 90% of the time Negotiate KPP ranges –2 seconds desirable –4 seconds acceptable with some 2-second special cases Benchmark client-server to validate feasibility Present solution and feasibility rationale at anchor point milestone review –Result: Acceptable solution with minimal delay

5 University of Southern California Center for Software Engineering CSE USC 4/15/05© USC-CSE5 Key Point: Need to Show Evidence Not just traceability matrices and PowerPoint charts Evidence can include results of –Prototypes: networks, robots, user interfaces, COTS interoperability –Benchmarks: performance, scalability, accuracy –Exercises: mission performance, interoperability, security –Models: cost, schedule, performance, reliability; tradeoffs –Simulations: mission scalability, performance, reliability –Early working versions: infrastructure, data fusion, legacy compatibility –Combinations of the above Validated by independent experts –Realism of assumptions –Representativeness of scenarios –Thoroughness of analysis –Coverage of key off-nominal conditions

6 University of Southern California Center for Software Engineering CSE USC 4/15/05© USC-CSE6 0 10 20 30 40 50 60 70 80 90 100 0102030405060708090100 % of Software Problem Reports (SPR’s) TRW Project A 373 SPR’s TRW Project B 1005 SPR’s % of Cost to Fix SPR’s Major Rework Sources: Off-Nominal Architecture-Breakers A - Network Failover B - Extra-Long Messages Off-Nominal Architecture-Breakers

7 University of Southern California Center for Software Engineering CSE USC 4/15/05© USC-CSE7 Common Examples of Inadequate Evidence 1.Our engineers are tremendously creative. They will find a solution for this. 2.We have three algorithms that met the KPPs on small-scale nominal cases. At least one will scale up and handle the off-nominal cases. 3.We’ll build it and then tune it to satisfy the KPPs 4.The COTS vendor assures us that they will have a security-certified version by the time we need to deliver. 5.We have demonstrated solutions for each piece from our NASA, Navy, and Air Force programs. It’s a simple matter of integration to put them together.

8 University of Southern California Center for Software Engineering CSE USC 4/15/05© USC-CSE8 Examples of Making the Evidence Adequate 1.Have the creative engineers prototype and evaluate a solution on some key nominal and off- nominal scenarios. 2.Prototype and evaluate the three examples on some key nominal and off-nominal scenarios 3.Develop prototypes and/or simulations and exercise them to show that the architecture will not break while scaling up or handling off- nominal cases. 4.Conduct a scaled-down security evaluation of the current COTS product. Determine this and other vendors’ track records for getting certified in the available time. Investigate alternative solutions. 5.Have a tiger team prototype and evaluate the results of the simple matter of integration.

9 University of Southern California Center for Software Engineering CSE USC 4/15/05© USC-CSE9 Case Study: CCPDS-R Project Overview CharacteristicCCPDS-R Domain Ground based C3 development Size/language 1.15M SLOC Ada Average number of people 75 Schedule 75 months; 48-month IOC Process/standards DOD-STD-2167A Iterative development Rational host DEC host DEC VMS targets Contractor TRW Customer USAF Current status Delivered On-budget, On-schedule Environment RATIONAL S o f t w a r e C o r p o r a t I o n

10 University of Southern California Center for Software Engineering CSE USC 4/15/05© USC-CSE10 CCPDS-R Reinterpretation of SSR, PDR Development Life Cycle ConstructionElaboration Inception Competitive design phase: Architectural prototypes Planning Requirements analysis Contract award Architecture baseline under change control Early delivery of “alpha” capability to user Architecture IterationsRelease Iterations SSRIPDRPDRCDR 0 5 10 15 20 25 RATIONAL S o f t w a r e C o r p o r a t I o n (LCA) (LCO) High-risk prototypes Working Network OS with validated failover

11 University of Southern California Center for Software Engineering CSE USC 4/15/05© USC-CSE11 CCPDS-R Results: No Late 80-20 Rework  Architecture first -Integration during the design phase -Demonstration-based evaluation  Configuration baseline change metrics: RATIONAL S o f t w a r e C o r p o r a t I o n Project Development Schedule 152025303540 30 20 10 Design Changes Implementation Changes Maintenance Changes and ECP’s Hours Change  Risk Management

12 University of Southern California Center for Software Engineering CSE USC 4/15/05© USC-CSE12 AT&T Experience with AP Reviews

13 University of Southern California Center for Software Engineering CSE USC Backup Charts

14 University of Southern California Center for Software Engineering CSE USC 4/15/05© USC-CSE14 Spiral Anchor Points Enable Concurrent Engineering

15 University of Southern California Center for Software Engineering CSE USC 4/15/05© USC-CSE15 Operational Concept Elaboration of system objectives and scope by increment Elaboration of operational concept by increment System Prototype(s) Exercise range of usage scenarios Resolve major outstanding risks System Requirements Elaboration of functions, interfaces, quality attributes, and prototypes by increment - Identification of TBD’s (to be determined items) Stakeholders’ concurrence on their priority concerns System and Software Architecture Choice of architecture and elaboration by increment - Physical and logical components, connectors, configurations, constraints - COTS, reuse choices - Domain architecture and architectural style choices Architecture evolution parameters Need Concurrently Engineered Milestone Reviews Life Cycle Objectives (LCO); Life Cycle Architecture Package (LCA)

16 University of Southern California Center for Software Engineering CSE USC 4/15/05© USC-CSE16 Life-Cycle Plan Elaboration of WWWWWHH* for Initial Operational Capability (IOC) Partial elaboration, identification of key TBD’s for later increments Feasibility Rationale Assurance of consistency among elements above All major risks resolved or covered by risk management plan. *WWWWWHH: Why, What, When, Who, Where, How, How Much Need Concurrently Engineered Milestone Reviews Life Cycle Objectives (LCO); Life Cycle Architecture Package (LCA)

17 University of Southern California Center for Software Engineering CSE USC 4/15/05© USC-CSE17 LCO (MS A) and LCA (MS B) Pass/Fail Criteria A system built to the given architecture will –Support the operational concept –Satisfy the requirements –Be faithful to the prototype(s) –Be buildable within the budgets and schedules in the plan –Show a viable business case –Establish key stakeholders’ commitment to proceed LCO: True for at least one architecture LCA: True for the specific life cycle architecture; All major risks resolved or covered by a risk management plan

18 University of Southern California Center for Software Engineering CSE USC 4/15/05© USC-CSE18 B. Boehm, W. Hansen, “The Spiral Model as a Tool for Evolutionary Acquisition,” Cross Talk, May 2001. B. Boehm, D. Port, “Balancing Discipline and Flexibility with the Spiral Model and MBASE,” CrossTalk, December 2001, pp. 23-28. B. Boehm, D. Port, L. Huang, and W. Brown, “Using the Spiral Model and MBASE to Generate New Acquisition Process Models: SAIV/ CAIV, and SCQAIV,” CrossTalk, January 2002, pp. 20-25. D. Reifer and B. Boehm, “A Model Contract/Subcontract Award Fee Plan for Large, Change-Intensive Software Acquisitions,” USC-CSE Technical Report, April 2003. B. Boehm, A.W. Brown, V. Basili, and R. Turner, “Spiral Acquisition of Software- Intensive Systems of Systems,” Cross Talk, May 2004, pp. 4-9. J. Marenzano et. al., “Architecture Reviews: Practice and Experience,” in IEEE Software, March/April 2005. W. Royce, Software Project Management, Addison Wesley, 1998. MBASE web site : sunset.usc.edu/research/MBASE CrossTalk articles: www.stsc.hill.af.mil/crosstalk References


Download ppt "University of Southern California Center for Software Engineering CSE USC A Case for Anchor Point Milestones and Feasibility Rationales April 2005 Barry."

Similar presentations


Ads by Google