Download presentation
Presentation is loading. Please wait.
Published byBrittany McCoy Modified over 9 years ago
1
University of Southern California Center for Systems and Software Engineering Feasibility Evidence Description (FED) Barry Boehm, USC CS 577a Lecture Fall 2014
2
University of Southern California Center for Systems and Software Engineering Summary Schedule-based and event-based reviews are risk-prone Evidence-based processes enable early risk resolution –They require more up-front systems engineering effort –They have a high ROI for high-risk projects –They synchronize and stabilize concurrent engineering –The evidence becomes a first-class deliverable It requires planning and earned value management They can be added to traditional review processes 07/09/2010©USC-CSSE2
3
University of Southern California Center for Systems and Software Engineering Types of Milestone Reviews Schedule-based reviews (contract-driven) –We’ll hold the PDR on April 1 whether we have a design or not –High probability of proceeding into a Death March Event-based reviews (artifact-driven) –The design will be done by June 1, so we’ll have the review then –Large “Death by PowerPoint and UML” event Hard to avoid proceeding with many unresolved risks and interfaces Evidence-based commitment reviews (risk-driven) –Evidence provided in Feasibility Evidence Description (FED) A first-class deliverable –Shortfalls in evidence are uncertainties and risks –Should be covered by risk mitigation plans –Stakeholders decide to commit based on risks of going forward 07/09/2010©USC-CSSE3
4
University of Southern California Center for Systems and Software Engineering 07/09/2010©USC-CSSE4 Nature of FEDs and Anchor Point Milestones Evidence provided by developer and validated by independent experts that: If the system is built to the specified architecture, it will –Satisfy the specified operational concept and requirements Capability, interfaces, level of service, and evolution –Be buildable within the budgets and schedules in the plan –Generate a viable return on investment –Generate satisfactory outcomes for all of the success-critical stakeholders Shortfalls in evidence are uncertainties and risks –Should be resolved or covered by risk management plans Assessed in increasing detail at major anchor point milestones –Serves as basis for stakeholders’ commitment to proceed –Serves to synchronize and stabilize concurrently engineered elements Can be used to strengthen current schedule- or event-based reviews
5
University of Southern California Center for Systems and Software Engineering 07/09/2010©USC-CSSE5 Nature of Feasibility Evidence Not just traceability matrices and PowerPoint charts Evidence can include results of –Prototypes: of networks, robots, user interfaces, COTS interoperability –Benchmarks: for performance, scalability, accuracy –Exercises: for mission performance, interoperability, security –Models: for cost, schedule, performance, reliability; tradeoffs –Simulations: for mission scalability, performance, reliability –Early working versions: of infrastructure, data fusion, legacy compatibility –Previous experience –Combinations of the above Validated by independent experts –Realism of assumptions –Representativeness of scenarios –Thoroughness of analysis –Coverage of key off-nominal conditions
6
University of Southern California Center for Systems and Software Engineering 07/09/2010©USC-CSSE6 Common Examples of Inadequate Evidence 1.Our engineers are tremendously creative. They will find a solution for this. 2.We have three algorithms that met the KPPs on small-scale nominal cases. At least one will scale up and handle the off- nominal cases. 3.We’ll build it and then tune it to satisfy the KPPs 4.The COTS vendor assures us that they will have a security- certified version by the time we need to deliver. 5.We have demonstrated solutions for each piece from our NASA, Navy, and Air Force programs. It’s a simple matter of integration to put them together.
7
University of Southern California Center for Systems and Software Engineering 07/09/2010©USC-CSSE7 Examples of Making the Evidence Adequate 1.Have the creative engineers prototype and evaluate a solution on some key nominal and off-nominal scenarios. 2.Prototype and evaluate the three examples on some key nominal and off-nominal scenarios 3.Develop prototypes and/or simulations and exercise them to show that the architecture will not break while scaling up or handling off-nominal cases. 4.Conduct a scaled-down security evaluation of the current COTS product. Determine this and other vendors’ track records for getting certified in the available time. Investigate alternative solutions. 5.Have a tiger team prototype and evaluate the results of the simple matter of integration.
8
University of Southern California Center for Systems and Software Engineering Feasibility Analysis in 577 Provide Feasibility Evidence Description ascertaining: –Business Feasibility: Perform Cost vs. Benefits analysis to determine Return on Investment (ROI) –Technology Feasibility Architectural Feasibility: –Level of Service Feasibility – Capability Feasibility –Evolutionary Feasibility NDI/NCS Interoperability –Process Feasibility: Why follow a particular process and how does it help with execution? –Schedule Feasibility: Is the project sufficiently scoped to be doable within 1-2 semesters? (COCOMO, WinWin, prototyping) 8
9
University of Southern California Center for Systems and Software Engineering Estimating Client Costs CS577 team effort is not a cost for client 9
10
University of Southern California Center for Systems and Software Engineering Estimating Client Benefits Also summarize non-quantifiable benefits: better services, education (relate to Benefits Chain) 10
11
University of Southern California Center for Systems and Software Engineering Computing ROI Year Cost (hours) # Benefit (hours) + Cumulativ e Cost Cumulativ e Benefit ROI* 20124250 0 20131567625817620.31 20141727627531,5241.02 20151907629432,2861.42 20162107621,1533,0481.64 11 # : Assuming 10% per yr increase in cost. Rounded up + : Benefits rounded up to nearest integer * : ROI = (Cumulative Benefit – Cumulative Cost) / (Cumulative Cost)
12
University of Southern California Center for Systems and Software Engineering Plotting ROI 12 Benefit Realization only after transition: - Mid 2013 for 2 semester projects - Early 2013 for 1 semester projects
13
University of Southern California Center for Systems and Software Engineering Technology Feasibility 1.Architecture Feasibility –LOS Feasibility Techniques: Analysis Detailed references to prototypes Models Simulations –Capability Feasibility: Explicitly state/show how design satisfies capability requirements –Evolutionary Feasibility: Explicitly state/show how design satisfies evolutionary requirements (if any) 13
14
University of Southern California Center for Systems and Software Engineering Technology Feasibility 2.NDI/NCS Interoperability Various different NDI/NCSes may be used to satisfy the operational concept Need to check if they can seamlessly interoperate –Plug and Play instead of Plug and Pray Usually a manual effort by going through documentations and architecture and by prototyping to see if glue code required 14
15
University of Southern California Center for Systems and Software Engineering Process Feasibility ICSM for 577 typically has 4 ‘sub process models’ –Architected Agile (develop from scratch) –NDI Process (Shrink-wrapped software; minor customization possible; may have missing functionality) –NDI Intensive ( ̴ 30% of features provided by NDI; remaining effort in appraising features) –Net-Centric Services (Almost all functionality provided by online services with some customization) Need to provide rationale stating which process was chosen and why (How) Will the process help deliver the operational concept within budget/schedule? 15
16
University of Southern California Center for Systems and Software Engineering Risk Assessment Feasibility analysis only helps put estimates on the costs/benefits to ascertain expected ROI Various environmental factors can jeopardize project execution and delivery –Risks: Things that have a possibility of occurring in the future and may negatively impact outcome of project –Problem: Risk which has occurred or something that will happen with 100% probability Necessary to identify, analyze, prioritize and come up with mitigation plans if risk occurs 16
17
University of Southern California Center for Systems and Software Engineering Risk Management/Documentation Risks Risk Exposure Risk Mitigations Probability of Loss* Magnitude of Loss* Risk Exposur e Need to synchronize with another team for delivering capability. High communication overhead. 8972 -Setup a fixed schedule of meeting frequently and try to raise all the problems that most likely to occur. - Fixed meetings for synchronizing and finalizing architectural interfaces 17 * Scale: 1 – 9 (1: lowest, 9:highest) Risk Exposure (RE) = Probability of Loss x Magnitude of Loss (Risks prioritized using RE score)
18
University of Southern California Center for Systems and Software Engineering Various Stages of Feasibility Analysis Feasibility Analysis is NOT a one time activity The granularity of the analysis changes when progressing through the project Continually conducted as more details are uncovered during execution A previous “feasible” decision might as well become “infeasible” later or vice versa Feasibility Evidence required at every at every anchor-point milestone in ICSM 18
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.