Download presentation
Presentation is loading. Please wait.
1
Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses, Alexandria, VA
2
2 Voting for Payoff vs. Ease for “Big Ideas”
3
3 Payoff vs. Ease for “Big Ideas” Low High DifficultEasy Define software implications of competitive prototyping policy Lack of software requirements and technology maturity are causing most systems to fail Recommended guidance for defining architecture requirements and quantitatively identifying, predicting, evaluating, verifying, and validating quality characteristics and attributes Max score is 261 = 29*9 Body of knowledge for estimation Systems and software quality survey report (e.g., resources regarding: selection of characteristics, measurements, evaluation methods, etc.)
4
4 Knit Together Selected “Big Ideas” into Unified Proposed Initiative Leveraging Competitive Prototyping through Acquisition Initiatives in Integrated Systems and Software Requirements, Risk, Estimation, and Quality
5
5 Competitive Prototyping Memo
6
6 Leveraging Competitive Prototyping through Acquisition Initiatives in Integrated Systems and Software Requirements, Risk, Estimation, and Quality (1/3) ● Task 1: Conduct surveys and interviews of leading software professionals (government, industry, academia) to gather ideas, assess impacts, and sense expectations for Competitive Prototyping –14 days: Coordination of feedback from selected participation in DoD Workshop on Competitive Prototyping 4/28/08 –6 months: Summary report complete ● Task 2: Provide amplification of Competitive Prototyping memo for integrated SE/SW, [including where in the lifecycle there are opportunities for Competitive Prototyping [and how they can be contractually achieved]] –6 months: Initial sensing of guidance/implementation implications, including investigating the movement of Milestone B after System PDR –12 months: Specific guidance/implementation recommendations
7
7 Leveraging Competitive Prototyping through Acquisition Initiatives in Integrated Systems and Software Requirements, Risk, Estimation, and Quality (2/3) ● Task 3:Identify first adopters of Competitive Prototyping and facilitate and gather insights on effective usage, including collecting and analyzing data –6 months: Recommendations for programs, including lessons learned from previous similar programs that used competitive prototyping –12 months: Support kickoff of programs –18 months: Actively engaged with programs in facilitation and gathering insights ● Task 4: Develop guidance for early selection and application of integrated SE/SW quality systems for Competitive Prototyping [for RFP-authors] –6 months: 1 st draft, ready for limited circulation –18 months: 2 nd draft, ready for wide circulation
8
8 Leveraging Competitive Prototyping through Acquisition Initiatives in Integrated Systems and Software Requirements, Risk, Estimation, and Quality (3/3) ● Task 5: Develop Software Engineering Handbook for Competitive Prototyping including material explicitly targeted to different audiences (acquirer, supplier, etc.). Note: tasks 5 and 6 are tightly coupled. –6 months: Outline –12 months: 1 st draft, ready for limited circulation –18 months: Usage and evaluations on programs –30 months: 2 nd draft, ready for wide circulation ● Task 6: Develop training assets (materials, competencies, skill sets, etc.) that capture best-of-class ideas/practices for Competitive Prototyping. Note: tasks 5 and 6 are tightly coupled. –6 months: Outline –12 months: Scenarios and initial example materials including drafts coordinated with feedback from programs –18 months: Usage and evaluations on programs –24 months: 1 st draft, ready for limited circulation –36 months: 2 nd draft, ready for wide circulation
9
9 Original “Big Ideas” Charts
10
10 Define Software Implications of Competitive Prototyping Policy ● Summary –Define the software implications of the competitive prototyping policy and use this opportunity to address the “overall estimation problem” ● Benefits –Concurrently engineer the systems and software requirements –Address the overall estimation problem –Improve how to best manage expectations (software achievability) –Align incentives for stakeholders; Program office, services, industry VPs, proposal business development, program execution
11
11 Lack of Software Requirements and Technology Maturity are Causing Most Systems to Fail ● Summary –These are dominant (according to GAO) factors in software system acquisition that need to be addressed –Recommend pilots of programs that identify software technology development consistent with the Young memo ● Benefits –Summarizes statements from many sources of software problems and what we can learn from them –Posits a change to the life cycle Better / more mature requirements at initiation Substantially more attention to software maturity PMs have to know more about their jobs (seeking support in software and systems engineering and cost estimation and tracking) ● Team Elaborations –Much of what we are trying to develop is cutting edge, so we have to consciously deal with technology maturations (integration of multiple parallel new technologies) –Need discussion of development of human capital –Need to understand scope change versus evolution, clarification, and careful decomposition
12
12 Acquisition Guidance Task 2: Develop recommended guidance, based on an example of Quality Model, for architecture of software intensive systems; includes guidance for defining architecture requirements and quantitatively identifying, predicting, evaluating, verifying, and validating Quality Characteristics and Attributes Deliverables: Deliverable 1: Systems and software quality survey report (e.g., resources regarding: selection of characteristic, measurements, evaluation methods, etc.) Deliverable 2: Recommended guidance for defining architecture requirements and quantitatively identifying, predicting, evaluating, verifying, and validating Quality Characteristics and Attributes Timeline –Survey report: 4 months post 4/08 workshop –Draft recommended guidance: 12 months post 4/08 workshop –Updated recommended guidance: 18 months post 4/08 workshop
13
13 Body of Knowledge for Estimation ● Summary –Curriculum for training people on estimation; Body of knowledge for estimation (pulls from many disciplines: SW, economics, finance, management) ● Benefits –Grow human capital –Disseminate best practices, body of knowledge –Facilitate professional certifications –Build on MS in SW Engineering curriculum –Define types of competencies and skill sets –Synergies with Young’s just kicked off Software Acquisition Training and Education Working Group (SATEWG)
14
14 Whitepapers Available (see SEI website www.sei.cmu.edu) ● Type 1: –Making Requirements Management Work in an Organization ● Type 2: –Requirements Engineering ● Type 3: –Department of Defense SW System Acquisition--What’s Broke and What Can SW Requirement Management Contribute to a Systemic Fix? –Delivering Common Software Requirements for DoD Mission Capabilities –A Consideration for Enabling a Systems-of-Systems (SoS) Governance
15
15 Charter for Day 2 Working Sessions ● React to “unified proposal” on “Enabling Competitive Prototyping through Software Acquisition Initiatives in Requirements, Risk/Estimation, and Quality” ● Define/refine/improve near-term tasks, deliverables, milestones ● Define task leaders ● Define lists of interested participants ● Estimate resources required for near-term tasks
16
16 Proposed Software in Acquisition Working Group Meeting (Week of July 14, 2008, WashDC area) ● JULY: Outbrief from 4/28/08 and related meetings on CompProto (Blake Ireland) ● JULY: Updated task planning/status for Tasks 1-6 and performing organizations (Bruce Amato plus reps from performing orgs) ● JULY/OCT: Invited speakers to share experiences on previous competitive prototyping (Rick Selby) ● JULY: Updated on DAG and related guides (John Forbes) ● JULY/OCT: Systems Engineering Forum coordination (Kristen Baldwin) ● JULY/OCT: Initial results from Task 1 surveys/interviews regarding gathering ideas, assessing impacts, and sensing expectations for Competitive Prototyping (Carl Clavadetscher) ● OCT: Panel discussion on how competitive prototyping has been used in the past, how it is currently being planned as embodied in the Competitive Prototyping memo, and emerging/unanticipated issues (Ken Nidiffer) ● OCT: Update on programs adopting Competitive Prototyping and how they are doing so including status of their plans and decisions (Bruce Amato) ● JULY/OCT: Action plan going forward, including planning for Fall 2008 meeting (All) ● Invitees: April 2008 attendees plus people working Tasks 1-6
17
TPS Decision Problem 7 Available decision rules inadequate Need better information Info. has economic value State of Nature AlternativeFavorableUnfavorable BB (Bold)250-50 BC (Conservative)50
18
Expected Value of Perfect Information (EVPI) Build a Prototype for $10K –If prototype succeeds, choose BB Payoff: $250K – 10K = $240K –If prototype fails, choose BC Payoff: $50K – 10K = $40K If equally likely, Ev = 0.5 ($240K) + 0.5 ($40K) = $140K Could invest up to $50K and do better than before –thus, EVPI = $50K
19
However, Prototype Will Give Imperfect Information That is, P(IB|SF) = 0.0 Investigation (Prototype) says choose bold state of nature: Bold will fail P(IB|SS) = 1.0 Investigation (prototype) says choose bold state of nature: bold will succeed
20
Suppose we assess the prototype’s imperfections as P(IB|SF) = 0.20,P(IB|SS) = 0.90 And Suppose the states of nature are equally likely P(SF) = 0.50P(SS) = 0.50 We would like to compute the expected value of using the prototype EV(IB,IC) = P(IB) (Payoff if use bold) +P(IC) (Payoff if use conservative) = P(IB) [ P(SS|IB) ($250K) + P(SE|IB) (-$50K) ] + P(IC) ($50K) But these aren’t the probabilities we know
21
How to get the probabilities we need P(IB) = P(IB|SS) P(SS) + P(IB|SF) P(SF) P(IC) = 1 – P(IB) P(SS|IB) = P(IB|SS) P(SS)(Bayes’ formula) P(IB) P(SF|IB) = 1 – P (SS|IB) P(SS|IB) = Prob (we will choose Bold in a state of nature where it will succeed) Prob (we will choose Bold)
22
Net Expected Value of Prototype PROTO COST, $K P (PS|SF)P (PS|SS)EV, $KNET EV, $K 0600 50.300.8069.34.3 100.200.9078.28.2 200.100.9586.86.8 300.001.00900 8 4 0 102030 NET EV, $K PROTO COST, $K
23
Conditions for Successful Prototyping (or Other Info-Buying) 1. There exist alternatives whose payoffs vary greatly depending on some states of nature. 2. The critical states of nature have an appreciable probability of occurring. 3. The prototype has a high probability of accurately identifying the critical states of nature. 4. The required cost and schedule of the prototype does not overly curtail its net value. 5. There exist significant side benefits derived from building the prototype.
24
Pitfalls Associated by Success Conditions 1. Always build a prototype or simulation –May not satisfy conditions 3,4 2. Always build the software twice –May not satisfy conditions 1,2 3. Build the software purely top-down –May not satisfy conditions 1,2 4. Prove every piece of code correct –May not satisfy conditions 1,2,4 5. Nominal-case testing is sufficient –May need off-nominal testing to satisfy conditions 1,2,3
25
Statistical Decision Theory: Other S/W Engineering Applications How much should we invest in: –User information gathering –Make-or-buy information –Simulation –Testing –Program verification How much should our customers invest in: –MIS, query systems, traffic models, CAD, automatic test equipment, …
26
The Software Engineering Field Exists Because Processed Information Has Value
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.