University of Southern California Center for Systems and Software Engineering Evidence-Based Software Processes Supannika Koolmanojwong CS510 1.

Slides:



Advertisements
Similar presentations
System Integration Verification and Validation
Advertisements

Ninth Lecture Hour 8:30 – 9:20 pm, Thursday, September 13
Software Quality Assurance Plan
The System and Software Development Process Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
Lecture # 2 : Process Models
Alternate Software Development Methodologies
ITIL: Service Transition
Using UML, Patterns, and Java Object-Oriented Software Engineering Royce’s Methodology Chapter 16, Royce’ Methodology.
University of Southern California Center for Systems and Software Engineering SoS Engineering and the ICM Workshop Overview Jo Ann Lane USC CSSE
Rational Unified Process
University of Southern California Center for Software Engineering CSE USC MBASE Essentials Planning and control Milestone content Process models Life cycle.
University of Southern California Center for Systems and Software Engineering USC CSSE Research Overview Barry Boehm Sue Koolmanojwong Jo Ann Lane Nupul.
University of Southern California Center for Systems and Software Engineering Integrating Systems and Software Engineering (IS&SE) with the Incremental.
Software Engineering General Project Management Software Requirements
1 Software Testing and Quality Assurance Lecture 15 - Planning for Testing (Chapter 3, A Practical Guide to Testing Object- Oriented Software)
Iterative development and The Unified process
DITSCAP Phase 2 - Verification Pramod Jampala Christopher Swenson.
The Software Product Life Cycle. Views of the Software Product Life Cycle  Management  Software engineering  Engineering design  Architectural design.
Systems Analysis and Design. Systems Development Life Cycle (SDLC) Systems Analysis Systems Design Programming Testing Conversion On-going maintenance.
University of Southern California Center for Systems and Software Engineering Feasibility Evidence Description (FED) Barry Boehm, USC CS 577a Lecture Fall.
Software Evolution Planning CIS 376 Bruce R. Maxim UM-Dearborn.
What is Business Analysis Planning & Monitoring?
© 2005 Prentice Hall14-1 Stumpf and Teague Object-Oriented Systems Analysis and Design with UML.
Web Development Process Description
S/W Project Management
Extreme Programming Software Development Written by Sanjay Kumar.
Introduction to RUP Spring Sharif Univ. of Tech.2 Outlines What is RUP? RUP Phases –Inception –Elaboration –Construction –Transition.
Chapter 2 The process Process, Methods, and Tools
THE PROTOTYPING MODEL The prototyping model begins with requirements gathering. Developer and customer meet and define the overall objectives for the software.
Understand Application Lifecycle Management
CS 360 Lecture 3.  The software process is a structured set of activities required to develop a software system.  Fundamental Assumption:  Good software.
©Ian Sommerville 2000, Mejia-Alvarez 2009 Slide 1 Software Processes l Coherent sets of activities for specifying, designing, implementing and testing.
FCS - AAO - DM COMPE/SE/ISE 492 Senior Project 2 System/Software Test Documentation (STD) System/Software Test Documentation (STD)
Identify steps for understanding and solving the
Product Development Chapter 6. Definitions needed: Verification: The process of evaluating compliance to regulations, standards, or specifications.
 CS 5380 Software Engineering Chapter 2 – Software Processes Chapter 2 Software Processes1.
Chapter – 9 Checkpoints of the process
Testing Workflow In the Unified Process and Agile/Scrum processes.
Product Documentation Chapter 5. Required Medical Device Documentation  Business proposal  Product specification  Design specification  Software.
Object-oriented Analysis and Design Stages in a Software Project Requirements Writing Analysis Design Implementation System Integration and Testing Maintenance.
CHECKPOINTS OF THE PROCESS Three sequences of project checkpoints are used to synchronize stakeholder expectations throughout the lifecycle: 1)Major milestones,
University of Southern California Center for Systems and Software Engineering Model-Based Software Engineering Supannika Koolmanojwong Spring 2013.
Assessing the influence on processes when evolving the software architecture By Larsson S, Wall A, Wallin P Parul Patel.
University of Southern California Center for Systems and Software Engineering Barry Boehm, USC CS 510 Software Planning Guidelines.
Fifth Lecture Hour 9:30 – 10:20 am, September 9, 2001 Framework for a Software Management Process – Life Cycle Phases (Part II, Chapter 5 of Royce’ book)
The System and Software Development Process Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
Software Product Line Material based on slides and chapter by Linda M. Northrop, SEI.
Rational Unified Process (RUP) Process Meta-model Inception Phase These notes adopted and slightly modified from “RUP Made Easy”, provided by the IBM Academic.
University of Southern California Center for Software Engineering CSE USC A Case for Anchor Point Milestones and Feasibility Rationales April 2005 Barry.
J. Scott Hawker p. 1Some material © Rational Corp. Rational Unified Process Overview See and use the RUP Browser on lab machines.
University of Southern California Center for Systems and Software Engineering 3/3/2010© USC-CSSE CSCI577B 2010 Light Weight Sw Engg for Off-the-Books.
Business Analysis. Business Analysis Concepts Enterprise Analysis ► Identify business opportunities ► Understand the business strategy ► Identify Business.
Overview of RUP Lunch and Learn. Overview of RUP © 2008 Cardinal Solutions Group 2 Welcome  Introductions  What is your experience with RUP  What is.
CS223: Software Engineering Lecture 2: Introduction to Software Engineering.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
University of Southern California Center for Systems and Software Engineering Aug. 26, 2010 © USC-CSE Page 1 A Winsor Brown CS 577a Lecture Fall.
What has been accomplished at the end of MSD 1 & 2?
University of Southern California Center for Systems and Software Engineering RDCR ARB CS 577b Software Engineering II Supannika Koolmanojwong.
University of Southern California Center for Systems and Software Engineering Barry Boehm, USC CS 510 Fall 2010 Software Planning Guidelines.
 System Requirement Specification and System Planning.
University of Southern California Center for Software Engineering C S E USC ICSM Principles for Successful Software and Systems Engineering Barry Boehm,
Chapter 11 Project Management.
Systems Analysis and Design in a Changing World, 4th Edition
Client Introductions to CS577a
Chapter 2 Software Processes
Software life cycle models
Comparison between each special case
CS310 Software Engineering Lecturer Dr.Doaa Sami
Incremental Commitment Model (ICM)* for Software
Presentation transcript:

University of Southern California Center for Systems and Software Engineering Evidence-Based Software Processes Supannika Koolmanojwong CS510 1

University of Southern California Center for Systems and Software Engineering Outline Technical Shortfall Management Shortfall Consequences of Evidence Shortfalls How much evidence is enough? FED development framework Experiences with Evidence-Based Reviews 2

University of Southern California Center for Systems and Software Engineering 8/22/11(c) USC CSSE3July 2008©USC-CSSE3 The Incremental Commitment Life Cycle Process: Overview Stage I: DefinitionStage II: Development and Operations Anchor Point Milestones Synchronize, stabilize concurrency via FEDs Risk patterns determine life cycle process

University of Southern California Center for Systems and Software Engineering Problem with not having an evidence Commit to a set of plans and specification with little evidence Assuming the budget and schedule No assurance Plug and Pray Guess-timation 4

University of Southern California Center for Systems and Software Engineering Why we need evidence? To show the positive payoffs –Case study analysis –Cost benefit analysis As a basis for assuring system stakeholders, safe to proceed into development Capture this evidence in Feasibility Evidence Description (FED) –Validated by independent experts 5

University of Southern California Center for Systems and Software Engineering Technical Shortfalls Current software design and development methods focus on –the inputs and outputs, –preconditions and post-conditions lack adequate capabilities to support analyses about –how well the elements perform –how expensive –how compatible off-nominal performance –Usually show only sunny-day scenarios 6

University of Southern California Center for Systems and Software Engineering Technical Shortfalls tend to focus on exhaustive presentations –Powerpoint and UML little evidence on –Rainy-day scenarios –How will they perform adequately on throughput, response time, safety, security, usability, … –be buildable within the available budgets and schedules –to generate positive returns on investment 7

University of Southern California Center for Systems and Software Engineering Model-driven development Strongly focus on expressing product capabilities and relationships also focusing on reasoning about their combined incompatibilities and incompleteness Models –process models, property models, success models 8

University of Southern California Center for Systems and Software Engineering 8/22/11(c) USC CSSE99 The Model-Clash Spider Web: Master Net - Stakeholder value propositions (win conditions)

University of Southern California Center for Systems and Software Engineering MasterNet Case Study Incompatibilities among product models and other stakeholder value models are at least as frequent and important as product- product (PD-PD) model clashes. MasterNet users specified 3.5 million source lines of code –cost of $30/SLOC = $105 million –$22 million budget 10

University of Southern California Center for Systems and Software Engineering Frequency of Project Model Clash Types product-product model incompatibilities –30% of the model clashes –24% of the risks 11

University of Southern California Center for Systems and Software Engineering Management Shortfalls Schedule-based review –“The contract specifies that the Preliminary Design Review (PDR) will be held on April 1, 2011, whether we have a design or not.” Neither the customer nor the developer wants to fail the PDR –Numerous undefined interfaces and unresolved risks 12

University of Southern California Center for Systems and Software Engineering Management Shortfalls Event-based review –“Once we have a preliminary design, we will hold the PDR.” –exhaustive presentations of sunny-day PowerPoint charts and UML diagrams –focus on catching the 30% of the project model clashes of the product-product form –still have numerous other model clashes that are unidentified and will cause extensive project rework, overruns, and incomplete deliveries 13

University of Southern California Center for Systems and Software Engineering Management Shortfalls Most outsourcing contracts focus on product-oriented deliverables and reviews “I’d like to have some of my systems engineers address those software quality- factor risks, but my contract deliverables and award fees are based on having all of the system’s functions defined by the next review.” 14

University of Southern California Center for Systems and Software Engineering Project earned-value management systems Overfocus on product definition tracking project progress and data item descriptions (DIDs) for deliverables –Most contract DIDs cover function, interface, and infrastructure –demonstration of their feasibility in optional appendices 15

University of Southern California Center for Systems and Software Engineering Consequences of Evidence Shortfalls Biannual Standish Reports - major root causes of project failure: –shortfalls in evidence of feasibility with respect to stakeholder objectives 2009 Standish Report –32% of 9000 projects delivered their full capability within their budget and schedule –24% were cancelled –44% were significantly over budget, over schedule, and/or incompletely delivered 16

University of Southern California Center for Systems and Software Engineering Consequences of Evidence Shortfalls 2005 Standish Report - 71% of the sources of failure due –lack of user involvement, –executive support, –clear requirements, –proper planning, and –realistic expectations 17

University of Southern California Center for Systems and Software Engineering How much architecting is enough? 18

University of Southern California Center for Systems and Software Engineering How much architecting is enough? Amount of rework was an exponential function of project size A small project –easily adapt its architecture to rapid change via refactoring –with a rework penalty of 18% between minimal and extremely thorough architecture and risk resolution 19

University of Southern California Center for Systems and Software Engineering How much architecting is enough? A very large project –rework penalty of 91% –integration rework due to undiscovered large- component interface incompatibilities and critical performance shortfalls. 20

University of Southern California Center for Systems and Software Engineering How much architecting is enough? Black lines represent the average-case cost of rework, architecting, and total cost dotted red lines show the effect on the cost of architecting and total cost –rapid change adds 50% to the cost of architecture and risk resolution 21

University of Southern California Center for Systems and Software Engineering How much architecting is enough? high investments in architecture, feasibility analysis, and other documentation do not have a positive return on investment for very high-volatility projects due to the high costs of documentation rework for rapid- change adaptation. 22

University of Southern California Center for Systems and Software Engineering How much architecting is enough? Blue lines represent a conservative analysis of the external cost effects of system failure due to unidentified architecting shortfalls the costs of architecting shortfalls are not only added development project rework, but also losses to the organization’s operational effectiveness and productivity. 23

University of Southern California Center for Systems and Software Engineering How much architecting is enough? the greater the project’s size, criticality, and stability are, the greater is the need for validated architecture feasibility evidence very small, low-criticality projects with high volatility, the proof efforts would make little difference and would need to be continuously redone, producing a negative return on investment 24

University of Southern California Center for Systems and Software Engineering Evidence criteria evidence provided by the developer and validated by independent experts if the system is built to the specified architecture it will: –Satisfy the specified operational concept and requirements, including capability, interfaces, level of service, and evolution –Be buildable within the budgets and schedules in the plan –Generate a viable return on investment –Generate satisfactory outcomes for all of the success- critical stakeholders –Identify shortfalls in evidence as risks, and cover them with risk mitigation plans 25

University of Southern California Center for Systems and Software Engineering Commitment Reviews a set of Anchor Point milestone reviews to ensure that many concurrent activities are –Synchronized, Stabilized, risk-assessed focused on developer-produced and expert- validated evidence –These experts need to determine the realism of assumptions, the representativeness of scenarios, the thoroughness of analysis, and the coverage of key off- nominal conditions. to help the system’s success-critical stakeholders determine whether to proceed into the next level of commitment 26

University of Southern California Center for Systems and Software Engineering shortfall in feasibility evidence indicates a level of program execution uncertainty and a source of program risk often not possible to fully resolve all risks at a given point in the development cycle known, unresolved risks need to be identified and covered by risk management plans –the necessary staffing –funding to address them 27

University of Southern California Center for Systems and Software Engineering Risk is bad ? A program with risks –not necessarily bad –particularly if it has strong risk management plans. A program with no risks –may be high on achievability –but low on ability to produce a timely payoff or competitive advantage. 28

University of Southern California Center for Systems and Software Engineering Evidence Prototypes: of networks, robots, user interfaces, COTS interoperability Benchmarks: for performance, scalability, accuracy Exercises: for mission performance, interoperability, security Models: for cost, schedule, performance, reliability; tradeoffs Simulations: for mission scalability, performance, reliability Early working versions: of infrastructure, data fusion, legacy compatibility Previous experience 29

University of Southern California Center for Systems and Software Engineering Validity for the evidence (1) Data well defined –What is counted –How it is derived (e.g., how measured, calculated, or inferred) Representative mission scenarios –Operational environment –Adversaries –Component reliability, scalability, etc. –Nominal and off-nominal scenarios –Treatment of uncertainty –Composability of results; interference effects –Scale 30

University of Southern California Center for Systems and Software Engineering Validity for the evidence (2) Parameter values realistic –Based upon measured results –Inferred from representative projects/activities –Derived from relevant scenarios Outputs traceable to mission effectiveness –Directly/indirectly –Based on appropriate models, scenarios Models verified and validated –Via representative scenarios –Limiting cases, off-nominals realistic 31

University of Southern California Center for Systems and Software Engineering FED Development Process Framework if the evidence does not accompany the specifications and plans, the specifications and plans are incomplete event-based reviews need to be replaced by evidence-based reviews Documentation: no one-size-fits-all –determine the appropriate level of analysis and evaluation As with reused software, evidence can be appropriately reused 32

University of Southern California Center for Systems and Software Engineering Steps in developing a FED (1) 33 StepDescriptionExamples/ Detail ADevelop phase work products/ artifacts For a Development Commitment Review, this would include the system’s operational concept, prototypes, requirements, architecture, life cycle plans, and associated assumptions BDetermine most critical feasibility assurance issues Issues for which lack of feasibility evidence is program-critical CEvaluate feasibility assessment options Cost-effectiveness; necessary tool, data, scenario availability DSelect options, develop feasibility assessment plans EPrepare FED assessment plans and earned value milestones

University of Southern California Center for Systems and Software Engineering Steps in developing a FED (2) 34 StepDescriptionExamples/ Detail FBegin monitoring progress with respect to plans Also monitor changes to the project, technology, and objectives, and adapt plans GPrepare evidence-generation enablers Assessment criteria Parametric models, parameter values, bases of estimate COTS assessment criteria and plans Benchmarking candidates, test cases Prototypes/simulations, evaluation plans, subjects, and scenarios Instrumentation, data analysis capabilities HPerform pilot assessments; evaluate and iterate plans and enablers Short bottom-line summaries and pointers to evidence files are generally sufficient

University of Southern California Center for Systems and Software Engineering Steps in developing a FED (3) 35 StepDescriptionExamples/ Detail IAssess readiness for Commitment Review Shortfalls identified as risks and covered by risk mitigation plans Proceed to Commitment Review if ready JHold Commitment Review when ready; adjust plans based on review outcomes See Commitment Review process overview below.

University of Southern California Center for Systems and Software Engineering Conclusion Overall, evidence-based specifications and plans will not guarantee a successful project, but in general will eliminate many of the software delivery overruns and shortfalls experienced on current software projects. 36