University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Barry Boehm USC Center for Systems and Software Engineering.

Slides:



Advertisements
Similar presentations
Process Database and Process Capability Baseline
Advertisements

Early Effort Estimation of Business Data-processing Enhancements CS 689 November 30, 2000 By Kurt Detamore.
CLEANROOM SOFTWARE ENGINEERING
Automated Software Cost Estimation By James Roberts EEL 6883 Spring 2007.
Copyright 2000, Stephan Kelley1 Estimating User Interface Effort Using A Formal Method By Stephan Kelley 16 November 2000.
Software Quality Assurance Inspection by Ross Simmerman Software developers follow a method of software quality assurance and try to eliminate bugs prior.
COCOMO Suite Model Unification Tool Ray Madachy 23rd International Forum on COCOMO and Systems/Software Cost Modeling October 27, 2008.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Ricardo Valerdi USC Center for Systems and Software.
University of Southern California Center for Software Engineering CSE USC System Dynamics Modeling of a Spiral Hybrid Process Ray Madachy, Barry Boehm,
March 2002 COSYSMO: COnstructive SYStems Engineering Cost MOdel Ricardo Valerdi USC Annual Research Review March 11, 2002.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 3/18/08 (Systems and) Software Process Dynamics Ray Madachy USC.
1 COSYSMO 3.0: Future Research Directions Jared Fortune University of Southern California 2009 COCOMO Forum Massachusetts Institute of Technology.
University of Southern California Center for Software Engineering CSE USC ©USC-CSE 10/23/01 1 COSYSMO Portion The COCOMO II Suite of Software Cost Estimation.
Integrated COCOMO Suite Tool for Education Ray Madachy 24th International Forum on COCOMO and Systems/Software Cost Modeling November.
University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE1 Barry Boehm, USC Motorola Quality Workshop September 15,
10/25/2005USC-CSE1 Ye Yang, Barry Boehm USC-CSE COCOTS Risk Analyzer COCOMO II Forum, Oct. 25 th, 2005 Betsy Clark Software Metrics, Inc.
RIT Software Engineering
2/13/07(c) USC-CSSE1 An Analytical Comparison between Pair Development and Software Development with Inspection By: Monvorath (Molly) Phongpaibul
Risk Analysis and Mitigation with Expert COSYSMO Ray Madachy, Ricardo Valerdi Naval Postgraduate School MIT Lean Aerospace Initiative
Introduction Wilson Rosa, AFCAA CSSE Annual Research Review March 8, 2010.
SE 450 Software Processes & Product Metrics 1 Defect Removal.
1 CORADMO in 2001: A RAD Odyssey Cyrus Fakharzadeh 16th International Forum on COCOMO and Software Cost Modeling University of Southern.
University of Southern California Center for Software Engineering CSE USC USC-CSE Annual Research Review COQUALMO Update John D. Powell March 11, 2002.
Copyright USC-CSSE 1 Quality Management – Lessons of COQUALMO (COnstructive QUALity MOdel) A Software Defect Density Prediction Model AWBrown.
© USC-CSE1 Determine How Much Dependability is Enough: A Value-Based Approach LiGuo Huang, Barry Boehm University of Southern California.
© USC-CSE Feb Keun Lee ( & Sunita Chulani COQUALMO and Orthogonal Defect.
System-of-Systems Cost Modeling: COSOSIMO July 2005 Workshop Results Jo Ann Lane University of Southern California Center for Software Engineering.
1 Predictors of customer perceived software quality Paul Luo Li (ISRI – CMU) Audris Mockus (Avaya Research) Ping Zhang (Avaya Research)
Expert COSYSMO Update Raymond Madachy USC-CSSE Annual Research Review March 17, 2009.
University of Southern California Center for Software Engineering CSE USC 9/14/05 1 COCOMO II: Airborne Radar System Example Ray Madachy
University of Southern California Center for Software Engineering CSE USC ©USC-CSE 3/11/2002 Empirical Methods for Benchmarking High Dependability The.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy USC Center for Systems and Software Engineering
Software Process and Product Metrics
Introduction to Software Testing
Software causes many failures - significant mission risk Hard to quantify effects on system risk of: software defects software development practices software.
University of Southern California Center for Systems and Software Engineering Improving Affordability via Value-Based Testing 27th International Forum.
8/27/20151NeST Controlled. 2 Communication Transportation Education Banking Home Applications.
CS527: (Advanced) Topics in Software Engineering Overview of Software Quality Assurance Tao Xie ©D. Marinov, T. Xie.
COCOMO-SCORM: Cost Estimation for SCORM Course Development
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation.
CLEANROOM SOFTWARE ENGINEERING.
CS 501: Software Engineering Fall 1999 Lecture 16 Verification and Validation.
Defect Management Defect Injection and Removal
VTT-STUK assessment method for safety evaluation of safety-critical computer based systems - application in BE-SECBS project.
Instructor: Peter Clarke
Project Management Estimation. LOC and FP Estimation –Lines of code and function points were described as basic data from which productivity metrics can.
University of Southern California Center for Systems and Software Engineering 10/30/2009 © 2009 USC CSSE1 July 2008©USC-CSSE1 The Incremental Commitment.
Requirements Verification & Validation Requirements Engineering & Project Management.
Experimentation in Computer Science (Part 1). Outline  Empirical Strategies  Measurement  Experiment Process.
© USC-CSE 2001 Oct Constructive Quality Model – Orthogonal Defect Classification (COQUALMO-ODC) Model Keun Lee (
Assessing the influence on processes when evolving the software architecture By Larsson S, Wall A, Wallin P Parul Patel.
Project Estimation Model By Deepika Chaudhary. Factors for estimation Initial estimates may have to be made on the basis of a high level user requirements.
University of Southern California Center for Systems and Software Engineering Metrics Organizational Guidelines [1] ©USC-CSSE1 [1] Robert Grady, Practical.
Development of Methodologies for Independent Verification and Validation of Neural Networks NAG OSMA-F001-UNCLASS Methods and Procedures.
University of Southern California Center for Systems and Software Engineering COCOMO Suite Toolset Ray Madachy, NPS Winsor Brown, USC.
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
SFWR ENG 3KO4 Slide 1 Management of Software Engineering Chapter 8: Fundamentals of Software Engineering C. Ghezzi, M. Jazayeri, D. Mandrioli.
Estimating “Size” of Software There are many ways to estimate the volume or size of software. ( understanding requirements is key to this activity ) –We.
University of Southern California Center for Software Engineering CSE USC SCRover Increment 3 and JPL’s DDP Tool USC-CSE Annual Research Review March 16,
University of Southern California Center for Systems and Software Engineering Reducing Estimation Uncertainty with Continuous Assessment: Tracking the.
University of Southern California Center for Systems and Software Engineering Reducing Estimation Uncertainty with Continuous Assessment Framework Pongtip.
DevCOP: A Software Certificate Management System for Eclipse Mark Sherriff and Laurie Williams North Carolina State University ISSRE ’06 November 10, 2006.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
1 Agile COCOMO II: A Tool for Software Cost Estimating by Analogy Cyrus Fakharzadeh Barry Boehm Gunjan Sharman SCEA 2002 Presentation University of Southern.
CS4311 Spring 2011 Process Improvement Dr
Product reliability Measuring
Constructive Cost Model
SLOC and Size Reporting
Introduction to Software Testing
Quality Management, Peer Review, & Architecture Review Board
Presentation transcript:

University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Barry Boehm USC Center for Systems and Software Engineering {madachy, 22nd International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2007 ODC COQUALMO

University of Southern California Center for Systems and Software Engineering ©USC-CSSE2 Introduction Software risk advisory tools have been developed to support critical NASA flight projects on the Constellation program * Tradeoff analyses are afforded with the Orthogonal Defect Classification COnstructive QUALity MOdel (ODC COQUALMO), which predicts software defects introduced and removed classifying them with ODC defect types –Static and dynamic versions of the model V&V techniques are quantified from a value-based perspective when the defect model is integrated with risk minimization techniques * This work was partially sponsored by NASA AMES Software Risk Advisory Tools Cooperative Agreement No. NNA06CB29A

University of Southern California Center for Systems and Software Engineering ©USC-CSSE3 COQUALMO Background COQUALMO is a COCOMO extension that predicts the number of residual defects in a software product [Chulani, Boehm 1999] –Uses COCOMO II cost estimation inputs with defect removal parameters to predict the numbers of generated, detected and remaining defects for requirements, design and code Enables 'what-if' analyses that demonstrate the impact of –Defect removal techniques for automated analysis, peer reviews, and execution testing on defect types –Effects of personnel, project, product and platform characteristics on software quality Provides insights into –Assessment of payoffs for quality investments –Understanding of interactions amongst quality strategies –Probable ship time

University of Southern California Center for Systems and Software Engineering ©USC-CSSE4 ODC COQUALMO Overview COCOMO II ODC COQUALMO Defect Removal Model Software size Software product, process, platform and personnel attributes Defect removal capability levels Automated analysis Peer reviews Execution testing and tools Software development effort and schedule Number of residual defects Defect Introduction Model Defect density per unit of size Requirements (4 types) Design (6 types) Code (6 types) ODC additions in blue

University of Southern California Center for Systems and Software Engineering ©USC-CSSE5 COQUALMO Defect Removal Rating Scales Highly advanced tools, model- based test More advance test tools, preparation. Dist- monitoring Well-defined test seq. and basic test coverage tool system Basic test Test criteria based on checklist Ad-hoc test and debug No testing Execution Testing and Tools Extensive review checklist Statistical control Root cause analysis, formal follow Using historical data Formal review roles and Well-trained people and basic checklist Well-defined preparation, review, minimal follow-up Ad-hoc informal walk- through No peer review Peer Reviews Formalized specification, verification. Advanced dist- processing More elaborate req./design Basic dist- processing Intermediate- level module Simple req./design Compiler extension Basic req. and design consistency Basic compiler capabilities Simple compiler syntax checking Automated Analysis Extra HighVery HighHighNominalLowVery Low

University of Southern California Center for Systems and Software Engineering ©USC-CSSE6 ODC Extension ODC COQUALMO decomposes defects using ODC categories [Chillarege et al. 1992] –Enables tradeoffs of different detection efficiencies for the removal practices per type of defect The ODC taxonomy provides well-defined criteria for the defect types and has been successfully applied on NASA projects –ODC defects are mapped to operational flight risks

University of Southern California Center for Systems and Software Engineering ©USC-CSSE7 ODC Defect Categories Requirements  Correctness  Completeness  Consistency  Ambiguity/Testability Design/Code  Interface  Timing  Class/Object/Function  Method/Logic/Algorithm  Data Values/Initialization  Checking

University of Southern California Center for Systems and Software Engineering ©USC-CSSE8 Supports Value-Based Perspective Motivations –V&V techniques have different detection efficiencies for different types of defects –Effectiveness may vary over the lifecycle –Different defect types have different flight risk impacts –Scarce V&V resources to optimize –V&V methods may have overlapping capabilities for detecting defects ODC COQUALMO with other tools can help quantify the best combination of V&V techniques, their optimal order and timing

University of Southern California Center for Systems and Software Engineering ©USC-CSSE9 Complementary Risk Techniques V&V processes are designed and optimized with ODC COQUALMO by integrating it with different risk minimization techniques For NASA we have demonstrated linkage with machine learning, strategic optimization and JPL’s Defect Detection and Prevention (DDP) risk management method –Can determine the impact of V&V techniques on specific risks and overall flight risk

University of Southern California Center for Systems and Software Engineering ©USC-CSSE10 Current Calibrations The initial ODC defect distribution pattern is per JPL rover studies [Lutz and Mikulski 2003] A two round Delphi survey was completed to quantify ODC defect detection efficiencies, gauging the effect of different defect removal techniques against the ODC categories (example defect detection efficiencies below)

University of Southern California Center for Systems and Software Engineering ©USC-CSSE11 ODC COQUALMO Inputs

University of Southern California Center for Systems and Software Engineering ©USC-CSSE12 ODC COQUALMO Defect Outputs Effort and schedule outputs not shown

University of Southern California Center for Systems and Software Engineering ©USC-CSSE13 Dynamic ODC COQUALMO Portion Portion for Requirements Completeness defects only:

University of Southern California Center for Systems and Software Engineering ©USC-CSSE14 Dynamic ODC COQUALMO Sample Outputs Example of applying increased V&V for Execution Testing and Tools at 18 months:

University of Southern California Center for Systems and Software Engineering ©USC-CSSE15 Conclusions and Ongoing Work Software estimation models can play an important role in facilitating the right balance of activities to meet project goals –ODC COQUALMO can be used in different ways to reason about and optimize V&V processes We are recalibrating ODC COQUALMO with a combination of empirical data and Delphi survey results –Includes specialized calibrations with some CSSE industrial affiliates Continuing to tailor the model for NASA flight projects –Analyzing empirical case studies and data from manned and unmanned NASA flight projects, as well as other USC-CSSE industrial affiliates who serve as contractors on space projects Integrating the model with complementary techniques to support risk management practices, and to compare their performances.

University of Southern California Center for Systems and Software Engineering ©USC-CSSE16 References [Chulani, Boehm 1999] Chulani S, Boehm B, “Modeling software defect introduction and removal: COQUALMO (COnstructive QUALity MOdel)”, USC-CSE Technical Report , 1999 [Chillarege et al. 1992]R. Chillarege, I. Bhandari, J. Chaar, M. Halliday, D. Moebus, B. Ray, and M. Wong, "Orthogonal Defect Classification -- A Concept for In-Process Measurements," IEEE Transactions on Software Engineering, vol. 18, no. 11, pp , November 1992 [Lutz, Mikulski 2003] Lutz R, Mikulski I, “Final Report: Adapting ODC for Empirical Analysis of Pre-Launch Anomalies”, version 1.2, JPL Caltech report, December