University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop.

Slides:



Advertisements
Similar presentations
Verification and Validation
Advertisements

Ninth Lecture Hour 8:30 – 9:20 pm, Thursday, September 13
Software Engineering-II Sir zubair sajid. What’s the difference? Verification – Are you building the product right? – Software must conform to its specification.
Using UML, Patterns, and Java Object-Oriented Software Engineering Royce’s Methodology Chapter 16, Royce’ Methodology.
Software Quality Engineering Roadmap
SE curriculum in CC2001 made by IEEE and ACM: Overview and Ideas for Our Work Katerina Zdravkova Institute of Informatics
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 3/18/08 (Systems and) Software Process Dynamics Ray Madachy USC.
University of Southern California Center for Software Engineering CSE USC 12/6/01©USC-CSE CeBASE: Opportunities to Collaborate Barry Boehm, USC-CSE Annual.
University of Southern California Center for Software Engineering CSE USC ©USC-CSE Overview: USC Annual Research Review Barry Boehm, USC-CSE February.
University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE1 Barry Boehm, USC Motorola Quality Workshop September 15,
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Barry Boehm USC Center for Systems and Software Engineering.
SE 555 Software Requirements & Specification Beyond Requirements Based on Weigers Chapter17.
SE 450 Software Processes & Product Metrics 1 Defect Removal.
Fundamentals of Information Systems, Second Edition
Copyright USC-CSSE 1 Quality Management – Lessons of COQUALMO (COnstructive QUALity MOdel) A Software Defect Density Prediction Model AWBrown.
© USC-CSE1 Determine How Much Dependability is Enough: A Value-Based Approach LiGuo Huang, Barry Boehm University of Southern California.
University of Southern California Center for Software Engineering C S E USC Barry Boehm, USC DoD Software Engineering S&T Summit August 7, 2001
SE 555 Software Requirements & Specification Requirements Validation.
© USC-CSE Feb Keun Lee ( & Sunita Chulani COQUALMO and Orthogonal Defect.
University of Southern California Center for Software Engineering C S E USC Agile and Plan-Driven Methods Barry Boehm, USC USC-CSE Affiliates’ Workshop.
CSE USC Fraunhofer USA Center for Experimental Software Engineering, Maryland February 6, Outline Motivation Examples of Existing.
Software Quality Assurance For Software Engineering && Architecture and Design.
IV&V Facility Model-based Design Verification IVV Annual Workshop September, 2009 Tom Hempler.
Verification and Validation CIS 376 Bruce R. Maxim UM-Dearborn.
University of Southern California Center for Systems and Software Engineering Improving Affordability via Value-Based Testing 27th International Forum.
University of Southern California Center for Software Engineering C S E USC August 2001©USC-CSE1 CeBASE Experience Base (eBASE) -Shared Vision Barry Boehm,
What is Business Analysis Planning & Monitoring?
CS527: (Advanced) Topics in Software Engineering Overview of Software Quality Assurance Tao Xie ©D. Marinov, T. Xie.
Software Quality SEII-Lecture 15
What is Software Engineering? the application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software”
CLEANROOM SOFTWARE ENGINEERING.
High Dependability Computing in a Competitive World
ISERN Open Issues, Grand Challenges or Have we made any progress and where are going? Vic Basili 2001.
Chapter 6: Testing Overview Basic idea of testing - execute software and observe its behavior or outcome. If failure is observed, analyze execution record.
Software Component Technology and Component Tracing CSC532 Presentation Developed & Presented by Feifei Xu.
Software Software is omnipresent in the lives of billions of human beings. Software is an important component of the emerging knowledge based service.
Software Quality Engineering Chapters 1-3 Overview, Software Quality and Quality Assurance.
Computer Science Open Research Questions Adversary models –Define/Formalize adversary models Need to incorporate characteristics of new technologies and.
University of Southern California Center for Systems and Software Engineering 10/30/2009 © 2009 USC CSSE1 July 2008©USC-CSSE1 The Incremental Commitment.
IT Requirements Management Balancing Needs and Expectations.
Testing Workflow In the Unified Process and Agile/Scrum processes.
This chapter is extracted from Sommerville’s slides. Text book chapter
Eleventh Lecture Hour 9:30 – 10:20 am, Saturday, September 16 Software Management Disciplines Iterative Process Planning (from Part III, Chapter 10 of.
CHECKPOINTS OF THE PROCESS Three sequences of project checkpoints are used to synchronize stakeholder expectations throughout the lifecycle: 1)Major milestones,
John D. McGregor Session 2 Preparing for Requirements V & V
© USC-CSE 2001 Oct Constructive Quality Model – Orthogonal Defect Classification (COQUALMO-ODC) Model Keun Lee (
University of Southern California Center for Systems and Software Engineering Metrics Organizational Guidelines [1] ©USC-CSSE1 [1] Robert Grady, Practical.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Chapter 19 Verification and Validation.
Software Product Line Material based on slides and chapter by Linda M. Northrop, SEI.
1 Chapter 3 1.Quality Management, 2.Software Cost Estimation 3.Process Improvement.
Fundamentals of Information Systems, Second Edition 1 Systems Development.
Software quality factors
CS551 - Lecture 5 1 CS551 Lecture 5: Quality Attributes Yugi Lee FH #555 (816)
Architecture Analysis Techniques
MODEL-BASED SOFTWARE ARCHITECTURES.  Models of software are used in an increasing number of projects to handle the complexity of application domains.
CASE (Computer-Aided Software Engineering) Tools Software that is used to support software process activities. Provides software process support by:- –
University of Southern California Center for Software Engineering CSE USC SCRover Increment 3 and JPL’s DDP Tool USC-CSE Annual Research Review March 16,
Rational Unified Process Fundamentals Module 4: Core Workflows II - Concepts Rational Unified Process Fundamentals Module 4: Core Workflows II - Concepts.
CPSC 871 John D. McGregor Module 6 Session 2 Validation and Verification.
University of Southern California Center for Systems and Software Engineering Aug. 26, 2010 © USC-CSE Page 1 A Winsor Brown CS 577a Lecture Fall.
University of Southern California Center for Systems and Software Engineering Core Capability Drive-Through Preparation Pongtip Aroonvatanaporn CSCI 577b.
Slide 1SATC June 2000 Dolores R. Wallace* NASA Goddard Space Flight Center Greenbelt, Maryland for the American Society.
Testing and Evolution CSCI 201L Jeffrey Miller, Ph.D. HTTP :// WWW - SCF. USC. EDU /~ CSCI 201 USC CSCI 201L.
CS 577b: Software Engineering II
Software Engineering (CSI 321)
Software Quality Assurance
CSC 480 Software Engineering
Software Testing and Maintenance Maintenance and Evolution Overview
Software Quality Assurance
Presentation transcript:

University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop July 31, 2001

University of Southern California Center for SoftwareEngineering 07/31/01©USC-CSE2 Outline Reliable Software Strategic Context –Opportunity Tree Reliable Software Technology Prospects Reliable Software Technology Transition –Software Technology Transition Challenges –Technology Transition Rate Drivers –Technology Transition Acceleration –Staged Testbed Experimentation

University of Southern California Center for SoftwareEngineering 07/31/01©USC-CSE3 Software Dependability Opportunity Tree Decrease Defect Risk Exposure Continuous Improvement Decrease Defect Impact, Size (Loss) Decrease Defect Prob (Loss) Defect Prevention Defect Detection and Removal Value/Risk - Based Defect Reduction Graceful Degradation CI Methods and Metrics Process, Product, People Technology

University of Southern California Center for SoftwareEngineering 07/31/01©USC-CSE4 Software Defect Prevention Opportunity Tree IPT, JAD, WinWin,… PSP, Cleanroom, Dual development,… Manual execution, scenarios,... Staffing for dependability Rqts., Design, Code,… Interfaces, traceability,… Checklists Defect Prevention People practices Standards Languages Prototyping Modeling & Simulation Reuse Root cause analysis

University of Southern California Center for SoftwareEngineering 07/31/01©USC-CSE5 People Practices: Some Empirical Data Cleanroom: Software Engineering Lab –25-75% reduction in failure rates –5% vs 60% of fix efforts over 1 hour Personal Software Process/Team Software Process –50-75% defect reduction in CMM Level 5 organization –Even higher reductions for less mature organizations Staffing –Many experiments find factor-of-10 differences in people’s defect rates

University of Southern California Center for SoftwareEngineering 07/31/01©USC-CSE6 Software Defect Detection Opportunity Tree Completeness checking Consistency checking - Views, interfaces, behavior, pre/post conditions Traceability checking Compliance checking - Models, assertions, standards Defect Detection and Removal - Rqts. - Design - Code Testing Reviewing Automated Analysis Peer reviews, inspections Architecture Review Boards Pair programming Requirements & design Structural Operational profile Usage (alpha, beta) Regression Value/Risk - based Test automation

University of Southern California Center for SoftwareEngineering 07/31/01©USC-CSE7 Orthogonal Defect Classification - Chillarege, 1996

UMD-USC CeBASE Experience Comparisons - “Under specified conditions, …” Technique Selection Guidance (UMD, USC) Peer reviews are more effective than functional testing for faults of omission and incorrect specification –Peer reviews catch 60 % of the defects Functional testing is more effective than reviews for faults concerning numerical approximations and control flow Technique Definition Guidance (UMD) For a reviewer with an average experience level, a procedural approach to defect detection is more effective than a less procedural one. Readers of a software artifact are more effective in uncovering defects when each uses a different and specific focus. –Perspective - based reviews catch 35% more defects 807/31/01 CeBASE

University of Southern California Center for SoftwareEngineering 07/31/01©USC-CSE9 Defect Impact Reduction Opportunity Tree Business case analysis Pareto (80-20) analysis V/R-based reviews V/R-based testing Cost/schedule/quality as independent variable Decrease Defect Impact, Size (Loss) Graceful Degredation Value/Risk - Based Defect Reduction Fault tolerance Self-stabilizing SW Reduced-capability models Manual Backup Rapid recovery

University of Southern California Center for SoftwareEngineering 07/31/01©USC-CSE10 C, S, Q as Independent Variable Determine Desired Delivered Defect Density (D4) –Or a value-based equivalent Prioritize desired features –Via QFD, IPT, stakeholder win-win Determine Core Capability –90% confidence of D4 within cost and schedule –Balance parametric models and expert judgment Architect for ease of adding next-priority features –Hide sources of change within modules (Parnas) Develop core capability to D4 quality level –Usually in less than available cost and schedule Add next priority features as resources permit Used successfully on 24 of 26 USC digital library projects

University of Southern California Center for SoftwareEngineering 07/31/01©USC-CSE11 HDC Technology Prospects High-dependability change –Architecture-based analysis and composition –Lightweight formal methods Model-based analysis and assertion checking –Domain models, stochastic models High-dependability test data generation Dependability attribute tradeoff analysis –Dependable systems form less-dependable components –Exploiting cheap, powerful hardware Self-stabilizing software Complementary empirical methods –Experiments, testbeds, models, experience factory –Accelerated technology transition, education, training

University of Southern California Center for SoftwareEngineering 07/31/01©USC-CSE12 Outline Reliable Software Strategic Context –Opportunity Tree Reliable Software Technology Prospects Reliable Software Technology Transition –Software Technology Transition Challenges –Technology Transition Rate Drivers –Technology Transition Acceleration –Staged Testbed Experimentation

University of Southern California Center for SoftwareEngineering 07/31/01©USC-CSE13 Adoption requires behavioral change Payoffs take a long time to demonstrate –And are hard to trace back to particular technology insertions Marketplace often makes “fixing it later” more attractive than “doing it right the first time” –“The IT industry expends the bulk of its resources, both financial and human, on rapidly bringing products to market.” -PITAC Report, p.8 Strong coupling among technologies, processes, acquisition practices, cultures Rapidly evolving commercial technology Slowly evolving Government acquisition practices Risk-averse program managers Leaky intellectual property Software Engineering Technology Transition Challenges

University of Southern California Center for SoftwareEngineering 07/31/01©USC-CSE14 1. Competition-criticality 2. Impact on current operations, power bases 3. Number of concurrent innovations involved 4. Adopter change/risk-aversion 5. Payback realization speed 6. Regulatory obstacles or incentives 7. Degree of executive support 8. Factor Endowment (human, knowledge, capital, infrastructure) Technology Transition Rate Drivers

University of Southern California Center for SoftwareEngineering 07/31/01©USC-CSE15 Technology Transition Acceleration: Staged Testbed Experimentation Assess technology readiness, scalability, generality by experimental use in staged testbeds –Stage I: small representative subsystems –Stage II: larger subsystems with representative mission components –Stage III: part-to-full systems in full mission simulator –Stage IV: test-mode application to full mission-systems Use Goal-Question-Metric approach –Goals: mission objectives and priorities –Questions: content and hypotheses to test –Metrics: data to collect and analyze