Steve Chenoweth Office Phone: (812) Cell: (937)

Slides:



Advertisements
Similar presentations
Testing and Quality Assurance
Advertisements

Predictor of Customer Perceived Software Quality By Haroon Malik.
1 Predicting Bugs From History Software Evolution Chapter 4: Predicting Bugs from History T. Zimmermann, N. Nagappan, A Zeller.
Mining Metrics to Predict Component Failures Nachiappan Nagappan, Microsoft Research Thomas Ball, Microsoft Research Andreas Zeller, Saarland University.
Software Quality Assurance Inspection by Ross Simmerman Software developers follow a method of software quality assurance and try to eliminate bugs prior.
1 Software Architecture CSSE 477: Week 5, Day 1 Statistical Modeling to Achieve Maintainability Steve Chenoweth Office Phone: (812) Cell: (937)
1 Software Maintenance and Evolution CSSE 575: Session 3, Part 4 Big Refactorings Steve Chenoweth Office Phone: (812) Cell: (937)
1 Software Maintenance and Evolution CSSE 575: Session 4, Part 1 Software Maintenance – Big Issues served up, Side order of Reifer Steve Chenoweth Office.
What causes bugs? Joshua Sunshine. Bug taxonomy Bug components: – Fault/Defect – Error – Failure Bug categories – Post/pre release – Process stage – Hazard.
1 Software Maintenance and Evolution CSSE 575: Session 2, Part 3 Moving Features Between Objects Steve Chenoweth Office Phone: (812) Cell: (937)
1 Software Maintenance and Evolution CSSE 575: Session 1, Part 1 Course Introduction Steve Chenoweth Office Phone: (812) Cell: (937)
1 Software Maintenance and Evolution CSSE 575: Session 8, Part 3 Predicting Bugs Steve Chenoweth Office Phone: (812) Cell: (937)
1 Static Testing: defect prevention SIM objectives Able to list various type of structured group examinations (manual checking) Able to statically.
SE 450 Software Processes & Product Metrics Reliability: An Introduction.
1 CSSE 377 – Intro to Availability & Reliability Part 2 Steve Chenoweth Tuesday, 9/13/11 Week 2, Day 2 Right – Pictorial view of how to achieve high availability.
© S. Demeyer, S. Ducasse, O. Nierstrasz Migration.1 5. Testing and Migration What and Why  Reengineering Life-Cycle Tests: Your Life Insurance !  Grow.
Soft. Eng. II, Spr. 2002Dr Driss Kettani, from I. Sommerville1 CSC-3325: Chapter 9 Title : Reliability Reading: I. Sommerville, Chap. 16, 17 and 18.
1 Predictors of customer perceived software quality Paul Luo Li (ISRI – CMU) Audris Mockus (Avaya Research) Ping Zhang (Avaya Research)
 QUALITY ASSURANCE:  QA is defined as a procedure or set of procedures intended to ensure that a product or service under development (before work is.
SEG Software Maintenance1 Software Maintenance “The modification of a software product after delivery to correct faults, to improve performance or.
S Neuendorf 2004 Prediction of Software Defects SASQAG March 2004 by Steve Neuendorf.
A Comparative Analysis of the Efficiency of Change Metrics and Static Code Attributes for Defect Prediction Raimund Moser, Witold Pedrycz, Giancarlo Succi.
Estimation Wrap-up CSE 403, Spring 2008, Alverson Spolsky.
1 Software Maintenance and Evolution CSSE 575: Session 8, Part 2 Analyzing Software Repositories Steve Chenoweth Office Phone: (812) Cell: (937)
Cmpe 589 Spring Software Quality Metrics Product  product attributes –Size, complexity, design features, performance, quality level Process  Used.
CS4723 Software Validation and Quality Assurance
Software Metrics - Data Collection What is good data? Are they correct? Are they accurate? Are they appropriately precise? Are they consist? Are they associated.
Software Engineering CS3003
Project Tracking. Questions... Why should we track a project that is underway? What aspects of a project need tracking?
University of Sunderland CIFM03Lecture 4 1 Software Measurement and Reliability CIFM03 Lecture 4.
This chapter is extracted from Sommerville’s slides. Text book chapter
OHTO -99 SOFTWARE ENGINEERING “SOFTWARE PRODUCT QUALITY” Today: - Software quality - Quality Components - ”Good” software properties.
CSSE 375 Software Construction and Evolution: More SCM, and a loop back to Feathers! Shawn and Steve Left – On big systems, SCM is a well-defined process.
Software Reliability Research Pankaj Jalote Professor, CSE, IIT Kanpur, India.
Configuration Management and Change Control Change is inevitable! So it has to be planned for and managed.
Fixing the Defect CEN4072 – Software Testing. From Defect to Failure How a defect becomes a failure: 1. The programmer creates a defect 2. The defect.
CHARACTERIZING CLOUD COMPUTING HARDWARE RELIABILITY Authors: Kashi Venkatesh Vishwanath ; Nachiappan Nagappan Presented By: Vibhuti Dhiman.
DevCOP: A Software Certificate Management System for Eclipse Mark Sherriff and Laurie Williams North Carolina State University ISSRE ’06 November 10, 2006.
Objective ICT : Internet of Services, Software & Virtualisation FLOSSEvo some preliminary ideas.
Week # 4 Quality Assurance Software Quality Engineering 1.
Chapter 8: Maintenance and Software Evolution Ronald J. Leach Copyright Ronald J. Leach, 1997, 2009, 2014,
Cs498dm Software Testing Darko Marinov January 24, 2012.
The PLA Model: On the Combination of Product-Line Analyses 강태준.
Swami NatarajanOctober 1, 2016 RIT Software Engineering Software Metrics Overview.
Introduction to Software Quality Assurance & Testing
Steve Chenoweth Office Phone: (812) Cell: (937)
SOFTWARE TESTING Date: 29-Dec-2016 By: Ram Karthick.
Software Configuration Management
Hardware & Software Reliability
Approaches to ---Testing Software
Testing More In CS430.
Rapid Application Development
Chapter 18 Maintaining Information Systems
Software Maintenance and Evolution CSSE 575: Session 1, Part 1 Course Introduction Steve Chenoweth Office Phone: (812) Cell: (937)
Some Simple Definitions for Testing
DEFECT PREDICTION : USING MACHINE LEARNING
Developing Applications
Design for Quality Design for Quality and Safety Design Improvement
Software Quality Engineering
Chapter 9 – Software Evolution and Maintenance
Finding and Managing Bugs CSE 403 Lecture 23
Welcome to Corporate Training -1
Software metrics.
Chapter 8 Software Evolution.
Software Testing “If you can’t test it, you can’t design it”
An Introduction to Debugging
White Box testing & Inspections
Chapter 10: Testing and Quality Assurance
Introduction to Software Testing
Presentation transcript:

Software Maintenance and Evolution CSSE 575: Session 8, Part 3 Predicting Bugs Steve Chenoweth Office Phone: (812) 877-8974 Cell: (937) 657-3885 Email: chenowet@rose-hulman.edu Above – A good video on research into predicting bugs, by James Whitehead of UC Santa Cruz. At https://www.youtube.com/watch?v=IYmbMtOL9Sc.

Basic question I only have so much time for code reviews, testing, other QA-related costs: What do I spend the time on Related to studies of software reliability Traditional answer is to spend time on the code that “has to work” like Code used to recover from a crash Start-up and shut down code The small percentage of code that runs all the time Some of this material is from Zimmerman, et al, “Predicting bugs from history.” In Mens & Demeyer (Eds), Software Evolution. Springer, 2010.

Developers can’t remember everything So, they need help from actual facts about successes and failures Like the bug database Helps to visualize this But past doesn’t predict future: Software evolves, with refactorings to fix bug-prone areas New features may be in different areas

What makes a module defect-prone? Software needs to be fixed because it doesn’t satisfy some requirement. There’s an error, somewhere from writing down the requirement to product usage. An error in source code is considered a “defect” It may cause a detectable fault, and needs to be fixed We look for certain “invariants” in a module’s history: Complexity Problem domain Evolution Process  See next slides

Complexity We can assume that likelihood of mistakes in some artifact increases with: The number of details The extent of how these details interact with each other Many ways to measure this… Do complexity metrics correlate with defect density?

Complexity, cntd

Complexity, cntd The authors studied 5 Microsoft projects. Found significant correlations between complexity metrics and post-release defects. But – across projects, different complexity metrics correlated significantly. None of the metrics qualified as a universal predictor. Tried making combination predictors, as well!

Problem domain Question – How does problem domain impact defect likelihood? Authors looked at Eclipse. Domains were: User interfaces Compiler internals These are complex in different ways!

Problem domain, cntd 86% 15% Chance of having a defect needing to be fixed, in the first 6 months after release

Problem domain, cntd The authors also considered families of dependencies as a “problem domain” Studied these in Microsoft Windows Server 2003 Families were: Imports Exports RPC Registry access They found these to be suitable predictors of defect levels in modules

Problem domain, cntd They also found a “domino effect” in these dependencies: A defect in one component can increase the likelihood of defects in dependent components.

Problem domain, cntd

Evolution Question: Does code churn correlate with defects? Churn = amount of code change in a unit over time Authors analyzed churn leading to MS Windows Server 2003 SP1 Included about 40 million lines of code in 2000 binaries! Extracted churn info from version control system

Evolution, cntd They found – The more a component changed, the more likely it was to have defects Code churn measures can be used to predict defect-prone components

Process Every defect that escapes into the field  a failure of quality assurance. A good software process can compensate many of the risks in everything else. Thus, the quality of the development process should also be considered when it comes to predicting defects. This remains to be studied! E.g., do higher-level CMM organizations really produce better software of equal complexity?

The “management factor” “Any predictor will eventually turn out to be wrong.” This is because, if a predictor predicts a higher number of defects, then That component will be checked more carefully Which will reduce defect density! Thus, defect predictors are self-defeating prophecies… Above – Nostradamus takes a dim view of our prediction methods. From http://j.l.navarro.tripod.com/jlnavarro/id11.html.

Other bug prediction approaches Most people assume that new bugs will occur where old bugs occurred, given similarities in new tasks: Consider “temporal and spatial locality” Consider “developer locality” Use machine learning techniques See Jim Whitehead talk One might ask, “What would it be like if we used formal methods and ‘clean room’ to avoid most bugs in the first place?

The paper you read “Bug-fix time prediction models: Can we do better?” By Pamela Bhattacharya and Julian Neamtiu. MSR ‘11. Takes a different approach – predicting the work involved, versus just the bug occurrence. They found existing models had bugs! Need more independent variables. No correlation with bug-opener’s reputation.

Paper, cntd Bug fix time is important to predict, because: It helps predict software quality It helps coordinate effort in analyzing bugs The prior models used machine learning algorithms Experiments on both open source & commercial Report attributes like bug severity, number of developers involved, number of comments and attachments, and developer reputation The authors’ new findings leave open questions!

Assignment and milestone See the assignment info under projects. Left – Where did the word “milestone” come from, anyway? Is this graphic answer way too obvious to be right? From http://www.wigdahl.net/quern/2009/09/14/milestone/.