Slide 1 SPIN 23 February 2006 Norman Fenton Agena Ltd and Queen Mary University of London Improved Software Defect Prediction.

Slides:



Advertisements
Similar presentations
Testing Workflow Purpose
Advertisements

QuEdge Testing Process Delivering Global Solutions.
1 Introducing Bayesian Nets in AgenaRisk An example based on Software Defect Prediction.
Software Quality Assurance Plan
Computer Engineering 203 R Smith Project Tracking 12/ Project Tracking Why do we want to track a project? What is the projects MOV? – Why is tracking.
1. Profile Decision-making and risk assessment under uncertainty Special expertise on software project risk assessment Novel applications of causal models.
© Copyright Richard W. Selby and Northrop Grumman Corporation. All rights reserved. 0 Process Synchronization and Stabilization February 2007 Rick.
Stepan Potiyenko ISS Sr.SW Developer.
What causes bugs? Joshua Sunshine. Bug taxonomy Bug components: – Fault/Defect – Error – Failure Bug categories – Post/pre release – Process stage – Hazard.
Reliability and Software metrics Done by: Tayeb El Alaoui Software Engineering II Course.
SE 450 Software Processes & Product Metrics Software Metrics Overview.
Software Defect Modeling at JPL John N. Spagnuolo Jr. and John D. Powell 19th International Forum on COCOMO and Software Cost Modeling 10/27/2004.
Stoimen Stoimenov QA Engineer QA Engineer SitefinityLeads,SitefinityTeam6 Telerik QA Academy Telerik QA Academy.
Chapter 13: Defect Prevention & Process Improvement
S Neuendorf 2004 Prediction of Software Defects SASQAG March 2004 by Steve Neuendorf.
1 Doing Statistics for Business Doing Statistics for Business Data, Inference, and Decision Making Marilyn K. Pelosi Theresa M. Sandifer Chapter 11 Regression.
1 NASA OSMA SAS02 Software Reliability Modeling: Traditional and Non-Parametric Dolores R. Wallace Victor Laing SRS Information Services Software Assurance.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 27 Slide 1 Quality Management 1.
Graphical Causal Models: Determining Causes from Observations William Marsh Risk Assessment and Decision Analysis (RADAR) Computer Science.
Advancing Requirements-Based Testing Models to Reduce Software Defects Craig Hale, Process Improvement Manager and Presenter Mara Brunner, B&M Lead Mike.
CLEANROOM SOFTWARE ENGINEERING.
Independent Verification and Validation (IV&V) Techniques for Object Oriented Software Systems SAS meeting July 2003.
Testing – A Methodology of Science and Art. Agenda To show, A global Test Process which work Like a solution Black Box for an Software Implementation.
Managing Software Projects Analysis and Evaluation of Data - Reliable, Accurate, and Valid Data - Distribution of Data - Centrality and Dispersion - Data.
Defect Management Defect Injection and Removal
Engineering Productivity Measurement Research Team Engineering Productivity Measurement Research Team Bob Shoemaker BE&K Bob Shoemaker BE&K CPI Conference.
Fourteenth Lecture Hour 9:30 – 10:20 am, Sunday, September 16 Software Management Disciplines Project Control and Process Automation (from Part III, Chapter.
Security of Open Source Web Applications Maureen Doyle, James Walden Northern Kentucky University Students: Grant Welch, Michael Whelan Acknowledgements:
吳仲理 Using Lean in Application Development to achieve competitive advantage and customer delight N. Balaji Ganesh, WIPRO Technologies 20/12/2013.
ESA/ESTEC, TEC-QQS August 8, 2005 SAS_05_ESA SW PA R&D_Winzer,Prades Slide 1 Software Product Assurance (PA) R&D Road mapping Activities ESA/ESTEC TEC-QQS.
Testing Workflow In the Unified Process and Agile/Scrum processes.
What Do We Know about Defect Detection Methods P. Runeson et al.; "What Do We Know about Defect Detection Methods?", IEEE Software, May/June Page(s):
Software Engineering Experimentation Software Metrics Jeff Offutt
Top Down View of Estimation Test Managers Forum 25 th April 2007.
Software Metrics and Reliability. Definitions According to ANSI, “ Software Reliability is defined as the probability of failure – free software operation.
Slide 6.1 © The McGraw-Hill Companies, 2002 Object-Oriented and Classical Software Engineering Fifth Edition, WCB/McGraw-Hill, 2002 Stephen R. Schach
Milestone 2 Review Measurement Phase Complete Cost Data Integrity Project October 19, 2007.
Formal Methods in Software Engineering
Systems Analysis and Design in a Changing World, Fourth Edition
M Global Software Group 1 Motorola Internal Use Only Better Software Quality at a Lower Cost: Testing to Eliminate Software Black Holes Isaac (Haim) Levendel,
Codes, Peers and Mates Media processing meets future networks EU Workshop on thematic priorities in Networked Media Brussels January 19 th 2010 Ebroul.
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
why smart data is better than big data Queen Mary University of London
1 Report on results of Discriminant Analysis experiment. 27 June 2002 Norman F. Schneidewind, PhD Naval Postgraduate School 2822 Racoon Trail Pebble Beach,
SOFTWARE METRICS Software Metrics :Roadmap Norman E Fenton and Martin Neil Presented by Santhosh Kumar Grandai.
Slide 1 UCL JDI Centre for the Forensic Sciences 21 March 2012 Norman Fenton Queen Mary University of London and Agena Ltd Bayes and.
WERST – Methodology Group
Using Bayesian Nets to Predict Software Defects in Arbitrary Software Lifecycles Martin Neil Agena Ltd London, UK Web:
LOGO TESTING Team 8: 1.Nguyễn Hoàng Khánh 2.Dương Quốc Việt 3.Trang Thế Vinh.
Rational Unified Process (RUP)
Advanced Software Engineering Lecture 4: Process & Project Metrics.
1 Forecasting/ Causal Model MGS Forecasting Quantitative Causal Model Trend Time series Stationary Trend Trend + Seasonality Qualitative Expert.
DMAICTools.com thanks our industry partners for sharing their forms - use it, improve it, share it, send us back a better version!
Static and Integration Testing. Static Testing vs Dynamic Testing  To find defects  This testing includes verification process  without executing.
Template. Project Title : Facility: TeamSponsorCHR Data analysisProcess MappingPatients interviews  Key Findings Issue / Focus AreaMeasure#Root cause.
DevCOP: A Software Certificate Management System for Eclipse Mark Sherriff and Laurie Williams North Carolina State University ISSRE ’06 November 10, 2006.
Topic:- At the end we will be able to explain:- Why it is called Meta Model ?? Spiral Model Its Advantages & Disadvantages… Phases of Spiral Model...
OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 1.1 © OSEL 2005 Page 1 of 30 Analysis of Defect (and other) Data SPIN London,
Manual Testing Concepts Instructor: Surender. Agenda  Content: 1. Testing Overview I. What is testing II. Who does testing III. When to Start Testing.
Chapter 8: Maintenance and Software Evolution Ronald J. Leach Copyright Ronald J. Leach, 1997, 2009, 2014,
Swami NatarajanOctober 1, 2016 RIT Software Engineering Software Metrics Overview.
Software Defects Cmpe 550 Fall 2005
Software Engineering Experimentation
S519: Evaluation of Information Systems
Software Engineering Experimentation
Project collaborators’ names
The Software Testing Life Cycle
Exploring Complexity Metrics as Indicators of Software Vulnerability
Testing, Inspection, Walkthrough
Presentation transcript:

Slide 1 SPIN 23 February 2006 Norman Fenton Agena Ltd and Queen Mary University of London Improved Software Defect Prediction

Slide 2 delivery Pre-release testing Post-release operation Using fault data to predict reliability

Slide 3 Pre-release vs post-release faults? 0 0 Post-release faults Pre-release faults ?

Slide 4 Pre-release vs post-release faults: actual Post-release faults Pre-release faults

Slide 5 Software metrics….?

Slide 6 Regression models….?

Slide 7 Solution: causal models (risk maps)

Slide 8 What’s special about this approach? Structured Visual Robust

Slide 9 The specific problem for Philips time Defects found predicted actual

Slide 10 Background to work with Philips Fixed Risk Map (AID) 2000 MODIST Agena Risk Reliability models

Slide 11 Projects in the trial (actual number of projects shown in brackets)

Slide 12 Factors used at Philips Existing code base… Complexity and management of new requirements... Development, testing, rework, process … Overall project management …

Slide 13 Actual versus predicted defects Correlation coefficient 95%

Slide 14 Validation Summary (Philips’ words) “Bayesian Network approach is innovative and one of the most promising techniques for predicting defects in software projects” “Our evaluation showed excellent results for project sizes between 10 and 90 KLOC” “Initially, projects outside this range were not within the scope of the default model so predictions were less accurate, but accuracy was significantly improved after standard model tailoring” “AgenaRisk is a valuable tool as is”

Slide 15 Specific benefits Accurate predictions of defects at different phases Know when to stop testing Identify where to allocate testing Minimise the cost of rework Highlight areas for improvement Use out of the box now if you have no data Approach fully customisable Models arbitrary project life-cycles

Slide 16 Summary Risk maps – the way forward Validation results excellent

Slide 17 …And You can use the technology NOW