Prediction Basic concepts. Scope Prediction of:  Resources  Calendar time  Quality (or lack of quality)  Change impact  Process performance  Often.

Slides:



Advertisements
Similar presentations
Forecasting Using the Simple Linear Regression Model and Correlation
Advertisements

Hypothesis Testing Steps in Hypothesis Testing:
Predictor of Customer Perceived Software Quality By Haroon Malik.
ESCOM/April 2001Aristotle University/Singular Int’l1 BRACE: BootstRap based Analogy Cost Estimation Automated support for an enhanced effort prediction.
Software Quality Ranking: Bringing Order to Software Modules in Testing Fei Xing Michael R. Lyu Ping Guo.
MetriCon 2.0 Correlating Automated Static Analysis Alert Density to Reported Vulnerabilities in Sendmail Michael Gegick, Laurie Williams North Carolina.
SBSE Course 3. EA applications to SE Analysis Design Implementation Testing Reference: Evolutionary Computing in Search-Based Software Engineering Leo.
Mining Metrics to Predict Component Failures Nachiappan Nagappan, Microsoft Research Thomas Ball, Microsoft Research Andreas Zeller, Saarland University.
Prediction of fault-proneness at early phase in object-oriented development Toshihiro Kamiya †, Shinji Kusumoto † and Katsuro Inoue †‡ † Osaka University.
Chapter 12 Simple Linear Regression
Regression Analysis. Unscheduled Maintenance Issue: l 36 flight squadrons l Each experiences unscheduled maintenance actions (UMAs) l UMAs costs $1000.
Statistics 350 Lecture 16. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Chapter 14 Conducting & Reading Research Baumgartner et al Chapter 14 Inferential Data Analysis.
Lecture 24: Thurs. Dec. 4 Extra sum of squares F-tests (10.3) R-squared statistic (10.4.1) Residual plots (11.2) Influential observations (11.3,
Copyright USC-CSSE 1 Quality Management – Lessons of COQUALMO (COnstructive QUALity MOdel) A Software Defect Density Prediction Model AWBrown.
CSE 300: Software Reliability Engineering Topics covered: Software metrics and software reliability Software complexity and software quality.
1 Predictors of customer perceived software quality Paul Luo Li (ISRI – CMU) Audris Mockus (Avaya Research) Ping Zhang (Avaya Research)
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
State coverage: an empirical analysis based on a user study Dries Vanoverberghe, Emma Eyckmans, and Frank Piessens.
S Neuendorf 2004 Prediction of Software Defects SASQAG March 2004 by Steve Neuendorf.
1 Prediction of Software Reliability Using Neural Network and Fuzzy Logic Professor David Rine Seminar Notes.
Objectives of Multiple Regression
1 NASA OSMA SAS02 Software Reliability Modeling: Traditional and Non-Parametric Dolores R. Wallace Victor Laing SRS Information Services Software Assurance.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 27 Slide 1 Quality Management 1.
Chapter 13: Inference in Regression
5-1 Introduction 5-2 Inference on the Means of Two Populations, Variances Known Assumptions.
1 Software Quality Engineering CS410 Class 5 Seven Basic Quality Tools.
Research Terminology for The Social Sciences.  Data is a collection of observations  Observations have associated attributes  These attributes are.
Multivariate Statistical Data Analysis with Its Applications
Knowledge Acquisition. Concepts of Knowledge Engineering Knowledge engineering The engineering discipline in which knowledge is integrated into computer.
Independent Verification and Validation (IV&V) Techniques for Object Oriented Software Systems SAS meeting July 2003.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
1 POP Quiz T/F Defect Removal Effectiveness and Defect Removal Models are not true Predictive Models Define DRE What is a Checklist? What is it for? What.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 23Slide 1 Chapter 23 Software Cost Estimation.
Configuration Management (CM)
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
This chapter is extracted from Sommerville’s slides. Text book chapter
A Validation of Object-Oriented Design Metrics As Quality Indicators Basili et al. IEEE TSE Vol. 22, No. 10, Oct. 96.
Supporting Release Management & Quality Assurance for Object-Oriented Legacy Systems - Lionel C. Briand Visiting Professor Simula Research Labs.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
1 Indian Institute of Technology Bombay Indian Institute of Technology, Mumbai A Framework for Design Phase Prediction using Integrated Product and Process.
THE IRISH SOFTWARE ENGINEERING RESEARCH CENTRELERO© What we currently know about software fault prediction: A systematic review of the fault prediction.
Multivariate Data Analysis Chapter 1 - Introduction.
Software Maintenance Speaker: Jerry Gao Ph.D. San Jose State University URL: Sept., 2001.
SOFTWARE METRICS Software Metrics :Roadmap Norman E Fenton and Martin Neil Presented by Santhosh Kumar Grandai.
Module III Multivariate Analysis Techniques- Framework, Factor Analysis, Cluster Analysis and Conjoint Analysis Research Report.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
Business Statistics: A First Course (3rd Edition)
1 Experience from Studies of Software Maintenance and Evolution Parastoo Mohagheghi Post doc, NTNU-IDI SEVO Seminar, 16 March 2006.
Data Mining and Decision Support
Traditional Economic Model of Quality of Conformance
Machine Learning 5. Parametric Methods.
Object-Oriented (OO) estimation Martin Vigo Gabriel H. Lozano M.
NTU & MSRA Ming-Feng Tsai
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Chapter 8 Testing the Programs. Integration Testing  Combine individual comp., into a working s/m.  Test strategy gives why & how comp., are combined.
Modeling of Core Protection Calculator System Software February 28, 2005 Kim, Sung Ho Kim, Sung Ho.
McGraw-Hill/Irwin © 2003 The McGraw-Hill Companies, Inc.,All Rights Reserved. Part Four ANALYSIS AND PRESENTATION OF DATA.
Software Defects Cmpe 550 Fall 2005
Chapter 2 Simple Comparative Experiments
12 Steps to Useful Software Metrics
CHAPTER 29: Multiple Regression*
Predict Failures with Developer Networks and Social Network Analysis
CIS12-3 IT Project Management
Chapter 13 Quality Management
The Organizational Impacts on Software Quality and Defect Estimation
MECH 3550 : Simulation & Visualization
Statistical Thinking and Applications
Exploring Complexity Metrics as Indicators of Software Vulnerability
Presentation transcript:

Prediction Basic concepts

Scope Prediction of:  Resources  Calendar time  Quality (or lack of quality)  Change impact  Process performance  Often confounded with the decision process

Historical data Y (dependent, observed, response variable) X (independent, prediction variable) known unknown x0x0 prediction interval of new observation Y 0 at x 0 explained variance of observed Y i

Methods for building prediction models  Statistical  Parametric  Make assumptions about distribution of the variables  Good tools for automation  Linear regression, Variance analysis,...  Non-parametric, robust  No assumptions about distribution  Less powerful, low degree of automation  Rank-sum methods, Pareto diagrams,...  Causal models  Link elements with semantic links or numerical equations  Simulation models, connectionism models, genetic models,...  Judgemental  Organise human expertise  Delphi method, pair-wise comparison, rule-based methods

Common SE-predictions  Detecting fault-prone modules  Project effort estimation  Change Impact Analysis  Ripple effect analysis  Process improvement models  Model checking  Consistency checking

Introduction  There are many faults in software  Faults are costly to find and repair  The later we find faults the more costly they are  We want to find faults early  We want to have automated ways of finding faults  Our approach  Automatic measurements on models  Use metrics to predict fault-prone modules

Related work  Niclas Ohlsson, PhD work 1993  AXE, fault prediction, introduced Pareto diagrams,  Predictor: number of new and changed signals  Lionel Briand, Khaled El Eman, et al  Numerous contributions in exploring relations between fault- proness and object-oriented metrics  Piotr Tomaszewski, PhD Karlskrona 2006  Studies fault density  Comparison of statistical methods and expert judgement  Jeanette Heidenberg, Andreas Nåls  Discover weak design and propose changes

Approach  Find metrics (independent variables)  Number of model elements (size)  Number of changed methods (change)  Transitions per state (complexity)  Changed operations * transitions per state (combinations) ...  Use metrics to predict (dependent variable)  Number of TRs

Capsules

State charts

package capsuleclass attributeoperationportprotocol signal State machine Statetransition Data model

Our project - modelmet  RNC application - Three releases  About 7000 model elements  TR statistics database (2000 TRs)  Find metrics  Existing metrics (done at standard daily build)  Run scripts on models  Statistical analysis  Linear regression, principal component analysis, discriminant analysis, robust methods  Neural networks, Bayesian belief networks

Size Change Complexity Combined

Metrics based on change, system A

Metrics based on change, system B

Complexity and size metrics, system A

Complexity and Size metrics, system B

Other metrics, system A TRD = C states – protocols modelelements

Other metrics, system B

How to use predictions  Uneven distribution of faults is common – 80/20 rule  Perform special treatment on selected parts  Select experienced designers  Provide good working conditions  Parallell teams  Inspections  Static and dynamic analysis tools ...  Perform root-cause analysis and make corrections

Results Contributions:  Valid statistical material:  Large models, large number of TRs  Two change projects  Two highly explanatory predictors were found  State chart metrics are as good as OO metrics Problems:  Some problems to match modules in models and TRs  Effort to collect change data