Software Metrics Validation using Version control

Slides:



Advertisements
Similar presentations
SE 501 Software Development Processes Dr. Basit Qureshi College of Computer Science and Information Systems Prince Sultan University Lecture for Week 7.
Advertisements

Automated Software Maintainability through Machine Learning by Eric Mudge.
1 Predicting Bugs From History Software Evolution Chapter 4: Predicting Bugs from History T. Zimmermann, N. Nagappan, A Zeller.
Prediction of fault-proneness at early phase in object-oriented development Toshihiro Kamiya †, Shinji Kusumoto † and Katsuro Inoue †‡ † Osaka University.
Figures – Chapter 24.
Analysis of CK Metrics “Empirical Analysis of Object-Oriented Design Metrics for Predicting High and Low Severity Faults” Yuming Zhou and Hareton Leung,
Software engineering for real-time systems
Software Quality Metrics
Soft. Eng. II, Spr. 02Dr Driss Kettani, from I. Sommerville1 CSC-3325: Chapter 6 Title : The Software Quality Reading: I. Sommerville, Chap: 24.
Chapter Nine NetWare-Based Networking. Objectives Identify the advantages of using the NetWare network operating system Describe NetWare’s server hardware.
A GOAL-BASED FRAMEWORK FOR SOFTWARE MEASUREMENT
Software Metrics II Speaker: Jerry Gao Ph.D. San Jose State University URL: Sept., 2001.
University of Southern California Center for Systems and Software Engineering 1 © USC-CSSE Unified CodeCounter (UCC) with Differencing Functionality Marilyn.
Lessons learned from an open-source University Project P. Basdaras, K. Chalkias, A. Chatzigeorgiou, I. Deligiannis, P. Tsakiri, N. Tsantalis Department.
University of Southern California Center for Systems and Software Engineering © 2009, USC-CSSE 1 Assessing and Estimating Corrective, Enhancive, and Reductive.
1 Complexity metrics  measure certain aspects of the software (lines of code, # of if-statements, depth of nesting, …)  use these numbers as a criterion.
Comp 587 Parker Li Bobby Kolski. Automated testing tools assist software engineers to gauge the quality of software by automating the mechanical aspects.
University of Toronto Department of Computer Science © 2001, Steve Easterbrook CSC444 Lec22 1 Lecture 22: Software Measurement Basics of software measurement.
Software Engineering Laboratory, Department of Computer Science, Graduate School of Information Science and Technology, Osaka University 1 Refactoring.
By: Md Rezaul Huda Reza 5Ps for SE Process Project Product People Problem.
1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn.
Chapter 6 : Software Metrics
Paradigm Independent Software Complexity Metrics Dr. Zoltán Porkoláb Department of Programming Languages and Compilers Eötvös Loránd University, Faculty.
Software Measurement & Metrics
Chapter Nine NetWare-Based Networking. Introduction to NetWare In 1983, Novell introduced its NetWare network operating system Versions 3.1 and 3.1—collectively.
Software Engineering 2003 Jyrki Nummenmaa 1 SOFTWARE PRODUCT QUALITY Today: - Software quality - Quality Components - ”Good” software properties.
1 OO Metrics-Sept2001 Principal Components of Orthogonal Object-Oriented Metrics Victor Laing SRS Information Services Software Assurance Technology Center.
OHTO -99 SOFTWARE ENGINEERING “SOFTWARE PRODUCT QUALITY” Today: - Software quality - Quality Components - ”Good” software properties.
Software Engineering Research Group, Graduate School of Engineering Science, Osaka University 1 Evaluation of a Business Application Framework Using Complexity.
Software Engineering 2 Software Testing Claire Lohr pp 413 Presented By: Feras Batarseh.
MSE Presentation 1 By Padmaja Havaldar- Graduate Student Under the guidance of Dr. Daniel Andresen – Major Advisor Dr. Scott Deloach-Committee Member Dr.
This material is approved for public release. Distribution is limited by the Software Engineering Institute to attendees. Sponsored by the U.S. Department.
Enabling Reuse-Based Software Development of Large-Scale Systems IEEE Transactions on Software Engineering, Volume 31, Issue 6, June 2005 Richard W. Selby,
Disciplined Software Engineering Lecture #2 Software Engineering Institute Carnegie Mellon University Pittsburgh, PA Sponsored by the U.S. Department.
Copyright © 1994 Carnegie Mellon University Disciplined Software Engineering - Lecture 1 1 Disciplined Software Engineering Lecture #2 Software Engineering.
SOFTWARE METRICS. Software Process Revisited The Software Process has a common process framework containing: u framework activities - for all software.
CSc 461/561 Information Systems Engineering Lecture 5 – Software Metrics.
Software Engineering Laboratory, Department of Computer Science, Graduate School of Information Science and Technology, Osaka University IWPSE 2003 Program.
A Metrics Program. Advantages of Collecting Software Quality Metrics Objective assessments as to whether quality requirements are being met can be made.
Daniel Liu & Yigal Darsa - Presentation Early Estimation of Software Quality Using In-Process Testing Metrics: A Controlled Case Study Presenters: Yigal.
Object-Oriented (OO) estimation Martin Vigo Gabriel H. Lozano M.
Ontology Support for Abstraction Layer Modularization Hyun Cho, Jeff Gray Department of Computer Science University of Alabama
COMPUTER III. Fundamental Concepts of Programming Control Structures Sequence Selection Iteration Flowchart Construction Introduction to Visual Basic.
Exam Schedule System by Sheikh Nur Jahan ID# Supervisor: Md. Ahsan Arif Project Presentation for Bachelor of Science Dept. of Computer Science.
Introduction to Algorithm. What is Algorithm? an algorithm is any well-defined computational procedure that takes some value, or set of values, as input.
1 CASE Computer Aided Software Engineering. 2 What is CASE ? A good workshop for any craftsperson has three primary characteristics 1.A collection of.
SQL Database Management
The Role of Tool Support in Public Policies and Accessibility
Visual Basic.NET Windows Programming
A Hierarchical Model for Object-Oriented Design Quality Assessment
Assessment of Geant4 Software Quality
INFORMATION RETRIEVAL AND KNOWLEDGE MANAGEMENT SYSTEM
Sizing With Function Points
Microsoft Office Illustrated Fundamentals
Object-Oriented Metrics
Jeliot 3 Spring 2004 Andrés Moreno García Niko Myller
Design Metrics Software Engineering Fall 2003
Post Hoc Tests on One-Way ANOVA
Design Metrics Software Engineering Fall 2003
Ada – 1983 History’s largest design effort
Tonga Institute of Higher Education
Personal Software Process Software Estimation
LEARNING AREA 4 : MULTIMEDIA
Predicting Fault-Prone Modules Based on Metrics Transitions
Software Metrics “How do we measure the software?”
Presented by Trey Brumley and Ryan Carter
OS Changer Porting Kit Contents Application Common Operating Environment (AppCOE): An eclipse based IDE for development of C/C++ applications.
Software Metrics SAD ::: Fall 2015 Sabbir Muhammad Saleh.
Automated Analysis and Code Generation for Domain-Specific Models
Software Metrics using EiffelStudio
Presentation transcript:

Software Metrics Validation using Version control Presenters: Hema Pujar Smitha Halyal 0729932 0729096 Tutor: drs. Reinier Post (LaQuSo) r.d.j.post@tue.nl

Contents Introduction Objectives Metrics Selection Tools Selection Validating tools Open Source Systems Statistical Evaluation Analysis Future work References / Department of Mathematics and Computer Science 9-11-2018

Software metric validation ? The idea for validating the automatically measurable metrics is to test for correlation between the software metrics in the context of ISO 9126 Software Quality Model and other quality attributes[4]. / Department of Mathematics and Computer Science 9-11-2018

Metrics Software metrics: A software metric measures (or quantifies) some property of software. Some common software metrics : Lines of code(LOC) Number of Local Methods(NOM) McCabe Cyclomatic Complexity(CC) Depth of Inheritance(DIT) /Department of Mathematics and Computer Science 9-11-2018

Software Maintainability Metric The ease with which a software system or component can be modified to correct faults, improve performance, or adapt to a changed environment. Examples : On size of the file Lines of Code(LOC). Interface complexity metrics Number of attributes and number of methods(NAM). Number of Local Methods(NOM). /Department of Mathematics and Computer Science 9-11-2018

Development Effort It is the measure of human effort involved in developing a project or system. It usually involves experts in estimating it. Program changes is an indication of development effort. So number of changes made to a file is measured. / Department of Mathematics and Computer Science 9-11-2018

Objectives To measure the maintainability metrics and development effort using already existing tools. To find relation between the maintainability metric measured and development effort if there exists a relation between them. To estimate the value of development effort based on its relation with maintainability metric. / Department of Mathematics and Computer Science 9-11-2018

Metrics Selection Size of file (lines of code). Choosing relevant metrics. The first metric chosen is on the version of source code that is related to maintainability. Size of file (lines of code). The second metric chosen is on the changes made to the source code that is related to development effort. Number of changes made to a file = (lines added + lines deleted + lines modified). / Department of Mathematics and Computer Science 9-11-2018

Tool Selection Requirements for choosing tools: For object oriented systems. Easily freely available. Simple to use. License free. / Department of Mathematics and Computer Science 9-11-2018

Table.1 Tools and Metrics[4] Names CBO DIT LCOM- CK LCOM- HS LOC NOC NOM RFC TCC WMC Analyst4j X Eclipse Metrics Plugin 1.3.6 Eclipse Metrics 3.4 SLOC Metrics 3.0 Table.1 Tools and Metrics[4] / Department of Mathematics and Computer Science 9-11-2018

SLOC Metrics 3.0 SLOC Metrics 3.0 is an effective and convenient tool which measures the size of the source code based on the physical source lines of code(LOC) metric recommended by the Software Engineering Institute of Carnegie Mellon University. Features Support for C++, C#, Java, HTML, Perl, VB .NET, Visual Basic, PHP and more. can be run as a console application. easy to use windows-based interface. sorted results to identify code intensive modules and files. / Department of Mathematics and Computer Science 9-11-2018

Diffstat This is used to measure the number of changes made to the source code. Features It is commonly used to provide a summary of the changes in files. It reads output of the diff command and displays the insertions, deletions and modifications in each file. / Department of Mathematics and Computer Science 9-11-2018

Validating Tools The tools chosen for measuring the chosen metrics are validated by manually counting the number of lines and the modifications made to small files. / Department of Mathematics and Computer Science 9-11-2018

Open Source Systems Criteria for identifying suitable Open Source Systems: It must be an object-oriented system. For each selected system, at least 5 unique releases must be available. A selected system has less than 50 source files. A selected system must still be active or terminated for good but not abandoned because it became unmaintainable. / Department of Mathematics and Computer Science 9-11-2018

Contd.. Jaim The primary goal of Jaim is to simplify writing AOL bots in Java. It consists of 46 source files. ProGuard It is a free Java class file shrinker and optimizer. It consists of 465 source files. jTcGUI (selected open source system) A Linux tool for managing TrueCrypt volumes. It has five source files and in the last two versions two files were added. / Department of Mathematics and Computer Science 9-11-2018

Metric Values from Tools Version-2 jTcGUI File1 File2 File3 File4 File5 File6 File7 Size of the file 301 118 104 78 No. of changes made when compared with version-1 391 157 184 121 Version-3 jTcGUI File1 File2 File3 File4 File5 File6 File7 Size of the file 288 120 105 79 No. of changes made when compared with version-2 55 10 8 1 Version-4 jTcGUI File1 File2 File3 File4 File5 File6 File7 Size of the file 289 120 105 79 No. of changes made when compared with version-3 3 /Department of Mathematics and Computer Science 9-11-2018

Contd.. Version-5 jTcGUI File1 File2 File3 File4 File5 File6 File7 Size of the file 325 120 194 190 220 No. of changes made when compared with version-4 62 Version-6 jTcGUI File1 File2 File3 File4 File5 File6 File7 Size of the file 383 120 194 190 220 No. of changes made when compared with version-5 96 4 141 33 276 Version-7 jTcGUI File1 File2 File3 File4 File5 File6 File7 Size of the file 383 120 195 194 220 No. of changes made when compared with version-6 70 /Department of Mathematics and Computer Science 9-11-2018

No. of changes made when compared with version-7 Contd.. Version-8 jTcGUI File1 File2 File3 File4 File5 File6 File7 Size of the file 403 120 242 200 221 256 43 No. of changes made when compared with version-7 27 71 8 4 343 713 / Department of Mathematics and Computer Science 9-11-2018

Statistical Evaluation Versions x (size) y (number of changes made) xy x2 y2 Version2 644 915 589260 414736 837225 Version3 646 1003 647938 417316 1006009 Version4 647 1005 650235 418609 1010025 Version5 786 1197 940842 617796 1432809 Version6 1161 1747 2028267 1347921 3052009 Version7 1166 1827 2130282 1359556 3337929 Version8 1539 2350 3616650 2368521 5522500 Table2 Finding correlation between size and number of changes made to a version. / Department of Mathematics and Computer Science 9-11-2018

Contd... Correlation(r) = r = 0.9978 Conclusion : The size of the system and changes made to it are highly correlated. /Department of Mathematics and Computer Science 9-11-2018

Analysis ANOVA Analysis of Variance(ANOVA) used to compare when there are more than two mean values. 5% level of significance. Hypothesis are selected or rejected based on the P-values. Newman-Keuls multiple post hoc procedures This procedure is used when pair-wise comparison has to be done. / Department of Mathematics and Computer Science 9-11-2018

Contd.. H0 : There is no significant difference between the different versions with respect to size and number of changes made. Version Size of the file No. of changes Means Std.Dev. Version2 150.25 101.86 213.25 121.28 Version 3 148.00 94.86 18.50 24.64 Version4 148.25 95.35 0.75 1.50 Version5 209.80 74.30 12.40 27.73 Version 6 221.40 97.64 117.40 98.91 Version 7 222.40 97.26 14.00 31.30 Version 8 212.14 112.97 166.57 270.08 Total 192.32 94.67 82.79 150.23 ANOVA (F) 0.6185 1.8272 P-value 0.7137 0.1312 Table.3 Comparison of different systems with respect to size of the file and number of changes by one way ANOVA / Department of Mathematics and Computer Science 9-11-2018

Contd.. H0:There is no significant difference between different files with respect to size and number of changes made. *Significant at 5% level of significance (p<0.05)   Files Size of the file No. of changes Means Std.Dev. File1 338.86 49.51 90.57 136.80 File 2 119.71 0.76 29.71 58.09 File 3 162.71 56.86 67.71 72.62 File 4 144.29 61.47 23.29 44.72 File 5 220.25 0.50 70.00 137.35 File 6 256.00 0.00 343.00 File 7 43.00 713.00 Total 192.32 94.67 82.79 150.23 ANOVA (F) 18.9386 9.8992 P-value 0.0000* Table 4 Comparison of different files with respect to size of the file and number of changes by one way ANOVA. / Department of Mathematics and Computer Science 9-11-2018

Contd.. H0: There is no significant difference between pair-wise files with respect to size. *Significant at 5% level of significance (p<0.05)  Files File1 File 2 File 3 File 4 File 5 File 6 File 7 Files 338.860 119.710 162.7100 144.2900 220.2500 256.0000 43.0000 1.0000   0.0003* 0.0013* 0.5561 0.0007* 0.5558 0.6582 0.0204* 0.0931 0.1738 0.1745 0.0544 0.0206* 0.0783 0.0527 0.3931 0.0001* 0.0735 0.034* 0.0522 0.0018* 0.0004* Table 5 Pair wise comparison of different files with respect to size of the file by Newman-Keuls multiple post hoc procedures / Department of Mathematics and Computer Science 9-11-2018

Contd.. H0 : There is no significant difference between pair-wise files with respect to number of changes made. *Significant at 5% level of significance (p<0.05)  Files File1 File 2 File 3 File 4 File 5 File 6 File 7 Files 90.5710 29.7140 67.7140 23.2860 70.0000 343.0000 713.0000 1.0000 0.8843 0.9596 0.6523 0.9262 0.9392 0.8560 0.8071 0.8799 0.9784 0.9430 0.0055* 0.0070* 0.0136* 0.0081* 0.0080* 0.0001* 0.0002* 0.0003* Table Pair wise comparison of different files with respect to no. of changes by Newman-Keuls multiple post hoc procedures / Department of Mathematics and computer Science 9-11-2018

Future work To estimate the development effort(number of changes) from the size of the file. To check for systems with more source files. To check for systems with more versions. / Department of Mathematics and Computer Science 9-11-2018

References [1] Basili.V.R., Briand.L.C., Melo.W.L.: A Validation of Object Oriented Design Metrics as Quality Indicators, IEEE Transactions on Software Engineering,Vol.22,No.10,October 1996. [2] Lindroos Jaana: Code and Design Metrics for Object-Oriented Systems, University of Helsinki(2004). [3] Jamali.S.M.: Object Oriented Metrics, Sharif University of Technology, Tehran, Iran(2006). [4] Lincke.R., Lowe,W.: Validation of a Standard and Metric Based Software Quality Model,School of Mathematics and Systems Engineering,Vaxjo University,Vaxjo,Sweden(2005). [5] Lincke.R., Lowe.W.: Comparing Software Metrics Tools, School of Mathematics and Systems Engineering,Vaxjo University, Sweden(2008). /Department of Mathematics and Computer Science 9-11-2018

Questions / Department of Mathematics and Computer Science 9-11-2018