Download presentation
Presentation is loading. Please wait.
Published byMarilynn Tyler Modified over 9 years ago
1
An Approach to Measure Java Code Quality in Reuse Environment Aline Timóteo Advisor: Silvio Meira UFPE – Federal University of Pernambuco alt.timoteo@gmail.com 1
2
Summary Motivation Background Metrics An Approach to Measure Java Code Quality Main Contributions Status 2
3
Motivation 3
4
Motivation Reuse Benefit Productivity Cost Quality Reuse is a competitive advantage!!!!! Reuse environment [Frakes, 1994] Process Metrics Tools Repository Search engine Domain tools … 4
5
Problem Component Repository promote reuse success [Griss, 1994] Artifacts quality must be assured by the organization that maintains a repository? [Seacord, 1999] How to minimize low-quality artifacts reuse? 5
6
Background 6
7
Metrics “Software metrics is a method to quantify attributes in software processes, products and projects” [Daskalantonakis, 1992] Metrics Timeline Age 1: before 1991, where the main focus was on metrics based on the code complexity Age 2: after 1992, where the main focus was on metrics based on the concepts of Object Oriented (OO) systems 7
8
Age 1: Complexity Age 2: Object Oriented 8
9
Most Referenced Metrics LOC Cyclomatic Complexity [McCabe, 1976] Chidamber and Kemerer Metrics [Chidamber, 1994] Lorenz and Kidd Metrics [Lorenz, 1994] MOOD Metrics [Brito, 1994] 9
10
Problems related to Metrics [Ince, 1988 and Briand, 2002] Metrics Validation Theoretical Validation Measurement goal Experimental hypothesis Environment or context Empirical validation Metrics Automation Different set of metrics implemented Bad documentation Quality attributes x Metrics 10
11
An Approach to Measure Java Code Quality 11
12
An Approach to Measure Java Code Quality Quality Attributes x Metrics Metrics Selection and Specification Quality Attributes measurement 12
13
Quality in a Reuse Environment [Etzkorn, 2001] ISO 9126 13
14
Quality Attributes x Metrics Quality Attributes Code Attributes LOC Cyclomatic Complexity CK Metrics WMCDITNOCCBORFCLCOM sizex Analyzabilitycomplexity xxx XX documentationx complexity xxx XX modularity/ encapsulation X Changeabilitycoupling x X cohesion X inheritance x Stability Testabilitycomplexity xxx XX coupling x X complexity xxx Xx documentationx Reusability modularity/ encapsulation x coupling X x cohesion x 14
15
Metrics Selection Applicable for Java Empirical Validation Theoretical Validation Acceptance 15
16
Metrics Selection McCabe Metric [McCabe, 1976] Theoretical Validation, according to graphos teory Independence of technology Empirical Validation Acceptance [Refactorit, 2001; Metrics, 2005; JHawk, 2007] 16
17
Metrics Selection CK Metrics [Chidamber, 1994], Theoretical Validation, Developed in a OO context Empirical Validation [Briand, 1994; Chidamber, 1998; Tang, 1999]. Acceptance [Refactorit, 2001; Metrics, 2005; JHawk, 2007] 17
18
Metrics Specification Response for a Class Definition Short nameRFC Description The response set of a class is a set of methods that can potentially be executed in response to a message received by an object of that class. Calculated by RFC = M + R M = number of methods in the class R = number of remote methods directly called by methods of the class Allowable valueRFC<=50 PrivateNo Analysis PresentationN/A Range of analysisWhen the development cycle is done Analysis procedureIdentify how classes have a highest responsibility ResponsibleMetrics Analyst 18
19
Quality Attributes Measurement (QAM) QAM = (the number of metrics that have a allowable value) Heuristically QAM >= Number of metrics /2 Example: 2,5 <= QAM <= 5 Quality Attribute Code Attribute CK Metrics WMCDITCBORFCLCOM Testability complexityxx xx coupling x x Max Testability = 5 Min Testability = 2,5 19
20
Approach Automation 20
21
Approach Automation 21 Analyzability QAM = 3.0 -------> RFC: 2.0 -------> WMC 1.0 -------> CC 0.0 Changeability QAM = 3.0 -------> CBO 3.0 -------> RFC: 2.0 -------> WMC 1.0 public class Client implements Runnable, CommandListener { /** * Start the client thread */ public void start() { Thread t = new Thread(this); t.start(); } }
22
Experiment 22
23
Experiment Main Question The retrieval component quality is better? Compare B.A.R.T. search results Results before introduce filter Results after introduce filter Apply questionnaire for customers 23
24
Main Contributions Introduce quality analysis in a repository Reduce code problem propagation Highest Reliability 24
25
Current Stage 25
26
Referências [Frakes, 1994] W. B. Frakes and S. Isoda, "Success Factors of Systematic Software Reuse," IEEE Software, vol. 11, pp. 14--19, 1994. [Griss, 1994] M. L. Griss, "Software Reuse Experience at Hewlett-Packard," presented at 16th International Conference on Software Engineering (ICSE), Sorrento, Italy, 1994. [Garcia, 2006] V. C. Garcia, D. Lucrédio, F. A. Durão, E. C. R. Santos, E. S. Almeida, R. P. M. Fortes, and S. R. L. Meira, "From Specification to Experimentation: A Software Component Search Engine Architecture," presented at The 9th International Symposium on Component-Based Software Engineering (CBSE 2006), Mälardalen University, Västerås, Sweden, 2006. [Etzkorn, 2001] Letha H. Etzkorn, William E. Hughes Jr., Carl G. Davis: Automated reusability quality analysis of OO legacy software. Information & Software Technology 43(5): 295-308 (2001) [Daskalantonakis, 1992] M. K. Daskalantonakis, “A Pratical View of Software Measurement and Implementation Experiences Within Motorola”, IEEE Transactions on Software Engineering, vol 18, 1992, pp. 998–1010. [McCabe, 1976] T. J. McCabe, “A Complexity Measure”. IEEE Transactions of Software Engineering, vol SE-2, 1976, pp. 308-320. [Chidamber, 1994] S. R. Chidamber, C. F. Kemerer, “A Metrics Suite for Object Oriented Design”, IEEE Transactions on Software Engineering, vol 20, Piscataway - USA, 1994, pp. 476-493. [Lorenz, 1994] M. Lorenz, J. Kidd, “Object-Oriented Software Metrics: A Practical Guide”, Englewood Cliffs, New Jersey - USA, 1994. [Brito, 1994] A. F. Brito, R. Carapuça, "Object-Oriented Software Engineering: Measuring and controlling the development process", 4th Interntional Conference on Software Quality, USA, 1994. [Ince, 1988] D. C. Ince, M. J. Sheppard, "System design metrics: a review and perspective", Second IEE/BCS Conference, Liverpool - UK, 1988, pp. 23-27. [Briand, 2002] L. C. Briand, S. Morasca, V. R. Basili, “An Operational Process for Goal-Driven Definition of Measures”, Software Engineering - IEEE Transactions, vol 28, 2002, pp. 1106-1125. [Morasca, 1989] S. Morasca, L. C. Briand, V. R. Basili, E. J. Weyuker, M. V. Zelkowitz, B. Kitchenham, S. Lawrence Pfleeger, N. Fenton, "Towards a framework for software measurementvalidation", Software Engineering, IEEE Transactions, vol 23, 1995, pp. 187-189. [Seacord, 1999] Robert C. Seacord. Software engineering component repositories. Technical report, Software Engineering Institute (SEI), 1999 26
27
[Refactorit, 2001] Refactorit tool, online, last update: 01/2008, available: http://www.aqris.com/display/ap/RefactorIt http://www.aqris.com/display/ap/RefactorIt [Jdepend, 2005] JDepend tool, online, last update: 03/2006,available: http://www.clarkware.com/software/JDepend.html http://www.clarkware.com/software/JDepend.html [Metrics, 2005] Metrics Eclipse Plugin, online, last update: 07/2005, available: http://sourceforge.net/projects/metrics http://sourceforge.net/projects/metrics [Jhawk, 2007] JHawk Eclipse Plugin, online, last update: 03/2007, available: http://www.virtualmachinery.com/jhawkprod.htm http://www.virtualmachinery.com/jhawkprod.htm 27
28
Aline Timóteo UFPE – Federal University of Pernambuco alt.timoteo@gmail.com 28
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.