OORPT Object-Oriented Reengineering Patterns and Techniques 7. Problem Detection Prof. O. Nierstrasz.

Slides:



Advertisements
Similar presentations
Ch:8 Design Concepts S.W Design should have following quality attribute: Functionality Usability Reliability Performance Supportability (extensibility,
Advertisements

Software Metrics for Object Oriented Design
8. Introduction to Denotational Semantics. © O. Nierstrasz PS — Denotational Semantics 8.2 Roadmap Overview:  Syntax and Semantics  Semantics of Expressions.
12. Common Errors, a few Puzzles. © O. Nierstrasz P2 — Common Errors, a few Puzzles 12.2 Common Errors, a few Puzzles Sources  Cay Horstmann, Computing.
ESE Einführung in Software Engineering 7. Modeling Behaviour Prof. O. Nierstrasz.
Reverse Engineering © SERG Code Cloning: Detection, Classification, and Refactoring.
Figures – Chapter 24.
OORPT Object-Oriented Reengineering Patterns and Techniques 7. Problem Detection Prof. O. Nierstrasz.
March 25, R. McFadyen1 Metrics Fan-in/fan-out Lines of code Cyclomatic complexity Comment percentage Length of identifiers Depth of conditional.
© Serge Demeyer, Stéphane Ducasse, Oscar Nierstrasz6.1 Refactorings Refactoring –What is it? –Why is it necessary? –Examples –Tool support Refactoring.
Nov R. McFadyen1 Metrics Fan-in/fan-out Lines of code Cyclomatic complexity* Comment percentage Length of identifiers Depth of conditional.
Page 1 Building Reliable Component-based Systems Chapter 7 - Role-Based Component Engineering Chapter 7 Role-Based Component Engineering.
Object-Oriented Reengineering Patterns and Techniques Prof. O. Nierstrasz Prof. S. Ducasse T.
© S. Demeyer, S. Ducasse, O. Nierstrasz Duplication.1 7. Problem Detection Metrics  Software quality  Analyzing trends Duplicated Code  Detection techniques.
ESE Einführung in Software Engineering N. XXX Prof. O. Nierstrasz Fall Semester 2009.
© S. Demeyer, S. Ducasse, O. Nierstrasz Duplication.1 7. Problem Detection Metrics  Software quality  Analyzing trends Duplicated Code  Detection techniques.
Soft. Eng. II, Spr. 02Dr Driss Kettani, from I. Sommerville1 CSC-3325: Chapter 6 Title : The Software Quality Reading: I. Sommerville, Chap: 24.
The Software Composition Group Prof. O. Nierstrasz
13. Summary, Trends, Research. © O. Nierstrasz PS — Summary, Trends, Research Summary, Trends, Research...  Summary: functional, logic and object-oriented.
7. Metrics in Reengineering Context
Metamodeling Seminar X. CHAPTER Prof. O. Nierstrasz Spring Semester 2008.
Object-Oriented Metrics
7. Duplicated Code Metrics Duplicated Code Software quality
March R. McFadyen1 Software Metrics Software metrics help evaluate development and testing efforts needed, understandability, maintainability.
ESE Einführung in Software Engineering X. CHAPTER Prof. O. Nierstrasz Wintersemester 2005 / 2006.
N. XXX Prof. O. Nierstrasz Thanks to Jens Palsberg and Tony Hosking for their kind permission to reuse and adapt the CS132 and CS502 lecture notes.
12. Summary, Trends, Research. © O. Nierstrasz PS — Summary, Trends, Research Roadmap  Summary: —Trends in programming paradigms  Research:...
Testing an individual module
Chapter 10 Classes Continued
© S. Demeyer, S. Ducasse, O. Nierstrasz Chapter.1 MakeMoney Corp. C*O of MakeMoney Corp. Our Vision  We invest in software  We do not know software 
OORPT Object-Oriented Reengineering Patterns and Techniques X. CHAPTER Prof. O. Nierstrasz.
CP — Concurrent Programming X. CHAPTER Prof. O. Nierstrasz Wintersemester 2005 / 2006.
12. eToys. © O. Nierstrasz PS — eToys 12.2 Denotational Semantics Overview:  … References:  …
Software Process and Product Metrics
REFACTORING Lecture 4. Definition Refactoring is a process of changing the internal structure of the program, not affecting its external behavior and.
Software Engineering Laboratory, Department of Computer Science, Graduate School of Information Science and Technology, Osaka University 1 Refactoring.
© S. Demeyer, S. Ducasse, O. Nierstrasz Intro.1 1. Introduction Goals Why Reengineering ?  Lehman's Laws  Object-Oriented Legacy Typical Problems  common.
Object-Oriented Reengineering Patterns 5. Problem Detection.
Software Measurement & Metrics
Concepts of Software Quality Yonglei Tao 1. Software Quality Attributes  Reliability  correctness, completeness, consistency, robustness  Testability.
1 Evaluating Code Duplication Detection Techniques Filip Van Rysselberghe and Serge Demeyer Lab On Re-Engineering University Of Antwerp Towards a Taxonomy.
1 Object-Oriented Design in Practice. Could/Should We Do Better? Dr. Radu Marinescu Timişoara, Object-Oriented Design in Practice. Could/Should.
1 Metrics and lessons learned for OO projects Kan Ch 12 Steve Chenoweth, RHIT Above – New chapter, same Halstead. He also predicted various other project.
An Automatic Software Quality Measurement System.
1 Towards an Automated Approach for Quality Improvement in OO Design Dr. Radu Marinescu Timişoara, Towards an Automatic Approach for Quality.
CSc 461/561 Information Systems Engineering Lecture 5 – Software Metrics.
SEG 4110 – Advanced Software Design and Reengineering Topic T Introduction to Refactoring.
Object-Oriented (OO) estimation Martin Vigo Gabriel H. Lozano M.
Ontology Support for Abstraction Layer Modularization Hyun Cho, Jeff Gray Department of Computer Science University of Alabama
Refactoring Agile Development Project. Lecture roadmap Refactoring Some issues to address when coding.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
Object Oriented Metrics
Object Design More Design Patterns Object Constraint Language Object Design Specifying Interfaces Review Exam 2 CEN 4010 Class 18 – 11/03.
CS223: Software Engineering Lecture 21: Unit Testing Metric.
Designing classes How to write classes in a way that they are easily understandable, maintainable and reusable 6.0.
A Hierarchical Model for Object-Oriented Design Quality Assessment
Introduction Edited by Enas Naffar using the following textbooks: - A concise introduction to Software Engineering - Software Engineering for students-
Assessment of Geant4 Software Quality
Software Metrics 1.
Course Notes Set 12: Object-Oriented Metrics
Design Characteristics and Metrics
Part 3 Design What does design mean in different fields?
Object-Oriented Metrics
Design Metrics Software Engineering Fall 2003
Design Metrics Software Engineering Fall 2003
Introduction Edited by Enas Naffar using the following textbooks: - A concise introduction to Software Engineering - Software Engineering for students-
Introduction to Data Structure
Software Metrics SAD ::: Fall 2015 Sabbir Muhammad Saleh.
Chapter 8: Design: Characteristics and Metrics
Presentation transcript:

OORPT Object-Oriented Reengineering Patterns and Techniques 7. Problem Detection Prof. O. Nierstrasz

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.2 Roadmap  Metrics  Object-Oriented Metrics in Practice  Duplicated Code

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.3 Roadmap  Metrics —Software quality —Analyzing trends  Object-Oriented Metrics in Practice  Duplicated Code

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.4 Why Metrics in OO Reengineering (ii)?  Assessing Software Quality —Which components have poor quality? (Hence could be reengineered) —Which components have good quality? (Hence should be reverse engineered)  Metrics as a reengineering tool!  Controlling the Reengineering Process —Trend analysis: which components changed? —Which refactorings have been applied?  Metrics as a reverse engineering tool!

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.5 ISO 9126 Quantitative Quality Model Software Quality Functionality Reliability Efficiency Usability Maintainability Portability ISO 9126FactorCharacteristic Metric Error tolerance Accuracy Simplicity Modularity Consistency defect density = #defects / size correction impact = #components changed correction time Leaves are simple metrics, measuring basic attributes

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.6 Product & Process Attributes Product Attribute Definition: measure aspects of artifacts delivered to the customer Example: number of system defects perceived, time to learn the system Process Attribute Definition: measure aspects of the process which produces a product Example: time to correct defect, number of components changed per correction

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.7 External & Internal Attributes External Attribute Definition: measures how the product/process behaves in its environment Example: mean time between failure, #components changed Internal Attribute Definition: measured purely in term of the product, separate from its behaviour in context Example: class coupling and cohesion, method size

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.8 External vs. Internal Product Attributes ExternalInternal Advantage:  close relationship with quality factors Disadvantage:  relationship with quality factors is not empirically validated Disadvantages:  measure only after the product is used or process took place  data collection is difficult; often involves human intervention/interpretation  relating external effect to internal cause is difficult Advantages:  can be measured at any time  data collection is quite easy and can be automated  direct relationship between measured attribute and cause

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.9 Metrics and Measurements  Weyuker [1988] defined nine properties that a software metric should hold. —Read Fenton & Pfleeger for critiques.  For OO only 6 properties are really interesting [Chidamber 94, Fenton & Pfleeger ] —Noncoarseness: – Given a class P and a metric m, another class Q can always be found such that m(P)  m(Q) – Not every class has the same value for a metric —Nonuniqueness. – There can exist distinct classes P and Q such that m(P) = m(Q) – Two classes can have the same metric —Monotonicity – m(P)  m (P+Q) and m(Q)  m (P+Q), P+Q is the “combination” of the classes P and Q.

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.10 Metrics and Measurements (ii) —Design Details are Important – The specifics of a class must influence the metric value. Even if a class performs the same actions details should have an impact on the metric value. —Nonequivalence of Interaction – m(P) = m(Q)  m(P+R) = m(Q+R) where R is an interaction with the class. —Interaction Increases Complexity – m(P) + (Q) < m (P+Q). – when two classes are combined, the interaction between the too can increase the metric value  Conclusion: Not every measurement is a metric.

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.11 Selecting Metrics  Fast —Scalable: you can’t afford log(n2) when n  1 million LOC  Precise —(e.g. #methods — do you count all methods, only public ones, also inherited ones?) —Reliable: you want to compare apples with apples  Code-based —Scalable: you want to collect metrics several times —Reliable: you want to avoid human interpretation  Simple —Complex metrics are hard to interpret

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.12 Assessing Maintainability  Size of the system, system entities —Class size, method size, inheritance —The intuition: large entities impede maintainability  Cohesion of the entities —Class internals —The intuition: changes should be local  Coupling between entities —Within inheritance: coupling between class-subclass —Outside of inheritance —The intuition: strong coupling impedes locality of changes

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.13 Sample Size and Inheritance Metrics Class Attribute Method Access Invoke BelongTo Inherit Inheritance Metrics hierarchy nesting level (HNL) # immediate children (NOC) # inherited methods, unmodified (NMI) # overridden methods (NMO) Class Size Metrics # methods (NOM) # instance attributes (NIA, NCA) # Sum of method size (WMC) Method Size Metrics # invocations (NOI) # statements (NOS) # lines of code (LOC)

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.14 Sample class Size  (NIV) —[Lore94] Number of Instance Variables (NIV) —[Lore94] Number of Class Variables (static) (NCV) —[Lore94] Number of Methods (public, private, protected) (NOM)  (LOC) Lines of Code  (NSC) Number of semicolons [Li93]  number of Statements  (WMC) [Chid94] Weighted Method Count —WMC = ∑ c i —where c is the complexity of a method (number of exit or McCabe Cyclomatic Complexity Metric)

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.15 Hierarchy Layout  (HNL) [Chid94] Hierarchy Nesting Level, (DIT) [Li93] Depth of Inheritance Tree,  HNL, DIT = max hierarchy level  (NOC) [Chid94] Number of Children  (WNOC) Total number of Children  (NMO, NMA, NMI, NME) [Lore94] Number of Method Overridden, Added, Inherited, Extended (super call)  (SIX) [Lore94] —SIX (C) = NMO * HNL / NOM —Weighted percentage of Overridden Methods

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.16 Method Size  (MSG) Number of Message Sends  (LOC) Lines of Code  (MCX) Method complexity —Total Number of Complexity / Total number of methods —API calls= 5, Assignment = 0.5, arithmetics op = 2, messages with params = 3....

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.17 Sample Metrics: Class Cohesion  (LCOM) Lack of Cohesion in Methods —[Chidamber 94] for definition —[Hitz 95] for critique Ii = set of instance variables used by method Mi let P = { (Ii, Ij ) | Ii  Ij =  } Q = { (Ii, Ij ) | Ii  Ij   } if all the sets are empty, P is empty LCOM =|P| - |Q|if |P|>|Q| 0otherwise  Tight Class Cohesion (TCC)  Loose Class Cohesion (LCC) —[Bieman 95] for definition —Measure method cohesion across invocations

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.18 Sample Metrics: Class Coupling (i)  Coupling Between Objects (CBO) —[Chidamber 94a] for definition, —[Hitz 95a] for a discussion —Number of other classes to which it is coupled  Data Abstraction Coupling (DAC) —[Li 93] for definition —Number of ADT’s defined in a class  Change Dependency Between Classes (CDBC) —[Hitz 96a] for definition —Impact of changes from a server class (SC) to a client class (CC).

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.19 Sample Metrics: Class Coupling (ii)  Locality of Data (LD) —[Hitz 96] for definition LD = ∑ |Li | / ∑ |Ti | Li = non public instance variables + inherited protected of superclass + static variables of the class Ti = all variables used in Mi, except non-static local variables Mi = methods without accessors

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.20 The Trouble with Coupling and Cohesion  Coupling and Cohesion are intuitive notions —Cf. “computability” —E.g., is a library of mathematical functions “cohesive” —E.g., is a package of classes that subclass framework classes cohesive? Is it strongly coupled to the framework package?

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.21 Conclusion: Metrics for Quality Assessment  Can internal product metrics reveal which components have good/poor quality?  Yes, but... —Not reliable – false positives: “bad” measurements, yet good quality – false negatives: “good” measurements, yet poor quality —Heavyweight Approach – Requires team to develop (customize?) a quantitative quality model – Requires definition of thresholds (trial and error) —Difficult to interpret – Requires complex combinations of simple metrics  However... —Cheap once you have the quality model and the thresholds —Good focus (± 20% of components are selected for further inspection)  Note: focus on the most complex components first!

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.22 Roadmap  Metrics  Object-Oriented Metrics in Practice —Detection strategies, filters and composition —Sample detection strategies: God Class …  Duplicated Code Michele Lanza and Radu Marinescu, Object-Oriented Metrics in Practice, Springer-Verlag, 2006

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.23 Detection strategy  A detection strategy is a metrics-based predicate to identify candidate software artifacts that conform to (or violate) a particular design rule

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.24 Filters and composition  A data filter is a predicate used to focus attention on a subset of interest of a larger data set —Statistical filters – I.e., top and bottom 25% are considered outliers —Other relative thresholds – I.e., other percentages to identify outliers (e.g., top 10%) —Absolute thresholds – I.e., fixed criteria, independent of the data set  A useful detection strategy can often be expressed as a composition of data filters

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.25 God Class  A God Class centralizes intelligence in the system —Impacts understandibility —Increases system fragility

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.26 ModelFacade (ArgoUML)  453 methods  114 attributes  over 3500 LOC  all methods and all attributes are static

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.27 Feature Envy  Methods that are more interested in data of other classes than their own [Fowler et al. 99]

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.28 ClassDiagramLayouter

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.29 Data Class  A Data Class provides data to other classes but little or no functionality of its own

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.30 Data Class (2)

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.31 Property

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.32 Shotgun Surgery  A change in an operation implies many (small) changes to a lot of different operations and classes

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.33 Project

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.34 Roadmap  Metrics  Object-Oriented Metrics in Practice  Duplicated Code —Detection techniques —Visualizing duplicated code

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.35 Code is Copied Small Example from the Mozilla Distribution (Milestone 9) Extract from /dom/src/base/nsLocation.cpp

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.36 Case StudyLOC Duplication without comments with comments gcc460’0008.7%5.6% Database Server245’ %23.3% Payroll40’ %25.4% Message Board6’ %17.4% How Much Code is Duplicated? Usual estimates: 8 to 12% in normal industrial code 15 to 25 % is already a lot!

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.37 What is Duplicated Code?  Duplicated Code = Source code segments that are found in different places of a system. — in different files — in the same file but in different functions — in the same function  The segments must contain some logic or structure that can be abstracted, i.e.,  Copied artifacts range from expressions, to functions, to data structures, and to entire subsystems.

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.38 Copied Code Problems  General negative effect —Code bloat  Negative effects on Software Maintenance —Copied Defects —Changes take double, triple, quadruple,... Work —Dead code —Add to the cognitive load of future maintainers  Copying as additional source of defects —Errors in the systematic renaming produce unintended aliasing  Metaphorically speaking: —Software Aging, “hardening of the arteries”, —“Software Entropy” increases even small design changes become very difficult to effect

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.39 Nontrivial problem: No a priori knowledge about which code has been copied How to find all clone pairs among all possible pairs of segments? Code Duplication Detection

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.40 AuthorLevelTransformed CodeComparison Technique Johnson 94LexicalSubstringsString-Matching Ducasse 99LexicalNormalized StringsString-Matching Baker 95SyntacticalParameterized StringsString-Matching Mayrand 96SyntacticalMetric TuplesDiscrete comparison Kontogiannis 97SyntacticalMetric TuplesEuclidean distance Baxter 98SyntacticalASTTree-Matching General Schema of Detection Process

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.41 Recall and Precision

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.42 … //assign same fastid as container fastid = NULL; const char* fidptr = get_fastid(); if(fidptr != NULL) { int l = strlen(fidptr); fastid = newchar[ l + 1 ]; … //assign same fastid as container fastid = NULL; const char* fidptr = get_fastid(); if(fidptr != NULL) { int l = strlen(fidptr); fastid = newchar[ l + 1 ]; … fastid=NULL; constchar*fidptr=get_fastid(); if(fidptr!=NULL) intl=strlen(fidptr) fastid = newchar[l+] … fastid=NULL; constchar*fidptr=get_fastid(); if(fidptr!=NULL) intl=strlen(fidptr) fastid = newchar[l+] Simple Detection Approach (i)  Assumption: – Code segments are just copied and changed at a few places  Noise elimination transformation – remove white space, comments – remove lines that contain uninteresting code elements – (e.g., just ‘else’ or ‘}’)

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.43 Simple Detection Approach (ii)  Code Comparison Step —Line based comparison (Assumption: Layout did not change during copying) —Compare each line with each other line. —Reduce search space by hashing: – Preprocessing: Compute the hash value for each line – Actual Comparison: Compare all lines in the same hash bucket  Evaluation of the Approach —Advantages: Simple, language independent —Disadvantages: Difficult interpretation

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.44 A Perl script for C++ (i)

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.45 A Perl script for C++ (ii) Handles multiple files Removes comments and white spaces Controls noise (if, {,) Granularity (number of lines) Possible to remove keywords

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.46 Output Sample Lines: create_property(pd,pnImplObjects,stReference,false,*iImplObjects); create_property(pd,pnElttype,stReference,true,*iEltType); create_property(pd,pnMinelt,stInteger,true,*iMinelt); create_property(pd,pnMaxelt,stInteger,true,*iMaxelt); create_property(pd,pnOwnership,stBool,true,*iOwnership); Locations: 6178/6179/6180/6181/ /6199/6200/6201/6202 Lines: create_property(pd,pnSupertype,stReference,true,*iSupertype); create_property(pd,pnImplObjects,stReference,false,*iImplObjects); create_property(pd,pnElttype,stReference,true,*iEltType); create_property(pd,pMinelt,stInteger,true,*iMinelt); create_property(pd,pnMaxelt,stInteger,true,*iMaxelt); Locations: 6177/ /6230 Lines = duplicated lines Locations = file names and line number

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.47 Enhanced Simple Detection Approach  Code Comparison Step —As before, but now – Collect consecutive matching lines into match sequences – Allow holes in the match sequence  Evaluation of the Approach —Advantages – Identifies more real duplication, language independent —Disadvantages – Less simple – Misses copies with (small) changes on every line

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.48 Abstraction —Abstracting selected syntactic elements can increase recall, at the possible cost of precision

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.49 Metrics-based detection strategy  Duplication is significant if: —It is the largest possible duplication chain uniting all exact clones that are close enough to each other. —The duplication is large enough.

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.50 Automated detection in practice  Wettel [ MSc thesis, 2004] uses three thresholds: —Minimum clone length: the minimum amount of lines present in a clone (e.g., 7) —Maximum line bias: the maximum amount of lines in between two exact chunks (e.g., 2) —Minimum chunk size: the minimum amount of lines of an exact chunk (e.g., 3) Mihai Balint, Tudor Gîrba and Radu Marinescu, “How Developers Copy,” ICPC 2006

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.51 Visualization of Duplicated Code  Visualization provides insights into the duplication situation —A simple version can be implemented in three days —Scalability issue  Dotplots — Technique from DNA Analysis — Code is put on vertical as well as horizontal axis — A match between two elements is a dot in the matrix

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.52 Detected Problem File A contains two copies of a piece of code File B contains another copy of this code Possible Solution Extract Method All examples are made using Duploc from an industrial case study (1 Mio LOC C++ System) Visualization of Copied Code Sequences

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.53 Detected Problem 4 Object factory clones: a switch statement over a type variable is used to call individual construction code Possible Solution Strategy Method Visualization of Repetitive Structures

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.54 Visualization of Cloned Classes Class A Class B Class A Detected Problem: Class A is an edited copy of class B. Editing & Insertion Possible Solution Subclassing …

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection Classes implementing lists for different data types Detail Overview Visualization of Clone Families

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.56 Conclusion  Duplicated code is a real problem —makes a system progressively harder to change  Detecting duplicated code is a hard problem —some simple techniques can help —tool support is needed  Visualization of code duplication is useful —basic tool support is easy to build (e.g., 3 days with rapid-prototyping)  Curing duplicated code is an active research area

© Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.57 License > Attribution-ShareAlike 2.5 You are free: to copy, distribute, display, and perform the work to make derivative works to make commercial use of the work Under the following conditions: Attribution. You must attribute the work in the manner specified by the author or licensor. Share Alike. If you alter, transform, or build upon this work, you may distribute the resulting work only under a license identical to this one. For any reuse or distribution, you must make clear to others the license terms of this work. Any of these conditions can be waived if you get permission from the copyright holder. Your fair use and other rights are in no way affected by the above. Attribution-ShareAlike 2.5 You are free: to copy, distribute, display, and perform the work to make derivative works to make commercial use of the work Under the following conditions: Attribution. You must attribute the work in the manner specified by the author or licensor. Share Alike. If you alter, transform, or build upon this work, you may distribute the resulting work only under a license identical to this one. For any reuse or distribution, you must make clear to others the license terms of this work. Any of these conditions can be waived if you get permission from the copyright holder. Your fair use and other rights are in no way affected by the above.