Presentation is loading. Please wait.

Presentation is loading. Please wait.

OORPT Object-Oriented Reengineering Patterns and Techniques 7. Problem Detection Prof. O. Nierstrasz.

Similar presentations


Presentation on theme: "OORPT Object-Oriented Reengineering Patterns and Techniques 7. Problem Detection Prof. O. Nierstrasz."— Presentation transcript:

1 OORPT Object-Oriented Reengineering Patterns and Techniques 7. Problem Detection Prof. O. Nierstrasz

2 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.2 Roadmap  Metrics  Object-Oriented Metrics in Practice  Duplicated Code

3 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.3 Roadmap  Metrics —Software quality —Analyzing trends  Object-Oriented Metrics in Practice  Duplicated Code

4 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.4 Why Metrics in OO Reengineering (ii)?  Assessing Software Quality —Which components have poor quality? (Hence could be reengineered) —Which components have good quality? (Hence should be reverse engineered)  Metrics as a reengineering tool!  Controlling the Reengineering Process —Trend analysis: which components changed? —Which refactorings have been applied?  Metrics as a reverse engineering tool!

5 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.5 ISO 9126 Quantitative Quality Model Software Quality Functionality Reliability Efficiency Usability Maintainability Portability ISO 9126FactorCharacteristic Metric Error tolerance Accuracy Simplicity Modularity Consistency defect density = #defects / size correction impact = #components changed correction time Leaves are simple metrics, measuring basic attributes

6 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.6 Product & Process Attributes Product Attribute Definition: measure aspects of artifacts delivered to the customer Example: number of system defects perceived, time to learn the system Process Attribute Definition: measure aspects of the process which produces a product Example: time to correct defect, number of components changed per correction

7 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.7 External & Internal Attributes External Attribute Definition: measures how the product/process behaves in its environment Example: mean time between failure, #components changed Internal Attribute Definition: measured purely in term of the product, separate from its behaviour in context Example: class coupling and cohesion, method size

8 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.8 External vs. Internal Product Attributes ExternalInternal Advantage:  close relationship with quality factors Disadvantage:  relationship with quality factors is not empirically validated Disadvantages:  measure only after the product is used or process took place  data collection is difficult; often involves human intervention/interpretation  relating external effect to internal cause is difficult Advantages:  can be measured at any time  data collection is quite easy and can be automated  direct relationship between measured attribute and cause

9 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.9 Metrics and Measurements  Weyuker [1988] defined nine properties that a software metric should hold. —Read Fenton & Pfleeger for critiques.  For OO only 6 properties are really interesting [Chidamber 94, Fenton & Pfleeger ] —Noncoarseness: – Given a class P and a metric m, another class Q can always be found such that m(P)  m(Q) – Not every class has the same value for a metric —Nonuniqueness. – There can exist distinct classes P and Q such that m(P) = m(Q) – Two classes can have the same metric —Monotonicity – m(P)  m (P+Q) and m(Q)  m (P+Q), P+Q is the “combination” of the classes P and Q.

10 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.10 Metrics and Measurements (ii) —Design Details are Important – The specifics of a class must influence the metric value. Even if a class performs the same actions details should have an impact on the metric value. —Nonequivalence of Interaction – m(P) = m(Q)  m(P+R) = m(Q+R) where R is an interaction with the class. —Interaction Increases Complexity – m(P) + (Q) < m (P+Q). – when two classes are combined, the interaction between the too can increase the metric value  Conclusion: Not every measurement is a metric.

11 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.11 Selecting Metrics  Fast —Scalable: you can’t afford log(n2) when n  1 million LOC  Precise —(e.g. #methods — do you count all methods, only public ones, also inherited ones?) —Reliable: you want to compare apples with apples  Code-based —Scalable: you want to collect metrics several times —Reliable: you want to avoid human interpretation  Simple —Complex metrics are hard to interpret

12 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.12 Assessing Maintainability  Size of the system, system entities —Class size, method size, inheritance —The intuition: large entities impede maintainability  Cohesion of the entities —Class internals —The intuition: changes should be local  Coupling between entities —Within inheritance: coupling between class-subclass —Outside of inheritance —The intuition: strong coupling impedes locality of changes

13 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.13 Sample Size and Inheritance Metrics Class Attribute Method Access Invoke BelongTo Inherit Inheritance Metrics hierarchy nesting level (HNL) # immediate children (NOC) # inherited methods, unmodified (NMI) # overridden methods (NMO) Class Size Metrics # methods (NOM) # instance attributes (NIA, NCA) # Sum of method size (WMC) Method Size Metrics # invocations (NOI) # statements (NOS) # lines of code (LOC)

14 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.14 Sample class Size  (NIV) —[Lore94] Number of Instance Variables (NIV) —[Lore94] Number of Class Variables (static) (NCV) —[Lore94] Number of Methods (public, private, protected) (NOM)  (LOC) Lines of Code  (NSC) Number of semicolons [Li93]  number of Statements  (WMC) [Chid94] Weighted Method Count —WMC = ∑ c i —where c is the complexity of a method (number of exit or McCabe Cyclomatic Complexity Metric)

15 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.15 Hierarchy Layout  (HNL) [Chid94] Hierarchy Nesting Level, (DIT) [Li93] Depth of Inheritance Tree,  HNL, DIT = max hierarchy level  (NOC) [Chid94] Number of Children  (WNOC) Total number of Children  (NMO, NMA, NMI, NME) [Lore94] Number of Method Overridden, Added, Inherited, Extended (super call)  (SIX) [Lore94] —SIX (C) = NMO * HNL / NOM —Weighted percentage of Overridden Methods

16 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.16 Method Size  (MSG) Number of Message Sends  (LOC) Lines of Code  (MCX) Method complexity —Total Number of Complexity / Total number of methods —API calls= 5, Assignment = 0.5, arithmetics op = 2, messages with params = 3....

17 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.17 Sample Metrics: Class Cohesion  (LCOM) Lack of Cohesion in Methods —[Chidamber 94] for definition —[Hitz 95] for critique Ii = set of instance variables used by method Mi let P = { (Ii, Ij ) | Ii  Ij =  } Q = { (Ii, Ij ) | Ii  Ij   } if all the sets are empty, P is empty LCOM =|P| - |Q|if |P|>|Q| 0otherwise  Tight Class Cohesion (TCC)  Loose Class Cohesion (LCC) —[Bieman 95] for definition —Measure method cohesion across invocations

18 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.18 Sample Metrics: Class Coupling (i)  Coupling Between Objects (CBO) —[Chidamber 94a] for definition, —[Hitz 95a] for a discussion —Number of other classes to which it is coupled  Data Abstraction Coupling (DAC) —[Li 93] for definition —Number of ADT’s defined in a class  Change Dependency Between Classes (CDBC) —[Hitz 96a] for definition —Impact of changes from a server class (SC) to a client class (CC).

19 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.19 Sample Metrics: Class Coupling (ii)  Locality of Data (LD) —[Hitz 96] for definition LD = ∑ |Li | / ∑ |Ti | Li = non public instance variables + inherited protected of superclass + static variables of the class Ti = all variables used in Mi, except non-static local variables Mi = methods without accessors

20 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.20 The Trouble with Coupling and Cohesion  Coupling and Cohesion are intuitive notions —Cf. “computability” —E.g., is a library of mathematical functions “cohesive” —E.g., is a package of classes that subclass framework classes cohesive? Is it strongly coupled to the framework package?

21 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.21 Conclusion: Metrics for Quality Assessment  Can internal product metrics reveal which components have good/poor quality?  Yes, but... —Not reliable – false positives: “bad” measurements, yet good quality – false negatives: “good” measurements, yet poor quality —Heavyweight Approach – Requires team to develop (customize?) a quantitative quality model – Requires definition of thresholds (trial and error) —Difficult to interpret – Requires complex combinations of simple metrics  However... —Cheap once you have the quality model and the thresholds —Good focus (± 20% of components are selected for further inspection)  Note: focus on the most complex components first!

22 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.22 Roadmap  Metrics  Object-Oriented Metrics in Practice —Detection strategies, filters and composition —Sample detection strategies: God Class …  Duplicated Code Michele Lanza and Radu Marinescu, Object-Oriented Metrics in Practice, Springer-Verlag, 2006

23 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.23 Detection strategy  A detection strategy is a metrics-based predicate to identify candidate software artifacts that conform to (or violate) a particular design rule

24 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.24 Filters and composition  A data filter is a predicate used to focus attention on a subset of interest of a larger data set —Statistical filters – I.e., top and bottom 25% are considered outliers —Other relative thresholds – I.e., other percentages to identify outliers (e.g., top 10%) —Absolute thresholds – I.e., fixed criteria, independent of the data set  A useful detection strategy can often be expressed as a composition of data filters

25 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.25 God Class  A God Class centralizes intelligence in the system —Impacts understandibility —Increases system fragility

26 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.26 ModelFacade (ArgoUML)  453 methods  114 attributes  over 3500 LOC  all methods and all attributes are static

27 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.27 Feature Envy  Methods that are more interested in data of other classes than their own [Fowler et al. 99]

28 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.28 ClassDiagramLayouter

29 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.29 Data Class  A Data Class provides data to other classes but little or no functionality of its own

30 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.30 Data Class (2)

31 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.31 Property

32 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.32 Shotgun Surgery  A change in an operation implies many (small) changes to a lot of different operations and classes

33 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.33 Project

34 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.34 Roadmap  Metrics  Object-Oriented Metrics in Practice  Duplicated Code —Detection techniques —Visualizing duplicated code

35 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.35 Code is Copied Small Example from the Mozilla Distribution (Milestone 9) Extract from /dom/src/base/nsLocation.cpp

36 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.36 Case StudyLOC Duplication without comments with comments gcc460’0008.7%5.6% Database Server245’00036.4%23.3% Payroll40’00059.3%25.4% Message Board6’50029.4%17.4% How Much Code is Duplicated? Usual estimates: 8 to 12% in normal industrial code 15 to 25 % is already a lot!

37 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.37 What is Duplicated Code?  Duplicated Code = Source code segments that are found in different places of a system. — in different files — in the same file but in different functions — in the same function  The segments must contain some logic or structure that can be abstracted, i.e.,  Copied artifacts range from expressions, to functions, to data structures, and to entire subsystems.

38 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.38 Copied Code Problems  General negative effect —Code bloat  Negative effects on Software Maintenance —Copied Defects —Changes take double, triple, quadruple,... Work —Dead code —Add to the cognitive load of future maintainers  Copying as additional source of defects —Errors in the systematic renaming produce unintended aliasing  Metaphorically speaking: —Software Aging, “hardening of the arteries”, —“Software Entropy” increases even small design changes become very difficult to effect

39 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.39 Nontrivial problem: No a priori knowledge about which code has been copied How to find all clone pairs among all possible pairs of segments? Code Duplication Detection

40 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.40 AuthorLevelTransformed CodeComparison Technique Johnson 94LexicalSubstringsString-Matching Ducasse 99LexicalNormalized StringsString-Matching Baker 95SyntacticalParameterized StringsString-Matching Mayrand 96SyntacticalMetric TuplesDiscrete comparison Kontogiannis 97SyntacticalMetric TuplesEuclidean distance Baxter 98SyntacticalASTTree-Matching General Schema of Detection Process

41 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.41 Recall and Precision

42 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.42 … //assign same fastid as container fastid = NULL; const char* fidptr = get_fastid(); if(fidptr != NULL) { int l = strlen(fidptr); fastid = newchar[ l + 1 ]; … //assign same fastid as container fastid = NULL; const char* fidptr = get_fastid(); if(fidptr != NULL) { int l = strlen(fidptr); fastid = newchar[ l + 1 ]; … fastid=NULL; constchar*fidptr=get_fastid(); if(fidptr!=NULL) intl=strlen(fidptr) fastid = newchar[l+] … fastid=NULL; constchar*fidptr=get_fastid(); if(fidptr!=NULL) intl=strlen(fidptr) fastid = newchar[l+] Simple Detection Approach (i)  Assumption: – Code segments are just copied and changed at a few places  Noise elimination transformation – remove white space, comments – remove lines that contain uninteresting code elements – (e.g., just ‘else’ or ‘}’)

43 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.43 Simple Detection Approach (ii)  Code Comparison Step —Line based comparison (Assumption: Layout did not change during copying) —Compare each line with each other line. —Reduce search space by hashing: – Preprocessing: Compute the hash value for each line – Actual Comparison: Compare all lines in the same hash bucket  Evaluation of the Approach —Advantages: Simple, language independent —Disadvantages: Difficult interpretation

44 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.44 A Perl script for C++ (i)

45 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.45 A Perl script for C++ (ii) Handles multiple files Removes comments and white spaces Controls noise (if, {,) Granularity (number of lines) Possible to remove keywords

46 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.46 Output Sample Lines: create_property(pd,pnImplObjects,stReference,false,*iImplObjects); create_property(pd,pnElttype,stReference,true,*iEltType); create_property(pd,pnMinelt,stInteger,true,*iMinelt); create_property(pd,pnMaxelt,stInteger,true,*iMaxelt); create_property(pd,pnOwnership,stBool,true,*iOwnership); Locations: 6178/6179/6180/6181/6182 6198/6199/6200/6201/6202 Lines: create_property(pd,pnSupertype,stReference,true,*iSupertype); create_property(pd,pnImplObjects,stReference,false,*iImplObjects); create_property(pd,pnElttype,stReference,true,*iEltType); create_property(pd,pMinelt,stInteger,true,*iMinelt); create_property(pd,pnMaxelt,stInteger,true,*iMaxelt); Locations: 6177/6178 6229/6230 Lines = duplicated lines Locations = file names and line number

47 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.47 Enhanced Simple Detection Approach  Code Comparison Step —As before, but now – Collect consecutive matching lines into match sequences – Allow holes in the match sequence  Evaluation of the Approach —Advantages – Identifies more real duplication, language independent —Disadvantages – Less simple – Misses copies with (small) changes on every line

48 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.48 Abstraction —Abstracting selected syntactic elements can increase recall, at the possible cost of precision

49 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.49 Metrics-based detection strategy  Duplication is significant if: —It is the largest possible duplication chain uniting all exact clones that are close enough to each other. —The duplication is large enough.

50 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.50 Automated detection in practice  Wettel [ MSc thesis, 2004] uses three thresholds: —Minimum clone length: the minimum amount of lines present in a clone (e.g., 7) —Maximum line bias: the maximum amount of lines in between two exact chunks (e.g., 2) —Minimum chunk size: the minimum amount of lines of an exact chunk (e.g., 3) Mihai Balint, Tudor Gîrba and Radu Marinescu, “How Developers Copy,” ICPC 2006

51 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.51 Visualization of Duplicated Code  Visualization provides insights into the duplication situation —A simple version can be implemented in three days —Scalability issue  Dotplots — Technique from DNA Analysis — Code is put on vertical as well as horizontal axis — A match between two elements is a dot in the matrix

52 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.52 Detected Problem File A contains two copies of a piece of code File B contains another copy of this code Possible Solution Extract Method All examples are made using Duploc from an industrial case study (1 Mio LOC C++ System) Visualization of Copied Code Sequences

53 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.53 Detected Problem 4 Object factory clones: a switch statement over a type variable is used to call individual construction code Possible Solution Strategy Method Visualization of Repetitive Structures

54 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.54 Visualization of Cloned Classes Class A Class B Class A Detected Problem: Class A is an edited copy of class B. Editing & Insertion Possible Solution Subclassing …

55 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.55 20 Classes implementing lists for different data types Detail Overview Visualization of Clone Families

56 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.56 Conclusion  Duplicated code is a real problem —makes a system progressively harder to change  Detecting duplicated code is a hard problem —some simple techniques can help —tool support is needed  Visualization of code duplication is useful —basic tool support is easy to build (e.g., 3 days with rapid-prototyping)  Curing duplicated code is an active research area

57 © Stéphane Ducasse, Serge Demeyer, Oscar Nierstrasz OORPT — Problem Detection 7.57 License > http://creativecommons.org/licenses/by-sa/2.5/ Attribution-ShareAlike 2.5 You are free: to copy, distribute, display, and perform the work to make derivative works to make commercial use of the work Under the following conditions: Attribution. You must attribute the work in the manner specified by the author or licensor. Share Alike. If you alter, transform, or build upon this work, you may distribute the resulting work only under a license identical to this one. For any reuse or distribution, you must make clear to others the license terms of this work. Any of these conditions can be waived if you get permission from the copyright holder. Your fair use and other rights are in no way affected by the above. Attribution-ShareAlike 2.5 You are free: to copy, distribute, display, and perform the work to make derivative works to make commercial use of the work Under the following conditions: Attribution. You must attribute the work in the manner specified by the author or licensor. Share Alike. If you alter, transform, or build upon this work, you may distribute the resulting work only under a license identical to this one. For any reuse or distribution, you must make clear to others the license terms of this work. Any of these conditions can be waived if you get permission from the copyright holder. Your fair use and other rights are in no way affected by the above.


Download ppt "OORPT Object-Oriented Reengineering Patterns and Techniques 7. Problem Detection Prof. O. Nierstrasz."

Similar presentations


Ads by Google