Download presentation
Presentation is loading. Please wait.
1
1 Complexity metrics measure certain aspects of the software (lines of code, # of if-statements, depth of nesting, …) use these numbers as a criterion to assess a design, or to guide the design interpretation: higher value higher complexity more effort required (= worse design) two kinds: intra-modular: inside one module inter-modular: between modules
2
2 Sized-based complexity measures counting lines of code (intra-modular) differences in scale for different programming languages Halstead’s metrics: counting operators and operands
3
3 Halstead’s metrics: n 1 : number of unique operators n 2 : number of unique operands N 1 : total number of operators N 2 : total number of operands
4
4 Example program public static void sort(int x []) { for (int i=0; i < x.length-1; i++) { for (int j=i+1; j < x.length; j++) { if (x[i] > x[j]) { int save=x[i]; x[i]=x[j]; x[j]=save } operator, 1 occurrence operator, 2 occurrences
5
5 operator # of occurrences public sort() int [] {} for {;;} if () = < … n 1 = 17 1 4 7 4 2 1 5 2 … N 1 = 39
6
6 Example program public static void sort(int x []) { for (int i=0; i < x.length-1; i++) { for (int j=i+1; j < x.length; j++) { if (x[i] > x[j]) { int save=x[i]; x[i]=x[j]; x[j]=save } operand, 2 occurrences
7
7 operand # of occurrences x length i j save 0 1 n 2 = 7 9 2 7 6 2 1 2 N 2 = 29
8
8 Metrics: size of vocabulary: n = n 1 + n 2 program length: N = N 1 + N 2 volume: V = N log 2 n level of abstraction: L = V*/ V, V*=volume of fct prototype. For sort(x), V* = 2 log 2. For main(), L is high. approximation: L’ = (2/n 1 )(n 2 /N 2 ) programming effort: E = V/L estimated programming time: T ’ = E/18 estimate of N: N ’ = n 1 log 2 n 1 + n 2 log 2 n 2 for this example: N = 68, N ’ = 89, L =.015, L’ =.028
9
9 Remember… Metrics are only estimates, give some idea on complexity critique: different definitions of “operand” and “operator” explanations are not convincing (empirical, may not apply to other projects) Source code must exist
10
10 Other metrics (structure-based) use control structures data structures or both example complexity measure based on data structures: average number of instructions between successive references to a variable best known measure is based on the control structure: McCabe’s cyclomatic complexity
11
11 Example program public static void sort(int x []) { for (int i=0; i < x.length-1; i++) { for (int j=i+1; j < x.length; j++) { if (x[i] > x[j]) { int save=x[i]; x[i]=x[j]; x[j]=save } 2 1 3 4 5 6 7 8 9 10 11
12
12 Cyclomatic complexity e = number of edges (13) n = number of nodes (11) p = number of connected components (1) CV = e - n + p + 1 (4) 2 1 3 4 5 6 7 8 9 10 11
13
13 Intra-modular complexity measures, summary for small programs, the various measures correlate well with programming time however, a simple length measure such as LOC does equally well complexity measures are not very context sensitive complexity measures take into account few aspects it might help to look at the complexity density instead
14
14 System structure: inter-module complexity measures dependencies between modules: draw graph: modules =nodes edges connecting modules may denote several relations, most often: A uses B (ex: procedure calls)
15
15 The uses relation the call-graph chaos (general directed graph) hierarchy (acyclic graph) strict hierarchy (layers) tree
16
16 In a picture: chaos strict hierarchy tree
17
17 Measurements } size # nodes # edges width height
18
18 Deviation from a tree strict hierarchy tree
19
19 Tree impurity metric complete graph with n nodes has e = n(n-1)/2 edges a tree with n nodes has e = (n-1) edges tree impurity for a graph with n nodes and e edges: m(G) = 2(e-n+1)/(n-1)(n-2) (0 for tree, 1 for complete graph)
20
20 Any tree impurity metric: m(G) = 0 if and only if G is a tree m(G 1 ) > m(G 2 ) if G 1 = G 2 + an extra edge if G 1 and G 2 have the same # of “extra” edges wrt their spanning tree, and G 1 has more nodes than G 2, then m(G 1 ) < m(G 2 ) m(G) m(K n ) = 1, where G has n nodes, and K n is the (undirected) complete graph with n nodes
21
21 Information flow metric tree impurity metrics only consider the number of edges, not their “thickness” Shepperd’s metric: there is a local flow from A to B if: A invokes B and passes it a parameter B invokes A and A returns a value there is a global flow from A to B if A updates some global structure and B reads that structure
22
22 Shepperd’s metric fan-in(M) = # (local and global) flows whose sink is M fan-out(M) = # (local and global) flows whose source is M complexity(M) = (fan-in(M) * fan-out(M)) 2
23
23 Point to ponder: What does this program do? procedure X(A: array [1..n] of int); var i, k, small: int; begin for i:= 1 to n do small:= A[i]; for k:= i to n-1 do if small <= A[k] then swap (A[k], A[k+1]) end
24
24 Object-oriented metrics WMC: Weighted Methods per Class DIT: Depth of Inheritance Tree NOC: Number Of Children CBO: Coupling Between Object Classes RFC: Response For a Class LCOM: Lack of COhesion of a Method
25
25 Weighted Methods per Class measure for size of class WMC = c(i), i = 1, …, n (number of methods) c(i) = complexity of method i mostly, c(i) = 1
26
26 Depth of Class in Inheritance Tree DIT = distance of class to root of its inheritance tree Good: forest of classes of medium height
27
27 Number Of Children NOC: counts immediate descendants in inheritance tree higher values NOC are considered bad: possibly improper abstraction of the parent class also suggests that class is to be used in a variety of settings
28
28 Coupling Between Object Classes two classes are coupled if a method of one class uses a method or state variable of another class CBO = count of all classes a given class is coupled with high values: something is wrong all couplings are counted alike; refinements are possible
29
29 Response For a Class RFC = size of the “response set” of a class response set = {set of methods: M} {set methods called by M1: R 1 } {R2} … M1 M3 M2 R1
30
30 Lack of Cohesion of a Method cohesion = glue that keeps the module (class) together, eg. all methods use the same set of state variables if some methods use a subset of the state variables, and others use another subset, the class lacks cohesion LCOM = number of disjoint sets of methods in a class two methods in the same set share at least one state variable
31
31 OO metrics WMC, CBO, RFC, LCOM most useful Predict fault proneness during design Strong relationship to maintenance effort Many OO metrics correlate strongly with size
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.