Download presentation
Presentation is loading. Please wait.
Published byLizbeth Caldwell Modified over 8 years ago
1
Software Engineering Lecture 19: Object-Oriented Testing & Technical Metrics
2
Today’s Topics l Evaluating OOA and OOD Models l Unit, Class & Integration Testing l OO Design Metrics l Class-Oriented Metrics l Operation-Oriented Metrics l Testing Metrics l Project Metrics
3
O-O Programs are Different l High Degree of Reuse Does this mean more, or less testing? l Unit Testing vs. Class Testing What is the right “unit” in OO testing? l Review of Analysis & Design Classes appear early, so defects can be recognized early as well
4
Testing OOA and OOD Models l Correctness (of each model element) Syntactic (notation, conventions) review by modeling experts Semantic (conforms to real problem) review by domain experts l Consistency (of each class) Revisit CRC & Class Diagram Trace delegated responsibilities Examine / adjust cohesion of responsibilities
5
Model Testing [2] l Evaluating the Design Compare behavioral model to class model Compare behavioral & class models to the use cases Inspect the detailed design for each class (algorithms & data structures)
6
Unit Testing l What is a “Unit”? Traditional: a “single operation” O-O: encapsulated data & operations l Smallest testable unit = class many operations l Inheritance testing “in isolation” is impossible operations must be tested every place they are used
7
Testing under Inheritance Shape move() Circle resize() Square resize() Ellipse resize() Q: What if implementation of resize() for each subclass calls inherited operation move() ? A: Shape cannot be completely tested unless we also test Circle, Square, & Ellipse!
8
Integration Testing l O-O Integration: Not Hierarchical Coupling is not via subroutine “Top-down” and “Bottom-up” have little meaning l Integrating one operation at a time is difficult Indirect interactions among operations
9
O-O Integration Testing l Thread-Based Testing Integrate set of classes required to respond to one input or event Integrate one thread at a time l Example: Event-Dispatching Thread vs. Event Handlers in Java Implement & test all GUI events first Add event handlers one at a time
10
O-O Integration [2] l Use-Based Testing Implement & test independent classes first Then implement dependent classes (layer by layer, or cluster- based) Simple driver classes or methods sometimes required to test lower layers
11
Validation Testing l Details of objects not visible l Focus on user-observable input and output l Methods: Utilize use cases to derive tests (both manual & automatic) Black-box testing for automatic tests
12
Test Case Design l Focus: “Designing sequences of operations to exercise the states of a class instance” l Challenge: Observability Do we have methods that allow us to inspect the inner state of an object? l Challenge: Inheritance Can test cases for a superclass be used to test a subclass?
13
Test Case Checklist [Berard ’93] l Identify unique tests & associate with a particular class l Describe purpose of the test l Develop list of testing steps: Specified states to be tested Operations (methods) to be tested Exceptions that might occur External conditions & changes thereto Supplemental information (if needed)
14
Object-Oriented Metrics l Five characteristics [Berard ’95]: Localization operations used in many classes Encapsulation metrics for classes, not modules Information Hiding should be measured & improved Inheritance adds complexity, should be measured Object Abstraction metrics represent level of abstraction
15
Design Metrics [Whitmire ’97] l Size Population (# of classes, operations) Volume (dynamic object count) Length (e.g., depth of inheritance) Functionality (# of user functions) l Complexity How classes are interrelated
16
Design Metrics [2] l Coupling # of collaborations between classes, number of method calls, etc. l Sufficiency Does a class reflect the necessary properties of the problem domain? l Completeness Does a class reflect all the properties of the problem domain? (for reuse)
17
Design Metrics [3] l Cohesion Do the attributes and operations in a class achieve a single, well-defined purpose in the problem domain? l Primitiveness (Simplicity) Degree to which class operations can’t be composed from other operations
18
Design Metrics [4] l Similarity Comparison of structure, function, behavior of two or more classes l Volatility The likelihood that a change will occur in the design or implementation of a class
19
Class-Oriented Metrics l Of central importance in evaluating object- oriented design (which is inherently class-based) l A variety of metrics proposed: Chidamber & Kemerer (1994) Lorenz & Kidd (1994) Harrison, Counsell & Hithi (1998)
20
Weighted Methods per Class l Assume class C has n methods, complexity measures c 0 …c i WMC(C) = c i l Complexity is a function of the # of methods and their complexity l Issues: How to count methods? (inheritance) Normalize c i to 1.0
21
Depth of Inheritance Tree l Maximum length from a node C to the root of the tree l PRO: inheritance = reuse l CON: Greater depth implies greater complexity Hard to predict behavior under inheritance Greater design complexity (effort)
22
DIT Example [from SEPA 5/e] DIT = 4 (Longest path from root to child node in hierarchy) 1 2 3 4
23
Number of Children l Subclasses immediately subordinate to class C are its children l As # of children (NOC) increases: PRO: more reuse CON: parent becomes less abstract CON: more testing required
24
Coupling Between Objects l Number of collaborations for a given class C l As CBO increases: CON: reusability decreases CON: harder to modify, test l CBO should be minimized
25
Response For A Class l Response Set: the set of methods than can potentially execute in response to some message l RFC: The # of methods in the response set l As RFC increases: CON: Effort for testing increases CON: Design complexity increases
26
Lack of Cohesion in Methods l LCOM: # of methods that access one or more of the same attributes l When LCOM is high: More coupling between methods Additional design complexity l When LCOM is low: Lack of cohesion? e.g.: control panel gauges Reduced design complexity
27
Class Size l Number of operations Inherited & Local l Number of attributes Inherited & Local l These may be added, but… l They lack the weighting for complexity which WMC provides
28
Method Inheritance Factor l Proportion of inherited methods to total methods available in a class MIF = M i (Ci) / M a (Ci) l A way to measure inheritance (and the additional design & testing complexity)
29
Operation-Oriented Metrics l Average Operation Size (OS avg ) LOC not a good measure Better: number of messages sent Should strive to minimize l Operation Complexity (OC) E.g., Function Points; minimize l Average # Parameters (NP avg ) Larger = more complex collaborations between objects; try to minimize
30
O-O Testing Metrics l Percent Public & Protected (PAP) Comparison of attribute types Higher: greater chance of side-effects l Public Access to Data (PAD) # of classes that can access data in another (encapsulation violation) Higher: greater chance of side-effects
31
Testing Metrics [2] l Number of Root Classes (NOR) # of distinct class hierarchies Higher: increased testing effort, since test cases must be defined for each l Fan-In (FIN) In O-O context = multiple inheritance FIN > 1 should be avoided! (Java)
32
Project Metrics l Number of Scenario Scripts (NSS) Proportional to # classes, methods… Strong indicator of program size l Number of Key Classes (NKC) Unique to solution (not reused) Higher: substantial development work l Number of Subsystems (NSUB) Impact: resource allocation, parallel scheduling, integration effort
33
Questions?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.