Presentation is loading. Please wait.

Presentation is loading. Please wait.

1. Software Metric- A definition 2. Types of Software metrics 3. Frame work of product metrics 4. Product metrics.

Similar presentations


Presentation on theme: "1. Software Metric- A definition 2. Types of Software metrics 3. Frame work of product metrics 4. Product metrics."— Presentation transcript:

1 copyright@nit.sainipoonam@gmail.com

2 1. Software Metric- A definition 2. Types of Software metrics 3. Frame work of product metrics 4. Product metrics for SDLC a. Metrics: Requirements phase b. Metric for analysis model c. Metric for design model d. Metric for source code e. Metric for Testing f. Metric for Maintenance 5. Types of Software Measurement 6. Software Quality metric 7. Software Metrics cost

3 A quantitative measure of the degree to which a system, component, or process possesses a given attribute. Measured by: individual module during development Errors should be categorized by origin, type, cost

4 Measure - quantitative indication of extent, amount, dimension, capacity, or size of some attribute of a product or process. E.g. Number of errors uncovered Measurement- is the act of obtaining a measure E.g. Reviews or unit tests etc. Metric – relates the individual measures in some way E.g. Average number of errors found per review or per unit test Indicator - a metric or combination of metrics that provide insight into the software process, a software project, or the product itself

5 Processes, Products & Services UnderstandPredictControl Evaluate

6 Example - M etric: % defects corrected To understand evaluate control predict the attribute of the entity in order to goal(s) evaluate % defects found & corrected during testing Tothe in order to ensure all known defects are corrected before shipment

7 1.Product metrics quantify characteristics of the product being developed size, reliability 2.Process metrics quantify characteristics of the process being used to develop the software efficiency of fault detection 3.Project metrics Enable a software project manager to assess the status of an ongoing project, track potential risks, uncover problem areas before they go “critical” & evaluate the project team’s ability to control quality of software work products.

8 Insights of process paradigm, software engineering tasks, work product, or milestones Lead to long term process improvement Private process metrics Private process metrics (e.g. defect rates by individual or module) are known only to the individual or team concerned Public process metrics Public process metrics enable organizations to make strategic changes to improve the software process Statistical software process improvement helps an organization to discover its strengths and weaknesses

9 Explicit results of software development activities e.g., Deliverables, documentation, by products  Assesses the state of the project  Track potential risks  Uncover problem areas  Adjust workflow or tasks  Evaluate teams ability to control quality

10 Software project metrics are used by the software team to adapt project workflow and technical activities. Project metrics are used to avoid development schedule delays, to mitigate potential risks, and to assess product quality on an on-going basis. Every project should measure its inputs (resources), outputs (deliverables), and results (effectiveness of deliverables). Application of project metrics on most software projects occurs during estimation.

11 According to Roche, a measurement process can be characterized by five activities. Formulation 1. Formulation- derivation of s/w measures and metrics Collection- 2. Collection- to accumulate data required to derive formulated metrics. Analysis 3. Analysis- computation of metrics & application of mathematical tools. Interpretation 4. Interpretation- evaluation of metrics to gain insight into the quality of the representation. Feedback 5. Feedback- Recommendations derived from interpretation transmitted to the s/w team.

12 A set of attributes that should be encompassed by effective software metrics are as follows 1. Simple and Computable 2. Consistent and objective 3. Consistent in the use of units and dimensions 4. Programming language independent 5. An effective mechanism for high quality feedback

13 Function-Based Metrics Metrics for specification quality

14 Function points are computed from direct measures of the information domain of a business software application and assessment of its complexity. Once computed function points are used like LOC to normalize measures for software productivity, quality, and other attributes Use a measure of the functionality delivered by the application as a normalization value. The relationship of LOC and function points depends on the language used to implement the software.

15 Information domain values are defined as follow: 1. No. of external inputs(EIs) 2. No. of external output(EOs) 3. No. of external inquiries(EQs) 4. No. of internal logical files(ILFs) 5. No. of external interface files(EIFs) Compute function point as follows: FP = count-total * [0.65 + 0.01 * Σ(Fi)] The Fi ( i = 1 to 14) are "complexity adjustment values“.

16 Davis suggested that the qualitative characteristics of s/w quality can be can be represented using one or more metrics. e.g., say there are n r number of requirements in a specification, such that n r = n f + n nf Specificity(lack of ambiguity): Q1 = n ui /n r Completeness: Q2 = n u /n i  n s Overall Completeness: Q3 = n c /n c + n nv

17 There are five design model for software metrics. 1. Architectural design 2. Object oriented design 3. Class oriented Metrics 4. Component level design 5. Operations oriented design

18 It focus on characteristic of the program architecture. These metrics are “black box” in the sense that they do not require any knowledge of the inner workings of a particular software component. According to Card & Glass three s/w design complexity measure 1.Structural Complexity 2.Data Complexity 3.System Complexity

19 1. Structural Complexity 1. Structural Complexity S(i) of a module i. S(i) = f out 2 (i) Fan out is the number of modules immediately subordinate (directly invoked). 2. Data Complexity 2. Data Complexity D(i) D(i) = v(i)/[f out (i)+1] v(i) is the number of inputs and outputs passed to and from i 3. System Complexity 3. System Complexity C(i) C(i) = S(i) + D(i) As each of these complexity values increases the overall complexity of the system also increases Proposed by Card and Glass, 90

20 Size = n + a Size = n + a n number of nodes a= number of arcs Depth Depth Width Width Arc-to-Node ratio, r = a/n (indicator of coupling) Arc-to-Node ratio, r = a/n (indicator of coupling) Proposed by Fenton, 91

21 Much about OO design is subjective - a good programmer “knows” what makes good code There are 9 characteristic of an OO design 1.Size 2.Complexity 3.Coupling 4.Sufficiency 5.Completeness 6.Cohesion 7.Primitiveness 8.Similarity 9.Volatility

22 Chidamber and Kemerer have proposed six class-based design metrics for OO systems. 1. Weighted methods per class (WMC) 2. Depth of the inheritance tree (DIT) 3. Number of children (NOC) 1. Weighted methods per class (WMC) 2. Depth of the inheritance tree (DIT) 3. Number of children (NOC) 4. Coupling between object classes (CBO) 5. Response for a class (RFC) 6. Lack of cohesion in methods (LCOM) 4. Coupling between object classes (CBO) 5. Response for a class (RFC) 6. Lack of cohesion in methods (LCOM)

23 Component- level design metrics focus on internal characteristics of a software component. It include measures of the “three Cs”- (a.)cohesion metrics (b.)coupling metrics (c.)complexity metrics It is “glass box” in the sense that they require knowledge of inner working of the module under consideration.

24 Three simple metric, proposed by Lorenz and Kidd 1. Average operation size (LOC, volume) 2. Operation complexity (cyclomatic) 3. Average number of parameters per operation

25 Halstead assigned quantitative laws to the development of computer s/w, using a set of primitive measures They may be derived after code is generated or estimated once design is complete The measures are : n1 : no. of distinct operators that appear in a program n2 : no. of distinct operands that appear in a program N1 : total no. of operator occurrences N2 : total no. of operand occurrences Halstead uses primitive measures to develop expression for

26 Program length, potential min. vol. For an algo., actual vol., program level, language level. Halstead shows that length N can be estimated N= n1 log 2 n1 +n2 log 2 n2 And program volume V= N log 2 (n1+n2) Halstead defines a volume ratio L as the ratio of volume of most compact form of a program to the volume of the actual program Volume ratio L= 2/n1 *n2/N2 L= 2/n1 *n2/N2, where L must always be less than 1.

27 This metrics focus on the process of testing, not the technical characteristic of the tests themselves Function-based metrics can be used as a predictor for overall testing effort Architectural design metrics provide information on the ease or difficulty associated with integration testing Cyclomatic complexity lies at the core of basis path testing Testing metrics fall into two broad categories 1. Metrics that attempt to predict the likely number of tests required at various testing levels 2. Metrics that focus on test coverage for a given component

28 Testing effort can also be estimated using metrics derived from Halstead measures Program Level(PL) =1/[(n1/2)*(N2/n2)] Halstead effort(e)= Program vol.(V)/PL % of testing effort(k)=e(k)/Σe(i) where e(k) is computed for module k

29 The OO design metrics provide an indication of design quality. The metrics consider aspects of encapsulation and inheritance. A sampling follows: 1. Lack of cohesion in methods(LCOM) 2. Percent public and protected(PAP) 3. Public access to data members(PAD) 4. Numbers of root classes (NOR) 5. Fan- in(FIN) 6. Number of children(NOC) and depth of the inheritance tree(DIT)

30 Software maturity index(SMI) that provides an indication of the stability of a software product Software maturity index is computed as SMI = [Mt-(Fa + Fc + Fd)]/Mt where Mt= no. of modules in the current release Fc= no. of modules in the current release that have been change. Fa= no. of modules in the current release that have been added. Fd= no. of modules from the preceding release that were deleted in the current release.

31 The overriding goal of software engineering is to produce a high- quality system, application, or product. The quality of a system, application, or product is only as good as The requirements that describe the problem The design that models the solution The code that leads to an executable program The tests that exercise the software to uncover errors. To accomplish this real-time quality assessment, the engineer must use technical measures to evaluate quality in objective, rather than subjective, ways.

32 1. Correctness : 1. Correctness : The degree to which the software performs its required function. Common measure: Defects per KLOC, where a defect is defined as a verified lack of conformance to requirements. 2. Maintainability: 2. Maintainability: The ease with which a program can be corrected if an error is encountered, adapted if its environment changes, or enhanced if the customer desires a change in requirements. A simple time -oriented metric Mean-time -to-change (MTTC), the time it takes to analyze the change request, design an appropriate modification, implement the change, test it, and distribute the change to all users

33 3. Integrity: 3. Integrity: Measures a system's ability to withstand attacks (both accidental and intentional) on its security. Attacks programs, data, and documents. To measure integrity, two additional attributes must be defined – Threat – Security Threat: The probability that an attack of a specific type will occur within a given time. Security: The probability that the attack of a specific type will be repelled. integrity = Σ [1 - threat x (1 - security)] where threat and security are summed over each type of attack.

34 4. Usability: 4. Usability: User friendliness. If a program is not "user friendly," it is often doomed to failure, even if the functions that it performs are valuable User friendliness can be measured in terms of four characteristics (i) the physical and/or intellectual skill required to learn the system (ii) the time required to become moderately efficient in the use of the system (iii) the net increase in productivity measured when the system is used by someone who is moderately efficient (iv) a subjective assessment of users attitudes toward the system

35 A quality metric that provides benefits at both project and process level is defect removal efficiency. DRE is a measure of the filtering ability of quality assurance and control activities as they are applied throughout all process frame work activates. DRE = E / (E + D) where E = number of errors found before delivery of the software to the end user D = number of defects found after delivery The ideal value for DRE is 1. No defects are found in the software Realistically, D will be greater than zero, but the value of DRE can still approach 1 as E increases As E increases it is likely that the final value of D will decrease

36 DRE can also be used within the project to assess a team's ability to find errors before they are passed to the next framework activity DREi = Ei / (Ei + Ei + 1) where Ei = number of errors found during software engineering activity i. Ei + 1 = number of errors found during software engineering activity i +1 that are traceable to errors that were not discovered in software engineering activity i.

37 Measure individuals Use metrics as a “stick” Ignore the data Use only one metric Cost Quality Schedule

38 Select metrics based on goals Goal 1 Goal 2 Question 1Question 2Question 3Question 4 Metrics 1Metric 2Metric 3Metric 4Metric 5 [Basili-88] Focus on processes, products & services Processes, Products & Services Provide feedback Feedback Data Data Providers Metrics Obtain “buy-in”


Download ppt "1. Software Metric- A definition 2. Types of Software metrics 3. Frame work of product metrics 4. Product metrics."

Similar presentations


Ads by Google