Software Measurement Activities. Software Measurement Framework SEG3202 N. El Kadri.

Slides:



Advertisements
Similar presentations
SOFTWARE PROCESS IMPROVEMENT “Never Stop Learning”
Advertisements

Chapter 2 The Software Process
Metrics for Process and Projects
©2006 OLC 1 Process Management: The Foundation for Achieving Organizational Excellence Process Management Implementation Worldwide.
CPIS 357 Software Quality & Testing I.Rehab Bahaaddin Ashary Faculty of Computing and Information Technology Information Systems Department Fall 2010.
1 The Role of the Revised IEEE Standard Dictionary of Measures of the Software Aspects of Dependability in Software Acquisition Dr. Norman F. Schneidewind.
Software Development Process Models. The Waterfall Development Model.
Software Quality Metrics
Software Metrics II Speaker: Jerry Gao Ph.D. San Jose State University URL: Sept., 2001.
Capability Maturity Model (CMM) in SW design
SOFTWARE PROJECT MANAGEMENT Project Quality Management Dr. Ahmet TÜMAY, PMP.
Software Process and Product Metrics
Capability Maturity Model
Software Reliability: The “Physics” of “Failure” SJSU ISE 297 Donald Kerns 7/31/00.
Chapter : Software Process
University of Toronto Department of Computer Science © 2001, Steve Easterbrook CSC444 Lec22 1 Lecture 22: Software Measurement Basics of software measurement.
Integrated Capability Maturity Model (CMMI)
UNIT-II Chapter : Software Quality Assurance(SQA)
Software Engineering II Lecture 1 Fakhar Lodhi. Software Engineering - IEEE 1.The application of a systematic, disciplined, quantifiable approach to the.
Chapter 2 The Process.
N By: Md Rezaul Huda Reza n
Cmpe 589 Spring Software Quality Metrics Product  product attributes –Size, complexity, design features, performance, quality level Process  Used.
CPIS 357 Software Quality & Testing
1SAS 03/ GSFC/SATC- NSWC-DD System and Software Reliability Dolores R. Wallace SRS Technologies Software Assurance Technology Center
J. R. Burns, Texas Tech University Capability Maturity Model -- CMM n Developed by the Software Engineering Institute (SEI) in 1989 –SEI is a spinoff.
College of Engineering and Computer Science Computer Science Department CSC 131 Computer Software Engineering Fall 2006 Lecture # 1 (Ch. 1, 2, & 3)
1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn.
Introduction to Software Engineering LECTURE 2 By Umm-e-Laila 1Compiled by: Umm-e-Laila.
Software Engineering Software Process and Project Metrics.
Chapter 6 : Software Metrics
Software Engineering Lecture # 17
CS /39 Illinois Institute of Technology CS487 Software Engineering David A. Lash.
Software Measurement & Metrics
Product Metrics An overview. What are metrics? “ A quantitative measure of the degree to which a system, component, or process possesses a given attribute.”
University of Sunderland CIFM03Lecture 2 1 Quality Management of IT CIFM03 Lecture 2.
Software Project Management Lecture # 3. Outline Chapter 22- “Metrics for Process & Projects”  Measurement  Measures  Metrics  Software Metrics Process.
Lecture 4 Software Metrics
SWEN 5130 Requirements Engineering 1 Dr Jim Helm SWEN 5130 Requirements Engineering Requirements Management Under the CMM.
Process Improvement. It is not necessary to change. Survival is not mandatory. »W. Edwards Deming Both change and stability are fundamental to process.
CMMI. 1.Initial - The software process is characterized as ad hoc, and occasionally even chaotic. Few processes are defined, and success depends on individual.
Systems Analysis and Design in a Changing World, Fourth Edition
Search Engine Optimization © HiTech Institute. All rights reserved. Slide 1 What is Solution Assessment & Validation?
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
Level 1 Level 1 – Initial: The software process is characterized as ad hoc and occasionally even chaotic. Few processes are defined, and success depends.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
A Metrics Program. Advantages of Collecting Software Quality Metrics Objective assessments as to whether quality requirements are being met can be made.
Estimating “Size” of Software There are many ways to estimate the volume or size of software. ( understanding requirements is key to this activity ) –We.
1 Software Engineering: A Practitioner’s Approach, 6/e Chapter 15a: Product Metrics for Software Software Engineering: A Practitioner’s Approach, 6/e Chapter.
Advanced Software Engineering Lecture 4: Process & Project Metrics.
SOFTWARE PROCESS IMPROVEMENT
Metrics "A science is as mature as its measurement tools."
Project Management Quality Management. Introduction Project planning Gantt chart and WBS Project planning Network analysis I Project planning Network.
Done By: Asila AL-harthi Fatma AL-shehhi Fakhriya AL-Omieri Safaa AL-Mahroqi.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
CMMI for Services, Version 1.3
Capability Maturity Model. What is CMM? n CMM: Capability Maturity Model n Developed by the Software Engineering Institute of the Carnegie Mellon University.
Project Management PTM721S
CS4311 Spring 2011 Process Improvement Dr
TechStambha PMP Certification Training
د. حنان الداقيز خريف /28/2016 Software Quality Assurance ضمان جودة البرمجيات ITSE421 5 – The components of the SQA.
Level 1 Level 1 – Initial: The software process is characterized as ad hoc and occasionally even chaotic. Few processes are defined, and success depends.
Software Planning
More on Estimation In general, effort estimation is based on several parameters and the model ( E= a + b*S**c ): Personnel Environment Quality Size or.
COCOMO Models.
Software Engineering Lecture 16.
Software Engineering I
Capability Maturity Model
Software metrics.
Capability Maturity Model
Presentation transcript:

Software Measurement Activities. Software Measurement Framework SEG3202 N. El Kadri

2 Software Measurement Activities Cost and effort estimation models and measures Productivity models and measures Data collection Quality models and measures Reliability models Performance evaluation and models Structural and complexity metrics Capability maturity assessment Management by metrics Evaluation of methods and tools

3 Cost and effort estimation –managers must plan projects by predicting necessary cost and effort and assigning resources appropriately –Doing this accurately has become one of the ‘holy grail’ searches of software engineering. –numerous measurement-based models for software cost and effort estimation have been proposed and used. –Examples: Boehm’s COCOMO model, Putnam’s SLIM model and Albrecht’s function points model.

4 Simple COCOMO Model Effort Predict Effort = a(size) b Effort = person month Size = predicted a,b: constants depending on type of system: “ organic ” : a = 2.4 b = 1.05 “ semi-detached ” : a = 3.0 b = 1.12 “ embedded ” : a = 3.6 b = 1.2

5 Albrecht ’ s Function Points Count the number of: –External inputs –External outputs –External inquiries –External files –Internal files Giving each a “ weighting factor ” The Unadjusted Function Count(UFC) is the sum of all these weighted scores To get the Adjusted Function Count(FP), multiply by a Technical Complexity Factor(TCF) FP = UFC * TCF User Spelling Checker User Dictionary

6 Productivity models and measures Traditional model: simply divides size (LOC) by effort (person-month). Productivity model as a decomposition into measurable attributes: –This model is a significantly more comprehensive view of productivity than the traditional one

7 Data Collection Effective use of measurement is dependent on careful data collection Ensure that measures are defined unambiguously, that collection is consistent and complete, and that data integrity is not at risk. Require carefully-planned data collection, as well as thorough analysis and reporting of the results. Example: failure data collection 1) Time of failure 2) Time interval between failures 3) Cumulative failure up to a given time 4) Failures experienced in a time interval

8 Quality Models Models of quality for various views of software quality constructed in a tree-like fashion The tree describes the pertinent relationships between factors and their dependent criteria, so we can measure the factors in terms of the dependent criteria measures. –upper branches hold important high-level quality factors of software products, such as reliability and usability, that we would like to quantify –Each quality factor is composed of lower-level criteria, such as modularity and data commonality. –The criteria are easier to understand and measure than the factors; thus, actual measures (metrics) are proposed for the criteria.

9 ISO 9126 Quality Model Factors Criteria Metrics (see ISO9126-2, ISO9126-3)

10 Reliability Models Most quality models include reliability as one of its component factors. software reliability modeling is applicable during the implementation phase of software QA. Based on observation and record information about software failures during test or operation.

11 Reliability Models Plot the change of failure intensity against time. The most famous reliability models are the basic exponential model and logarithmic Poisson model –The basic exponential model assumes finite failures in infinite time; –the logarithmic, Poisson model assumes infinite failures. Automated tools such as CASRE are available.

12 Performance Evaluation and Models Performance Model includes externally- observable system performance characteristics, such as response times and completion rates. Performance modeling is part of the implementation and maintenance phases of software QA. Performance specialists also investigate: –the efficiency of algorithms as embodied in computational and algorithmic complexity [Harel 1992] –the inherent complexity of problems measured in terms of efficiency of an optimal solution.

13 Structural and complexity metrics We measure structural attributes of representations of the software which are available before the implementation : Control- flow structure Data- flow structure Data structure Information flow attributes Complexity metrics (1979~) –Halstead’s “Software Science” metrics –McCabe’s “Cyclomatic Complexity” metrics (McCabe 1989) - number of independent paths in execution of a program –Influenced by Growing acceptance of structured programming Notions of cognitive complexity

14 McCabe ’ s Cyclomatic Complexity If G is the control flowgraph of program P and G has e edges and n nodes v(P) = e – n + 2 v(P) is the number of linearly independent paths in G here, e = 16, n=13, and v(P) = 5 More simply, if d is the number of decision nodes in G then v(P) = d + 1 McCabe proposed: v(P) < 10 for each module P

15 Management by metrics Estimate project elements such as cost, schedules, and staffing profiles Track project results against planning estimates Validate the organizational models as the basis for improving future estimates

16 Measurement for Guiding Management: Example Assume that an organization’s goal is to decrease the error rate in delivered software while maintaining (or possibly improving) the level of productivity; further assume that the organization has decided to change the process by introducing the Cleanroom method. SEL assessed the impact of introducing the Cleanroom method. The results of the experiment appear to provide preliminary evidence of the expected improvement in reliability following introduction of the Cleanroom method and may also indicate an improvement in productivity.

17 Evaluation of methods and Tools Efficiency of methods (1991~) Efficiency and reliability of tools Certification test of acquired tools and Components

18 Capability Maturity Assessment US Software Engineering Institute (SEI) model (1989): Grading using five-level scale. ISO 9001: Quality systems: models for quality assurance in design/development, production, installation and servicing (1991) ISO : Guidelines for application of ISO 9001 to the development, supply and maintenance of software (1991)

19 Capability Maturity Model (CMM) 1. Initial The software process is characterized as ad hoc, and occasionally even chaotic. Few processes are defined, and success depends on individual effort. 2. Repeatable Basic project management processes are established to track cost, schedule, and functionality. The necessary process discipline is in place to repeat earlier successes on projects with similar applications. 3. Defined The software process for both management and engineering activities is documented, standardized, and integrated into a standard software process for the organization. All projects use an approved, tailored version of the organization's standard software process for developing and maintaining software. 4. Managed Detailed measures of the software process and product quality are collected. Both the software process and products are quantitatively understood and controlled. 5. Optimizing Continuous process improvement is enabled by quantitative feedback from the process and from piloting innovative ideas and technologies. Disciplined process Standard, consistent process Predictable process Continuously improving process

20 Software Measurement Program Measurement is the mechanism to provide feedback on software quality. A measurement program without a clear purpose will result in frustration, waste, annoyance, and confusion. To be successful, a measurement program must be viewed as one tool in the quest for the improved engineering of software.

21 SE Standards ISO/IEC 9126 Software product evaluation: Quality characteristics and guidelines for their use ISO/IEC 15939:2002 Software Measurement Process ISO 9000 Standards are used to regulate internal quality and to assure quality of suppliers. Measurement is part of ISO 9000 IEEE 1061: Software Quality Metrics Methodology IEEE 1045: Software Productivity Metrics

22 Clarification: Metrics v.s. Measures v.s. Measurements Metrics are commonly accepted scales that define measurable attributes of entities, their units and their scopes. Measure is a relation between an attribute and a measurement scale. In the literature, measurements, measures, metrics are used as synonymous

23 Rigorous Measurement Framework Measurement = data collection + context Data collection: –Why you are collecting the data –How you plan to use the data –Purpose or destination for collecting the data (f.i., improving quality of your software from some perspective) Trade-off between costs and benefits

24 How to Build Valid Measurement Context? Points of view on software development: –Strategic: –Strategic: long-term performance of the organization –Tactical: –Tactical: short-term performance of an individual process –Technical: –Technical: details of products and processes that influence the development processes and products Classes of software development objects –Products –Processes –Resources Measurement Context = selected points of view + selected object(s)

25 Views of Measurement: Strategic View Organization’s goals are stated in measurable terms Measures of products, projects, and resources are summarized as means or medians, with some indication of variability –Unit cost (labor hours / size) –Defect rates (delivered defects / size) –Cycle time (project days / size) Strategic View tracks trends of these summary statistics. Strategic data is used to determine if and how well those goals are being met Primary user of strategic measurement data: strategic manager

26 Views of Measurement: Tactical View Concerned with performance of individual project Measurement data is used to –compare actual results to target (estimated or planned) results. Any variances are noted and investigated. Defect discovery rate during inspection or testing activities –predict values of certain indirect project measures Using project size to predict cost and schedule Primary user of tactical measurement data: project manager

27 Views of Measurement: Tactical View Project manager uses tactical measurement data for:

28 Views of Measurement: Technical View Physically, all measurement takes place at the technical level. All measures used at the strategic and tactical levels are built from fundamental technical measures –Strategic and tactical users of measurement data depend on technical users to supply the data Technical measures are focused upon a set of internal attributes of a single product or process, highly dependent on the technology in the product Primary user of technical measurement data: software engineer

29 Views of Measurement: Technical View

30 Objects of Measurement First obligation of measurement effort is to identify those objects which are to be measured –Processes, Products, Resources We measure attributes of those objects –Internal: measured purely in terms of the process, project, product or resource itself –External: can be measured only with respect to how the process, project, product or resource relates to its environment

31 Objects of Measurement: Process Processes are measured by comparing instance measurements to each other over time –Direct Internal process measures:

32 Objects of Measurement: Process –Indirect Internal process measures –External Process Measures Productivity – the unit of product produced per unit of input Stability of the process Variation – the extent to which instances of the process differ from each other

33 Objects of Measurement: Resources resources are those objects that serve as input to the processes: –People, tools, materials, methods, time, money, training – Internal attributes measures: Cost, capability, constraints on use –External attributes measures: Performance, productivity

34 Software Measurement Framework