SENG 530: Software Verification and Validation

Slides:



Advertisements
Similar presentations
Chapter 2 The Software Process
Advertisements

Metrics for Process and Projects
Process Improvement.
SAK5102 SOFTWARE EVALUATION Semester II 2008/ credits Tuesday 6.30 pm – 9.30 pm (BK1) Assoc. Prof Dr. Abdul Azim Abd Ghani 1.
Software Quality Engineering Roadmap
Software Quality Metrics
Planning a measurement program What is a metrics plan? A metrics plan must describe the who, what, where, when, how, and why of metrics. It begins with.
R R R CSE870: Advanced Software Engineering (Cheng): Intro to Software Engineering1 Advanced Software Engineering Dr. Cheng Overview of Software Engineering.
A GOAL-BASED FRAMEWORK FOR SOFTWARE MEASUREMENT
1 Software metrics in general Seminar on Software Engineering Sanna Martikainen,
Creator: ACSession No: 5 Slide No: 1Reviewer: SS CSE300Advanced Software EngineeringSeptember 2005 Software Measurement - Basics CSE300 Advanced Software.
OHT 3.1 Galin, SQA from theory to implementation © Pearson Education Limited 2004 The need for comprehensive software quality requirements Classification.
SOFTWARE PROJECT MANAGEMENT Project Quality Management Dr. Ahmet TÜMAY, PMP.
Software Process and Product Metrics
Software Testing Introduction. Agenda Software Testing Definition Software Testing Objectives Software Testing Strategies Software Test Classifications.
Software Verification and Validation (V&V) By Roger U. Fujii Presented by Donovan Faustino.
1 College of Engineering and Computer Science Computer Science Department CSC 131 Computer Software Engineering Fall 2006 Lecture # 2 Chapter 6 & 7 System.
University of Toronto Department of Computer Science © 2001, Steve Easterbrook CSC444 Lec22 1 Lecture 22: Software Measurement Basics of software measurement.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 27 Slide 1 Quality Management 1.
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
S T A M © 2000, KPA Ltd. Software Trouble Assessment Matrix Software Trouble Assessment Matrix *This presentation is extracted from SOFTWARE PROCESS QUALITY:
 The software systems must do what they are supposed to do. “do the right things”  They must perform these specific tasks correctly or satisfactorily.
المحاضرة الثالثة. Software Requirements Topics covered Functional and non-functional requirements User requirements System requirements Interface specification.
1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn.
Software Engineering Software Process and Project Metrics.
Chapter 6 : Software Metrics
Topic (1)Software Engineering (601321)1 Introduction Complex and large SW. SW crises Expensive HW. Custom SW. Batch execution.
IT Requirements Management Balancing Needs and Expectations.
Software Project Management With Usage of Metrics Candaş BOZKURT - Tekin MENTEŞ Delta Aerospace May 21, 2004.
This chapter is extracted from Sommerville’s slides. Text book chapter
Experimentation in Computer Science (Part 1). Outline  Empirical Strategies  Measurement  Experiment Process.
Software Metrics – part 2 Mehran Rezaei. Software Metrics Objectives – Provide State-of-art measurement of software products, processes and projects Why.
Software Testing and Quality Assurance Software Quality Assurance 1.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
SWEN 5130 Requirements Engineering 1 Dr Jim Helm SWEN 5130 Requirements Engineering Requirements Management Under the CMM.
University of Southern California Center for Systems and Software Engineering Metrics Organizational Guidelines [1] ©USC-CSSE1 [1] Robert Grady, Practical.
Software Engineering - I
Software quality factors
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
Introduction to Measurement. According to Lord Kelvin “When you can measure what you are speaking about and express it in numbers, you know something.
27/3/2008 1/16 A FRAMEWORK FOR REQUIREMENTS ENGINEERING PROCESS DEVELOPMENT (FRERE) Dr. Li Jiang School of Computer Science The.
A Metrics Program. Advantages of Collecting Software Quality Metrics Objective assessments as to whether quality requirements are being met can be made.
Software Development Problem Analysis and Specification Design Implementation (Coding) Testing, Execution and Debugging Maintenance.
1 Software Engineering: A Practitioner’s Approach, 6/e Chapter 15a: Product Metrics for Software Software Engineering: A Practitioner’s Approach, 6/e Chapter.
Making knowledge work harder Process Improvement.
West Virginia University Sherif Yacoub, Hany H. Ammar, and Ali Mili A UML Model for Analyzing Software Quality Sherif Yacoub, Hany H. Ammar, and Ali Mili.
Advanced Software Engineering Lecture 4: Process & Project Metrics.
Software Measurement Measuring software indicators: metrics and methods Jalote-2002,
F02measure13 1 Software Measurement Measurement is essential for a science.
SOFTWARE PROCESS IMPROVEMENT
Software Measurement: A Necessary Scientific Basis By Norman Fenton Presented by Siv Hilde Houmb Friday 1 November.
Software Engineering (CSI 321) Software Process: A Generic View 1.
Requirements Management with Use Cases Module 2: Introduction to RMUC Requirements Management with Use Cases Module 2: Introduction to RMUC.
Software Engineering Lecture 10: System Engineering.
Requirements. Outline Definition Requirements Process Requirements Documentation Next Steps 1.
What is a software? Computer Software, or just Software, is the collection of computer programs and related data that provide the instructions telling.
 System Requirement Specification and System Planning.
Emilia Mendes Professora Visitante CAPES/ Associate Professor Univ. Auckland, NZ. Introdução a Métricas, Qualidade e Medição de Software.
Advanced Software Engineering Dr. Cheng
Software Quality Control and Quality Assurance: Introduction
Chapter 10 Software Quality Assurance& Test Plan Software Testing
Software Engineering B.Tech Ii csE Sem-II
Software Engineering (CSI 321)
Chapter 13 Quality Management
Software metrics.
Measurement What is it and why do it? 2/23/2019
Goal-Driven Continuous Risk Management
Goal-Driven Software Measurement
Chapter 26 Estimation for Software Projects.
Metrics Organizational Guidelines [1]
Presentation transcript:

SENG 530: Software Verification and Validation V&V Processes and Techniques Prof. Bojan Cukic Lane Department of Computer Science and Electrical Engineering West Virginia University

Overview Software Inspections. Software Metrics. 02/14/2002 Software Metrics. Today Software Reliability Engineering. 02/28/2002

Agenda Software Engineering Measurements. Measurement Theory. A Goal-Based Framework for Software Measurement. Verification and Validation Metrics.

Measure? Why? Developers angle. Managers angle. Customers angle. Completeness of requirements, quality of design, testing readiness. Managers angle. Delivery readiness, budget and scheduling issues. Customers angle. Compliance with requirements, quality. Maintainers angle. Planning for upgrades and improvements.

Measurement A process by which numbers (symbols) are assigned to attributes of entities in the real world in such a way to describe them according to clearly defined rules. Measurement process is difficult to define. Measuring colors, intelligence is difficult. Measurement accuracy, margin of errors. Measurement units, scales. Drawing conclusions from measurements is difficult.

Measurement (2) “What is not measurable make measurable” [Galileo, 1564-1642]. Increased visibility, understanding, control. Measurement: Direct quantification of an attribute. Calculation: Indirect, a combination of measurements used to understand some attribute. (Ex. Overall scores in decathlon).

Measurement in Software Engineering Applicable to managing, costing, planning, modeling, analyzing, specifying, designing, implementing, verifying, validating, maintaining. Engineering implies understanding and control. Computer science provides theoretical foundations for building software, software engineering focuses on controlled and scientifically sound implementation process.

Measurement in Software Engineering Considered somewhat a luxury?!? Weakly defined targets: “Product will be user-friendly, reliable, maintainable”. Gilb’s Principle of Fuzzy Targets: “Projects without clear goals will not achieve their goals clearly.” Estimation of costs. Cost of design, cost of testing, cost of coding… Predicting product quality. Considering technology impacts.

Software Measurement Objectives “You cannot control what you cannot measure.” [DeMarco, 1982] Managers angle. Cost: Measure time and effort of various processes (elicitation, design, coding, test). Staff productivity: Measure staff time, size of artifacts. Use these for predicting the impact of a change. Product quality: Record faults, failures, changes as they occur. Cross compare different projects. User satisfaction: Response time, functionality. Potential for improvement.

Software Measurement Objectives (2) Engineers angle. Are requirements testable? Instead of requiring “reliable operation”, state the expected mean time to failure. Have all faults been found? Use models of expected detection rates. Meeting product or process goals. <20 failures per beta-test site in a month. No module contains more than x lines (standards). What the future holds? Predict product size from specification size.

The scope of software metrics Cost and effort estimation. COCOMO 1, Function points model, etc. Effort is a function of size (LOC, function points), developer’s capability, level of reuse, etc. Productivity models and measures. Simplistic approach: Size/Effort. Can be quite misleading, even dangerous.

A productivity model

The Scope of Metrics (2) Data collection Easier said than done. Must be planed and executed carefully. Use simple graphs and charts to present collected data (see next slide). Good experiments, surveys, case studies are essential.

Presenting collected data

The Scope of Metrics (3) Quality models and measurements. Quality and productivity models are usually combined. Advanced COCOMO (COCOMO II), McCalls model. Usually constructed in a tree-like fashion. At a high level are indirect factors, at a low level are directly measurable factors.

The Scope of Metrics (4) Reliability models. Performance evaluation and modeling. Structural and complexity metrics. Readily available structural properties of code (design) serve as surrogate for quality assessment, control, prediction. Management by metrics. Many companies define standard tracking and project monitoring/reporting systems. Capability maturity assessment.

Agenda Software Engineering Measurements. Measurement Theory. A Goal-Based Framework for Software Measurement. Verification and Validation Metrics.

The basics of measurement Some pervasive measurement techniques are taken for granted A rising column of mercury for temperature measurement was not so obvious 150 years ago. But, we developed a measurement framework for temperature. How well do we understand software attributes we want to measure? What is program “complexity”, for example?

The basics of measurement (2) Are we really measuring the attribute we want to measure? Is the number of “bugs” in system testing a measure of quality? What statements can be made about an attribute” Can we “double design quality”? What operations can be applied to measurements? What is “average productivity” of the group? What is “average quality” of software modules?

Empirical relations

Empirical relations (2) “Taller than” is an empirical relation. Binary relation (x is taller than y). Unary relation (x is tall). Empirical relations need to be mapped from real world into mathematics. In this mapping, real world is the domain, mathematical world is the range. Range can be the set of integers, real numbers, or even non-numeric symbols.

Representation condition The mapping should preserve real world relations. A is taller than B iff M(A) > M(B). Binary empirical relation “taller than” is replaced by the numerical relation >. So, “x is much taller than y” may mean M(x)>M(y)+15.

Stages of formal measurement

Agenda Software Engineering Measurements. Measurement Theory. A Goal-Based Framework for Software Measurement. Verification and Validation Metrics.

The framework Classifying the entities to be examined Determining relevant measurement goals Identifying the level of maturity reached by the organization

Classifying software measures Processes are collections of software related activities. Associated with time, schedule. Products are artifacts, deliverables or documents that result from a process activity. Resources are entities required by a process activity.

Classifying software measures (2) For each entity, we distinguish: Internal attributes Those that can be measured purely in terms of the product, process or the resource itself. Size, complexity measures, dependencies. External attributes Those that can be measured in terms of how the product, process or the resource relate to their environment. Experienced failures, timing and performance.

Process Measures include: The duration of the process or one of its activities. The effort associated with the process or activities The number of incidents of the specific type arising during the process or one of its activities. Ave. cost of error=cost/#errors_found.

Products External attributes: Internal attributes Reliability, maintainability, understandability (of documentation), usability, integrity, efficiency, reusability, portability, interoperability… Internal attributes Size, effort, cost, functionality, modularity, syntactic correctness.

Product measurements Direct measure example: Indirect measure example: Entity: Module design document (D1) Attribute: Size Measure: No. of bubbles (in flow diagram). Indirect measure example: Entity: Module design document (D1, D2,…) Attribute: Average module size Measure: Average no. of bubbles (in flow diagram).

Resources Personnel (individual or team), materials (including office supplies), tools, methods. Resource measurement may show what resource to blame for poor quality. Cost measured across all types or resources. Productivity: amount_of_output/effort_input. Combines resource measure (input) with the product measure (output).

GQM Paradigm Goal-Question-Metric Steps: List the major goals of the development effort. Derive from each goal questions that must be answered to determine if goals are being met. Decide what must be measured to answer the questions adequately.

Generic GQM Example

AT&T GQM Example

Goal definition templates Purpose: To (characterize, evaluate, predict…) the (process, model, metric…) in order to (understand, assess, manage, learn, improve…) Perspective Examine the (cost, correctness, defects, changes…) from the viewpoint of (developer, manager, customer…) Environment The environment consists of process factors, people factors, methods, tools, etc.

Process improvement Measurement is useful for understanding, establishing the baseline, assessing and predicting. But the larger context is improvement. SEI proposed five maturity levels, ranging from the least to the most predictable and controllable.

Process maturity levels

Maturity and measurement overview Level Characteristic Metrics Initial Ad hoc Baseline Repeatable Process depends Project on individuals management Defined Process defined & Product institutionalized Managed Measured process Process+ feedback Repeatable Improvement fed back Process+feedback to the process for changing the process.

Repeatable process

A defined process

A managed process

Applying the framework Cost and effort estimation E=a*Sb E is effort (person months), S is size (thousands of delivered source statements) A, b are environment specific constants. Data collection Orthogonal defect classification

Applying the framework (2) Reliability models JM model: MTTFi=a/(N-I+1) N: total no. faults, 1/a is the “fault size”. Capability maturity assessment The maturity attribute is also viewed as an attribute of contractor’s process.

Applying the framework (3) Evaluation of methods and tools

Agenda Software Engineering Measurements. Measurement Theory. A Goal-Based Framework for Software Measurement. Verification and Validation Metrics.

V&V application of metrics Applicable throughout the lifecycle. Should be condensed for small projects. Used to assess product, process, resources. V&V metric characteristics: Simplicity Objectivity Ease of collection Robustness (insensitive to changes) Validity

V&V specific metrics (requirements and design)