Chapter ? Quality Assessment CSE784 Software Studio Class Notes Chapter ? Quality Assessment Jim Fawcett Copyright © 1999-2003
Software Product Model Architecture – Operational Concept Document (OCD) defines product components and allocates processing to them defines external product behavior Software Requirements Specification (SRS) describes what constitutes correct operation it is the basis for testing and evaluation Software Design Document (SDD) defines an architecture for the system describes software design and implementation specifies a software build process Test Plan defines procedures for unit, integration, validation, qualification, and regression testing qualification test procedures are emphasized Prototype Code verifies design for critical processing, analyzes implementation problems as they arise Product Code Code for each component of the product, implemented as software modules. test stub attached to each module, used to establish basic software cycling and nearly correct operation Test Code test drivers for unit, integration, and qualification tests Test Report The Architecture defines a partition of the product into component parts describes in detail the product’s public interface lists critical processing may define major data structures
Requirements Analysis Software requirements analysis and preliminary design are processes of breaking down or decomposition in the application domain: Application requirements are decomposed to processes and data flows. Process is a logical model of some part of the program’s activities necessary to satisfy part of its requirements model. Data flows represent the information necessary to sustain activities allocated to the process. Each process is allocated part of the program’s requirements model and may derive additional requirements necessary to complete or disambiguate its processing model. A design structure is developed by associating major processes with modules. Each such process and its data flows represents the public interface of its module. Each stage of the decomposition needs to flow down, or allocate, requirements to its component parts, otherwise there is no basis for deciding the correctness of the design.
SRS Quality Measures Legally Complete Unambiguous All requirements are specified. Complete description of behavior and appearance. Traceability between A and B levels. Unambiguous Requirements need no interpretation and should be read very literally. This depends to some extent on the expertise of the participants – customer and developers. Absolutely Consistent Context diagram and all levels of Data Flow Diagrams balance. Flow names match in spelling and case. Data Dictionary lists all data flows as shown on the above diagrams with exact spelling and case. Fairly uniform level of detail in each leaf node process description, e.g., DFDs are balanced. Paragraph numbers match Data Flow Diagram numbering. Testable and No Design Detail Requirements have no adjectives or adverbs, e.g., no best, maximum, highest, lowest, etc. No design details, e.g., no requirements on data transfer, implementation strategy, data structures, control.
Software Design Issues Abstraction logical model of component’s responsibilities Modularity partitioning into cohesive components Encapsulation separating public interface from private implementation building firewalls Hierarchy delegating responsibilities Cohesion each component should be focused on only one or two activities Coupling narrow coupling to maximize independence each component should need to know as little as possible about other components Locality of reference data referenced by a component should be defined or declared by the component Size and Complexity each component is small and simple enough to test thoroughly Object Oriented Design an object takes care of itself, allocating any resources it needs, and cleaning up after itself it is a self contained processing entity which serves others through calls to its public interface
Design Quality Measures Software decomposed into executive and server modules Graphical user interfaces delegate all their computation Server modules are cohesive, focused on a single activity or intimately related set of activities High level server modules have well-defined application domain abstractions and support salvage reuse. A module supports salvage reuse if: it has a strong abstraction Has no recursive dependencies on sibling modules depends only on modules it calls, not on modules calling it, e.g., no assumptions Makes very few assumptions about its environment Low-level server modules have well-defined solution domain abstractions and are reusable in a bottoms-up development. All modules are self-documenting Manual and Maintenance Pages Comments to describe complex or tricky code Support testing Provide test stubs Catch exceptions, provide meaningful error messages Modules are relatively small and simple Can be represented with a single structure chart Functions are small and simple Source Lines of Code (SLOCs) < 50 Cyclomatic Complexity (CC) < 10
Design Document Quality Measures Complete Description of “As-Built” Product Descriptions of: Structure Modular structure and modules Classes and class relationships Function calling relationships Responsiblities Operation Object creation, lifetime, and messaging relationships Events States and state transitions
Software Implementation Issues Modular Structure manual page maintenance page public interface private implementation test stub test drivers The importance of correct operation design attributes, e.g., size, complexity, The role of testing pre and post conditions, formalizing assumptions checking invariants Honoring the logical model Good Neighbor policy Good Housekeeping policy Object Oriented Software building sets of objects empowering objects client-server model Error handling Managing interfaces Developer’s responsibilities for maintenance
Implementation Quality Measures To Be Defined
Software Testing Construction testing (build a little, test a little) test stubs testing invariants incremental development Unit testing (did I do it correctly?) striving to make each component operate correctly, e.g., according to its specifications Integration testing (will they come together?) bringing separately developed components into the same build Performance testing (are there adequate margins?) examine critical time-lines random access and disk memory use Validation testing (does it crash?) stress tests, pattern tests, random inputs Qualification testing (did we meet all requirements?) a legal verification that requirements have been meet Regression testing (does it still work?) high level test of public operations used when porting to new platforms, after maintenance modifications, and when re-installing
Test Quality Measures To Be Defined
End of Presentation