jpf@fe.up.pt www.fe.up.pt/~jpf TQS - Teste e Qualidade de Software (Software Testing and Quality) Test Case Design – Model Based Testing João Pascoal Faria jpf@fe.up.pt www.fe.up.pt/~jpf
Model based/driven engineering Additional step Design Code Model Software Unit V&V Implementation V&V Controller V&V System V&V System Req. System V&V V&V Controller (Software) Req. Controller V&V Model V&V Model Design Verify & Validate Models are intermediate work products of the engineering activities, between requirements and code “We use modeling and simulation for early validation of the system solution” Source: “Failure mode avoidance”, Tim Davis, 27th International Conference on Software Engineering, 2005
Models as RE output and testing input Opportunity Analysis Market Analysis Customer Request Business Rules Requirements Engineering Process MODEL (a) (a) Requirements modeling (b) Model validation (b) Project, Process & Documentation Standards Requirements Document Source: Larry Bernstein, Model Based Testing ICSE Workshop, May 05 Software Design Software Development Software Testing Marketing, Etc.
Black-box model-based testing Model represents the required behavior of a software item or system Design of the system is not modeled Consider only external interfaces inputs, outputs and usage conditions Black box tests are generated from models
Different types of models Visual models UML diagrams Formal models Formal specifications E.g. visual models enriched with contracts/assertions (invariants, pre-conditions and post-conditions) Executable models E.g. visual models enriched with high-level action specifications Focus here To be addressed later (formal V&V methods)
Model-based testing with visual models Behavioral UML models Use case models Interaction models: sequence diagrams, collaborations diagrams State machine models Activity models Same techniques as white box Can be used to generate test cases in a more or less automated way
Model-based testing with formal models Formal specification = formal model Non executable formal specifications Constraint language Contracts: operation pre/post conditions Can be expressed in OCL – Object Constraint Language Post conditions can be used to check outcomes – test oracle Executable formal specifications Action language Executable test oracle for conformance testing Example: “Colocação de professores”
State-transition testing Construct a state-transition model (state machine model) of the item to be tested (from the perspective of a user / client) E.g. with a state diagram in UML Define test cases to exercise all states and all transitions between states Usually, not all possible paths (sequences of states and transitions), because of combinatorial explosion Each test case describes a sequence of inputs and outputs (including pre/post internal states), and may cover several states and transitions Also test to fail – with unexpected inputs for a particular state Particularly useful for testing user interfaces state – a particular form/screen/page or a particular mode (inspect, insert, modify, delete; draw line mode, brush mode, etc.) transition – navigation between forms/screens/pages or transition between modes Also useful to test object oriented software (object life cycles) Particularly useful when a state-transition model already exists as part of the specification
Composable models for integration testing Tuner Restore Process ing returnRestore envRestore/ fire callRestore env Change returnDropReqAccepted [AUTO TUN] / fire callRestore Red returnDropReqAccepted [MANUAL TUN] DropReqProcess ing envChange/callDropReq Green callDropAcknowledge [MANUAL TUN] / fire returnDropAcknowledge returnDropReqPending Orange callDropAck[AUTO TUN] / fire callRestore, returnDropAcknowledge envChange Hop Green callDropReq[MANUAL HOP] / fire returnDropReqPending Orange returnDropAck callRestore/ fire returnRestore DropAck Processing2 envAck/ fire callDropAck callRestore/ fire returnRestore DropAck Processing1 Red callDropReq [AUTO HOP] / fire returnDropReqAccepted returnDropAck Source: Tim Trew, Making it stick: Industrial Model-Based Testing,Model Based Testing ICSE Workshop, May 05
Use case and scenario testing Particularly adequate for system and integration testing Use cases capture functional requirements Each use case is described by one or more normal flow of events and zero or more exceptional flow of events (also called scenarios) Each scenario can be depicted by a sequence diagram Define at least one test case for each scenario Build and maintain a (tests to use cases) traceability matrix
From use cases to test cases: Map use case to sequence diagram A use case contains one main scenario and several alternate scenarios Each scenario is described by a sequence diagram Use case Make conventional reservation Summary Reservation clerk reserves a room for a hotel guest Actor Reservation clerk Precondition Reservation clerk has logged into the system Description 1. Guest gives personal details to clerk, such as name, room type… 2. Reservation clerk enters info.. 3. If reservation is available … Alternatives Alt1: Line 3: If reservation is not available, clerk requests the customer for another date Postcondition The reservation is saved and marked as guaranteed One use case contains a main scenario and zero or more alternative scenarios. Each scenario maps to a sequence diagram, shown here using the sequence diagram reference notation in UML 2.0.
From use cases to test cases: Sequence diagram example A sequence diagram is a graphical description of a scenario. This sequence diagram is a special case, that shows the sequence of (controllable) actor inputs and expected (observable) system responses.
From use cases to test cases: Map sequence diagram to annotated sequence A sequence diagram is made test-ready by: Identifying system inputs and outputs Adding constraints to system inputs and outputs {1} Inputs: s1: SingleReservation, c1: Conventional Input constraints: (s1.startDate ≤ s1.endDate) and ((c1.style = oneBedStyle) xor (c1.style = twoBedStyle))
From use cases to test cases: Map annotated sequence to test specification Map input constraints to the test input specifications Map output constraints to the test output specifications Map precondition to the test special procedure requirements Function to exercise Make Reservation use case, Main scenario Test id T1 Test items (objective) Exercise the main scenario of the main reservation use case for the conventional room feature Inputs {1}Inputs: s1: SingleReservation, c1: Conventional Constraints: (s1.startDate ≤ s1.endDate) and ((c1.style = oneBedStyle) xor (c1.style = twoBedStyle)) Outputs {2} Outputs: r1: Reservation Expected values: r1.status = reserved Special procedure requirements Precondition: ReservationClerk.status = logged in Intercase dependencies Precondition is set by Login.Main test An annotated sequence maps to a test specification. (IEEE standard 829-1998)
From use cases to test cases: Derive test case from test specification Function to exercise Make Reservation use case, Main scenario Test id T1 Test items (objective) Exercise the Main scenario of the main reservation use case for the Conventional room feature Inputs {1}Inputs: s1: SingleReservation, c1: Conventional Constraints: (s1.startDate ≤ s1.endDate) and ((c1.style = oneBedStyle) xor (c1.style = twoBedStyle)) Outputs {2} Outputs: r1: Reservation Expected values: r1.status = reserved Special procedure requirements Precondition: ReservationClerk.status = logged in Intercase dependencies Precondition is set by Login.Main test Function to exercise Make Reservation use case, Main scenario Test id T1 Test items (objective) Exercise the Main scenario of the main reservation use case for the Conventional room feature Inputs {1} s1.startDate = 05/12/2006 s1.endDate = 05/14/2006 c1.style = oneBedStyle Outputs {2} r1.status = reserved Special procedure requirement Precondition: ReservationClerk.status = logged in Intercase dependencies Precondition is set by Login.Main test A test specification is used to generate one or more test cases. A value is randomly selected from a predefined list of values to satisfy the test specification constraints. Source: Erika Mir Olimpiew, Hassan Gomaa, “Model-based Testing for Applications Derived from Software Product Lines”, Model Based Testing ICSE Workshop, May 05
References and further reading