Galit Friedman, Alan Hartman, Kenneth Nagin, Tomer Shiran IBM Haifa Research Laboratory ISSTA 2002
Outline Specification-based testing EFSM models and test generation Projected State Machine Coverage Criteria Test Generation Algorithms Experimental & Industrial Experience
Specification-based testing Modeling Test Generation Build a model based on the specifications Derive test cases from the model Test cases generated based on some coverage criterion Test cases contain stimuli and expected responses Test Suite
EFSM Models Labeled directed graph Nodes (states) labeled with both control and data Arcs (transitions) labeled by stimuli to application Includes expected responses to stimuli Start 0 - deposit 1 OK deposit 2 OK deposit 2 fail withdraw 0 fail withdraw 0 OK withdraw 1 OK deposit withdraw
Problems with EFSM State space explosion Test case explosion
Test Generation Extracting a set of paths from the EFSM How do you choose which paths? Coverage criteria! Star t 0 - dep osit 1 OK dep osit 2 OK dep osit 2 fail with draw 0 fail with draw 0 OK with draw 1 OK
Projected State Machine Start 0 - deposit 1 OK deposit 2 OK deposit 2 fail withdraw 0 fail withdraw 0 OK withdraw 1 OK Start - deposit OK deposit fail withdraw fail withdraw OK
Coverage Criteria I CC_State_Projection on Generate a set of test cases – one through each equivalence class of states in the projected state machine E.g. CC_State_Projection on action; result; Start - deposit OK deposit fail withdraw fail withdraw OK
Coverage Criteria II CC_Transition_Projection from to E.g. CC_Transition_Projection from action; to action; result; Equivalent to Carver and Tai’s CSPE-1 coverage criterion (Constraints on Succeeding and Preceeding Events) IEEE TSE 1998 Controllable stimuli: Start, Deposit, Withdraw Observable results: Fail, OK
Other coverage criteria for TG Hartmann et al. ISSTA 2000 – transition coverage of data partitions Offut & Abdurazik UML 1999 – explicit test purposes, transition coverage, predicate coverage of transitions Jeron & Morel CAV 1999 – test purposes Amman et al. FME 1998 – mutation coverage Henniger & Ural SDL 2000 – define-use coverage on message flow graph
Test Constraints Forbidden classes of states Forbidden classes of paths E. g. TC_Forbidden_State buffer=2; Start 0 - deposit 1 OK Forbidden withdraw 0 fail withdraw 0 OK withdraw 1 OK
Test Generation Algorithm Traverse the whole EFSM reachable state space –Use BFS, DFS, or CFS –Record data on reachable coverage tasks – including random representative selection –Eliminate forbidden configurations Extract a path to each selected task representatives When state space too large – generate on-the-fly
Experiments Buffer, Readers and Writers, Gas Station, Sliding Window Protocol, Elevator Control. Use different projections to obtain a hierarchy of test suites of varying strength. More projection variables created larger test suites with increased power of defect detection Use test constraints to partition the state space, enabling measurable coverage of well-defined subsets of behavior
Distributed File System Statistics: FSM states, 290 test cases, 729 coverage tasks Original Test: 12 PM, 18 defects (10 severity 2) Our test: 10 PM, 15 old defects (10 severity 2) 2 new defects Bottom Line – We made a convert
Call Center Two FSM models 37 defects Responsiveness to changes in spec. Reuse of function test for system test
Conclusions Flexible coverage criteria for a hierarchy of test suites FSM constraints help with state explosion Systematic and automated methodology Eases reuse and maintenance of test suites Successful in detecting faults and communicating error scenarios