Presentation is loading. Please wait.

Presentation is loading. Please wait.

Automated Evaluation of Runtime Object States Against Model-Level States for State-Based Test Execution Frank(Weifeng) Xu, Gannon University Dianxiang.

Similar presentations


Presentation on theme: "Automated Evaluation of Runtime Object States Against Model-Level States for State-Based Test Execution Frank(Weifeng) Xu, Gannon University Dianxiang."— Presentation transcript:

1 Automated Evaluation of Runtime Object States Against Model-Level States for State-Based Test Execution Frank(Weifeng) Xu, Gannon University Dianxiang Xu, North Dakota State University

2 Overview  Introduction  Objectives  State evaluation infrastructure  Case study  Experiments/Demo  Conclusions

3 Introduction  State- based testing process  Evaluation of runtime object states against the model- level states defined in a state model is critical to state- based test automation.  Manually keep track of the state of the running objects is difficult  Mapping from runtime object states to abstract states is time consuming

4 Objectives  This paper presents a state evaluation framework to support the automated state-based test execution process.  keeping track of the state of running objects  mapping the states to abstract states  firing a warning message if the abstract state does not match model level states

5 Challenges  How does the evaluation framework monitor and collect the states of running objects?  How does the monitor device interact with yet not depend on a particular IUT?  How can we automatically map the runtime object states to abstract states in a state model?  How can the test driver get informed if the actual state and the expected state do not match?

6 Approach  We take advantage of the pointcut mechanism in aspect- oriented programming and implement the framework in AspectJ.  We demonstrate the framework by a case study  We conduct a series of empirical studies to evaluate the framework by calculating and comparing the total consumed minutes of mapping states and checking the oracle in term of manual and automated execution.  The experiment results show that the evaluation framework is much more effective and efficient than manual evaluation.

7 State evaluation infrastructure

8

9 Case study

10 Expected state We need run time objects

11

12 Figure 7. The pseudo code of a JavaBean fires events if the expected and actual states are different

13 Experiments  Two groups of students  Group 1 students manually monitor and map states  Group 2 student use the framework  5 application

14

15

16 Conclusions  We have proposed a novel approach to automatically evaluation runtime object states to against model-level states for State-Based Test Execution.  The framework is able to automatically keep track of the properties of running objects,  converting the properties to corresponding states in state models and comparing whether or not the states match the expected states.  The framework is essentially an extension of observer design pattern implemented by enterprise JavaBeans.  We take advantage of the pointcut mechanism of AspectJ to facilitate the state monitoring ability.

17 Discussion  We are assuming that all the state models we are using for testing execution are correct.  We consider state abstractions but never action abstractions (input/methods to call). This makes things easier because these would have to be concretized.


Download ppt "Automated Evaluation of Runtime Object States Against Model-Level States for State-Based Test Execution Frank(Weifeng) Xu, Gannon University Dianxiang."

Similar presentations


Ads by Google