Copyright © Siemens AG All rights reserved. Essential Criteria on MBT to Ensure Quality of Software in Industry PVR Murthy Andreas Ulrich Siemens AG, Corporate Technology PVR Murthy Andreas Ulrich Siemens
Page 2 June © Siemens AG 2012 ST /CT-India Agenda Model-based testing (MBT) Case study Effective criteria for evaluation of MBT tools Analysis and discussions Summary
Page 3 June © Siemens AG 2012 ST /CT-India Modeling Generating Executing Analyzing Results Coverage Algorithm Generate Test Cases Generate Test Cases Generate Executable Generate Executable and/or Create Executable Application Under Testing Run Scripts Run Scripts Execute Manually Execute Manually Oracles Verify Results Test pass or Fail Test pass or Fail Decide whether to o Generate more tests o Modify the model o Stop testing Estimate o Reliability & other quality measures Decide whether to o Generate more tests o Modify the model o Stop testing Estimate o Reliability & other quality measures Introduction to Model Based Testing
Page 4 June © Siemens AG 2012 ST /CT-India Model-Based Testing Model-based testing (MBT) is an evolving test generation technique. Uses design artifacts (models) of a system under test. Generate test cases automatically. Suitable for adequate and thorough testing of critical and complex software.
Page 5 June © Siemens AG 2012 ST /CT-India MBT tools under study A number of MBT tools are available Commercial tools, e.g. T-VEC, Qtronic, Spec Explorer etc. Academic tools, e.g. NModel, TGV/CADP etc. We selected two tools for our study Conformiq’s Qtronic V2.0 Microsoft’s Spec Explorer2010 V3.0 They represent two different modeling paradigms Qtronic uses both textual and graphical notations while describing the system model (based on hierarchical state charts). Spec Explorer uses guarded state update rules in C# (based on abstract state machines, called model programs) and a coordination specification.
Page 6 June © Siemens AG 2012 ST /CT-India Conformiq’s Qtronic Models are expressed as a collection of textual files and/or graphical models. The textual notation is defined using QML Graphical models are created using either Conformiq Modeler or a third party UML editor. Generation of test cases based on state space analysis Can translate test cases into executable test scripts
Page 7 June © Siemens AG 2012 ST /CT-India Microsoft’s Spec Explorer Uses Visual Studio programming environment to model system behavior. Input is a set of.NET model assemblies and a coordination script. Generates test cases based on exploration of the model. Exploration systematically discovers all possible states and transitions. Exploration can be visualized through a state space graph. Test cases are a collection of C# files executable in MSTest
Page 8 June © Siemens AG 2012 ST /CT-India Case Study We have considered an application from Siemens medical domain. Considered SUT: An automated exposure control software. Make large images with an automatic sequence of many small images Modeled as a UML state chart with 11 system states, 24 transitions and 72 transition paths. Digital radiographic system
Page 9 June © Siemens AG 2012 ST /CT-India Evaluation Criteria For a Siemens industrial project in the healthcare domain High quality, reliable and robust software is indispensable It therefore requires rigorous and effective testing Effective and rigorous testing of critical and highly reliable software requires a good MBT tool / methodology. We have identified eight important parameters for evaluating the considered MBT tools. Based on the Siemens requirement in the domain of highly critical and reliable software. Identified parameters are subjective.
Page 10 June © Siemens AG 2012 ST /CT-India Criterion 1: Model Representation Aspects considered Type of model (tester or design model). Kind of notation (standard or proprietary). Ease of creation and editing (expressiveness, reusability, etc) Modeling advanced features (hierarchy, concurrency). Findings CriteriaConformiq’s QtronicMicrosoft’s Spec Explorer 2010 Model representation Design model Models are expressed as a collection of textual files and graphical models Models hierarchical UML state charts with unambiguous semantics. Nondeterministic transitions cannot be handled Testers model, model behavior can be viewed as an exploration graph. Models are expressed by using.NET language and Cord scripts for configuration control. No hierarchical modeling similar to UML state charts is supported. Nondeterministic transitions can be handled.
Page 11 June © Siemens AG 2012 ST /CT-India Criterion 2: Model Validation Aspects Considered Detecting requirement defects. Consistency checking. Checking design errors in the models. Findings CriteriaConformiq’s QtronicMicrosoft’s Spec Explorer 2010 Model validation Supports model validation by consistency checks. Tool checks configuration scripts and model code for consistency.
Page 12 June © Siemens AG 2012 ST /CT-India Criterion 3: Test Generation Strategy Aspects considered Tool support for choosing test coverage criteria. Efficiency of test case generation algorithm. Test generation control to avoid test case explosion Findings Criteria Conformiq’s QtronicMicrosoft’s Spec Explorer 2010 Test generation Strategy Test generation is coverage driven. Facility to maintain different test suites, Supports generation of negative tests. Provides a clear visual overview of the link between generated test cases and covered requirements. No direct support to control test case explosion. No control for specifying coverage criteria. Supports requirements coverage, but does not give a very clear visual overview. Test generation is also supported for non- deterministic models, but restricted to the generation of non-exhaustive test suites. Handles infinite state spaces by user scenarios and slices.
Page 13 June © Siemens AG 2012 ST /CT-India Criterion 4: Test Data Generation Aspects Considered Automation in test data generation. Accuracy, thoroughness, cost of testing. Findings Criteria Conformiq’s QtronicMicrosoft’s Spec Explorer 2010 Test data generation Based on the values of input variables and constants specified in the model. Does not have an option of specifying a domain of alternative data values for model parameters. Supports a good domain control method for generating test data. Provides an extensive mechanism for input parameter generation
Page 14 June © Siemens AG 2012 ST /CT-India Criterion 5: Concurrency support Aspects Considered Ability to test concurrent system behavior. Findings Criteria Conformiq’s QtronicMicrosoft’s Spec Explorer 2010 Concurrency support Concurrent state charts can be modeled only using the QML textual notation. Generated tests are strictly sequential regardless of the concurrency that might exist in the model. Supports modeling of concurrency, but the generated test cases are strictly sequential.
Page 15 June © Siemens AG 2012 ST /CT-India Criterion 6: Testing Levels Aspects Considered Unit, integration, or system testing. Findings Criteria Conformiq’s QtronicMicrosoft’s Spec Explorer 2010 Testing level Suitable for component, sub-system and system level testing. Limited support for integration testing. At a high level of abstraction as user interactions (system testing). At a lower level of abstraction, wherein the methods calls are interactions between different components (integration testing).
Page 16 June © Siemens AG 2012 ST /CT-India Criterion 7: Regression testing Aspects Considered Capability to generate test cases for regression testing. Test case prioritization, Test case optimization Findings Criteria Conformiq’s QtronicMicrosoft’s Spec Explorer 2010 Regression testing Does not support test case optimization or test case prioritization. Test cases can be manually selected from the set of generated tests upon model modification. Does not support test case optimization or test case prioritization.
Page 17 June © Siemens AG 2012 ST /CT-India Criterion 8: Usability Aspects Considered Initial learning curve, Availability of documentation and on-line help. User friendliness. Findings Criteria Conformiq’s QtronicMicrosoft’s Spec Explorer 2010 Usability Modeling advanced features requires additional training. Tool support is good. User manual is very informative. Support from Conformiq has been very good Not very intuitive for the tester, requires deep knowledge on modeling. Tool support is good. Documentation is currently not adequate.
Page 18 June © Siemens AG 2012 ST /CT-India Relevance of criteria for considered case study highly relevant not relevant Relevance / Criterion 1 model repr. 2 model val. 3 test gen. 4 test data 5 concur. 6 test levels 7 regr. test 8 usability Selection of relevant criteria for applying MBT in large industrial projects Leads to an informed selection of the right MBT approach and tool Highlights weak points of the selected tooling
Page 19 June © Siemens AG 2012 ST /CT-India Outlook – Suggested improvements for future generation tools Integration of different testing levels in a single framework. -- Unit, Integration testing and System testing Need more rigorous automatic test case selection techniques. -- Sophisticated constraint specification over models More sophisticated automatic test data generation. -- Applying right combination of standard techniques for effective fault revealing data -- Handling primitive as well as structured data types. Improved tool interoperability for different test artifacts. -- Migration from one environment to another
Page 20 June © Siemens AG 2012 ST /CT-India Summary In addition to the criteria mentioned already as a basis for understanding MBT approaches for application in industry, criteria for effective MBT may also include fault modeling support for applications in different domains model debugging support guiding modelers towards incorporating safety and liveness properties of applications in models facilitating expression of test design, test verdict and data concepts from UML Testing profile. support for modeling non-functional requirements
Copyright © Siemens AG All rights reserved. Thank You – Any Questions?