Download presentation
Presentation is loading. Please wait.
Published byRandall Bennett Modified over 9 years ago
1
Copyright © Siemens AG 2012. All rights reserved. Essential Criteria on MBT to Ensure Quality of Software in Industry PVR Murthy Andreas Ulrich Siemens AG, Corporate Technology PVR Murthy Andreas Ulrich Siemens
2
Page 2 June - 2012 © Siemens AG 2012 ST /CT-India Agenda Model-based testing (MBT) Case study Effective criteria for evaluation of MBT tools Analysis and discussions Summary
3
Page 3 June - 2012 © Siemens AG 2012 ST /CT-India Modeling Generating Executing Analyzing Results Coverage Algorithm Generate Test Cases Generate Test Cases Generate Executable Generate Executable and/or Create Executable Application Under Testing Run Scripts Run Scripts Execute Manually Execute Manually Oracles Verify Results Test pass or Fail Test pass or Fail Decide whether to o Generate more tests o Modify the model o Stop testing Estimate o Reliability & other quality measures Decide whether to o Generate more tests o Modify the model o Stop testing Estimate o Reliability & other quality measures Introduction to Model Based Testing
4
Page 4 June - 2012 © Siemens AG 2012 ST /CT-India Model-Based Testing Model-based testing (MBT) is an evolving test generation technique. Uses design artifacts (models) of a system under test. Generate test cases automatically. Suitable for adequate and thorough testing of critical and complex software.
5
Page 5 June - 2012 © Siemens AG 2012 ST /CT-India MBT tools under study A number of MBT tools are available Commercial tools, e.g. T-VEC, Qtronic, Spec Explorer etc. Academic tools, e.g. NModel, TGV/CADP etc. We selected two tools for our study Conformiq’s Qtronic V2.0 Microsoft’s Spec Explorer2010 V3.0 They represent two different modeling paradigms Qtronic uses both textual and graphical notations while describing the system model (based on hierarchical state charts). Spec Explorer uses guarded state update rules in C# (based on abstract state machines, called model programs) and a coordination specification.
6
Page 6 June - 2012 © Siemens AG 2012 ST /CT-India Conformiq’s Qtronic Models are expressed as a collection of textual files and/or graphical models. The textual notation is defined using QML Graphical models are created using either Conformiq Modeler or a third party UML editor. Generation of test cases based on state space analysis Can translate test cases into executable test scripts
7
Page 7 June - 2012 © Siemens AG 2012 ST /CT-India Microsoft’s Spec Explorer Uses Visual Studio programming environment to model system behavior. Input is a set of.NET model assemblies and a coordination script. Generates test cases based on exploration of the model. Exploration systematically discovers all possible states and transitions. Exploration can be visualized through a state space graph. Test cases are a collection of C# files executable in MSTest
8
Page 8 June - 2012 © Siemens AG 2012 ST /CT-India Case Study We have considered an application from Siemens medical domain. Considered SUT: An automated exposure control software. Make large images with an automatic sequence of many small images Modeled as a UML state chart with 11 system states, 24 transitions and 72 transition paths. Digital radiographic system
9
Page 9 June - 2012 © Siemens AG 2012 ST /CT-India Evaluation Criteria For a Siemens industrial project in the healthcare domain High quality, reliable and robust software is indispensable It therefore requires rigorous and effective testing Effective and rigorous testing of critical and highly reliable software requires a good MBT tool / methodology. We have identified eight important parameters for evaluating the considered MBT tools. Based on the Siemens requirement in the domain of highly critical and reliable software. Identified parameters are subjective.
10
Page 10 June - 2012 © Siemens AG 2012 ST /CT-India Criterion 1: Model Representation Aspects considered Type of model (tester or design model). Kind of notation (standard or proprietary). Ease of creation and editing (expressiveness, reusability, etc) Modeling advanced features (hierarchy, concurrency). Findings CriteriaConformiq’s QtronicMicrosoft’s Spec Explorer 2010 Model representation Design model Models are expressed as a collection of textual files and graphical models Models hierarchical UML state charts with unambiguous semantics. Nondeterministic transitions cannot be handled Testers model, model behavior can be viewed as an exploration graph. Models are expressed by using.NET language and Cord scripts for configuration control. No hierarchical modeling similar to UML state charts is supported. Nondeterministic transitions can be handled.
11
Page 11 June - 2012 © Siemens AG 2012 ST /CT-India Criterion 2: Model Validation Aspects Considered Detecting requirement defects. Consistency checking. Checking design errors in the models. Findings CriteriaConformiq’s QtronicMicrosoft’s Spec Explorer 2010 Model validation Supports model validation by consistency checks. Tool checks configuration scripts and model code for consistency.
12
Page 12 June - 2012 © Siemens AG 2012 ST /CT-India Criterion 3: Test Generation Strategy Aspects considered Tool support for choosing test coverage criteria. Efficiency of test case generation algorithm. Test generation control to avoid test case explosion Findings Criteria Conformiq’s QtronicMicrosoft’s Spec Explorer 2010 Test generation Strategy Test generation is coverage driven. Facility to maintain different test suites, Supports generation of negative tests. Provides a clear visual overview of the link between generated test cases and covered requirements. No direct support to control test case explosion. No control for specifying coverage criteria. Supports requirements coverage, but does not give a very clear visual overview. Test generation is also supported for non- deterministic models, but restricted to the generation of non-exhaustive test suites. Handles infinite state spaces by user scenarios and slices.
13
Page 13 June - 2012 © Siemens AG 2012 ST /CT-India Criterion 4: Test Data Generation Aspects Considered Automation in test data generation. Accuracy, thoroughness, cost of testing. Findings Criteria Conformiq’s QtronicMicrosoft’s Spec Explorer 2010 Test data generation Based on the values of input variables and constants specified in the model. Does not have an option of specifying a domain of alternative data values for model parameters. Supports a good domain control method for generating test data. Provides an extensive mechanism for input parameter generation
14
Page 14 June - 2012 © Siemens AG 2012 ST /CT-India Criterion 5: Concurrency support Aspects Considered Ability to test concurrent system behavior. Findings Criteria Conformiq’s QtronicMicrosoft’s Spec Explorer 2010 Concurrency support Concurrent state charts can be modeled only using the QML textual notation. Generated tests are strictly sequential regardless of the concurrency that might exist in the model. Supports modeling of concurrency, but the generated test cases are strictly sequential.
15
Page 15 June - 2012 © Siemens AG 2012 ST /CT-India Criterion 6: Testing Levels Aspects Considered Unit, integration, or system testing. Findings Criteria Conformiq’s QtronicMicrosoft’s Spec Explorer 2010 Testing level Suitable for component, sub-system and system level testing. Limited support for integration testing. At a high level of abstraction as user interactions (system testing). At a lower level of abstraction, wherein the methods calls are interactions between different components (integration testing).
16
Page 16 June - 2012 © Siemens AG 2012 ST /CT-India Criterion 7: Regression testing Aspects Considered Capability to generate test cases for regression testing. Test case prioritization, Test case optimization Findings Criteria Conformiq’s QtronicMicrosoft’s Spec Explorer 2010 Regression testing Does not support test case optimization or test case prioritization. Test cases can be manually selected from the set of generated tests upon model modification. Does not support test case optimization or test case prioritization.
17
Page 17 June - 2012 © Siemens AG 2012 ST /CT-India Criterion 8: Usability Aspects Considered Initial learning curve, Availability of documentation and on-line help. User friendliness. Findings Criteria Conformiq’s QtronicMicrosoft’s Spec Explorer 2010 Usability Modeling advanced features requires additional training. Tool support is good. User manual is very informative. Support from Conformiq has been very good Not very intuitive for the tester, requires deep knowledge on modeling. Tool support is good. Documentation is currently not adequate.
18
Page 18 June - 2012 © Siemens AG 2012 ST /CT-India Relevance of criteria for considered case study highly relevant not relevant Relevance / Criterion 1 model repr. 2 model val. 3 test gen. 4 test data 5 concur. 6 test levels 7 regr. test 8 usability Selection of relevant criteria for applying MBT in large industrial projects Leads to an informed selection of the right MBT approach and tool Highlights weak points of the selected tooling
19
Page 19 June - 2012 © Siemens AG 2012 ST /CT-India Outlook – Suggested improvements for future generation tools Integration of different testing levels in a single framework. -- Unit, Integration testing and System testing Need more rigorous automatic test case selection techniques. -- Sophisticated constraint specification over models More sophisticated automatic test data generation. -- Applying right combination of standard techniques for effective fault revealing data -- Handling primitive as well as structured data types. Improved tool interoperability for different test artifacts. -- Migration from one environment to another
20
Page 20 June - 2012 © Siemens AG 2012 ST /CT-India Summary In addition to the criteria mentioned already as a basis for understanding MBT approaches for application in industry, criteria for effective MBT may also include fault modeling support for applications in different domains model debugging support guiding modelers towards incorporating safety and liveness properties of applications in models facilitating expression of test design, test verdict and data concepts from UML Testing profile. support for modeling non-functional requirements
21
Copyright © Siemens AG 2012. All rights reserved. Thank You – Any Questions?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.