How Can Simple Model Test Complex System Model Based Testing of Large-Scale Software Victor Kuliamin ISP RAS, Moscow
Real Software Systems System Year Size (MLOC) Windows Windows NT Windows Windows NT ,5 Red Hat Linux Debian Linux Windows Red Hat Linux Sun StarOffice ,6 Debian Linux Red Hat Linux Windows XP Red Hat Linux Debian Linux They are huge and have a lot of functions They have very complex interfaces They are developed by hundreds of people They are distributed and concurrent System Year DevTeam Size Windows NT Windows NT Windows NT Debian Linux * Debian Linux * Windows Debian Linux * Debian Linux *
Quality of Real Software Systems System Year TestTeam Size Windows NT Windows NT Windows NT (0.9) Windows (1.2) System Test Cases, K MS Word XP 35 Oracle 10i 100 Window XP >2000 (?) They are tested a lot But Details of their behavior are not well defined And they still do have serious bugs
Model Based Testing – a Solution? Potential to test very large systems with high adequacy Parallelization of work on system and its tests Google on “model based testing” “case study” gives ~630 links on ~230 sites ~60 separate case studies concerned with industry since 1990 Most MBT case studies are small 30 KLOC Most MBT techniques are based on state models and hence prone to state explosion problem ?
Fighting Complexity No simple way to test a complex system adequately But manageable way exists – use of general engineering principles Abstraction Separation of concerns Modularization Reuse
UniTesK Solutions Modularize the system under test – contract specifications of components Modularize the test system – flexible test system architecture Adapters – binding test system and SUT Contracts Oracles – checking SUT’s behavior Test coverage goals based on contracts Test data generators for single operation Testing models (test scenario) – test sequence composition Abstract contract, more abstract testing model Reusability of contracts, testing models, test data generators →
Software Contracts Component AComponent C Component BComponent D Contract A Contract D Contract C Contract B Subsystem IISubsystem I Contract II Contract I Contracts (preconditions, postconditions, data integrity constraints) help to describe components on different abstraction levels
Test Coverage Goals post { if ( f(a, b) || g(a) ) … else if( h(a, c) & !g(b) ) … else … } !f(a, b) && !g(a) && !h(a, c) || !f(a, b) && !g(a) && g(b)
Testing Model states parametersoperation domain coverage goals
Test Data Generation Computation of single call arguments current state parameters states Test data generation is based on simple generators and coverage filtering
The Whole Picture System under Test Behavior Model Testing Model Coverage Model On-the-fly Test Sequence Generation Single Input Checking
Testing Concurrency s 11 Target System s 21 s 12 s 31 Multisequence is used instead of sequence of stimuli Stimuli and reactions form a partially ordered set r 12 r 22 r 11 r 21 Time
Plain concurrency : behavior of the system is equivalent to some sequence of the actions Checking Composed Behavior Plain concurrency axiom ✕
The Case Study 1994 – 1996 ISP RAS – Nortel Networks project on functional test suite development for Switch Operating System kernel Size of the SUT is ~250 KLOC ~530 interface operations 44 components were determined ~60 KLOC of specifications ~40 KLOC test scenarios developed in 1.5 year by 6 people A lot of bugs found in the SUT, which had been in use for 10 years Several of them cause cold restart ~30% of specifications are used to test other components 3 versions of the SUT were tested by 2000 (~500 KLOC) Changes in the test suite were <5%
Other Case Studies IPv6 implementations Microsoft Research Mobile IPv6 (in Windows CE 4.1) Oktet Intel compilers Web-based banking client management system Enterprise application development framework Billing system Components of TinyOS
UniTesK Tools 2001 Java / NetBeans, Eclipse (plan) Link 2003 C++ / NetBeans + MS Visual Studio CTesK 2002 C / Visual Studio 6.0, gcc 2003 C# / Visual Studio.NET 7.1 OTK 2003 Specialized tool for compiler testing
References 1.V. Kuliamin, A. Petrenko, I. Bourdonov, and A. Kossatchev. UniTesK Test Suite Architecture. Proc. of FME LNCS 2391, pp , Springer-Verlag, V. Kuliamin, A. Petrenko, N. Pakoulin, I. Bourdonov, and A. Kossatchev. Integration of Functional and Timed Testing of Real-time and Concurrent Systems. Proc. of PSI LNCS 2890, pp , Springer-Verlag, V. Kuliamin, A. Petrenko. Applying Model Based Testing in Different Contexts. Proceedings of seminar on Perspectives of Model Based Testing, Dagstuhl, Germany, September A. Kossatchev, A. Petrenko, S. Zelenov, S. Zelenova. Using Model-Based Approach for Automated Testing of Optimizing Compilers. Proc. Intl. Workshop on Program Undestanding, Gorno-Altaisk, V. Kuliamin, A. Petrenko, A. Kossatchev, and I. Burdonov. The UniTesK Approach to Designing Test Suites. Programming and Computer Software, Vol. 29, No. 6, 2003, pp (Translation from Russian) 6.S. Zelenov, S. Zelenova, A. Kossatchev, A. Petrenko. Test Generation for Compilers and Other Formal Text Processors. Programming and Computer Software, Vol. 29, No. 2, 2003, pp (Translation from Russian)
Contacts Victor V. Kuliamin , B. Kommunisticheskaya, 25 Moscow, Russia Web: Phone: Fax:
Thank you!