19.09.2002 Intel Academic Forum. Budapest, September, 2002 ISPRAS Experience in Model Based Testing Alexander K. Petrenko, Institute for System Programming.

Slides:



Advertisements
Similar presentations
1 Verification by Model Checking. 2 Part 1 : Motivation.
Advertisements

Testing Workflow Purpose
Test Yaodong Bi.
Software & Services Group, Developer Products Division Copyright© 2010, Intel Corporation. All rights reserved. *Other brands and names are the property.
Introducing Formal Methods, Module 1, Version 1.1, Oct., Formal Specification and Analytical Verification L 5.
DETAILED DESIGN, IMPLEMENTATIONA AND TESTING Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
Model Based Testing in Linux Standard Base Compliance Program A.V.Khoroshilov, A.K.Petrenko { khoroshilov, petrenko ispras.ru MBT Users Conference.
Testing and Quality Assurance
Multi-Paradigm Models as Source for Automatic Test Construction Victor Kuliamin ISP RAS, Moscow.
ISBN Chapter 3 Describing Syntax and Semantics.
CMSC 345, Version 11/07 SD Vick from S. Mitchell Software Testing.
How Can Simple Model Test Complex System Model Based Testing of Large-Scale Software Victor Kuliamin ISP RAS, Moscow.
Formal Methods of Systems Specification Logical Specification of Hard- and Software Prof. Dr. Holger Schlingloff Institut für Informatik der Humboldt.
Testing AJAX functionality with UniTESK Yevgeny Gerlits, a postgraduate student from Lomonosov Moscow State University SYRCoSE 2010.
Presenter : Yeh Chi-Tsai System-on-chip validation using UML and CWL Qiang Zhu 1, Ryosuke Oish 1, Takashi Hasegawa 2, Tsuneo Nakata 1 1 Fujitsu Laboratories.
Behavioral Design Outline –Design Specification –Behavioral Design –Behavioral Specification –Hardware Description Languages –Behavioral Simulation –Behavioral.
1 SWE Introduction to Software Engineering Lecture 5.
Describing Syntax and Semantics
EE694v-Verification-Lect5-1- Lecture 5 - Verification Tools Automation improves the efficiency and reliability of the verification process Some tools,
End-to-End Design of Embedded Real-Time Systems Kang G. Shin Real-Time Computing Laboratory EECS Department The University of Michigan Ann Arbor, MI
Copyright © 2006 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill Technology Education Copyright © 2006 by The McGraw-Hill Companies,
Principle of Functional Verification Chapter 1~3 Presenter : Fu-Ching Yang.
Introduction to Software Testing
Formal Methods in Industrial Software Standards Enforcement A. Grinevich, A. Khoroshilov V. Kuliamin, D. Markovtsev A. Petrenko, V. Rubanov ISP RAS, Moscow,
CS527: (Advanced) Topics in Software Engineering Overview of Software Quality Assurance Tao Xie ©D. Marinov, T. Xie.
System/Software Testing
© Janice Regan, CMPT 128, Jan CMPT 128 Introduction to Computing Science for Engineering Students Creating a program.
TEST SUITE DEVELOPMENT FOR CONFORMANCE TESTING OF PROTOCOLS Anastasia Tugaenko Scientific Adviser: Nikolay Pakulin, PhD Institute for System Programming.
Chapter 2: Software Process Omar Meqdadi SE 2730 Lecture 2 Department of Computer Science and Software Engineering University of Wisconsin-Platteville.
Contract Specification of Pipelined Designs Alexander Kamkin Institute for System Programming of RAS
Model Based Conformance Testing for Extensible Internet Protocols Anastasia Tugaenko Scientific Adviser: Nikolay Pakulin, PhD.
Benjamin Gamble. What is Time?  Can mean many different things to a computer Dynamic Equation Variable System State 2.
Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP.
Software Verification Academician V.P.Ivannikov, Director of ISPRAS Moscow, November 2008.
Presenter : Ching-Hua Huang 2013/7/15 A Unified Methodology for Pre-Silicon Verification and Post-Silicon Validation Citation : 15 Adir, A., Copty, S.
Testing Workflow In the Unified Process and Agile/Scrum processes.
Institute e-Austria in Timisoara 1 Author: prep. eng. Calin Jebelean Verification of Communication Protocols using SDL ( )
Software Engineering Research paper presentation Ali Ahmad Formal Approaches to Software Testing Hierarchal GUI Test Case Generation Using Automated Planning.
A Light-Weight C/C++ Based Tool for Hardware Verification Alexander Kamkin CTestBench Institute for System Programming of the Russian.
Unit Testing 101 Black Box v. White Box. Definition of V&V Verification - is the product correct Validation - is it the correct product.
Formal Verification Lecture 9. Formal Verification Formal verification relies on Descriptions of the properties or requirements Descriptions of systems.
Applying Model Based Testing in Different Contexts Alexander Petrenko Victor Kuliamin ISP RAS, Moscow.
CS Data Structures I Chapter 2 Principles of Programming & Software Engineering.
1. 2 Preface In the time since the 1986 edition of this book, the world of compiler design has changed significantly 3.
What is Testing? Testing is the process of finding errors in the system implementation. –The intent of testing is to find problems with the system.
Using Cycle-Accurate Contract Specifications for Testing Hardware Models Alexander Kamkin Institute for System Programming of RAS
UniTesK: Model Based Testing in Industrial Practice Victor Kuliamin Alexander Petrenko Alexander Kossatchev Igor Burdonov Institute for System Programming.
Architecture View Models A model is a complete, simplified description of a system from a particular perspective or viewpoint. There is no single view.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
ISP RAS Java Specification Extension for Automated Test Development Igor B. Bourdonov, Alexei V. Demakov, Andrei A. Jarov, Alexander S. Kossatchev, Victor.
UniTesK Test Suite Architecture Igor Bourdonov Alexander Kossatchev Victor Kuliamin Alexander Petrenko.
Static Techniques for V&V. Hierarchy of V&V techniques Static Analysis V&V Dynamic Techniques Model Checking Simulation Symbolic Execution Testing Informal.
Introduction to Hardware Verification ECE 598 SV Prof. Shobha Vasudevan.
From Natural Language to LTL: Difficulties Capturing Natural Language Specification in Formal Languages for Automatic Analysis Elsa L Gunter NJIT.
UniTesK Test Suite Architecture Igor Bourdonov Alexander Kossatchev Victor Kuliamin Alexander Petrenko.
September 1999Compaq Computer CorporationSlide 1 of 16 Verification of cache-coherence protocols with TLA+ Homayoon Akhiani, Damien Doligez, Paul Harter,
Whole Test Suite Generation. Abstract Not all bugs lead to program crashes, and not always is there a formal specification to check the correctness of.
Software Systems Division (TEC-SW) ASSERT process & toolchain Maxime Perrotin, ESA.
Parallelizing Functional Tests for Computer Systems Using Distributed Graph Exploration Alexey Demakov, Alexander Kamkin, and Alexander Sortov
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
1 The user’s view  A user is a person employing the computer to do useful work  Examples of useful work include spreadsheets word processing developing.
Introduction to Software Testing (2nd edition) Chapter 5 Criteria-Based Test Design Paul Ammann & Jeff Offutt
Sung-Dong Kim, Dept. of Computer Engineering, Hansung University Java - Introduction.
Advanced Computer Systems
Definition CASE tools are software systems that are intended to provide automated support for routine activities in the software process such as editing.
SOFTWARE TESTING OVERVIEW
Software Engineering (CSI 321)
Software Test Automation and Tools
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
V. Kuliamin, A. Petrenko, N.!Pakoulin, I.!Bourdonov, A.!Kossatchev
Presentation transcript:

Intel Academic Forum. Budapest, September, 2002 ISPRAS Experience in Model Based Testing Alexander K. Petrenko, Institute for System Programming of Russian Academy of Sciences (ISPRAS),

Intel Academic Forum. Budapest, September, 2002 ISPRAS Experience in Industrial Model Based Testing

Intel Academic Forum. Budapest, September, Why Model Based Testing? Exhaustive testing that covers all implementation paths is impossible. Exhaustive implementation based (“white box”) testing does not guaranty correct functionality. White box testing leads to increasing duration of development because test development can be launched only when the implementation is completed. Nevertheless, we like to conduct systematic testing. Formal models propose basis for systematic testing, we derive from the models  test coverage metrics,  input stimulus,  results correctness criteria. test development ahead of implementation schedule.

Intel Academic Forum. Budapest, September, Model Checking vs. Model Based Testing Model CheckingModel Based Testing Answers the question Is the model correct?Does the implementation behavior conform to the model behavior? Expected resultCorrect modelTest suite for the implementation testing and proper implementation Complexity of the models More simple in comparison with implementation because restriction of analytic analysis Close to complexity of implementation under test Relation between model and implementation Very complicatedSimple

Intel Academic Forum. Budapest, September, Synonyms Models (Formal) Specification We consider behavior/functional models. The models provide simplified, abstract view on the target software/hardware. Processing of the models needs their formal description/specification.

Intel Academic Forum. Budapest, September, Model Based Testing Approach Generate exhaustive test suites for model of implementation Translate the test suites to implementation level Apply the tests to implementation under test (Optionally) Interpret the testing results in terms of the model

Intel Academic Forum. Budapest, September, Related Works IBM Research Laboratory (Haifa, Israel) Microsoft Research (Redmond, US)

Intel Academic Forum. Budapest, September, Examples of Model Based Testing Applications IBM Research Laboratory (Haifa, Israel)  Store Date Unit – digital signal processor  API of file system, telephony and Internet protocols etc. Microsoft Research (Redmond, US)  Universal PnP interface ISPRAS (Moscow, Russia)  Kernel of operating system (Nortel Networks)  IPv6 protocol (Microsoft)  Compiler optimization units (Intel)  Massive parallel compiler testing (RFBR, Russia)

Intel Academic Forum. Budapest, September, Origin of ISPRAS Methods Test suite for compiler of real-time programming language for “Buran” space shuttle 1994 – 1996 ISP RAS – Nortel Networks contract on functional test suite development for Switch Operating System kernel  Few hundreds of bugs found in the OS kernel, which had been 10 years in use KVEST technology About 600K lines of Nortel code tested by 2000

Intel Academic Forum. Budapest, September, ISPRAS Model Based Testing: Two Approaches UniTesK Testing of Application Program Interfaces (API) based on Software Contract Lama Compiler testing based on LAnguage Model Application (Lama)

Intel Academic Forum. Budapest, September, 2002 UniTesK Testing of Application Program Interfaces (API)

Intel Academic Forum. Budapest, September, What is API? User Interface Application Program Interface (API)

Intel Academic Forum. Budapest, September, Functional Testing UniTesK method deals with functional testing Requirements Formal Specifications Tests To automate testing we provide a formal representation of requirements

Intel Academic Forum. Budapest, September, UniTesK Process Phases Techniques Pre- and post-conditions, invariants Implicit Finite State Machines (FSM), data iterators Test coverage metrics based on specification structure Interface specification Test scenario description Test execution Test result analysis

Intel Academic Forum. Budapest, September, Decomposition of Testing Tasks The entire test is a test sequence intended to achieve specified coverage From specification we can generate oracles and define test coverage metrics Test sequence construction Test oracles System under test

Intel Academic Forum. Budapest, September, Test Suite Architecture Legend: Automatic derivation Pre-builtManualGenerated SpecificationTest coverage tracker Test oracle Data model System under test Mediator Java/C/C++/C# mediator Test scenarioScenario driverTest engine

Intel Academic Forum. Budapest, September, Test Oracle Specification of method f integer f (a : float) post { post_f (a, f_result) } Test Oracle for the method f f_result = f(x) post_f(x,f_result) verdict = true verdict = false

Intel Academic Forum. Budapest, September, Test Coverage Metrics Based on Specification Structure Specification post { if (a || b || c || d && e ) { branch “OK“;..... } else { branch “Bad parameters" ;..... } } Partition (Derivation of branches and logical terms) BRANCH “OK”  a -- op1  !a && b -- op2  !a && !b && c -- op3 ... BRANCH “Bad parameters"  !a && !b && !c && !d  !a && !b && !c && d && !e

Intel Academic Forum. Budapest, September, Test Sequence Generation We use FSM to generate test sequences which traverse all equivalence classes defined by partition analysis. S1 S4 S2 S3 op2 op3 op2 op1 op3 But full FSM description is a labor consuming and tedious work.

Intel Academic Forum. Budapest, September, SC1 SC4 SC2 SC3 Equivalence classes of states FSM Construction. Statics SC1 SC4 SC2 SC3 op2 op3 op2 op1 op3 Partition (Branches and logical terms) BRANCH “OK”  a -- op1  !a && b -- op2  !a && !b && c-- op3 ... BRANCH "Bad parameters"  !a && !b && !c && !d -– op i  !a && !b && !c && d && !e -– op i+1 Partition (Branches and logical terms) BRANCH “OK”  a -- op1  !a && b -- op2  !a && !b && c-- op3 ... BRANCH "Bad parameters"  !a && !b && !c && !d -– op i  !a && !b && !c && d && !e -– op i+1 First step of FSM construction: -state and transition partition based on pre- and post-condition structure (FSM factorization) -test input iterators

Intel Academic Forum. Budapest, September, FSM Construction. Dynamics Second step of FSM construction SC1 SC4 SC2 SC3 op2 1 op3 op2 op1 op3 SC1 SC4 SC2 SC3 op2 op3 op2 op1 op3 Result of test execution

Intel Academic Forum. Budapest, September, Model Based Testing: problems of deployment 1994 – 1996 ISP RAS – Nortel Networks contract on functional test suite development for Switch Operating System kernel  Few hundreds of bugs found in the OS kernel, which had been 10 years in use KVEST technology About 600K lines of Nortel code tested by 2000 KVEST had been deployed only in Nortel’s regression testing process. Why? Only few formal techniques used in real life practice. Why?

Intel Academic Forum. Budapest, September, Problems of Model Based Testing Deployment ProblemUniTesK solution Formal models for analytical verification are too simple for test generation. Ratio of models (specifications) and implementation size is about 1:5-10. Mediators provide a bridge between abstract models and implementation. Executable models cannot provide test oracles in common case because dependence on implementation and indeterminism. Implicit specifications (pre- and post-conditions) provide the test oracles. Test sequence generation needs very huge models (for example, like FSM). Implicit FSM. Usual number of states is about How to estimate test quality without implementation test coverage? Structure of pre- and post-condition is very informative and quite simple for the test coverage metrics. Gap between formal techniques and software/hardware development practice. Usual programming languages are extended for specification purpose.

Intel Academic Forum. Budapest, September, UniTesK Tools and Applications CTesK – C testing tool -- alpha version  Microsoft IPv6 implementation – Java testing tool -- beta version  Partially tested by itself  API of parallel debugger of mpC IDE (mpC is a parallel extension of C)  Posix/Win32 File I/O subsystem VDM++TesK -- free Further steps: C#TesK and C++TesK, (conceivably) VHDLTesK

Intel Academic Forum. Budapest, September, 2002 Lama Compiler testing based on Language Models Application

Intel Academic Forum. Budapest, September, Pilot project under contract with Intel Model Based Testing of Compiler Optimization Units Automate optimization unit test generation: Improve test coverage of the units Automate test oracle problem

Intel Academic Forum. Budapest, September, Lama approach Lama stands for Compiler testing based on LAnguage Models Application. Lama process steps: Given: a programming language (PL) Invent a model (simplified) language (ML) of PL Generate a set of test “programs” in the ML Map ML test “programs” into PL test programs Run compiler (or a compiler unit) to process test program in PL and analyze correctness of the compiler results

Intel Academic Forum. Budapest, September, Process of optimization unit testing Model language building blocks Faults & test coverage reports Optimization background Program Language (PL) Specification Model language design Step 1 Iterator development Step 2 Mapper development Step 3 Step 4 Test “programs” in ML Test programs in PL Test Execution & Test result analysis

Intel Academic Forum. Budapest, September, An Example: Common Subexpression Elimination Optimization Label... Instruction Transition to label Basic block IF instruction if conditionthen blockelse block Common subexpression Step 1

Intel Academic Forum. Budapest, September, Result of translation into C Step 3 if ( (('c' - 'a') + (('c' - 'a') > ('c' - 'a'))) ) { (('c' - 'a') + (('c' - 'a') < ('c' - 'a'))); } else { (('c' - 'a') + (('c' - 'a') >= ('c' - 'a'))); }

Intel Academic Forum. Budapest, September, 2002 Conclusion

Intel Academic Forum. Budapest, September, Conclusion on UniTesK&Lama Both UniTesK and Lama follow model based testing approach Base idea: Testing complex software by means of exhaustive coverage of relatively simple models Area of applicability: Any software and hardware components with well-defined interfaces or functional properties

Intel Academic Forum. Budapest, September, References 1. A. K. Petrenko, I. B. Bourdonov, A. S. Kossatchev, V. V. Kuliamin. UniTesK Test Suite Architecture// Proceedings of FME’2002 conference, Copenhagen, Denmark, LNCS, No. 2391, 2002, pp A.Petrenko. Specification Based Testing: Towards Practice// VI Ershov conference proc., LNCS 2244, A. K. Petrenko, Bourdonov, A. S. Kossatchev, V. V. Kuliamin. Experiences in using testing tools and technology in real-life applications. Proceedings of SETT’01, Pune, India, I. B. Bourdonov, A. S. Kossatchev, V. V. Kuliamin. Using Finite State Machines in Program Testing// Programming and Computer Software, Vol. 26, No. 2, 2000, pp (English version). 5. I. Bourdonov, A. Kossatchev, A. Petrenko, and D. Galter. KVEST: Automated Generation of Test Suites from Formal Specifications// Proceedings of World Congress of Formal Methods, Toulouse, France, LNCS, No. 1708, 1999, pp I. B. Bourdonov, A. S. Kossatchev, V. V. Kuliamin, A. V. Maximov. Testing Programs Modeled by Nondeterministic Finite State Machine. ( white papers).