Download presentation
Presentation is loading. Please wait.
Published byJadyn Kimber Modified over 10 years ago
1
Applications of Automated Model Based Testing with TorX Ed Brinksma Course 2004
2
© Ed Brinksma/Jan Tretmans TorX Case Studies lConference Protocol lEasyLink TV-VCR protocol lCell Broadcast Centre component l‘’Rekeningrijden’’ Payment Box protocol lV5.1 Access Network protocol lEasy Mail Melder lFTP Client l“Oosterschelde” storm surge barrier-control academic Philips CMG Interpay Lucent CMG academic CMG
3
© Ed Brinksma/Jan Tretmans The Conference Protocol Experiment lAcademic benchmarking experiment, initiated for test tool evaluation and comparison lBased on really testing different implementations lSimple, yet realistic protocol (chatbox service) lSpecifications in LOTOS, Promela, SDL, EFSM l28 different implementations in C none of them (assumed-to-be) correct nothers manually derived mutants lhttp://fmt.cs.utwente.nl/ConfCasehttp://fmt.cs.utwente.nl/ConfCase
4
© Ed Brinksma/Jan Tretmans CPE UDP Layer CPE The Conference Protocol join leave send receive Conference Service
5
© Ed Brinksma/Jan Tretmans Conference Protocol Test Architecture CPE = IUT UT-PCO = C-SAP LT-PCO UDP Layer U-SAP LT-PCO Tester TorX BC A
6
© Ed Brinksma/Jan Tretmans The Conference Protocol Experiments lTorX - LOTOS, Promela : on-the-fly ioco testing Axel Belinfante et al., Formal Test Automation: A Simple Experiment IWTCS 12, Budapest, 1999. lTau Autolink - SDL : semi-automatic batch testing lTGV - LOTOS : automatic batch testing with test purposes Lydie Du Bousquet et al., Formal Test Automation: The Conference Protocol with TGV/TorX TestCom 2000, Ottawa. lPHACT/Conformance KIT - EFSM : automatic batch testing Lex Heerink et al., Formal Test Automation: The Conference Protocol with PHACT TestCom 2000, Ottawa.
7
© Ed Brinksma/Jan Tretmans Conference Protocol Results Results: fail pass “core dump” PHACT EFSM 21 6 1 TorX LOTOS 25 3 0 pass 000 444 666 000 444 666 289 293 398 TGV LOTOS random 25 3 0 TGV LOTOS purposes 24 4 0 TorX Promela 25 3 0 000 444 666 332 000 444 666
8
© Ed Brinksma/Jan Tretmans Conference Protocol Analysis lMutants 444 and 666 react to PDU’s from non-existent partners: nno explicit reaction is specified for such PDU’s, so ioco-correct, and TorX does not test such behaviour lSo, for LOTOS/Promela with TGV/TorX: All ioco-erroneous implementations detected lEFSM: ntwo “additional-state” errors not detected none implicit-transition error not detected
9
© Ed Brinksma/Jan Tretmans Conference Protocol Analysis lTorX statistics nall errors found after 2 - 498 test events nmaximum length of tests : > 500,000 test events lEFSM statistics n82 test cases with “partitioned tour method” ( = UIO ) nlength per test case : < 16 test events lTGV with manual test purposes n~ 20 test cases of various length lTGV with random test purposes n~ 200 test cases of 200 test events
10
© Ed Brinksma/Jan Tretmans TV VCR communication object of testing EasyLink Case Study EasyLink lprotocol between TV and VCR nsimple, but realistic lfeatures: npreset download nWYSIWYR (what you see is what you record) nEPG download n...
11
© Ed Brinksma/Jan Tretmans MBB (= Magic Black Box) lallows to monitor communication between TV and VCR by PC lallows PC to send messages to mimic TV or VCR lTorX distributed over PC and workstation EasyLink Test Architecture TV VCR MBB PC Work Station RC manual inter- action
12
© Ed Brinksma/Jan Tretmans Testing Preset Download Feature lWhat? ncheck whether TV correctly implements preset download based on Promela specification lHow? nlet PC play role of VCR and initiate preset download nreceive settings from TV nWHILE (TRUE) { let PC initiate preset download let PC non deterministically stop preset download check for consistency in presets } nfeature interaction: shuffle presets on TV using RC all under control of PC
13
© Ed Brinksma/Jan Tretmans Results: EasyLink Experiences ltest environment influences what can be tested ntesting power is limited by functionality of MBB linitially, state of TV is unknown ntester must be prepared for all possible states lsome “hacks” needed in specification and tool architecture in order to decrease state space lautomatic specification based testing feasible ltool architecture also suitable to cope with user interaction lsome (non fatal) non- conformances detected
14
© Ed Brinksma/Jan Tretmans CMG - CBC Component Test lTest one component of Cell Broadcast Centre lLOTOS (process algebra) specification of 28 pp. lUsing existing test execution environment lBased on automatic generation of “adapter” based on IDL lComparison (simple):existing testTorX ncode coverage82 %83 % ndetected mutants/1057 lConclusion: nTorX is as least as good as conventional testing (with potential to do better) nLOTOS is not nice (= terrible) to specify such systems
15
© Ed Brinksma/Jan Tretmans Interpay ‘’Rekeningrijden’’ Highway Tolling System
16
© Ed Brinksma/Jan Tretmans “Rekeningrijden” Characteristics : lSimple protocol lParallellism : nmany cars at the same time lEncryption lReal-time issues lSystem passed traditional testing phase
17
© Ed Brinksma/Jan Tretmans ‘’Rekeningrijden’’ : Phases for Automated Testing lIUT study ninformal and formal specification lAvailable tools study nsemantics and openness lTest environment ntest architecture, test implementation, SUT specification ntesting of test environment lTest execution ntest campaigns, execution, analysis
18
© Ed Brinksma/Jan Tretmans Payment Box (PB) Road Side Equipment Onboard Unit UDP/IP Wireless ‘’Rekeningrijden’’ Highway Tolling System
19
© Ed Brinksma/Jan Tretmans spec PB TorX Payment Box ‘’Rekeningrijden’’: Test Architecture I PCO
20
© Ed Brinksma/Jan Tretmans spec PB + UDP/IP TorX Payment Box SUT ‘’Rekeningrijden’’: Test Architecture II Test Context UDP/IP PCOIAP
21
© Ed Brinksma/Jan Tretmans Test Context ObuSim spec PB + ObuSim + TCP/IP + UDP/IP Payment Box TCP/IP TorX ‘’Rekeningrijden’’: Test Architecture III PCO SUT UDP/IP IAP
22
© Ed Brinksma/Jan Tretmans ‘’Rekeningrijden’’: Test Campaigns lIntroduction and use of Test Campaigns : nManagement of test tool configurations nManagement of IUT configurations nSteering of test derivation nScheduling of test runs nArchiving of results
23
© Ed Brinksma/Jan Tretmans ‘’Rekeningrijden’’: Issues lParallellism : nvery easy lEncryption : nNot all events can be synthesized : Leads to reduced testing power lReal-time : nHow to cope with real time constraints ? nEfficient computation for on-the-fly testing ? nLack of theory: quiescence vs. time-out
24
© Ed Brinksma/Jan Tretmans TorXPB tqtq Timeout Quiescence Input Observe Timeout Input TorXPB tqtq Timeout Tick Input Observe Timeout Input Spec := Spec + Tick ‘’Rekeningrijden’’ Problem: Quiescence in ioco vs. time-out
25
© Ed Brinksma/Jan Tretmans Spec := Refine + Buffer TorXPB Timeout Unexpected Input 0 Input 1 Input 0 Input 1 TorXPB Input 01 Input 0 Input 1 Error Full ‘’Rekeningrijden’’ Problem: Action Refinement
26
© Ed Brinksma/Jan Tretmans ‘’Rekeningrijden’’: Issues lModelling language: LOTOS Promela lSpec for testing Spec for validation lDevelopment of specification is iterative process lDevelopment of test environment is laborious lParameters are fixed in the model lPreprocessing: M4/CPP lPromela problem: Guarded inputs lTest Campaigns for bookkeeping and control of experiments lProbabilities incorporated
27
© Ed Brinksma/Jan Tretmans ‘’Rekeningrijden” : Results lTest results : n1 error during validation (design error) n1 error during testing (coding error) lAutomated testing : nbeneficial: high volume and reliability nmany and long tests executed ( > 50,000 test events ) nvery flexible: adaptation and many configurations lStep ahead in formal testing of realistic systems
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.