Slide 1. Slide 2 Outline Protocol Verification / Conformance Testing TTCN Automated validation Formal design and specification Traffic Theory Application.

Slides:



Advertisements
Similar presentations
SDL+ The Simplest, Useful Enhanced SDL-Subset The documentation is the design, the design is the system! Copyright © SDL Task Force Consortium.
Advertisements

System Integration Verification and Validation
SOFTWARE TESTING. INTRODUCTION  Software Testing is the process of executing a program or system with the intent of finding errors.  It involves any.
Annoucements  Next labs 9 and 10 are paired for everyone. So don’t miss the lab.  There is a review session for the quiz on Monday, November 4, at 8:00.
Project management Project manager must;
Lecture 1 Introduction to the ABAP Workbench
ISBN Chapter 3 Describing Syntax and Semantics.
Chapter 15 Design, Coding, and Testing. Copyright © 2005 Pearson Addison-Wesley. All rights reserved Design Document The next step in the Software.
System Design and Analysis
Software Requirements
Software Engineering, COMP201 Slide 1 Protocol Engineering Protocol Specification using CFSM model Lecture 30.
Slide 1 MSC and SDL. Slide 2 Relationship of MSC to SDL An MSC describes one or more traces of an SDL system specification. An entity in MSC may map to.
© Copyright Eliyahu Brutman Programming Techniques Course.
Describing Syntax and Semantics
School of Computer ScienceG53FSP Formal Specification1 Dr. Rong Qu Introduction to Formal Specification
Slide 1. Slide 2 Outline Protocol Verification / Conformance Testing TTCN Automated validation Formal design and specification Traffic Theory Application.
Principle of Functional Verification Chapter 1~3 Presenter : Fu-Ching Yang.
Cmpt-225 Simulation. Application: Simulation Simulation  A technique for modeling the behavior of both natural and human-made systems  Goal Generate.
Regression testing Tor Stållhane. What is regression testing – 1 Regression testing is testing done to check that a system update does not re- introduce.
Software Testing and QA Theory and Practice (Chapter 10: Test Generation from FSM Models) © Naik & Tripathy 1 Software Testing and Quality Assurance Theory.
Data Structures and Programming.  John Edgar2.
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
Dr. Pedro Mejia Alvarez Software Testing Slide 1 Software Testing: Building Test Cases.
Introduction to Systems Analysis and Design Trisha Cummings.
1 BTEC HNC Systems Support Castle College 2007/8 Systems Analysis Lecture 9 Introduction to Design.
© 2012 IBM Corporation Rational Insight | Back to Basis Series Chao Zhang Unit Testing.
CMSC 345 Fall 2000 Unit Testing. The testing process.
المحاضرة الثالثة. Software Requirements Topics covered Functional and non-functional requirements User requirements System requirements Interface specification.
Protocols and the TCP/IP Suite
Computer Security and Penetration Testing
1. Validating Wireless Protocol Conformance Test Cases Amresh Nandan Paresh Jain June 2004.
Overview of Formal Methods. Topics Introduction and terminology FM and Software Engineering Applications of FM Propositional and Predicate Logic Program.
TTCN Protocol Testing on Steroids! IEEE P WPAN Meeting March 6-10, 2000 Albuquerque, NM USA 802 Plenary Meeting Matthew Graney Telelogic North America.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
CE Operating Systems Lecture 3 Overview of OS functions and structure.
Introduction to Software Development. Systems Life Cycle Analysis  Collect and examine data  Analyze current system and data flow Design  Plan your.
Lyra – A service-oriented and component-based method for the development of communicating systems (by Sari Leppänen, Nokia/NRC) Traditionally, the design,
Requirements Engineering Methods for Requirements Engineering Lecture-30.
The System and Software Development Process Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
FDT Foil no 1 On Methodology from Domain to System Descriptions by Rolv Bræk NTNU Workshop on Philosophy and Applicablitiy of Formal Languages Geneve 15.
25 April 2000 SEESCOASEESCOA STWW - Programma Evaluation of on-chip debugging techniques Deliverable D5.1 Michiel Ronsse.
Debugging parallel programs. Breakpoint debugging Probably the most widely familiar method of debugging programs is breakpoint debugging. In this method,
The Software Development Process
Systems Development Life Cycle
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
OPERATING SYSTEMS CS 3530 Summer 2014 Systems and Models Chapter 03.
Topic 4 - Database Design Unit 1 – Database Analysis and Design Advanced Higher Information Systems St Kentigern’s Academy.
Computer Simulation of Networks ECE/CSC 777: Telecommunications Network Design Fall, 2013, Rudra Dutta.
Software Requirements Specification Document (SRS)
Software Quality Assurance and Testing Fazal Rehman Shamil.
1 The Software Development Process ► Systems analysis ► Systems design ► Implementation ► Testing ► Documentation ► Evaluation ► Maintenance.
Software Engineering, COMP201 Slide 1 Software Requirements BY M D ACHARYA Dept of Computer Science.
Chapter – 8 Software Tools.
Agenda  Quick Review  Finish Introduction  Java Threads.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
Some of the utilities associated with the development of programs. These program development tools allow users to write and construct programs that the.
1 Software Requirements Descriptions and specifications of a system.
Chapter 27 Network Management Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
 System Requirement Specification and System Planning.
SOFTWARE TESTING Date: 29-Dec-2016 By: Ram Karthick.
OPERATING SYSTEMS CS 3502 Fall 2017
Chapter 6 The Traditional Approach to Requirements.
Software Testing.
SOFTWARE TESTING OVERVIEW
Chapter 2: System Structures
Systems Analysis and Design
Programmable Logic Controllers (PLCs) An Overview.
Analysis models and design models
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
PSS0 Configuration Management,
Presentation transcript:

Slide 1

Slide 2 Outline Protocol Verification / Conformance Testing TTCN Automated validation Formal design and specification Traffic Theory Application Debugging

Slide 3 Testing of the Telecommunications Systems Functional Tests Formal System Tests against system level requirements Protocol Verification Performance/Load Tests Interoperability Tests- IOT (Inter-working Tests) Automatic Test Suites

Slide 4 Protocol Testing Conformance Testing –Performed utilising test equipment –Based on Conformance Test Suites/Cases specified by standard body –Are pretty detailed in terms of protocol implementation verification –Not good in testing performance/load issues Inter-working Trials –Testing of the real equipment inter-working –Good for performance/load testing –Hard to completely verify protocol

Slide 5 Conformance Testing Based on ISO/IEC 9646 (or X.290) –"Framework and Methodology of Conformance Testing of Implementations of OSI and CCITT Protocols.” Abstract Test Suites (ATS) consisting of Abstract Test Cases Test Cases are defined using “Black Box”model, only by controlling and observing using the external interfaces. Provide basis for Generic Test Tools and Methods for verification of Telecommunication Standards and Protocols

Slide 6 ITU Conformance Testing Standards X OSI conformance testing methodology and framework for protocol recommendations X.291 Abstract test suite specification X.292 The Tree and Tabular Combined Notation (TTCN) X.293 Test realization X.294 Requirements on test laboratories and clients for the conformance assessment process X.295 Protocol profile test specification X.296 Implementation conformance statements

Slide 7 Conformance Testing Verification that protocols are conforming to Standard Requirements PICS - Protocol Implementation Conformance Statement –Information provided by protocol implementator on the extent of the implementation: what optional items are implemented, is there any restrictions … PIXIT - Protocol implementation extra information –provides information regarding the physical configuration of unit under test: eg. Telephone numbers, socket numbers...

Slide 8 Tree and Tabular Combined Notation TTCN Defined as part of ISO/IEC 9646 (X.290) Used as a language for formal definition of test cases Each test case is an event tree in which external behavior such as: "If we send the message 'connect request,' either 'connect confirm' or 'disconnect indication' will be received" are described. Messages can be defined using either the TTCN-type notation or ASN.1.

Slide 9 TTCN Tests Reactive Systems IUT StimulusResponse PCO PCO = Point of Control and Observation IUT = Implementation Under Test

Slide 10 A Typical Test Architecture IUT L U Lower TesterUpper Tester Underlying Service Provider

Slide 11 Parallel Test Architecture IUT Underlying Service Provider(s) CP MTC MTC = Main Test Component PTC = Parallel Test Component CP = Coordination Point CP PTC

Slide 12 TTCN Formats TTCN is provided in two forms: a graphical form (TTCN.GR) suitable for human readability a machine processable form (TTCN.MP) suitable for transmission of TTCN descriptions between machines and possibly suitable for other automated processing.

Slide 13 Main Components of TTCN Test Suite Structure Data declarations (mainly PDUs) –TTCN data types –ASN.1 data types Constraints –i.e., actual instances of PDUs Dynamic behaviour trees (Test Cases) –Send, Receive, Timers, Expressions, Qualifiers –Test Steps

Slide 14 Test Suite Structure TTCN Test suite consist of 4 major parts –suite overview part –declarations part –constraints part –dynamic part

Slide 15 Suite Overview Part The suite overview part is a documentary feature comprised of indexes and page references. This is part of the heritage of TTCN as a paper-oriented language. It contains a table of contents and a description of the test suite, and its purpose is mainly to document the test suite to increase clarity and readability. In this part of the test suite, a quick overview of the entire test suite is possible.

Slide 16 Test Suite Overview Proforma

Slide 17 Suite Declarations Part The declarations part is used for declaring –types, –variables, –timers, –points of control and observation (PCOs), –test components. The types can be declared using either TTCN or ASN.1 type notation. Graphical table is used for declarations

Slide 18 TTCN Data Types There is no equivalent to pointer types and, as a consequence of that, types cannot be directly or indirectly recursive. Two structured types that are specific –protocol data unit (PDU) – data packets sent between peer entities in the protocol stack –abstract service primitive (ASP) - - a data type in which to embed a PDU for sending and receiving

Slide 19 Example of ASN.1 Type Definition

Slide 20 Example of TTCN Type Definition

Slide 21 Variable Declarations

Slide 22 Constraints Part In this part constraints are used to for describing the values sent or received. The instances used for sending must be complete, but for receiving there is the possibility to define incomplete values using wild cards, ranges and lists.

Slide 23

Slide 24 Dynamic Part In this part actual tests are defined Contains Test Suite with –test groups – a grouping of test cases. It might, for example, be convenient to group all test cases concerning connection establishment and transport into a separate test group –test steps – a grouping of test events, similar to a subroutine or procedure in other programming languages –test events – the smallest, indivisible unit of a test suite. Typically, it corresponds to sending or receiving a message and the operations for manipulating timers

Slide 25 TTCN Behavior Tree Suppose that the following sequence of events can occur during a test whose purpose is to establish a connection, exchange some data, and close the connection. –a) CONNECTrequest, CONNECTconfirm, DATArequest, DATAindication, DISCONNECTrequest. Possible alternatives to “valid behaviour” are –b)CONNECTrequest, CONNECTconfirm, DATArequest, DISCONNECTindication. –c)CONNECTrequest, DISCONNECTindication.

Slide 26 TTCN Behavior Tree

Slide 27 TTCN Behavior Table The behavior tables build up the tree by defining the events in lines on different indention levels. The rows on the same indention are events that describe alternatives, and row on the next indention are the continuation of the previous line. The line can consist of one or a number of the following: –send statement – ! – states that a message is being sent –receive statement – ? – states that a message is being received –assignment – ( ) – there is an assignment of values –timer operations – start, stop or cancel a timer –time-out statement –Boolean operations – [] – qualifying the execution –attachment – + – acts as a procedure call

Slide 28 TTCN Behavior Table All the leaves in the event tree are assigned a verdict that can be: –pass – the test case completed without the detection of any errors –fail – an error was detected –inconclusive – there was insufficient evidence for a conclusive verdict to be assigned, but that the behavior was valid A verdict can be final or preliminary – it will not terminate the active test case execution. To describe what is happening in the test case, the dynamic behavior can be explained in plain language.

Slide 29 Dynamic Behaviour Table

Slide 30 TTCN MP Form

Slide 31 Testing and validation. To test an implementation of a protocol with an SDL specification the following approach can be taken. Get the interface into the appropriate state using a preamble. Send the appropriate message. Check all outputs Check the new state the protocol is in. Perform a postamble to return the protocol to the null state.

Slide 32 TTCN What do you need to know To understand Terminology To understand test principles and the test environment You don’t need to know how to write TTCN scripts, but you have to understand them so you can read test reports and analyse problems

Slide 33 Various Test Equipment Numerous Test Equipment supports the TTCN Protocol Analyzers Protocol Simulators Protocol Emulators

Slide 34 Protocol Analysers (Monitors) –Monitors of the ‘live’ communication between equipment and interprets messages in human readable form. –Same H/W may support monitoring of different protocol types –It may provide some statistical analysis capability –It may have advanced search/filtering capabilities. –You may define event on which logging would start or stop. Some Equipment Some Equipment Protocol Analyser

Slide 35 Protocol Simulators –Allows Users to write a scripts (similar to TTCN) which will send a messages and react on received messages –It can be used for testing various protocols –It may be quite big effort to write required amount of scripts –Suppliers often provide a set of test scripts targeting particular protocol Equipment Under Test Protocol Simulator

Slide 36 Protocol Emulators –Emulates operation of equipment (not full implementation) –No script writing required –It provides some control of behavior and configuration –Not flexible as simulator –Usually one box can contain Protocol Analyser, Simulator and Emulator Equipment Under Test Protocol Emulator

Slide 37 Interoperability Testing (IOT) Testing what users may expect from product, which may include more then demanded by standards: performance, robustness and reliability Proper functioning in the system where product is finally installed Interoperability with applications which use the product IOT is usually performed in addition to Formal Protocol Conformance Testing

Slide 38 Pros & Cons of IOT Potential IOT Problems –Not covering all possible protocol scenarios –Difficult to reproduce detected problems –Often little possibility to actively control the test environment so investigation can be done The pros of IOT –Standard may be ambiguous so different manufactures implement differently –Performance/Load aspects easier to test –IOT may be cheaper and faster –It gives the confidence to user in the system operational capabilities

Slide 39 The Case for Automated Validation If a standard is ratified by a standards body that has not been validated and contains errors the following may occur: –Some implementations of the protocol may exist with serious errors that will affect service operation and service assurance –Different implementations of the protocol, that are bug free will have proprietary solutions to overcome protocol errors and thus may not inter-work –Some vendors may choose to implement only a subset of standardised protocols, relying on proprietary protocols.

Slide 40 The Case for Automated Validation Even with simple protocols containing a small number of functional entities, messages and states the number of possibilities will be more than most people will have time to verify by hand. Communications protocol implementations are so complex, that it is likely to be impossible to test all protocol combinations on a system that has implemented a protocol. In reality standards committees are design by committee, and although some members perform automatic validation, most don’t, and protocols are modified to cope with bugs as they appear

Slide 41 State Explosion Problem Even with simple protocols containing a small number of functional entities, messages and states the number of possibilities will be more than most people will have time to verify by hand. Some problems have more permutations than a computer can go through, due to limitations on processing speed and memory.

Slide 42 Travelling salesman problem A salesman spends his time visiting n cities (or nodes) cyclically. In one tour he visits each city just once, and finishes up where he started. In what order should he visit them to minimise the distance travelled? If every city is connected to every other city, then the number of possible combinations is (n-1)!/ 2. For only 100 cities the number of possible combinations could be as high as 4.67 x There is a mathematical proof to show that for the number of towns in the US that the time required to list all combinations would be greater than the estimated length of the universe, and would require more memory than there are atoms in the universe, according to current physics theory.

Slide 43 Automated analysis tools. Through the use of automatic tools, that traverse through all the possible sets of states number of errors can be easily detected: –livelock and deadlock –output with no receiver –output with multiple receivers –over the maximum number of instances for a process –decision value not expected in the set of answers –unreachable states

Slide 44 Automated analysis tools. Errors related to data access, for example: –variable usage not compliant with its type –array overflow Automated tools will not be able to detect undesirable behaviour due to poor specification

Slide 45 Traditional Design Traditional design involves the following design cycle High Level Design –Requirements are written usually in English (or equivalent) –Protocol is then defined using a formal techniques such as SDL, MSC and ASN.1 Low Level Design Coding and testing –Most the effort occurs here. Coding and debugging thus becomes the focus of the protocol design

Slide 46 Formal Design The later an error is detected the more expensive it is to fix. Formal design techniques (FDT) are used to verify the correctness of the protocol on the high level design phase. This involves testing the protocol description as defined in a specification languages such as SDL. This requires –An unambiguous notation –Effective validation tools

Slide 47 Formal Protocol Specification Languages and Protocol Validation Formal protocol definition languages such as SDL can be used as input to protocol validation tools. Three common description languages used for this approach are SDL, Lotos and Estelle.

Slide 48 FDTs are also intended to satisfy objectives such as: a basis for analyzing specifications for correctness, efficiency, etc.; a basis for determining completeness of specifications; a basis for verification of specifications against the requirement of the Recommendation; a basis for determining conformance of implementations to Recommendations; a basis for determining consistency of specifications between Recommendations; a basis for implementation support.

Slide 49 Other Formal Specification Languages These other languages are discussed in the literature, but are beyond the scope of this subject and are not used by ITU-T: CCS (the Calculus of Communicating Systems) Lotos (Language for Temporal Ordering Systems): mathematically defined language Z Specification Language: a language based upon set theory and first-order predicate calculus. VDM (Vienna Development Method)

Slide 50 ITU Programming Languages ITU standardises the following formal description techniques –Z.100 CCITT Specification and description language (SDL) –Z.105SDL combined with ASN.1 (SDL/ASN.1) –Z.120 Message sequence chart (MSC) ITU also define the following telecommunications languages –Z.200CCITT high level language (CHILL) Common text with ISO/IEC –Z.30xIntroduction to the CCITT man-machine language

Slide 51 Performance analysis When a new protocol or service is introduced into a network, there needs to be an understanding what resources are required to implement the service. Performance analysis is the task of finding a services resource requirements are. This is used to cost the service at the cheapest price, with the highest probability that the targeted quality of service is delivered. There is a branch of statistics devoted to understanding the use of communications networks known as traffic theory.

Slide 52 Debugging Embedded Real Time Applications Usual PC Debuggers and Tools not applicable Hardware environment may not be stable Difficult to differentiate between Hardware of Software problems Problems may be difficult to reproduce and capture CPUs used are quite frequently new and not fully debugged

Slide 53 Firmware Debugging Tools Romulators Logic Analysers FW Simulation on Unix/DOS FW Application Monitor In-Circuit Emulator ICE CPU Performance Analysers On Chip Debugger (BDM)

Slide 54 ROMULATORS Replacement of the ROM with RAM which is controlled by the PC (for example) Instead of burning the ROM every time code has to be modified the new executable is downloaded to RAM Very cheap No support for debugging –no control or the CPU –no visibility or CPU activity in addition to what is available trough the application interface

Slide 55 Logic Analysers Device capturing all signal activities on the Processor Pins With addition of specific SW signal trace may be represented in form of C/C++ Code Logic Analyser is very common HW Test Tool Doesn’t provide ability to control the target CPU No monitoring of internal CPU registers available

Slide 56 FW Simulators Customised (cut-down) version of the FW running on the UNIX or DOS This allows use of UNIX/DOS tools –Full blown Debuggers –Run time error detection tools »detection of the memory leakage »detection of uninitialised variables »stack overflows/underflows »Reading or writing past the boundary of an array »Using Freed memory –Performance Analysers (Quantify,...)

Slide 57 Built-In Debug Monitors Debug facility integrated with the application It may print-out relevant debugging info It may also be interactive where user can partially control the application and request some info from FW Assumes that HW is working well and that FW is stable enough. It is not good for debugging applications which are crashing - resetting Use of Monitor may affect the speed of application which is sometimes not acceptable

Slide 58 In-Circuit Emulators (ICE) Provides full control of FW (CPU) –Break Points on different conditions (HW or SW) –Modification of Registers and Memory areas –Single Step execution –Source Level Tracing Allows running FW at full speed Application may run from “Emulation” RAM instead from ROM (this allows fast modification) RTOS Support is issue May be expensive (20-50 K)

Slide 59 CPU Performance Analysers Provides ability to determine how much time CPU is spending executing particular area of code. This allows to perform the performance analysis and tuning In case of Analysers which are RTOS aware CPU utilization per task can be displayed

Slide 60 Performance Analyser Example

Slide 61 CPU Background Debug Mode (BDM) Some microprocessors provide debugger implemented in CPU micro-code (BDM or JTAG) Full debug support available: registers can be viewed and modified, memory can be read or written... On chip HW break-point support Control via dedicated serial interface on chip connected to PC running debugger application This is relatively cheep alternative to ICE (2-5K) No Tracing of application execution available For tracing combination of Logic Analyser and BDM is required

Slide 62 Debugging Tools Summary Cheep but very limited tools (no control of CPU) –Romulators –Logic Analyser Built-in monitor allows some control of CPU but requires stable HW & FW platform FW simulation provides good flexible environment but it may not represent real-time aspect of real target Best Solutions for Low Level Debugging –In Circuit Emulator (ICE) - It can be expensive –BDM with addition of Logic Analyser for tracing