Download presentation
1
TP#16 oneM2M approach to testing
Source: ETSI Meeting Date:
2
Presentation Outline Achieving interoperability
Conformance versus interoperability testing test specification development Test purposes Conformance tests Interoperability tests
3
Interoperability is … The ultimate aim of standardization in Information and Communication Technology The red thread running through the entire standards development process, it’s not an isolated issue Not something to be somehow fixed at the end Best practise approach of SDO To produce with the required degree of parallelism Base Standards and Test Specifications Base standards should be designed with interoperability in mind Profiles can help to reduce potential non-interoperability Two complementary forms of testing Conformance testing Interoperability testing (i.e., a more formal form of IOT) Testing provides vital feedback into the standardization work
4
About Conformance Testing
Is Black-Box testing Stimulation and Response Point of Control and Observation (PCO) Test System Device Reqs.
5
Conformance is usually tested layer-by-layer
IUT = Implementation Under Test L3 Upper Tester API MMI L3 L2 PHY SUT = System Under Test 1 2 3 4 5 6 7 8 9 * # l a t i g d Test System SUT Upper Adapter L3 Test Lower Adapter Lower Tester Conformance testing concentrates on specific components in a system, often related to a single standard (or set of related standards). It is unit testing rather than system testing. Conformance testing is applied over open interfaces and checks for conformance to the requirements in a base specification or profile (standard). Conformance tests are executed under controlled conditions using a dedicated Test System. In summary, conformance testing is thorough and accurate but limited in scope. It gives a high-level of confidence that key components of a device or system are working as they were specified and designed to do. But a conformant component will not necessarily always interoperate with other components in a larger system. Test system usually runs standardized language like TTCN-3. TTCN-3 (Testing and Test Control Notation Version 3) is an internationally standardized programming language that has been specifically designed for use in specifying and controlling testing scenarios. It allows to be completely independent of the test platform. It is not linked to one specific test tool. Same test suite can run in any test systems which is very important for certification. TTCN-3 test suite have been developped for many standards like SIP, IMS, Intelligent Transport System in ETSI, some OMA standards etc … It runs on conformance test system for certification of 3GPP UE and Wimax Forum. i i t a l
6
Characteristics of Conformance Testing
Tests IUT against requirements specified either in a base specification or profile (standard) Gives a high-level of confidence that key components of a device or system are working as they were specified and designed to do Usually limited to one requirement per test High IUT control and observability Can explicitly test error behaviour Can provoke and test non-normal (but legitimate) scenarios Can be extended to include robustness tests Test execution can be automated and repeated Requires a test system development (and executable test cases) Can be expensive (e.g., radio-based test system) Tests under ideal conditions Tests are thorough and accurate but limited in scope At level of detailed protocol messages, service primitives, or procedure calls
7
Limitations of Conformance Testing
Does not prove end-to-end functionality (interoperability) between communicating systems Conformance tested implementations may still not interoperate This is often a specification problem rather than a testing problem! Need for minimum requirements coverage or profiles Tests individual system components But a system is often greater than the sum of its parts! Does not test the user’s ‘perception’ of the system Standardised conformance tests do not include proprietary ‘aspects’ Difficult to test via proprietary interfaces, e.g., user APIs Can however be done by a manufacturer with own conformance tests for proprietary requirements
8
About Interoperability Testing
End-to-End Interoperability of devices What is being tested? Assignment of verdicts? Need to distinguish between More formal interoperability testing (= test specifications) versus Informal interoperability events used to validate/develop technologies, e.g., Interop events Device1 Devicen Device2
9
Methodology for Interoperability Testing
WI-0027 Testing Framework Methodology for Interoperability Testing EUT / QE EUT 1 2 3 4 5 6 7 8 9 * # 1 2 3 4 5 6 7 8 9 * # Means of Communication (MoC) Interoperability Testing ( of terminals) IOT concentrates on a complete device or a collection of devices. It is system testing rather than unit testing. It is most commonly applied to end-to-end testing over networks. It shows, from the user's viewpoint, that functionality is accomplished (but not how). Validation reveals problems/errors in Standards and Products Validated standards give a higher chance of interoperable products For standardisers gives assurance that they provide right functionality For manufacturers and operators gives confidence to implement and go to market Provides an opportunity to correct errors in a controlled manner Late fixes in the product cycle are more expensive than early ones Decreases time to market Standards can be validated by several means but one of the most practical and cost effective is by interoperability events SUT API MMI L3 L2 PHY QE = Qualified Equipment EUT = Equipment Under Test
10
Characteristics of Interoperability Testing
It is system testing Tests a complete device or a collection of devices Shows that (two or more) devices interoperate But within a limited scenario Tests at a ‘high’ level (perception of users) Tests the ‘whole’, not the parts or layers E.g., protocol stacks integrated with applications Tests functionality Shows function is accomplished (but not how) Does not usually involve test systems Uses existing interfaces (standard/proprietary) Interoperability testing looks at end-to end functionality Less thorough but wide in scope Gives a high-level of confidence that devices (or components in a system) will interoperate with other devices (components)
11
Limitations of Interoperability Testing
Does not prove that a device is conformant Interoperable devices may still interoperate even though they are non-conformant to a base specification or standard Cannot explicitly test error behaviour or unusual scenarios Invalid conditions may need to be forced (lack of controllability) Has limited coverage (does not fully exercise the device) Usually performed manually Difficult to automate, tests use more time to be prepared than executed. Does not prove 100% interoperability with other implementations with which no testing has been done ‘A’ may interoperate with ‘B‘ and ‘B’ may interoperate with ‘C’. But it doesn’t necessarily follow that ‘A’ will interoperate with ‘C’.
12
Conformance and Interoperability Testing are Complementary
Standardization experience As you move up a system stack the emphasis should change from conformance to interoperability testing Lower layer protocols, infrastructure Emphasis on conformance Middleware, enablers Combination of conformance and interoperability testing Services, applications, systems Emphasis on interoperability testing Conformance testing can be viewed as a pre-requisite to interoperability testing Interop Testing Conformance Testing Interoperability
13
Stages in Test Specifications Development
Base Standard or Profile Specification Requirement Catalogue Implementation Check List (ICS/IFS) Identification of Test Group Structure (TSS) Specification of Test Purposes (TP) Specification of Test Descriptions (TD) Specification of Test Cases (TC) Validation of Test Cases
14
Conformance Test Specification Development
Standard TTCN-3 Test Suite Test Descriptions Test Purposes Compilation Requirements Catalogue and/or ICS/IXIT Test Case Parameterisation and Selection Executable Tests
15
Implementation Conformance Statement (ICS)
Tabular overview of the features and options that are implemented by a given product Checklist of the capabilities supported by the Implementation Under Test (IUT) Allows quick & easy first good guess on potential interoperability between two or more implementations Q# ICS Question Reference Status Support Q1 Support of Feature F1 [x] Clause a OPTIONAL Yes/No Q2 Support of Feature F2 [x] Clause b MANDATORY : Qn Support of Message M1 [y] Clause c CONDITIONAL Qk Support of message Parameter P1 [y] Clause d
16
Test Purposes Precise descriptions of the purpose of a test in relation to a particular (base standard) requirement Meaningful to a larger audience than technical experts Define the functionality being tested, i.e., WHAT? Should be formulated using a similar level of information content and wording as the corresponding requirement description Should only mention directly relevant aspects of interactions but not describe them in detail Reference a test configuration (i.e., architecture) Identify one or more test purposes per requirement Specified in Natural language, or Standardized Test Purpose notation (eg. TPLan) But definitely not coded!
17
Example of TP (ITS-G5) TP Id TP/GEONW/PON/FPB/BV-01 Test objective
Check Source packet buffering into UC forwarding buffer for unreachable Unicast destinations (absence of a suitable next hop candidate) Reference EN [1], clauses 7.5.3, , , Config Id CF03 PICS Selection PICS_GN_GUC_SRC Initial conditions with { the IUT being in the "initial state" and the IUT not having received any Beacon information from ItsNodeB and the IUT having a Location Table Entry for ItsNodeA (see note) and the IUT having been requested to send a GUC packet addressed to ItsNodeA containing TrafficClass.SCF set to 1 } Expected behaviour ensure that { when { the IUT receives a Beacon packet from ItsNodeB } then { the IUT selects the ItsNodeB as the next hop and the IUT sends the buffered GUC packet } } Note: Location Table Entry is created by sending any GeoNetworking packet, originated by ItsNodeA, from ItsNodeC to IUT.
18
Conformance Test Suite
A collection of detailed test cases or scripts that implement test purposes Can be compiled and executed, e.g., when using TTCN-3 Specifies HOW to test Implements also handling of, e.g., unexpected or background behaviour, or returning the SUT to the initial state after a test Composition of test cases Each individual test has a preamble, test body (i.e., implementation of the Test Purpose), and postamble Test components may be used clearly separate interactions with different logical IUT interfaces during a test Assigns test verdicts Should be parameterizable at run-time, e.g., with SUT address A test suite is not a test system Test focus on IUT behaviour Test system handles also message encoding and transport
19
What is TTCN-3? Testing and Test Control Notation Version 3
Internationally standardized language developed specifically for executable test specification Is independent of a specific IUT or IUT interfaces Is independent of a test execution environment Allows unambiguous implementation of tests Look and feel of a regular programming language Good tool support (many commercial tools available) Successfully deployed at 3GPP, OMA, ETSI, WiMAX Forum, etc … and industry in a variety of application domains e.g., telecom, automotive, software, etc. Good text book available
20
Example TTCN-3 Test Case
testcase TC_COR_0047_01() runs on Ipv6Node system EtherNetAdapter { f_cf02Up(); // Configure test system for HS->RT // No preamble required in this case f_TP_HopsSetToOne(); // Perform test // No postamble required in this case f_cf02Down(); // Return test system to initial state } function f_TP_HopsSetToOne() runs on Ipv6Node { var Ipv6Packet v_ipPkt; var FncRetCode v_ret := f_echoTimeExceeded( 1, v_ipPkt ); if ( v_ret == e_success and v_ipPkt.icmpCode == 0 ) { setverdict(pass);} else { setverdict(fail); } function f_echoTimeExceeded(in UInt8 p_hops, out Ipv6Packet p_ÃpPkt ) runs on Ipv6Node return FncRetCode { var Ipv6Packet v_ipPacket; var FncRetCode v_ret; ipPort.send( m_echoReqWithHops(p_hops) ); alt { [] ipPort.receive( mw_anyTimeExceeded ) -> value p_ipPkt { return e_success } [] ipPort.receive { return e_error } }
21
Validation of (Conformance) Tests
Typical test suite validation requires that Tests compile with several TTCN-3 tools Tests are executed on one or more platforms against various implementations of the standard (usually by oneM2M members) Requires very close co-operation with test tool suppliers, test platform providers, equipment manufacturers etc. Strict and well-defined processes between Test lab (who execute tests[, certify] and provide feedback) And oneM2M WG TST (who updates the tests) Timely availability of stable base specifications, test platforms (tools) and implementations is essential
22
Interoperability Test Descriptions
Specifies detailed steps to be followed to achieve stated test purpose, i.e., HOW? State steps clearly and unambiguously without unreasonable restrictions on actual method: Example: Answer incoming call NOT Pick up telephone handset Written in a structured and tabulated natural language so tests can be performed manually
23
Example Interoperability TD
Interoperability Test Description Identifier: TD_M2M_NH_07 Objective: AE creates a container ressource in registrar CSE via a container Create Request Configuration: M2M_CFG_01 References: [1] [2] Pre-test conditions: Resource <AE> has been created in registrar CSE Test Sequence: Step Type Description 1 Stimulus AE is requested to send a container Create Request (POST) to create container <container> 2 Check (Mca) Sent POST request contains Request URI <CSEBase>/<AE>?rty=container Host : Registrar CSE Payload: container resource <container> to be created 3 Check Check if possible that the <container> resource is created in registrar CSE. 4 Registrar sends response containing: Code = 200 Content-Location : <CSEBase> /<AE>/<container> Payload: Created <container> 5 Verify AE indicates successful operation 6 Repeat step 1-5 creating container resource with short lifetime 7 Check the created resource is removed after the expiry of its lifetime
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.