Presentation is loading. Please wait.

Presentation is loading. Please wait.

DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 1 Recommended Acceptance Testing Procedure for Network Enabled Training Simulators Peter Ross.

Similar presentations


Presentation on theme: "DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 1 Recommended Acceptance Testing Procedure for Network Enabled Training Simulators Peter Ross."— Presentation transcript:

1 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 1 Recommended Acceptance Testing Procedure for Network Enabled Training Simulators Peter Ross and Peter Clark Defence Science & Technology Organisation AUSTRALIA

2 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 2 Acceptance Testing – Distributed Simulation Objective: -To ensure the supplier has met the requirements of the contract Problem: -It is difficult to gauge fulfilment of the requirements, as often there is no immediate requirement to interface the simulator to another -Often it is cost prohibitive to conduct a trial with another simulator, so test equipment is used as an alternative -Whilst there is a wide range of test equipment available to facilitate testing, there is no standard procedure for applying these tools Solution: -Establish a procedure

3 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 3 Platform Training Simulator – Definition Is a local term for: –Human in the loop training simulator –Platform level representation of the battlespace –Simulated in real-time –... and often big and expensive (order of tens of millions)

4 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 4 Training Simulators – ADF Examples –AEW&C operational mission simulator –ANZAC team trainer –AP-3C operational mission simulator –AP-3C advanced flight simulator –FFG-UP onboard training system and team trainer –Hornet aircrew training system –Seasprite full mission flight simulator –ARH Trainer –ASLAV Crew Procedural Trainer –ABRAMS trainer

5 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 5 Platform Training Simulator - Components Trainer – cockpit, ops room, bridge Control Station – simulator configuration and scenario control Instructor/Asset Station – management of additional role players within the scenario Debrief – provides performance feedback to trainees Simulation Computer – calculations and display rendering Distributed Simulation Interface – enables simulators to participate in a shared virtual battlespace Generalisation only

6 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 6 Distributed Simulation 101 What is distributed simulation : –The provision of a shared virtual battlespace The problem : –Internally, each simulator models the virtual battlespace differently The solution : 1.Use the same internal model across all simulators; or 2.Adopt a simulation interoperability standard –ALSP: Aggregate Level Simulation Protocol –DIS: Distributed Interactive Simulation –HLA: High Level Architecture –TENA: Test and Training Enabling Architecture?

7 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 7 Simulation Interoperability Standards Network Model – what information is exchanged between simulators –DIS: ground truth, WGS84 –HLA: flexible, but often based on DIS/RPR-FOM Network Protocol – how the information is represented digitally –DIS: Protocol Data Units (PDUs) –HLA: flexible Network Transport – how the information is transported between simulators –DIS: flexible; but UDP/IP is almost always used –HLA: flexible Flexibility is not always an advantage

8 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 8 Distributed Simulation Interface (1) Combination of software and hardware Performs two tasks Translation – translate information between the internal and network models –e.g. coordinate conversion Exchange – marshal information and send it to other simulators (and vice-versa) –e.g. storing information within PDUs and outputting Ethernet frames

9 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 9 ISO/OSI: International Standards Organisation – Open Systems Interconnection Distributed Simulation Interface (2) - Layers Send Receive

10 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 10 Interoperability – Three Levels Compliant – the distributed simulation interface is implemented in accordance with the relevant standards Achieved at acceptance testing stage Interoperable – two or more simulators can participate in a distributed training exercise Achieved at requirements specification stage Compatible – two or more simulators can participate in a distributed training exercise and achieve training objectives Achieved at training needs analysis stage

11 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 11 Acceptance Testing – Technical Reasons Distributed simulation standards are often ambiguous; engineers will form their own interpretations of the standard –Two compliant simulators may not interoperate due to these interpretations Network protocols are intolerant to implementation errors –One incorrectly set bit is sufficient to prevent interoperability Resolving defects after the Defence Department takes ownership of the simulator is often expensive

12 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 12 Recommended Procedure - Overview Three stages: 1) Planning 2) Test Activity 3) Documentation

13 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 13 Recommended Procedure - Planning The When, Where, What, Why, Who, How … Functionality being tested; not all of the simulator’s capabilities are represented by the network model Manning: arms and legs to operate the various components of the simulator Data concerns –Enumerations (e.g. platform types) –Geographic locations –Classification Network media compatibility: 10base2 … 100baseFX Test equipment availability and compatibility Schedule

14 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 14 Recommended Procedure - Test Activity (1) Black box testing Test cases are applied to the two exposed interfaces: –HMI – Human Machine Interface –NIC – Network Interface Card

15 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 15 Recommended Procedure - Test Activity (2) Deploy the team and equipment to the training or contractor facility Test cases are categorised into three types: Configuration Testing – verify the simulator can be (and is) configured appropriately for a distributed training exercise Send Testing – verify information sent by the simulator is correct Receive Testing – verify information received by the simulator is correct

16 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 16 Recommended Procedure – Test Activity (3) Offsite analysis Time spent with the simulator is likely to be “precious” It is desirable to perform lengthy analysis of the data elsewhere To facilitate this: –Relevant HMI actions and network data are recorded in a test log –Log entries are time-stamped to enable correlation of events

17 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 17 The Procedure – Documentation (1) Report findings to Project Authority Indicate whether the distributed simulation component of the simulator should be “accepted” If not, make recommendations for change Testing should be repeated where there are significant problems

18 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 18 The Procedure – Documentation (2) Problems are highlighted by severity. Our adopted scheme: FAULT –Has potential to prevent interoperability with another simulator –Resolution advised ISSUE –Does not comply with standard, or lacks some functionality. However is unlikely to prevent interoperability with another simulator. –Resolution desirable ACTION –Test results insufficient to draw firm conclusion –Further investigation advised

19 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 19 Test Case Development (1) Test cases demonstrate the fulfilment of a specific distributed simulation requirement. Test cases must be documented, and reference the requirement, and any interpretations or assumptions made by the test engineer Requirements exist at different “layers” of the distributed simulation. Some examples: –Training – simulation of IFF modes 1, 2, 3 and Charlie –Network Model – issuance and receipt of the IFF object –Network Protocol – population of the IFF PDU –Network Transport – network host address, port numbers –Network Hardware – provision of a 100baseTX NIC

20 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 20 Testing Tools ToolLayerGenerateMonitorRecordReplay Tcpdump / EtherealTransport PR LogProtocol LZ NetdumpProtocol MaK LoggerProtocol MaK NetdumpProtocol PDU GeneratorProtocol DISCommWin (Radio)Model Airline SchedulerModel World ViewModel DIS Test SuiteModel LZ Entity GeneratorModel MaK F18Model MaK PVDModel MaK StealthModel MEGModel

21 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 21 Test Case Development (2) Emphasis is placed on testing the network model and protocol requirements; the other requirements are often easier to verify For each object and interaction supported by the simulator, test cases attempt to exercise all relevant software execution paths –Exercise all relevant aspects of the HMI –Exercise all relevant fields within all supported objects and interactions –Exercise the relationship between the HMI and NIC

22 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 22 Test Case Development (3) - Example

23 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 23 Sample Faults Noted Simulator A sends entity ID #0:0:0; Simulator B crashes on receipt of entity ID #0:0:0 Azimuth is reported in degrees instead of radians; power is reported in milliwatts instead of dB- referenced to one milliwatt Entity enumeration field is hard-coded and indicates the ownship is subsurface life form 3:4:225:4:0:0:0 (a whale) “Based on a true story…”

24 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 24 Summary DSTO has developed a library of test cases for IEEE 1278.1/A - Distributed Interactive Simulation Application Protocol The library evaluates many of the common object and interaction types Simulators are fundamentally different; tailoring the test cases is almost always necessary The procedure and library have been applied to several Australian Defence training simulators

25 DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 25 Recommended Acceptance Testing Procedure for Network Enabled Training Simulators Peter Ross and Peter Clark Defence Science & Technology Organisation AUSTRALIA


Download ppt "DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION Euro-SIW-030: 1 Recommended Acceptance Testing Procedure for Network Enabled Training Simulators Peter Ross."

Similar presentations


Ads by Google