Presentation is loading. Please wait.

Presentation is loading. Please wait.

Addressing Inconsistent Results in Open Over the Air Testing

Similar presentations


Presentation on theme: "Addressing Inconsistent Results in Open Over the Air Testing"— Presentation transcript:

1 Addressing Inconsistent Results in Open Over the Air Testing
IEEE TO be obtained November 2005 Addressing Inconsistent Results in Open Over the Air Testing August/2006 Date: Authors: 5355 Ave Encinas, Carlsbad, CA 92008 Pertti Visuri Airgain, Inc (760) Notice: This document has been prepared to assist IEEE It is offered as a basis for discussion and is not binding on the contributing individual(s) or organization(s). The material in this document is subject to change in form and content after further study. The contributor(s) reserve(s) the right to add, amend or withdraw material contained herein. Release: The contributor grants a free, irrevocable license to the IEEE to incorporate material contained in this contribution, and any modifications thereof, in the creation of an IEEE Standards publication; to copyright in the IEEE’s name any IEEE Standards publication even though it may include portions of this contribution; and at the IEEE’s sole discretion to permit others to reproduce in whole or in part the resulting IEEE Standards publication. The contributor also acknowledges and accepts that this contribution may be made public by IEEE Patent Policy and Procedures: The contributor is familiar with the IEEE 802 Patent Policy and Procedures including the statement "IEEE standards may include the known use of patent(s), including patent applications, provided the IEEE receives assurance from the patent holder or applicant with respect to patents essential for compliance with both mandatory and optional portions of the standard." Early disclosure to the Working Group of patent information that might be relevant to the standard is essential to reduce the possibility for delays in the development process and increase the likelihood that the draft publication will be approved for publication. Please notify the Chair as early as possible, in written or electronic form, if patented technology (or technology under patent application) might be incorporated into a draft standard being developed within the IEEE Working Group. If you have questions, contact the IEEE Patent Committee Administrator at Pertti Visuri, Airgain, Inc. Pertti Visuri, AIrgain, Inc.

2 Inconsistent Results August/2006
When two Devices Under Test (DUTs) are compared in open over the air (OOTA) environment using single device locations at each range the results are usually inconsistent and not easily repeatable To demonstrate this effect a set of comparison tests between two wireless client station devices were performed in a typical residential setting The devices under test (DUTs) were both cardbus cards that were identical except that they had different antenna systems The same AP (wireless counter part or WLCP) was used in all tests The throughput of seven different links (general AP and client station locations) were measured for both Cards The seven tests were repeated 15 times using slightly different locations and varying orientations of both the AP and the client stations Pertti Visuri, Airgain, Inc.

3 Comparison Results in Different Test Runs
August/2006 Comparison Results in Different Test Runs Here are a number of typical test results of measuring the same seven links with exactly the same AP and Client Stations. The only differences between the graphs are the precise location and the orientation of the AP and the Client station DUT1 DUT2 Throughputs in orientation 1 Throughputs in orientation 2 Throughputs in orientation 5 Throughput (Mbits/s) Link number Link number Link number Throughputs in orientation 9 Throughputs in orientation 11 Throughputs in orientation 13 Throughput (Mbits/s) Link number Link number Link number Pertti Visuri, Airgain, Inc.

4 Lack of Consistency and Repeatability
August/2006 Lack of Consistency and Repeatability It would be expected that if one of the DUT’s performs better at one distance (link) it would be better at other distances, too. However the results do not demonstrate such consistency In addition, the relative performance of the two units appears quite different in successive tests of the same links. The test does not appear to be repeatable DUT1 DUT2 Throughputs in orientation 1 Throughputs in orientation 2 Throughputs in orientation 5 Throughput (Mbits/s) Link number Link number Link number Throughputs in orientation 9 Throughputs in orientation 11 Throughputs in orientation 13 Throughput (Mbits/s) Link number Link number Link number Pertti Visuri, Airgain, Inc.

5 Averaging the Results of Slightly Different Locations
August/2006 Averaging the Results of Slightly Different Locations Fortunately it is possible to eliminate the inconsistencies and achieve repeatable results by simply performing several measurements at slightly different locations and orientations (of both the DUT and WLCP) for each of the links (distances) and averaging the results for each link Including all 15 measurements for each of the links provides consistent results. One of the DUTs performs better at all distances. Average Throughput of all 15 orientations DUT1 DUT2 Link number Pertti Visuri, Airgain, Inc.

6 Repeatability August/2006
Averaging the same data in two groups of 7 orientations for each link still maintains relatively good consistency and provides an indication of repeatability of the test Obviously averaging results of 15 locations/orientations is better than 7, but even these graphs demonstrate much better consistency and repeatability than using single locations DUT1 DUT2 Average Throughput of 7 orientations (orientations number 2,3,5,7,9,10,12) Average Throughput of 7 orientations (orientations number 1,4,6,8,11,13,14) Throughput (Mbits/s) Link number Link number Pertti Visuri, Airgain, Inc.

7 Practical Test Arrangement
August/2006 Practical Test Arrangement The 15 tests in each location were performed by placing both the DUT and the WLCP on a stop motion turntable. Both tables were turned 24º between each measurement and stopped for the duration of the throughput test. The 15 stops covered the full circle on both turntables. The tables were turning in opposite directions Client Station (DUT) Access Point (WLCP) Stop-motion turntable with controller Stop-motion turntable with controller Test control system Traffic generation device Traffic analyzer Automatic control of stop-motion turn tables The connection to one or the turntables was wireless but it was not operated during the throughput test Pertti Visuri, Airgain, Inc.

8 Measured Throughputs in All Tests
August/2006 Measured Throughputs in All Tests Here are the results of all measurements. The only difference between the curves on the graph of each DUT is the precise location and orientation of the DUT and WLCP Throughput of DUT1 Throughput of DUT2 Step of the turn tables Throughput (Mbits/s) Link number Link number Pertti Visuri, Airgain, Inc.

9 The Physics Causing the Large Variation
August/2006 The Physics Causing the Large Variation The reason causing the wide variation and apparent inconsistency in Open Over the Air (OOTA) testing is multipath fading Wireless signals reflect to varying degree from all surfaces that they reach. Reflected signals arrive at the receiving antenna in different phases depending on the distance they travel The RF field vectors add or subtract from one another depending on their phase and polarization Moving either of the antennas or any of the reflecting surfaces will result in a change in signal strength Transmitting antenna receiving antenna Pertti Visuri, Airgain, Inc.

10 Effect of Antenna Gain Patterns on Multipath Fading
August/2006 Effect of Antenna Gain Patterns on Multipath Fading If the gain pattern of either the receiving or transmitting antenna is different than the gain patterns in a reference system the resulting multipath fading will be different Reflected signals from all directions are included in the net signal strength and their contributions are affected by the antenna gain in each direction This results in a different signal strength (in each location and for each orientation of the antennas) if the gain patterns of both the antennas in the compared systems are not identical Transmitting antenna receiving antenna + + + = Pertti Visuri, Airgain, Inc.

11 Signal Variation with Antenna Location
August/2006 Signal Variation with Antenna Location To measure the effect of multipath fading an access point with two different antenna systems was moved over a grid of 100 locations and the signal strength was measured in each location The client station connected to the access point was about 40m (120 feet) away in a non-line of sight location The client station was not moved at all during the test The signal strength was measured using the RSSI reporting feature of an radio card and averaged over hundreds of samples during a few minutes to even out the effect of small changes in environment during the test The local signal strength variation is about 12 to 15dB and the locations of high and low signals depend strongly on the antenna system Antenna system 1 Signal Strength Antenna system 2 Signal Strength Pertti Visuri, Airgain, Inc.

12 Multipath Variation for Different Antenna Systems
August/2006 Single radio two antenna diversity unit Multipath Variation for Different Antenna Systems The multipath fading affects all systems, including MIMO systems In this test throughput of four different systems were compared using the physical test arrangement on slide 11 Throughput was measured using a standard Chariot test in 100 locations for each of the tested systems Each unit experienced very high throughput variations as a result of small movements and the local patterns were different Pertti Visuri, Airgain, Inc.

13 August/2006 Conclusions It is not possible to evaluate the relative performance of systems with different antenna systems by performing measurements only in a few locations at various distances It is not helpful to repeat the same test in exactly the same locations of the AP and the client station and average the results together. The variations are caused by the exact location and orientation of the devices. Performing several tests at each distance at slightly different locations and orientations and averaging the results at each distance is a very effective way to obtain consistent results and achieve repeatability of test results in open over the air (OOTA) testing. The proposed text of the Draft Recommended Practice for the Evaluation of Wireless Performance under preparation in the IEEE Task Group T incorporates this methodology for over the air testing. Pertti Visuri, Airgain, Inc.


Download ppt "Addressing Inconsistent Results in Open Over the Air Testing"

Similar presentations


Ads by Google