Application Layer Testing: An Example

Slides:



Advertisements
Similar presentations
Doc.: IEEE /0129r0 Submission January 2006 Don Berry, Wireless Enterprise ConsultingSlide 1 Non Noise Interference Testing Notice: This.
Advertisements

Beacon Measurement on Pilot Frames
Outdoor Use Case Date: Authors: November 2005 January 2995
LB84 General AdHoc Group Sept. Closing TGn Motions
Triggered QoS Measurements
[ Interim Meetings 2006] Date: Authors: July 2005
Latency-sensitive Applications - metrics
TCP Parameters and Settings
TGT Terminology and Concepts
Status, Next Steps and Call to Action
IEEE White Space Radio Contribution Title
LB73 Noise and Location Categories
LB73 Noise and Location Categories
Waveform Generator Source Code
March 2014 Election Results
TGp Closing Report Date: Authors: July 2007 Month Year
Attendance and Documentation for the March 2007 Plenary
3GPP Extended Date: Authors: July 2005 July 2005
3GPP liaison report May 2006 May 2006 Date: Authors:
Motion to accept Draft p 2.0
3GPP liaison report July 2006
[place presentation subject title text here]
Splicing in a Mesh Network
TGp Closing Report Date: Authors: March 2006 Month Year
On Coexistence Mechanisms
TGu-changes-from-d0-02-to-d0-03
TGp Closing Report Date: Authors: May 2007 Month Year
Self-organizing and Auto-configuring Mesh Networks
Best Path Selection Mechanism
Doc.: IEEE /1006r0 July 2006 July 2006 Supporting information for eliminating Section 5.7 "OTA Shielded Enclosure Environment" Date:
On Coexistence Mechanisms
Reflector Tutorial Date: Authors: July 2006 Month Year
TGv Redline D0.07 Insert and Deletion
TGv Redline D0.06 Insert and Deletion
ADS Study Group Mid-week Report
Selection Procedure Recommendation
Attendance for November 2006
TGu-changes-from-d0-01-to-d0-02
LB73 Noise and Location Categories
Extended Channel Switch Announcements
IEEE “ Requirements” Date: Authors:
TGy draft 2.0 with changebars from draft 1.0
TGv Redline D1.04-D1.0 Insert and Deletion
TGv Redline D0.10 Insert and Deletion
WAPI Position Paper Sept 2005 Sept 2005 IEEE WG
Redline of draft P802.11w D2.2 Date: Authors:
Freedom to use 40MHz at 5GHz
TGr Proposed Draft Revision Notice
Leader based Multicast
TGu-changes-from-d0-02-to-d0-03
3GPP2 Liaison Report Date: Authors: May 2006 May 2006
May 2005 CAPWAP AHC Closing Report
Liaison Report From Date: Authors: Month Year
Beamforming and Link Adaptation Motions
Draft P802.11s D1.03 WordConversion
Questions to the Contention-based Protocol (CBP) Study Group
TGT Conductive Test Environment
Motion to go to Letter Ballot
Outdoor Use Case Date: Authors: November 2005 January 2995
TGu-changes-from-d0-04-to-d0-05
for video transmission, Status
TGu-changes-from-d0-03-to-d0-04
TGu Motions Date: Authors: May 2006 May 2006
TGT Conductive Test Environment
Triggered QoS Measurements
WAPI Position Paper Sept 2005 Sept 2005 IEEE WG
Proposal for Diagnostic Alerts
WNG SC Closing Report Date: Authors: July 2006 July 2006
Selection Procedure Recommendation
TGp Motions Date: Authors: January 2006 Month Year
Presentation transcript:

Application Layer Testing: An Example Doc.: IEEE 802.11-05/1582r0 January 2005 January 2005 Application Layer Testing: An Example Date: 2005-1-17 Authors: Notice: This document has been prepared to assist IEEE 802.11. It is offered as a basis for discussion and is not binding on the contributing individual(s) or organization(s). The material in this document is subject to change in form and content after further study. The contributor(s) reserve(s) the right to add, amend or withdraw material contained herein. Release: The contributor grants a free, irrevocable license to the IEEE to incorporate material contained in this contribution, and any modifications thereof, in the creation of an IEEE Standards publication; to copyright in the IEEE’s name any IEEE Standards publication even though it may include portions of this contribution; and at the IEEE’s sole discretion to permit others to reproduce in whole or in part the resulting IEEE Standards publication. The contributor also acknowledges and accepts that this contribution may be made public by IEEE 802.11. Patent Policy and Procedures: The contributor is familiar with the IEEE 802 Patent Policy and Procedures <http:// ieee802.org/guides/bylaws/sb-bylaws.pdf>, including the statement "IEEE standards may include the known use of patent(s), including patent applications, provided the IEEE receives assurance from the patent holder or applicant with respect to patents essential for compliance with both mandatory and optional portions of the standard." Early disclosure to the Working Group of patent information that might be relevant to the standard is essential to reduce the possibility for delays in the development process and increase the likelihood that the draft publication will be approved for publication. Please notify the Chair <stuart.kerry@philips.com> as early as possible, in written or electronic form, if patented technology (or technology under patent application) might be incorporated into a draft standard being developed within the IEEE 802.11 Working Group. If you have questions, contact the IEEE Patent Committee Administrator at <patcom@ieee.org>. Tom Alexander, VeriWave Dr. Michael D. Foegelle, ETS-Lindgren

Abstract and Outline Outline: Doc.: IEEE 802.11-05/1582r0 January 2005 January 2005 Abstract and Outline Considerable discussion has taken place recently on the need to correlate application layer tests with controlled lower-layer tests. This presentation gives an example: An end-to-end application layer test revealed anomalous performance loss The performance loss was modeled, and traced to an unexpected MAC layer effect The MAC layer effect could then be the subject of a controlled test on the WLAN APs or clients alone Outline: A brief introduction to the application layer test, and some results in a controlled open-air test environment The MAC layer effect, and the relevant controlled lower-layer test Tom Alexander, VeriWave Dr. Michael D. Foegelle, ETS-Lindgren

The Controlled Open-Air Test Doc.: IEEE 802.11-05/1582r0 January 2005 January 2005 The Controlled Open-Air Test Tom Alexander, VeriWave Dr. Michael D. Foegelle, ETS-Lindgren

Application Layer Test: VoWLAN Performance January 2005 Application Layer Test: VoWLAN Performance Objective: to characterize the performance of a DUT (APs + WLAN switch) when used in enterprise-class VoIP applications Various performance parameters measured Call quality (R-value, jitter, loss) Call quality degradation with data loading Failover time and call drop counts Impact of failover on R-values Various test conditions and DUT configuration parameters Different numbers of handsets QoS enabled and disabled on DUT Artificial delays representing WAN links between sites Tom Alexander, VeriWave

Test Setup January 2005 18 WLAN devices Wired network OTA test 14 VoIP handsets 2 Enterprise APs 2 Traffic Generator / Analyzer units Wired network VoIP call server/gateway WLAN switch/router Other LAN devices OTA test 1 or 2 BSSIDs set up with physical and channel separation Handsets had integrated antennas Some DUTs had integrated antennas Tom Alexander, VeriWave

Controlled Open-Air Testing January 2005 Controlled Open-Air Testing Several measures taken to achieve repeatability Control of propagation environment Enclosed room, concrete walls, minimal metallic content within LOS zones Careful equipment & furniture positioning (secured in place for duration of tests) Minimize movement of scatterers (metallic objects) and absorbers (people) Control of interference Eliminate RF interference (cordless phones, Bluetooth, etc.) Eliminate other WLAN devices (scan for and shut off) Precision traffic generation and analysis Traffic generator offered load could be controlled to 5% accuracy Handset clocks aligned to within 4 ppm Analysis timestamps aligned to within 50 nsec Detailed monitoring of environment and devices during test Traffic analyzer reported duration & strength of noise bursts Offered load monitored from trial to trial FER levels monitored from trial to trial Tom Alexander, VeriWave

Repeatability Observed During Tests January 2005 Repeatability Observed During Tests Actual repeatability observed, within trial and across trials: R-factor variation across flows (handsets) within trial: < ±5%, typically under ±2% By comparison, variation from DUT to DUT could be >50% R-factor variation across trials (same DUT & handsets): < ±1%, typically under ±0.2% Failover roaming time variation across handsets within trial: < ±15% By comparison, variation from DUT to DUT could be as much as 500% Failover roaming time variation across trials (same DUT & handset): < ±5% Data throughput (forwarding rate) variation across trials: < ±2%, typically < ±0.5% Tom Alexander, VeriWave

Driving WLAN Metrics From Application Layer Measurements Doc.: IEEE 802.11-05/1582r0 January 2005 January 2005 Driving WLAN Metrics From Application Layer Measurements Tom Alexander, VeriWave Dr. Michael D. Foegelle, ETS-Lindgren

Anomalous Performance Issues January 2005 Anomalous Performance Issues Performance results were counter-intuitive Very limited number of calls supported (most DUTs did not support more than 6 calls – under 1 Mb/s of voice traffic!) Injection of just 1 Mb/s of data with < 1 Mb/s of voice traffic caused dramatic R value reductions, voice dropouts, calls dropping off line Very poor call roaming times (up to 15 seconds!) during failover test Excessive roaming time was particularly puzzling Time to restore VoIP streams far exceeded time for a “data” client to disassociate with AP #1 and authenticate/associate with AP #2 One example: Total time for 128 “golden” Layer 4 data clients to roam: 307 milliseconds Average time for 14 VoIP handsets to roam: 4.324 seconds Worst-case handset roaming time >10 seconds! Tom Alexander, VeriWave

Analysis of Underlying Behavior January 2005 Analysis of Underlying Behavior Handsets were making continuous channel/AP availability measurements Packet delay / loss variations monitored on a packet-by-packet basis Excessive loss or jitter triggered active probing (probe request/response) If active probing indicated response times >30 – 60 msec, channel scanning started Once channel scanning started, load went up and throughput went down Result: dramatic variations in R value DUTs were introducing probe request/response handshake delays Primary AP goes down; handsets move to backup AP in less than 100 milliseconds Handset sends probe requests to backup AP AP fails to return probe response, or is too late (>30 msec) Handset assumes AP not present / overloaded; does not associate, keeps scanning Sometimes probes took >15 sec before acceptable probe responses received This had a significant impact on both voice quality and failover time Long dead times in voice streams Sometimes calls dropped entirely due to timeouts Long dead times and delays in roaming Enormous disconnect between L2 and L7 events Throughput tests using “data” traffic indicate ample bandwidth for VoIP traffic Failover roaming tests using “data” traffic indicated millisecond failover times Actual VoIP handsets have very different behavior from data client adapters Tom Alexander, VeriWave

Layer 2 Tests Indicated Voice RTP TCP IP MAC PHY January 2005 “Expected” QoS testing End-to-end delay Delay variation Packet loss Burst loss profile over time Impact of background data traffic on the above Additional tests Probe request/response time Especially in the presence of multiple concurrent probe requests/responses Impact of background data on probe requests/responses Some DUTs worked well provided they were not required to transport data concurrently with voice, even though the level of bandwidth was well below the DUT capacity TCP IP MAC PHY Tom Alexander, VeriWave

Conclusion The 802.11 protocol is complex and has many moving parts January 2005 Conclusion The 802.11 protocol is complex and has many moving parts Some of these moving parts have significant and non-intuitive impact on application layer performance While the user may not know (or care) about 802.11 effects, he/she certainly cares about application layer performance We should devote some energy to picking metrics that are driven by actual application layer measurements and modeling Tom Alexander, VeriWave