Download presentation
Presentation is loading. Please wait.
Published byÁngela Lara Godoy Modified over 6 years ago
1
Performance Measurement and Analysis of H.323 Traffic
Prasad Calyam Mukundan Sridharan Weiping Mandrawa Paul Schopis 19th April 2004, PAM 2004, Juan-Les-Pins, France
2
Background Today H.323 Videoconferencing is a popular collaboration technology in both industry and academia Demand for high-quality Video and Audio in H.323 applications is always on the rise even given the rapid advancements in the recent past The “Network” is the critical variable that needs to be better understood in assuring satisfactory end-user experience while using H.323 applications Our fundamental question: “What can any given network health diagnostic actually tell us about what the end-user experience might be when using an H.323 application such as Videoconferencing on that network?”
3
Topics of Discussion Goals of our study Terminology
Test Setup and Methodology Results of our study Conclusions
4
Goals of our Study To obtain “Good”, “Acceptable” and “Poor” performance bounds for network metrics such as delay, jitter and loss for H.323 applications, based on simultaneous subjective and objective quality assessment of H.323 audio and video streams To identify the most dominating factor amongst delay, jitter and loss that affects end-user perception of audiovisual quality
5
H.323 Overview H.323 is an umbrella standard that defines how real-time multimedia communications such as videoconferencing can be supported on packet switched networks (Internet) Devices: Terminals, Gateways, Gatekeepers and MCUs Codecs: H.261, H.263, G.711, G.723.1 Signaling: H.225, H.245 Transport Mechanisms: TCP, UDP, RTP and RTCP Data collaboration: T.120 Many others…
6
H.323 Protocol Stack
7
Understanding the factors that affect H
Understanding the factors that affect H.323 Traffic performance assessment… Human Factors Individual perception of audiovisual quality, Lack of training to use the system effectively, … Device Factors VoIP endpoints, Gateways, MCUs, Routers, Firewalls, NATs, Modems, Operating System, Processor, memory, … Network Factors Delay, Jitter, Loss, throughput, BER, …
8
E2E Measurements of Audio and Video Applications
Two approaches to evaluating the performance of audiovisual quality Subjective Measurements Involve human participants to rate audiovisual quality Can you hear me now? We used the Mean Opinion Score (MOS) Ranking technique (ITU-T P.800) Not just “Good”! Objective Measurements Automated techniques to rate audiovisual quality We used the “E-Model” (ITU-T G.107)
9
MOS Rankings
10
What constitutes a “Task”?
Any activity that takes place in a routine Videoconference Casual conversation, intense discussion, a class lecture, movements involving minimal physical activity, … Recommendations insist realistic scenarios, not just passive viewing Guidelines can be found for task selection, participant training for scoring the audiovisual quality, task ordering, overall environment setup for quality assessments, … H.323 Beacon loopback feature provided additional tasks for the various assessments scenarios
11
Resynchronization Delay
Understanding Delay… SENDER SIDE NETWORK RECEIVER SIDE Compression Delay Transmission Delay Electronic Delay Propagation Delay Processing Delay Queuing Delay Resynchronization Delay Decompression Delay Presentation Delay Delay is the amount of time that a packet takes to travel from the sender’s application to reach the receiver’s destination application Caused by codecs, router queuing delays, … One-way delay requirement is stringent for H.323 Videoconferencing to maintain good interaction between both ends Good (0ms-150ms), Acceptable (150ms-300ms), Poor (> 300ms) Our Results!
12
Understanding Jitter…
Jitter is the variation in delay of the packets arriving at the receiving end Caused by congestion, insufficient bandwidth, varying packet sizes in the network, out of order packets, … Excessive jitter may cause packet loss in the receiver jitter buffers thus affecting the playback of the audio and video streams Good (0-20ms), Acceptable (20ms-50ms), Poor (>50ms) Our Results!
13
Understanding Loss… Packet Loss is the packets discarded deliberately (RED, TTL=0) or non-deliberately by intermediate links, nodes and end-systems along a given transmission path Caused by line properties (Layer 1), full buffers (Layer 3) or late arrivals (at the application) Good(0%-0.5%), Acceptable (0.5%-1.5%), Poor (>1.5%) Our Results!
14
Overall Test Setup
15
World Sites involved in Testing…
More than 500 one-on-one subjective quality assessments from Videoconferencing end-users and corresponding traffic traces were obtained from the Testing!!!
16
Various Tools Used in the Testing…
OARnet H.323 Beacon – a tool to troubleshoot H.323 performance problems Network path characteristics determination, Audio loopback, customized tests, … Ethereal – a network traffic sniffer tool To capture traffic traces for Objective Ratings analysis Telchemy VQMon – a network traffic trace analyzer for VoIP Used for obtaining Objective Ratings for various network settings NISTnet – a network emulator tool To create realistic network scenarios by introducing Delay, Jitter and Loss Spirent SmartBits – a commercial network measurement suite Used to qualify NISTnet NLANR Iperf – a TCP/UDP bandwidth measurement tool Network path characteristics determination jaalaM appareNet – a commercial network measurement suite Polycom FX Videoconferencing Station – Audio/Video clips source
17
H.323 Beacon Screenshots…
18
H.323 Beacon Screenshots… (Contd.)
19
H.323 Beacon Screenshots… (Contd.) http://www.itecohio.org/beacon
20
Internet Tests Methodology…
Design of Experiments → 2 Phase Approach Phase I: LAN with WAN Emulation Tests Full factorial (n**3 =27; n=3) Tasks with various network health scenario configurations on NISTnet Phase II: Internet Tests 9 equivalent tasks out of possible 27 tasks + 3 H.323 Beacon tasks using the loopback feature End-to-End delay, jitter and loss values configured on NISTnet for each Internet test site = (LAN NISTnet Settings) – (Inherent Path Characteristics for corresponding Internet test site) G G G A A A P P P G G A G A G A G G G G P G P G P G G BG BG BG BA BA BA BP BP BP G → Good; A → Acceptable; P → Poor; B(G/A/P) → Beacon Task
21
Results: Subjective and Objective MOS Vs Delay
Pearson correlation co-efficient for Subjective and Objective Scores for Delay = 0.827
22
Results: Subjective and Objective MOS Vs Jitter
Pearson correlation co-efficient for Subjective and Objective Scores for Jitter = 0.737
23
Results: Subjective and Objective MOS Vs Loss
Pearson correlation co-efficient for Subjective and Objective Scores for Loss = 0.712
24
Overall Results of Quality Grade Assessments for LAN/Internet Tests
Delay G A P Jitter Loss Result A/P G*/A A*/P G → Good; A → Acceptable; P → Poor; S1 - S9 → Scenario 1-9 * next to a grade → end-user will more often perceive that Grade of quality in that particular scenario
25
Results: Effects of Normalized Delay, Jitter and Loss variations on Subjective MOS
Normalized Scale: 1 Unit → 150ms Delay; 20ms Jitter; 0.5% Loss
26
Results: Effects of Normalized Delay, Jitter and Loss variations on Objective MOS
End-user perception of audiovisual quality is more sensitive to changes in jitter than to changes in delay and loss
27
Conclusions We have determined performance bounds for latency, jitter and loss by conducting exhaustive testing in a LAN environment with WAN Emulation By using many incremental values of each of the 3 metrics meticulously, we determined what ranges of these values in isolation and in combination affect the user perception of audiovisual quality as "good", "acceptable" and "poor" grades We have demonstrated our theory of bounds developed in a LAN with WAN emulation, actually holds good in the “Internet” By using literally every type of last mile connection and by testing with sites all over the world, we proved our LAN results scale consistently to the Internet
28
Conclusions (Contd.) We have observed the H.323 Beacon loopback assessments (again both subjective and objective) to be in accord with the audiovisual tasks test results This demonstrates the handy utility of the H.323 Beacon in determining user perceived audiovisual quality in H.323 production systems without remote end-user intervention Our normalized results indicate that end-user perception of audiovisual quality is more sensitive to changes in jitter than to changes in delay and loss
29
Questions
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.