Download presentation
Presentation is loading. Please wait.
Published byClare Margaret Marsh Modified over 9 years ago
1
Measurements of Congestion Responsiveness of Windows Streaming Media (WSM) Presented By:- Ashish Gupta
2
Roadmap Introduction Graphs and Observations Conclusion Comments
3
Introduction AIM: Characterize the bitrate response of “Intelligent streaming” and content “encoding” of WSM Variable Parameters: Network characterstics like bottleneck bandwidth Content encoding characterstics
4
Windows Streaming Media (WSM) Intelligent Streaming: Content bitrate is adjusted according to current available bandwidth Server and Client determine current bandwidth Multiple bitrate encoded files are used for streaming Intelligent degradation in image quality is done first (both server and client support) Audio is reconstructed to preserve quality
5
WSM (Contd.) WSM server decides the current available bandwidth It requires multiple bitrate encoded stream support by the media creator Some media data parameters like Thinning, Maximum bitrate etc. can be set during transfer initiation.
6
Roadmap Introduction Experimental Setup Graphs and Observations Conclusion Comments
7
Experimental Setup (as shown in the paper)
8
Setup (Contd.) Client has media tracker s/w for capturing the client side performance statistics Each side of the router has linux machine capture offered and achieved network loads. Internet behavior is achieved by router by setting the internet latencies (normal internet traffic)
9
Roadmap Introduction Experimental Setup Graphs and Observations Conclusion Comments
10
Graphs All graphs in the next few slides consider single bitrate encoding Network bandwidth 725 Kbps and latency set to measured internet latency equal to 45 ms WSM provided with 60 sec clip encoded at 340 Kbps, roughly equal to fair capacity
11
Graphs (Contd.)
12
Observations 20-40 percent of packet loss occurred during buffering phase WSM playing can be partitioned into buffering phase and playout phase Previous experiment is repeated with 540Kbps clip which is much above the fare share bandwidth
13
Graphs
14
Observations Media Thinning functionality provided by “intelligent streaming” was NOT used Competing TCP flow was denied of fair share of available capacity During buffering period heavy packet losses were encountered
15
Graphs
16
Observations No media thinning during the buffering phase Bitrate is increased even during losses during buffering phase (Fire-hose approach) Transmission rate decreased below fair share bandwidth during playout phase
17
Graph Now buffering phase and post-buffering phase is studied independently Graphs are plotted “transmission bitrate of WSM/TCP” vs “Content encoded bitrate” “loss rate” vs “Content encoded bitrate” Different capacity n/w is considered without induced losses for “Buffering phase only”
18
Graphs (Buffering: 250Kbps capacity)
19
Graphs (Buffering: Capacity 725 Kbps)
20
Graphs (Buffering: Capacity 1500 Kbps)
21
Observations Buffering rate is proportional to content encoding rate until the encoding rate exceeds bottleneck capacity Beyond this loss rate can be as high as 80% WSM has high loss rate due to higher sending rate
22
Observation (Contd.) The behavior of WSM changes between 340 Kbps-548 Kbps At 548Kbps encoding rate is there is a significant drop in loss rate
23
Graphs (Playout: Capacity 250 Kbps)
24
Graphs (Playout: Capacity 725 Kbps)
25
Graphs (Playout: Capacity 1500 Kbps)
26
Observations In post-buffering bit rate is proportional encoding bit- rate till bottleneck capacity For still higher encoding bit rate incurs losses upto 40 percent initially Bitrate goes lower then TCP flow due to media thinning Accurate information of “thinning” behavior over post buffering period is missed due to average values
27
Graphs: Type of traffic (340 Kbps) Packet Sequence number vs Time for 340 Kbps clip
28
Observations WSM traffic is bursty in nature Retransmission happens for any dropped packet Reason for spikes: shortening of transmission time between packets bursts
29
Graphs (Multiple bit rate) Aim: Explore number of bitrate vs responsiveness Ten set of clips were used 1 st clip (1128 Kbps) 2 nd clip (1128, 764 Kbps). 10 th clip (1128,.. 764, 548,.., 282, 148..Kbps) Study of Bitrate vs lowest bitrate contained in the clip is made
30
Graph (Buffering: Multiple bit rate)
31
For Buffering: WSM chooses the bitrate which is just lower than bottleneck bandwidth otherwise it takes the lowest capacity available
32
WSM specific observations Bottleneck bandwidth: Available bandwidth is measured in WSM using three large packets. Two packet pair estimate is used to get the bottleneck bandwidth Bottleneck bandwidth is estimated only once before each session and then RTCP messages are used to control retransmission Frequency of RTCP message increases with increasing loss rate
33
Graphs (Induced losses) Loss induced by router to check the behavior of WSM in case of loss due to network For further discussion bottleneck bandwidth: 725 Kbps Encoded bitrate: 548 Kbps (largest below 725 Kbps)
34
Graph (Buffering: network induced loss)
35
Observations (Buffering: network induced loss) With high loss induced loss rate TCP decreases its flow WSM increases its flow to compensate for high loss rate During the initial losses WSM buffering uses “fire-hose” approach
36
Graph (post-buffering: network induced loss)
37
Observations (post-buffering: network induced loss) Loss rate of 3-5% causes WSM to thin stream Above 5% loss rate there is no change in thinned bitrate
38
Graphs (Induced losses MBR clip) Next experiment uses multiple bitrate (MBR) encoded ( 548, 340, 282, 148, 106, 58…. Kbps)
39
Graphs (Induced losses with MBR)
40
Instead of thinning it chooses to stream at lower bitrate
41
Graphs (sudden change in induced loss) This shows that WSM is unaffected by the sudden change in network loss
42
WSM model WSM model: Should have two phases buffering and post buffering Bursty Buffering is TCP unfriendly In some post buffering period WSM is more than TCP friendly
43
Conclusion During buffering TCP friendliness can only be achieved if encoded bitrate is less then estimated capacity. Otherwise, buffering is done at encoding rate During playout Thinning or streaming low bitrate encoded stream is done This can also be TCP unfriendly if encoding rate is more than half capacity and less than full capacity
44
Conclusion This study shows that content provider can make judicious decision encoding rates number of encoding levels It provides the researcher with more accurate model of streaming media traffic
45
Comments Study is exhaustive It doesn’t propose analytical model for estimation of number encoding bitrates levels and there values
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.