Download presentation
Presentation is loading. Please wait.
1
Peer-to-peer Multimedia Streaming and Caching Service by Won J. Jeon and Klara Nahrstedt University of Illinois at Urbana-Champaign, Urbana, USA
2
Agenda Introduction Proposed peer-to-peer architecture Caching and Streaming Simulation result Comparison with our server-less architecture Conclusion
3
Introduction Important metrics for multimedia streaming Low initial delay Small delay jitter during playback Minimum network bandwidth utilization
4
Current Solution(1) Caching and pre-fetching by media gateway (proxy) Cache as segments, or only the prefix Geographically close to the clients Achieve small initial delay and delay jitter during playback
5
Current Solution(2) Broadcasting services (eg. Skyscraper) Achieve minimum server network bandwidth utilization Assume the synchronous playing time Assume buffering at clients
6
Motivation Motivation of the proposed peer-to-peer architecture Exploits the proximity of clients Minimizes the bandwidth utilization between the server and group of clients Architecture Assumes the group of peer-to-peer clients is connected via LAN Each client not only receives streams from the server, but also acts as a proxy server.
7
S(0,t 4 ) S(0,t 1 ) S(t 3,t 4 ) S(t 1,t 2 ) C1C1 C2C2 C3C3 CxCx S Cache Manager Proposed Architecture Topology One server, one cache manager and four nodes s(i,j) represents segment of stream between byte i and j
8
Caching Caching management Each client caches the retrieved stream and publishes its cache information to the cache manager Each client monitors its own resource availability (eg. network bandwidth), updates all information to the cache manager
9
Streaming Cache lookup Send query message to cache manager for information of cached streams in peer clients Streaming and Pre-fetching With response from the cache manager, the client send streaming and pre-fetching requests to the peer clients or the server
10
Streaming Timing diagram of the streaming and pre- fetching (requested by C i )
11
Streaming Switching Minimize the switching delay jitter The pre-fetching time t 1 * is determined by, Network bandwidth B ik, Network delay D ik between C i and C k Case 1: B ik is larger than service rate i Case 2: B ik is smaller than i
12
Streaming Switching Pre-fetching Time Case 1: Case 2: : estimated available bandwidth between C i and C k in the time period t 1 * and t 2 : estimated size of stream at time t
13
Simulation Simulator: ns-2 Video: Jurassic Park I (1.5Mbps) One server, two routers, and four clients Background traffic for all links between nodes and routers: Pareto distribution
14
Simulation Simulated Topology
15
Simulation Result Without pre-fetching With pre-fetching
16
Comparison Proposed P2P architecture Dedicated server is not eliminated Assume different segments cached by peers Assume centralized cache manager Single point of failure – server, cache manager Server-less architecture Dedicated server is eliminated Video blocks distributed to all nodes Additional encoding step for fault tolerance
17
Conclusion Proposed a peer-to-peer streaming and caching architecture Cache manager maintains all the cache and network connection information Achievements Reduces the initial delay Minimizes the delay jitter during playback
18
End of Presentation Thank you!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.