Peer-to-peer Multimedia Streaming and Caching Service by Won J. Jeon and Klara Nahrstedt University of Illinois at Urbana-Champaign, Urbana, USA
Agenda Introduction Proposed peer-to-peer architecture Caching and Streaming Simulation result Comparison with our server-less architecture Conclusion
Introduction Important metrics for multimedia streaming Low initial delay Small delay jitter during playback Minimum network bandwidth utilization
Current Solution(1) Caching and pre-fetching by media gateway (proxy) Cache as segments, or only the prefix Geographically close to the clients Achieve small initial delay and delay jitter during playback
Current Solution(2) Broadcasting services (eg. Skyscraper) Achieve minimum server network bandwidth utilization Assume the synchronous playing time Assume buffering at clients
Motivation Motivation of the proposed peer-to-peer architecture Exploits the proximity of clients Minimizes the bandwidth utilization between the server and group of clients Architecture Assumes the group of peer-to-peer clients is connected via LAN Each client not only receives streams from the server, but also acts as a proxy server.
S(0,t 4 ) S(0,t 1 ) S(t 3,t 4 ) S(t 1,t 2 ) C1C1 C2C2 C3C3 CxCx S Cache Manager Proposed Architecture Topology One server, one cache manager and four nodes s(i,j) represents segment of stream between byte i and j
Caching Caching management Each client caches the retrieved stream and publishes its cache information to the cache manager Each client monitors its own resource availability (eg. network bandwidth), updates all information to the cache manager
Streaming Cache lookup Send query message to cache manager for information of cached streams in peer clients Streaming and Pre-fetching With response from the cache manager, the client send streaming and pre-fetching requests to the peer clients or the server
Streaming Timing diagram of the streaming and pre- fetching (requested by C i )
Streaming Switching Minimize the switching delay jitter The pre-fetching time t 1 * is determined by, Network bandwidth B ik, Network delay D ik between C i and C k Case 1: B ik is larger than service rate i Case 2: B ik is smaller than i
Streaming Switching Pre-fetching Time Case 1: Case 2: : estimated available bandwidth between C i and C k in the time period t 1 * and t 2 : estimated size of stream at time t
Simulation Simulator: ns-2 Video: Jurassic Park I (1.5Mbps) One server, two routers, and four clients Background traffic for all links between nodes and routers: Pareto distribution
Simulation Simulated Topology
Simulation Result Without pre-fetching With pre-fetching
Comparison Proposed P2P architecture Dedicated server is not eliminated Assume different segments cached by peers Assume centralized cache manager Single point of failure – server, cache manager Server-less architecture Dedicated server is eliminated Video blocks distributed to all nodes Additional encoding step for fault tolerance
Conclusion Proposed a peer-to-peer streaming and caching architecture Cache manager maintains all the cache and network connection information Achievements Reduces the initial delay Minimizes the delay jitter during playback
End of Presentation Thank you!