Presentation is loading. Please wait.

Presentation is loading. Please wait.

Video Caching in Radio Access network: Impact on Delay and Capacity

Similar presentations


Presentation on theme: "Video Caching in Radio Access network: Impact on Delay and Capacity"— Presentation transcript:

1 Video Caching in Radio Access network: Impact on Delay and Capacity
IEEE Wireless Communications and Networking Conference: Mobile and Wireless Networks, 2012 Hasti Ahlehagh and Sujit Dey Department of Electronics & Computer Engineering, University of California, San Diego, La Jolla, CA, USA Speaker: Yi-Ting Chen

2 Outline Introduction Video Preference of Users In A Cell Site Approach
Cell Site Aware Caching Algorithms Scheduling Approach for Delay and Capacity Simulation Result Conclusions

3 Introduction(1/2) When Internet video is accessed by a mobile device, the video has to be fetched from the servers of a content delivery network (CDN).

4 Introduction(1/2) CDNs help reduce Internet bandwidth consumption.
But must additionally travel through the Core Network (CN) and Radio Access Network (RAN).

5 Introduction(2/2) In this paper we introduce caching of videos at (e)NodeBs at the edge of the RAN. For RAN caching, capable of storing 1000s of videos.

6 Introduction(2/2) How to enable high cache hit ratio for the RAN micro-caches?

7 Main Contribution Proposing novel caching policies with user’s preference. Proposing a video scheduling approach. Allocating the RAN backhaul resources to the video requests. Enhancing the overall capacity of the network . Satisfying video Quality of Experience (QoE).

8 Popularity of Online Videos and Video Categories
1) Video popularity follows a Zipf distribution [12] 10% of the online videos account for nearly 80% of the views, while the remaining 90% of the videos account for only total 20% 2) National video popularity does not reflect local video popularity [13]. 3) Users may have strong preferences towards specific video categories [14]. [12] M. Cha et. al.,“Analyzing the Video Popularity Characteristics of LargeScale User Generated Content Systems”, IEEE/ACM Transactions on Networking, Vol. 17, No. 5, Oct [13] Michael Zink, et al.,“Watch Global Cache Local: YouTube Network Traces at a Campus Network - Measurements and Implications.” In Proceedings of MMCN 2008, San Jose, CA, USA, Jan 2008. [14] Reelseo. Available:

9 Video Preference of Users
To understand local video popularity in a cell site, we define Active User Set (AUS). The probability that a video belonging to video category 𝒗𝒄 𝒋 : 𝑝( 𝒗𝒄 𝒋 | 𝒖 𝒌 ) : the probability that the user 𝒖 𝒌 requests videos of a specific video category 𝒗𝒄 𝒋 𝑝( 𝒖 𝒌 ) : the probability that user 𝒖 𝒌 generates a video request

10 The Popularity of Video
The popularity of video 𝒗 𝒊 within video category 𝑣𝑐 : 𝑝 𝒗 𝒊 : the overall popularity of video 𝒗 𝒊 across all videos and video categories. if 𝒗 𝒊 belongs to category 𝒗𝒄 𝒋 , 𝑝 𝒗𝒄 𝒋 𝒗 𝒊 =𝑝 𝒗 𝒊 else 𝑝 𝒗𝒄 𝒋 𝒗 𝒊 =𝟎.

11 The probability of video requested
The probability that video 𝒗 𝒊 is requested: MLR : a subset of videos with 𝑷 𝑹 values greater than a threshold. LLR : a subset of videos from the cache with the least 𝑷 𝑹 value. MLR LLR 𝑷 𝑹

12 Cell Site Aware Caching Algorithms
We outline four different caching algorithms: MPV and LRU Conventionally used by Internet CDNs. P-UPP and R-UPP We propose based on preferences of active users in the cell.

13 MPV (Most Popular Videos)
A proactive caching policy. Won’t update the caches based on the user requests. Won’t implement any cache replacement policy. The only changes that require cache update are changes in the video popularity distribution.

14 LRU (Least Recently Used)
A reactive caching policy. Fetches the video from the Internet CDN and caches it if there is a cache miss. If the cache is full, LRU replaces the video in the cache that has been least recently used. The backhaul bandwidth and delay needed will depend on the cache hit ratio, since there is no prefetching bandwidth.

15 R-UPP A reactive caching policy.
For a video requested with cache miss, R-UPP fetches the video from the Internet CDN and caches it. If the cache is full, R-UPP replaces videos depending on the UPP of the active users using LLR set. We use the LRU policy to select the one to be replaced.

16 R-UPP

17 P-UPP A proactive caching policy.
Preloading the cache with videos belonging to the Most Likely Requested (MLR) set. If the AUS changes frequently, may lead to high computational complexity and high backhaul bandwidth. A solution: only updated if the expected cache hit ratio improvement due to replacement exceeds a preset threshold.

18 P-UPP

19 Scheduling Approach for Delay and Capacity
For proactive policies, MPV and P-UPP. Depending on the number of concurrent video requests, cache misses, proactively fetched videos and frequency of prefetching, the backhaul bandwidth may not be sufficient for all the videos that need to be brought through the backhaul. One approach is to satisfy all the pending fetches, but it may result in some fetches getting significantly delayed.

20 Scheduling Approach for Delay and Capacity
Our proposed scheduling approach aims to Maximize the number of videos that can be served. Ensure each served video meets certain QoE requirements, including initial delay.

21 Video QoE We consider video QoE as consisting of two aspects:
The initial delay Using Leaky Bucket Parameters (LBPs) to determine The number of stalls during the video session

22 Leaky Bucket Parameters
In most video coding standards, a compliant bit stream must be decoded by a HRD (Hypothetical Reference Decoder) [16]. The HRD generates LBPs that consist of N 3-tupples (R, B, F) corresponding to N sets of transmission rates and buffer size parameters for a given bit stream. R (bit/second) is the average transmission rate. B is the buffer size that the client has. F is the size needed to fill initially to avoid any stall. [16] J. Ribas-Corbera, et al.,“A Generalized Hypothetical Reference Decoder for H.264/AVC”, IEEE Transactions on Circuits and Systems, vol. 13, no. 7, July 2003.

23 Leaky Bucket Parameters
The higher the data rate requested, the less the initial delay. However, if all the video clients greedily select the highest data rates, there may be more congestion in the RAN backhaul, leading to fewer requests that can be served. F/R

24 Backhaul Scheduling Approach
Goal : To support as many concurrent videos served as possible while ensuring initial delay below an acceptable threshold.

25 F/R Acceptable initial delay threshold : 25 sec

26 Simulation Parameters

27 Simulation Result(1/6) The performance of the different cache policies

28 Simulation Result(2/6) The mean RAN backhaul bandwidth required by the different policies.

29 Simulation Result(3/6) The blocking probability (probability that requested videos could not be scheduled)

30 Simulation Result(4/6) The probability that the delay of a successfully scheduled video is below a certain value When the cache size is 200GBits

31 Simulation Result(5/6) capacity vs. cache size

32 Simulation Result(6/6) The capacity when changing the target delay.
Cache size = 100 Gbits

33 Conclusion We proposed new caching policies based on video preference of users in the cell. Also proposing a new scheduling technique that allocates RAN backhaul bandwidth in coordination with requesting video clients. Our simulation results show that our method can significantly increase the number of concurrent video requests while meeting initial delay requirements.

34 Thanks for your listening!


Download ppt "Video Caching in Radio Access network: Impact on Delay and Capacity"

Similar presentations


Ads by Google