ARC: A self-tuning, low overhead Replacement Cache

Slides:



Advertisements
Similar presentations
9.4 Page Replacement What if there is no free frame?
Advertisements

CS 241 Spring 2007 System Programming 1 Memory Replacement Policies Lecture 32 Klara Nahrstedt.
Song Jiang1 and Xiaodong Zhang1,2 1College of William and Mary
Page Replacement Algorithms
Online Algorithm Huaping Wang Apr.21
Seoul National University Archi & Network LAB LRFU (Least Recently/Frequently Used) Block Replacement Policy Sang Lyul Min Dept. of Computer Engineering.
Chapter 101 The LRU Policy Replaces the page that has not been referenced for the longest time in the past By the principle of locality, this would be.
The Performance Impact of Kernel Prefetching on Buffer Cache Replacement Algorithms (ACM SIGMETRIC 05 ) ACM International Conference on Measurement & Modeling.
ULC: An Unified Placement and Replacement Protocol in Multi-level Storage Systems Song Jiang and Xiaodong Zhang College of William and Mary.
Aamer Jaleel, Kevin B. Theobald, Simon C. Steely Jr. , Joel Emer
A Survey of Web Cache Replacement Strategies Stefan Podlipnig, Laszlo Boszormenyl University Klagenfurt ACM Computing Surveys, December 2003 Presenter:
Paging: Design Issues. Readings r Silbershatz et al: ,
Virtual Memory Why? The need of memory more than the available physical memory. Process 3 Physical Memory Process 2 Process 1 Process 4.
ARC: A SELF-TUNING, LOW OVERHEAD REPLACEMENT CACHE
1 Cache and Caching David Sands CS 147 Spring 08 Dr. Sin-Min Lee.
Outperforming LRU with an Adaptive Replacement Cache Algorithm Nimrod megiddo Dharmendra S. Modha IBM Almaden Research Center.
Qinqing Gan Torsten Suel Improved Techniques for Result Caching in Web Search Engines Presenter: Arghyadip ● Konark.
Online Algorithms Amrinder Arora Permalink:
Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Competitive Analysis.
1 Competitive analysis of the LRFU paging algorithm Edith Cohen -- AT&T Haim Kaplan -- Tel Aviv Univ. Uri Zwick -- Tel Aviv Univ.
Cache Memory By JIA HUANG. "Computer Science has only three ideas: cache, hash, trash.“ - Greg Ganger, CMU.
Improving Proxy Cache Performance: Analysis of Three Replacement Policies Dilley, J.; Arlitt, M. A journal paper of IEEE Internet Computing, Volume: 3.
CSI 400/500 Operating Systems Spring 2009 Lecture #9 – Paging and Segmentation in Virtual Memory Monday, March 2 nd and Wednesday, March 4 th, 2009.
Ecole Polytechnique, Nov 7, Online Job Scheduling Marek Chrobak University of California, Riverside.
ECE7995 Caching and Prefetching Techniques in Computer Systems Lecture 8: Buffer Cache in Main Memory (IV)
A Hybrid Caching Strategy for Streaming Media Files Jussara M. Almeida Derek L. Eager Mary K. Vernon University of Wisconsin-Madison University of Saskatchewan.
03/29/2004CSCI 315 Operating Systems Design1 Page Replacement Algorithms (Virtual Memory)
Least Popularity-per-Byte Replacement Algorithm for a Proxy Cache Kyungbaek Kim and Daeyeon Park. Korea Advances Institute of Science and Technology (KAIST)
New Visual Characterization Graphs for Memory System Analysis and Evaluation Edson T. Midorikawa Hugo Henrique Cassettari.
Operating Systems ECE344 Ding Yuan Page Replacement Lecture 9: Page Replacement.
1 Ekow J. Otoo Frank Olken Arie Shoshani Adaptive File Caching in Distributed Systems.
Page 19/17/2015 CSE 30341: Operating Systems Principles Optimal Algorithm  Replace page that will not be used for longest period of time  Used for measuring.
Chapter 1 Computer System Overview Dave Bremer Otago Polytechnic, N.Z. ©2008, Prentice Hall Operating Systems: Internals and Design Principles, 6/E William.
Online Paging Algorithm By: Puneet C. Jain Bhaskar C. Chawda Yashu Gupta Supervisor: Dr. Naveen Garg, Dr. Kavitha Telikepalli.
« Performance of Compressed Inverted List Caching in Search Engines » Proceedings of the International World Wide Web Conference Commitee, Beijing 2008)
Qingqing Gan Torsten Suel CSE Department Polytechnic Institute of NYU Improved Techniques for Result Caching in Web Search Engines.
Chapter 21 Virtual Memoey: Policies Chien-Chung Shen CIS, UD
Computer Architecture Lecture 26 Fasih ur Rehman.
By Andrew Yee. Virtual Memory Memory Management What is Page Replacement?
COT 4600 Operating Systems Fall 2009 Dan C. Marinescu Office: HEC 304 Office hours: Tu-Th 3:00-4:00 PM.
Programming Logic and Design Fourth Edition, Comprehensive Chapter 8 Arrays.
International Symposium on Computer Architecture ( ISCA – 2010 )
Time Parallel Simulations I Problem-Specific Approach to Create Massively Parallel Simulations.
Clock-Pro: An Effective Replacement in OS Kernel Xiaodong Zhang College of William and Mary.
Parallel and Distributed Simulation Time Parallel Simulation.
Memory Management & Virtual Memory © Dr. Aiman Hanna Department of Computer Science Concordia University Montreal, Canada.
An Overview of Proxy Caching Algorithms Haifeng Wang.
Transforming Policies into Mechanisms with Infokernel Andrea C. Arpaci-Dusseau, Remzi H. Arpaci-Dusseau, Nathan C. Burnett, Timothy E. Denehy, Thomas J.
Jiahao Chen, Yuhui Deng, Zhan Huang 1 ICA3PP2015: The 15th International Conference on Algorithms and Architectures for Parallel Processing. zhangjiajie,
Operating Systems ECE344 Ding Yuan Page Replacement Lecture 9: Page Replacement.
LIRS: Low Inter-reference Recency Set Replacement for VM and Buffer Caches Xiaodong Zhang College of William and Mary.
Page Replacement FIFO, LIFO, LRU, NUR, Second chance
COS 318: Operating Systems Virtual Memory Paging.
LRFU (Least Recently/Frequently Used) Block Replacement Policy
Computer Architecture
Chapter 21 Virtual Memoey: Policies
ECE7995 Caching and Prefetching Techniques in Computer Systems
ECE-752 Zheng Zheng, Anuj Gadiyar
Chapter 15 – Part 1 The Internal Operating System
ECE 445 – Computer Organization
Distributed Systems CS
Virtual Memory فصل هشتم.
Zipf-Distributions & Caching
Qingbo Zhu, Asim Shankar and Yuanyuan Zhou
Feifei Li, Ching Chang, George Kollios, Azer Bestavros
Operating Systems CMPSC 473
Module IV Memory Organization.
ARC (Adaptive Replacement Cache)
CGS 3763 Operating Systems Concepts Spring 2013
Sarah Diesburg Operating Systems CS 3430
Presentation transcript:

ARC: A self-tuning, low overhead Replacement Cache Nimrod Megiddo and Dharmendra S. Modha Presented by Gim Tcaesvk

Cache Cache is expensive. The replacement policy is the only algorithm of interest.

Cache management problem Maximize the hit rate. Minimize the overhead. Computation Space

Belady’s MIN Replaces the furthest referenced page. Optimum for every case. Provides upper bound of hit ratio. But, Who knows the future?

LRU (Least Recently Used) Replaces the least recently used page. Optimum policy for SDD (Stack Depth Distribution). Captures recency but not frequency. 2 3 2 1 3 2 2 3 2 2 3 1 1 2 3 2 3 2 1 1 3

LFU (Least Frequently Used) Replaces the least frequently used page. Optimum policy for IRM (Independent Reference Model). Captures frequency but not recency. 1 1 1 2 3 3 2 1

Drawbacks of LFU Logarithmic complexity in cache size High frequently used page is hardly to paged out. Even if it is no longer useful.

LRU-2 Replaces the least 2nd recently used page. Optimum online policy which knows at most 2 most recent reference for IRM.

Limitations of LRU-2 Logarithmic complexity Tunable parameter, Due to priority queue Tunable parameter, CIP (Correlated Information Period) 2(2,4) 1(1,3) 0(-∞,0)

2Q (modified LRU-2) Uses simple LRU list instead of the priority queue. Two parameter, Kin and Kout Kin=Correlated Information Period Kout=Retained Information Period

LIRS (Low Inter-reference Recency Set) Maintains 2 LRU stacks of different size. Llirs maintains LRU page at least twice recently seen. Lhirs maintains LRU page only once recently seen. Works well for IRM, but not for SDD.

FBR (Frequency-Base Replacement) Divide LRU list into 3 sections; New Reference count in new section is not incremented. Middle Old Replaces page in old section with smallest reference count.

LRFU (Least Recently/Frequently Used) Exponential smoothing. λ is hard to tune.

MQ (Multi-Queue) Uses m LRU queues Q0, Q1, … Qm-1 Qi contains page that have been seen [2i,2i+1) times recently.

Introducing a Class of Replacement Policies

DBL(2c): DouBLe Maintains 2 variable-sized lists. Holds 2c pages.

DBL(2c) flow True False True False Most recent c page will always be contained.

: A new class of policy

FRCp(Fixed Replacement Cache) FRCp(c) is tuned We define a parameter p such that. 0≤p≤c.

FRCp(c) flow True False Delete Delete Replace Replace

ARC (Adaptive Replacement Cache) US Patent 20040098541 Assignee Name: IBM Corp.

ARC (Adaptive RC) Adaptation parameter p∈[0,c] For a given (fixed) value of p, Exactly same as FRCp. But it learns! ARC continuously adapts and tunes p.

Learning We should

ARC flow True False Delete Delete Replace Replace

Scan-Resistant Long sequence of 1-time-only reads will pass through L1. Less hits will be encountered in B1 compared to B2. If hit in B1, T2 will grow by learning.

ARC example 2 1 p=2 p=2 p=3 1 B2 2 1 T2 1 2 1 3 4 2 5 6 7 T1 6 1 2 3 4 7 5 5 4 3 2 B1 4 6 3 2 5 1 2 4 3 3

Experiment Various traces used in this paper.

Experiment: Hit ratio of OLTP ARC Outperforms online parameter algorithms. Performs as well as offline parameter algorithms.

Experiment: Hit ratio (cont.) MQ outperforms LRU, while ARC outperforms all.

Experiment: Hit ratio (cont.) Comparison for various workloads shows ARC outperforms.

Experiment: Cache size of P1 ARC performs always better than LRU.

Experiment: parameter p “Dances to the tune being played by workload.”

Summary ARC Is online and self-tuning. Is robust for a wider range of workloads. Has less overhead Is scan-resistant. Outperforms LRU.