Download presentation
Presentation is loading. Please wait.
Published byLiliana Small Modified over 9 years ago
1
Cache Memory By JIA HUANG
2
"Computer Science has only three ideas: cache, hash, trash.“ - Greg Ganger, CMU
3
The idea of using caching Using very short time to access the recently and frequently data Using very short time to access the recently and frequently data cache: fast but expensivedisks: cheap but slow
4
Type of cache CPU cache CPU cache Disk cache Disk cache Other caches Other caches Proxy web cache Proxy web cache
5
Usage of caching Caching is used widely in: Caching is used widely in: Storage systems Storage systems Databases Databases Web servers Web servers Middleware Middleware Processors Processors Operating systems Operating systems RAID controller RAID controller Many other applications Many other applications
6
Cache Algorithms Famous algorithms Famous algorithms LRU (Least Recently Used) LRU (Least Recently Used) LFU (Least Frequently Used) LFU (Least Frequently Used) Not so Famous algorithms Not so Famous algorithms LRU –K LRU –K 2Q 2Q FIFO FIFO others others
7
LRU (Least Recently Used) LRU is implemented by a linked list. LRU is implemented by a linked list. Discards the least recently used items first. Discards the least recently used items first.
8
LFU (least frequently used) Counts, how often an item is needed. Those that are used least often are discarded first. Counts, how often an item is needed. Those that are used least often are discarded first.
9
LRU vs. LFU The fundamental locality principle claims that if a process visits a location in the memory, it will probably revisit the location and its neighborhood soon The fundamental locality principle claims that if a process visits a location in the memory, it will probably revisit the location and its neighborhood soon The advanced locality principle claims that the probability of revisiting will increased if the number of the visits is bigger The advanced locality principle claims that the probability of revisiting will increased if the number of the visits is bigger
10
Disadvantages LRU – problem with LRU – problem with process scans a huge database LFU – process scans a huge database, but it made the performance even worse.
11
can there be a better algorithm? Yes Yes
12
New algorithm ARC (Adaptive Replacement Cache) ARC (Adaptive Replacement Cache) - it combines the virtues of LRU and LFU, while avoiding vices of both. The basic idea behind ARC is to adaptively, dynamically and relentlessly balance between "recency" and "frequency" to achieve a high hit ratio. - - Invented by IBM in 2003 ( Almaden Research Center, San Jose)
13
How it works?
14
ARC L1: pages were seen once recently("recency") L2: pages were seen at least twice recently ("frequency") If L1 contains exactly c pages --replace the LRU page in L1 else --replace the LRU page in L2. Lemma: The c most recent pages are in the union of L1 and L2. L1 L2 LRU LRU MRU MRU
15
ARC Divide L1 into T1 (top) & B1 (bottom) Divide L2 into T2 (top) & B2 (bottom) T1 and T2 contain c pages in cache and in directory B1 and B2 contain c pages in directory, but not in cache If T1 contains more than p pages, --replace LRU page in T1, else --replace LRU page in T2. L2 L1 MRU MRU LRU LRU
16
Adapt target size of T1 to an observed workload A self-tuning algorithm: hit in T1 or T2 : do nothing hit in B1: increase target of T1 hit in B2: decrease target of T1 L2 L2"frequency" L1 L1 "recency" MidpointARC
17
ARC ARC has low space complexity. A realistic implementation had a total space overhead of less than 0.75%. ARC has low time complexity; virtually identical to LRU. ARC is self-tuning and adapts to different workloads and cache sizes. In particular, it gives very little cache space to sequential workloads, thus avoiding a key limitation of LRU. ARC outperforms LRU for a wide range of workloads.
18
Example For a huge, real-life workload generated by a large commercial search engine with a 4GB cache, ARC's hit ratio was dramatically better than that of LRU (40.44 percent vs. 27.62 percent). -IBM (Almaden Research Center)
19
ARC vs. LRU
21
ARC Currently, ARC is a research prototype and will be available to customers via many of IBM's existing and future products. Currently, ARC is a research prototype and will be available to customers via many of IBM's existing and future products.
22
References http://en.wikipedia.org/wiki/Cach ing#Other_caches http://en.wikipedia.org/wiki/Cach ing#Other_caches http://en.wikipedia.org/wiki/Cach ing#Other_caches http://en.wikipedia.org/wiki/Cach ing#Other_caches http://www.cs.biu.ac.il/~wiseman /2os/2os/os2.pdf http://www.cs.biu.ac.il/~wiseman /2os/2os/os2.pdf http://www.cs.biu.ac.il/~wiseman /2os/2os/os2.pdf http://www.cs.biu.ac.il/~wiseman /2os/2os/os2.pdf http://www.almaden.ibm.com/St orageSystems/autonomic_stora ge/ARC/index.shtml http://www.almaden.ibm.com/St orageSystems/autonomic_stora ge/ARC/index.shtml http://www.almaden.ibm.com/St orageSystems/autonomic_stora ge/ARC/index.shtml http://www.almaden.ibm.com/St orageSystems/autonomic_stora ge/ARC/index.shtml http://www.almaden.ibm.com/cs/ people/dmodha/arc-fast.pdf http://www.almaden.ibm.com/cs/ people/dmodha/arc-fast.pdf http://www.almaden.ibm.com/cs/ people/dmodha/arc-fast.pdf http://www.almaden.ibm.com/cs/ people/dmodha/arc-fast.pdf
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.