Download presentation
Presentation is loading. Please wait.
1
Cache Memory Ross Galijan
2
Library analogy Imagine a library, whose shelves are lined with books Problem: While one person can walk around the library and read a book or two at a time, referencing multiple books for research can be difficult because books are all over the place
3
Library analogy Solution: Add tables to central library locations so people can take multiple books off the shelves and place them on the tables for easier access Tables have limited space compared to shelves, but books can be laid out for easier reading If a required book is not on the table, go find it on the shelves and bring it to the table If the table is full, remove a (presumably not needed) book to make room for another
4
Library analogy If a book is not in your hand, check the table If a book is not on the table, check the shelves If a book is not on the shelves, check if the book can be ordered from another library (and take forever to arrive) If a book is not in the system, you're in big trouble because your research cannot conclude
5
Relevance of library analogy Book in hand Book on table Book on shelf Book in system L1/L2 cache RAM HDD network/optical disc
6
Why not simply use one giant cache? Cache memory is smaller, and much more expensive. 1 TB HDD: $50~$100 1 TB RAM: At least $10,000 1 TB L2 cache: (this space intentionally left blank) Caching balances both access time and storage cost
7
Principle of Locality Data most recently accessed OR data near recently accessed data is likely to be accessed again in the near future The purpose of a cache is to store such data in an area more easily accessed than, say, the hard drive In a library, one will take related books off a shelf and bring them to a table, but will not return them to the shelves until either research is complete or the table is out of space
8
Two replacement methods Least recently added Removes items in the order which they entered the cache Requires entry order to be stored Easily implemented, but also easily ineffective Least recently used Removes the item which was last used the longest time ago Requires last access time to be stored Efficient, but difficult to implement
9
Cache mapping Direct mapping: each location in memory is mapped to a single location in cache Easily determines if data is in the cache, but potentially wastes a lot of space Memory address X maps to a cache of size Y at cache location X mod Y Associative mapping: any location in memory can be mapped to any location in cache Fully utilizes cache space, but needs to search entire cache to determine a miss – wastes time
10
Set associative mapping Combines direct mapping and associative mapping The cache is divided into N blocks of equal size Data from memory address X is assigned a specific block X mod N Each block then follows associative mapping Note that associative mapping is like a set associative map with one block, and direct mapping is like a set associative map with each block having size 1
11
A few formulas Hit ratio: cache hits / cache misses Average access time (single cache): hit ratio * cache access time + (1 – hit ratio) * memory access time Average access time (multiple cache): cache 1 hit ratio * cache 1 access time + … + cache X hit ratio * cache X access time + … + (1 – cache 1 hit ratio - … - cache X hit ratio - …) * memory access time
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.