Presentation is loading. Please wait.

Presentation is loading. Please wait.

Cache Memory Yi-Ning Huang. Principle of Locality Principle of Locality A phenomenon that the recent used memory location is more likely to be used again.

Similar presentations


Presentation on theme: "Cache Memory Yi-Ning Huang. Principle of Locality Principle of Locality A phenomenon that the recent used memory location is more likely to be used again."— Presentation transcript:

1 Cache Memory Yi-Ning Huang

2 Principle of Locality Principle of Locality A phenomenon that the recent used memory location is more likely to be used again soon.

3 What is cache memory? According to locality principle, Scientists designed cache memory to make memory more efficient. Cache contains the information most frequently needed by the processor. Cache memory is a small but fast memory module inserted between the processor and the main memory.

4 Example of grocery shopping If you buy one single item you need immediately in the grocery store. Then you might have to go to grocery store again soon. It is waste of time and waste of gas ( suppose you drive to grocery store)

5 Example of grocery shopping Most of people will do: You buy any items you need immediately and additional items you will most likely need in the future.

6 Example of grocery shopping Grocery store is similar to main memory Your home is similar to cache. To speed up access time, cache would stores information the computer will need immediately and will most likely need in the future.

7 What is cache memory? According to locality principle, Scientists designed cache memory to make memory more efficient. Cache contains the information most frequently needed by the processor. Cache memory is a small but fast memory module inserted between the processor and the main memory.

8 3 general functions in cache Address translation function Address mapping function Replacement algorithm

9 Mapping Cache Determines where the blocks are to be located in the cache. 3 types of mapping Fully associative mapping Fully associative mapping Direct mapping Direct mapping Set associative mapping Set associative mapping

10 Mapping Cache: Direct mapping Each primary memory address maps to a unique cache block. If cache has N blocks, then block address X of main memory maps to cache block Y = X mod N ( Cache block = block address mod number of blocks)

11 Mapping Cache: Direct mapping Main memory: 64kb Frame 0 1 … 31 32,,,, 63 64,,,, 8191 256-byte cache Block 0 1 … 31 TagBlockByte Address 8 53 Note: tag area is not shown

12 Mapping Cache: Direct mapping Main memory: 64kb Frame 0 1 … 31 32,,,, 63 64,,,, 8191 256-byte cache Block 0 1 … 31 TagBlockByte Address 8 53 Note: tag area is not shown

13 Mapping Cache: Direct mapping Use least significant b bits of the tag to indicate the cache blocks in which a frame can reside. Then, the tags would be only (p-n-b) bits long. TagBlockWord Bits in main memory address p-n-bb n Main memory Address (A)

14 Mapping Cache: Fully associative mapping Main memory can occupy any of the cache blocks. Disadvantages: All tags mush be searched in order to determine a hit or miss. If number of tags are large, then this search can be time consuming.

15 For example, a system with 64KB of primary memory and a 1KB cache. There are 8 bytes per block 8 bytes/frame Main memory: 64kb Frame 0 1 2 3 4 5 … 8191 Comparator TagWord Hit/Miss Tag area 13 bits Main memory Address (A) Cache Data Area Block 0 1 2 3 4 5 … 127 133 8 bytes/block Bits in main memory address: 16

16 Disadvantages of direct mapping and fully associative mapping Disadvantages of fully associative mapping : All tags mush be searched in order to determine a hit or miss. If number of tags are large, then this search can be time consuming. Disadvantages of direct mapping: It overcomes the disadvantages of fully associative mapping, but It will continue replace blocks even though there are some free blocks.

17 Mapping Cache: Set associative mapping Set associative mapping combines advantages and disadvantages of direct mapping and fully associative mapping. A block can placed in a restricted set of places in the cache. It divides the main-memory addresses into K sets.

18 Mapping Cache: Set associative mapping Main memory: 64kb Frame 0 1 … 31 32,,,, 63 64,,,, 8191 4-way set-associative cache Set 0 1 … 31 Tag Number Set Number Word Address 8 53 4 slot

19 Mapping Cache: Set associative mapping The set is usually chosen by bit selection =(Block address) MOD (Number of sets in cache)

20 Mapping Cache: Set associative mapping Main memory: 64kb Frame 0 1 … 31 32,,,, 63 64,,,, 8191 4-way set-associative cache Set 0 1 … 31 Tag Number Set Number Word Address 8 53 4 slot

21 Mapping Cache: Set associative mapping Main memory: 64kb Frame 0 1 … 31 32,,,, 63 64,,,, 8191 4-way set-associative cache Set 0 1 … 31 Tag Number Set Number Word Address 8 53 4 slot

22 Mapping Cache: Set associative mapping Direct mapped is simply one-way set associative A fully associative cache with m blocks could be called m -way set associative.

23 Replacement algorithm When there is a cache miss, it requires space for new blocks. Replacement algorithm determines which block can be replaced by new blocks. If use direct mapping, there is only one cache block the frame can occupy. Therefore, replacement algorithm is not needed in this case.

24 Replacement algorithm There are three replacement algorithm. LRU (Least recently used) LRU (Least recently used) FIFO (First in first out) FIFO (First in first out) Random Random

25 Replacement algorithm: LRU Replaced the least recently used block in the cache. To determine where is LRU block, a counter can be associated with each cache block. Advantage: This algorithm follows locality principle, so it limits number of times the block to be replaced. Disadvantage: Implementation is more complex.

26 Replacement algorithm: FIFO The first-in block in the cache is replaced first. In the other word, the block that is in the cache longest is replaced. Advantage: Easy to implement. Disadvantage: In some condition, blocks are replaced too frequently.

27 Reference Computer Organization, Design, and Architecture by Sajjan G. Shiva http://www.cs.sjsu.edu/~lee/cs147/cs147.htm http://www.cs.iastate.edu/~prabhu/Tutorial/CAC HE/bl_place_applet.html http://www.cs.iastate.edu/~prabhu/Tutorial/CAC HE/bl_place_applet.html http://www.articlesbase.com/hardware- articles/cache-memory-675304.html


Download ppt "Cache Memory Yi-Ning Huang. Principle of Locality Principle of Locality A phenomenon that the recent used memory location is more likely to be used again."

Similar presentations


Ads by Google