Cache Memory Yi-Ning Huang. Principle of Locality Principle of Locality A phenomenon that the recent used memory location is more likely to be used again.

Slides:



Advertisements
Similar presentations
SE-292 High Performance Computing
Advertisements

Cache and Virtual Memory Replacement Algorithms
361 Computer Architecture Lecture 15: Cache Memory
SE-292 High Performance Computing Memory Hierarchy R. Govindarajan
1 Recap: Memory Hierarchy. 2 Memory Hierarchy - the Big Picture Problem: memory is too slow and or too small Solution: memory hierarchy Fastest Slowest.
How caches take advantage of Temporal locality
Review CPSC 321 Andreas Klappenecker Announcements Tuesday, November 30, midterm exam.
Memory Problems Prof. Sin-Min Lee Department of Mathematics and Computer Sciences.
Memory Organization.
CS 524 (Wi 2003/04) - Asim LUMS 1 Cache Basics Adapted from a presentation by Beth Richardson
Cache Memories Effectiveness of cache is based on a property of computer programs called locality of reference Most of programs time is spent in loops.
Memory Hierarchy and Cache Design The following sources are used for preparing these slides: Lecture 14 from the course Computer architecture ECE 201 by.
Maninder Kaur CACHE MEMORY 24-Nov
Memory Systems Architecture and Hierarchical Memory Systems
Cache memory October 16, 2007 By: Tatsiana Gomova.
7-1 Chapter 7 - Memory Principles of Computer Architecture by M. Murdocca and V. Heuring © 1999 M. Murdocca and V. Heuring Principles of Computer Architecture.
Lecture 10 Memory Hierarchy and Cache Design Computer Architecture COE 501.
In1210/01-PDS 1 TU-Delft The Memory System. in1210/01-PDS 2 TU-Delft Organization Word Address Byte Address
Subject: Operating System.
Memory and cache CPU Memory I/O. CEG 320/52010: Memory and cache2 The Memory Hierarchy Registers Primary cache Secondary cache Main memory Magnetic disk.
10/18: Lecture topics Memory Hierarchy –Why it works: Locality –Levels in the hierarchy Cache access –Mapping strategies Cache performance Replacement.
CS1104 – Computer Organization PART 2: Computer Architecture Lecture 10 Memory Hierarchy.
Computer Architecture Memory organization. Types of Memory Cache Memory Serves as a buffer for frequently accessed data Small  High Cost RAM (Main Memory)
L/O/G/O Cache Memory Chapter 3 (b) CS.216 Computer Architecture and Organization.
Computer Architecture Lecture 26 Fasih ur Rehman.
3-May-2006cse cache © DW Johnson and University of Washington1 Cache Memory CSE 410, Spring 2006 Computer Systems
Computer Science and Engineering Copyright by Hesham El-Rewini Advanced Computer Architecture CSE 8383 January Session 2.
1 Virtual Memory Main memory can act as a cache for the secondary storage (disk) Advantages: –illusion of having more physical memory –program relocation.
CSE 241 Computer Engineering (1) هندسة الحاسبات (1) Lecture #3 Ch. 6 Memory System Design Dr. Tamer Samy Gaafar Dept. of Computer & Systems Engineering.
Cache Memory By Tom Austin. What is cache memory? A cache is a collection of duplicate data, where the original data is expensive to fetch or compute.
Multilevel Caches Microprocessors are getting faster and including a small high speed cache on the same chip.
1 CSCI 2510 Computer Organization Memory System II Cache In Action.
DECStation 3100 Block Instruction Data Effective Program Size Miss Rate Miss Rate Miss Rate 1 6.1% 2.1% 5.4% 4 2.0% 1.7% 1.9% 1 1.2% 1.3% 1.2% 4 0.3%
11 Intro to cache memory Kosarev Nikolay MIPT Nov, 2009.
Cache Memory By Ed Martinez.  The fastest and most expensive memory on a computer system that is used to store collections of data.  Uses very short.
1 Appendix C. Review of Memory Hierarchy Introduction Cache ABCs Cache Performance Write policy Virtual Memory and TLB.
Cache Small amount of fast memory Sits between normal main memory and CPU May be located on CPU chip or module.
CACHE MEMORY CS 147 October 2, 2008 Sampriya Chandra.
Introduction to computer architecture April 7th. Access to main memory –E.g. 1: individual memory accesses for j=0, j++, j
نظام المحاضرات الالكترونينظام المحاضرات الالكتروني Cache Memory.
Characteristics Location Capacity Unit of transfer Access method Performance Physical type Physical characteristics Organisation.
Computer Orgnization Rabie A. Ramadan Lecture 9. Cache Mapping Schemes.
Associative Mapping A main memory block can load into any line of cache Memory address is interpreted as tag and word Tag uniquely identifies block of.
CSCI206 - Computer Organization & Programming
Memory Hierarchy Ideal memory is fast, large, and inexpensive
Cache Memory.
The Memory System (Chapter 5)
Memory and cache CPU Memory I/O.
Ramya Kandasamy CS 147 Section 3
Computer Architecture
Basic Performance Parameters in Computer Architecture:
Consider a Direct Mapped Cache with 4 word blocks
William Stallings Computer Organization and Architecture 7th Edition
CSCI206 - Computer Organization & Programming
Memory and cache CPU Memory I/O.
Module IV Memory Organization.
Module IV Memory Organization.
Chapter 6 Memory System Design
Chap. 12 Memory Organization
CMSC 611: Advanced Computer Architecture
Miss Rate versus Block Size
EE108B Review Session #6 Daxia Ge Friday February 23rd, 2007
Contents Memory types & memory hierarchy Virtual memory (VM)
Chapter Five Large and Fast: Exploiting Memory Hierarchy
Cache - Optimization.
Cache Memory and Performance
10/18: Lecture Topics Using spatial locality
Presentation transcript:

Cache Memory Yi-Ning Huang

Principle of Locality Principle of Locality A phenomenon that the recent used memory location is more likely to be used again soon.

What is cache memory? According to locality principle, Scientists designed cache memory to make memory more efficient. Cache contains the information most frequently needed by the processor. Cache memory is a small but fast memory module inserted between the processor and the main memory.

Example of grocery shopping If you buy one single item you need immediately in the grocery store. Then you might have to go to grocery store again soon. It is waste of time and waste of gas ( suppose you drive to grocery store)

Example of grocery shopping Most of people will do: You buy any items you need immediately and additional items you will most likely need in the future.

Example of grocery shopping Grocery store is similar to main memory Your home is similar to cache. To speed up access time, cache would stores information the computer will need immediately and will most likely need in the future.

What is cache memory? According to locality principle, Scientists designed cache memory to make memory more efficient. Cache contains the information most frequently needed by the processor. Cache memory is a small but fast memory module inserted between the processor and the main memory.

3 general functions in cache Address translation function Address mapping function Replacement algorithm

Mapping Cache Determines where the blocks are to be located in the cache. 3 types of mapping Fully associative mapping Fully associative mapping Direct mapping Direct mapping Set associative mapping Set associative mapping

Mapping Cache: Direct mapping Each primary memory address maps to a unique cache block. If cache has N blocks, then block address X of main memory maps to cache block Y = X mod N ( Cache block = block address mod number of blocks)

Mapping Cache: Direct mapping Main memory: 64kb Frame 0 1 … 31 32,,,, 63 64,,,, byte cache Block 0 1 … 31 TagBlockByte Address 8 53 Note: tag area is not shown

Mapping Cache: Direct mapping Main memory: 64kb Frame 0 1 … 31 32,,,, 63 64,,,, byte cache Block 0 1 … 31 TagBlockByte Address 8 53 Note: tag area is not shown

Mapping Cache: Direct mapping Use least significant b bits of the tag to indicate the cache blocks in which a frame can reside. Then, the tags would be only (p-n-b) bits long. TagBlockWord Bits in main memory address p-n-bb n Main memory Address (A)

Mapping Cache: Fully associative mapping Main memory can occupy any of the cache blocks. Disadvantages: All tags mush be searched in order to determine a hit or miss. If number of tags are large, then this search can be time consuming.

For example, a system with 64KB of primary memory and a 1KB cache. There are 8 bytes per block 8 bytes/frame Main memory: 64kb Frame … 8191 Comparator TagWord Hit/Miss Tag area 13 bits Main memory Address (A) Cache Data Area Block … bytes/block Bits in main memory address: 16

Disadvantages of direct mapping and fully associative mapping Disadvantages of fully associative mapping : All tags mush be searched in order to determine a hit or miss. If number of tags are large, then this search can be time consuming. Disadvantages of direct mapping: It overcomes the disadvantages of fully associative mapping, but It will continue replace blocks even though there are some free blocks.

Mapping Cache: Set associative mapping Set associative mapping combines advantages and disadvantages of direct mapping and fully associative mapping. A block can placed in a restricted set of places in the cache. It divides the main-memory addresses into K sets.

Mapping Cache: Set associative mapping Main memory: 64kb Frame 0 1 … 31 32,,,, 63 64,,,, way set-associative cache Set 0 1 … 31 Tag Number Set Number Word Address slot

Mapping Cache: Set associative mapping The set is usually chosen by bit selection =(Block address) MOD (Number of sets in cache)

Mapping Cache: Set associative mapping Main memory: 64kb Frame 0 1 … 31 32,,,, 63 64,,,, way set-associative cache Set 0 1 … 31 Tag Number Set Number Word Address slot

Mapping Cache: Set associative mapping Main memory: 64kb Frame 0 1 … 31 32,,,, 63 64,,,, way set-associative cache Set 0 1 … 31 Tag Number Set Number Word Address slot

Mapping Cache: Set associative mapping Direct mapped is simply one-way set associative A fully associative cache with m blocks could be called m -way set associative.

Replacement algorithm When there is a cache miss, it requires space for new blocks. Replacement algorithm determines which block can be replaced by new blocks. If use direct mapping, there is only one cache block the frame can occupy. Therefore, replacement algorithm is not needed in this case.

Replacement algorithm There are three replacement algorithm. LRU (Least recently used) LRU (Least recently used) FIFO (First in first out) FIFO (First in first out) Random Random

Replacement algorithm: LRU Replaced the least recently used block in the cache. To determine where is LRU block, a counter can be associated with each cache block. Advantage: This algorithm follows locality principle, so it limits number of times the block to be replaced. Disadvantage: Implementation is more complex.

Replacement algorithm: FIFO The first-in block in the cache is replaced first. In the other word, the block that is in the cache longest is replaced. Advantage: Easy to implement. Disadvantage: In some condition, blocks are replaced too frequently.

Reference Computer Organization, Design, and Architecture by Sajjan G. Shiva HE/bl_place_applet.html HE/bl_place_applet.html articles/cache-memory html