Download presentation
Presentation is loading. Please wait.
Published byKelley Watkins Modified over 9 years ago
1
Compressed Memory Hierarchy Dongrui SHE Jianhua HUI
2
The research paper: A compressed memory hierarchy using an indirect index cache. By Erik G. Hallnor and Steven K. Reinhardt Advanced Computer Architecture Laboratory EECS Department University of Michigan
3
Outline Introduction Memory eXpansion Technology Cache-compression IIC & IIC-C Evaluation Summary
4
Introduction Memory capacity and Memory bandwidth The amount of cache cannot be increased without bound; Scarce resource: memory bandwidth;
5
Application of data compression First, adding a compressed main memory system (Memory Expansion Technology, MXT) Second, Storing compressed data in the cache, then data be transmitted in compressed form between main memory and cache
6
A key challenge Management of variable-sized data blocks: 128-byte Block After compression, 58 bytes unused
7
Outline Introduction Memory eXpansion Technology(MXT) Cache-compression IIC & IIC-C Evaluation Summary
8
Memory eXpansion Technology A server class system with hardware compressed main memory. Using LZSS compression algorithm. For most applications, two to one compression (2:l). Hardware compression of memory has a negligible performance penalty.
9
Hardware organization Sector translation table Each entry has 4 physical addr that each points to a 256B sector.
10
Outline Introduction Memory eXpansion Technology(MXT) Cache-compression IIC & IIC-C Evaluation Summary
11
Cache compression Most designs for power savings, using more conventional cache structures: unused storage benefits only by not consuming power. To use the space freed by compression, new cache structure is needed.
12
Outline Introduction Memory eXpansion Technology(MXT) Cache-compression IIC & IIC-C Evaluation Summary
13
Conventional Cache Structure Tag associated statically with a block When data is compressed
14
Solution: Indirect Index Cache A tag entry not associated with a particular data block A tag entry contains a pointer to data block
15
IIC structure The cache can be fully associative
16
Extend IIC to compressed data Tag contains multiple pointers to smaller data blocks
17
Software-managed Blocks grouped into prioritized pools based on frequency Victim is chosen from lowest-priority non- empty pool Generational Replacement
18
Additional Cost Compression/decompression engine More space for the tag entries Extra resource for replacement algorithm Area is roughly 13% larger
19
Outline Introduction Memory eXpansion Technology(MXT) Cache-compression IIC & IIC-C Evaluation Summary
20
Evaluation Method: SPEC CPU2000 Benchmarks: Main memory: 150 cycle latency, bus width 32, with MXT L1: 1 cycle latency, split 16KB, 4-way, 64B block size L2:12 cycle latency, unified 256KB, 8-way,128B block size L3:26 cycle latency, unified 1MB,8- way,128B block size, with IIC-C
21
Evaluation lsd Over 50% gain with only 10% area overhead
22
Evaluation
23
Summary Advantages: Increase Effective Capacity & Bandwidth; Power Saving From Less Memory Access Drawbacks: Increase Hardware Complexity Power Consumption of Additional Hardware
24
Future work Overall power consumption study Use it in embedded system
25
END Thank you ! Question time.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.