Download presentation
Presentation is loading. Please wait.
Published byAntony Short Modified over 5 years ago
1
Increasing Effective Cache Capacity Through the Use of Critical Words
Mentor: Amirhossein Mirhosseininiri Mentees: Ben Schreiber, Pooja Welling Motivation Results Caches are an integral part of modern computing. They act as a buffer between the CPU and the main memory, storing recently used data for quick access. Caching lowers the amount of main memory accesses. The prevention of these costly accesses saves time and energy. This project seeks to increase the effective capacity of the cache. The goal of this research is to develop a scheme where only the first needed part of a block of memory is stored, with the rest of the block being fetched while the first part is processed. Background Main memory is made from Dynamic RAM (DRAM), which is slow Caching is possible because of the principle of locality Spatial Locality: Data close to an access is likely to be accessed Temporal Locality: Data is likely to be accessed multiple times Caching data by this principle helps prevent accessing Main Memory Caches are high speed storage made from the fast static RAM (SRAM) Data is pulled from memory in blocks of words, which group in sets Critical Words Cache The hit rates for the dynamic critical words cache are significantly larger in comparison to the L2 cache and the critical words cache Specifically, the dynamic critical words cache showed a 50.1% improvement in hit rate over the L2 cache, while the critical words cache only showed a 28.4% improvement This indicates that the dynamic critical words method of moving blocks into the cache is an effective solution towards improving cache performance In addition, the average memory access time for the dynamic critical words cache is less than those of the standard and critical words cache Here, the dynamic critical words cache showed a 22.7% improvement in average memory access time over the L2 cache, while the critical words cache only showed a 14.3% improvement Overall, the dynamic critical words cache is seen to be the most effective approach towards achieving the ideal L2 cache results Sometimes, the same word in a block is always accessed first The same index in an array, the same method in a class, etc. Only store the 2 words needed first, the critical words By only storing 2 words of 8, the cache effectively grows by factor of 4 When requested, the critical words are sent to the processor While the critical words are processed, the rest of the block is fetched Ideally, the rest of the block arrives before it is needed A penalty is incurred if the critical words are mispredicted Same effect as a normal cache miss Dynamic Critical Words Cache Further Plans The results indicate that this is an effective approach. The simulation will be extended to include the dynamic resizing of the critical words portion against the whole block portion. This will allow for greater flexibility against workloads that are overwhelmingly predictable or unpredictable. Furthermore, other factors must also be measured. Of particular importance is energy consumption. In the critical words scheme, memory is accessed for every read. This causes a substantial increase in power consumption that must be evaluated. To collect this data, a model of the cache will need to be simulated in Verilog. Critical words are not always predictable Split L2 into two parts: Critical word cache for predictable data Traditional, whole block cache for unpredictable data Assume new data is predictable (predictable bit is 0) If successful, keep predictable bit as “0” If not successful, set predictable bit to “1”, move to whole block side Store predictable bit with data in all higher levels of memory hierarchy Acknowledgements Thank you to our mentor, Amirhossein Mirhosseininiri and the PURE Committee for providing us with this opportunity. References Huang, Cheng-Chieh, and Vijay Nagarajan. "Increasing cache capacity via critical-words-only cache." Computer Design (ICCD), nd IEEE International Conference on. IEEE, 2014.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.