Download presentation
Presentation is loading. Please wait.
Published byΛάμεχ Δημητρίου Modified over 6 years ago
1
John-Paul Fryckman CSE 231: Paper Presentation 23 May 2002
Dynamic Hot Data Stream Prefetching for General-Purpose Programs Chilimbi and Hirzel John-Paul Fryckman CSE 231: Paper Presentation 23 May 2002
2
Why Prefetching? Increasing memory latencies
Not enough single thread ILP to hide memory latencies Minimizing stall cycles due to misses increases IPC (increases performance) Fetch data before it is needed!
3
Target Hot Data Highly repetitious memory sequences
Hot sequences are objects long Hot data 90% of program references 80% of cache misses
4
Why Dynamic Dynamic prefetching translates into a general purpose solution Many unknowns at compile time Pointer chasing code Irregular strides
5
Dynamic Hot Data Stream Prefetching
Profile memory references Detect hot data streams Create and insert triggers for these streams And, repeat!
6
Profiling and Detection
Need to minimize profiling overhead Use sampling Switch into instrumented code Collect traces Find hot data streams Generate context-free grammars for hot sequences
7
DFSM Prefetching Engine
Merge CFGs together into a massive DFSM DFSM detects prefixes for hot sequences, then generates fetches for the rest of the data Insert prefetching code Presumably, states are removed when they are no longer hot
8
Good and the Not So Good Good: Questionable Questions:
With overhead, 5-19% speedups Questionable Questions: How does it impact easy to predict code? Worse case state for DFSM: O(2n) They did not study this. Is this possible? Do they always prefetch in time? What about phase changes/cache pollution?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.