Hopscotch Hashing Nir Shavit Tel Aviv U. & Sun Labs Joint work with Maurice Herlihy and Moran Tzafrir.

Slides:



Advertisements
Similar presentations
Lecture 6 Hashing. Motivating Example Want to store a list whose elements are integers between 1 and 5 Will define an array of size 5, and if the list.
Advertisements

Lecture 11 oct 6 Goals: hashing hash functions chaining closed hashing application of hashing.
Data Structures Using C++ 2E
Hashing as a Dictionary Implementation
© 2004 Goodrich, Tamassia Hash Tables1  
Lock-free Cuckoo Hashing Nhan Nguyen & Philippas Tsigas ICDCS 2014 Distributed Computing and Systems Chalmers University of Technology Gothenburg, Sweden.
Hash Tables1 Part E Hash Tables  
Hash Tables1 Part E Hash Tables  
Hash Tables1 Part E Hash Tables  
Lecture 10: Search Structures and Hashing
Lecture 11 oct 7 Goals: hashing hash functions chaining closed hashing application of hashing.
Hashing General idea: Get a large array
Data Structures Using C++ 2E Chapter 9 Searching and Hashing Algorithms.
L. Grewe. Computing hash function for a string Horner’s rule: (( … (a 0 x + a 1 ) x + a 2 ) x + … + a n-2 )x + a n-1 ) int hash( const string & key )
Lecture 6 Hashing. Motivating Example Want to store a list whose elements are integers between 1 and 5 Will define an array of size 5, and if the list.
Data Structures Hashing Uri Zwick January 2014.
Implementing Dictionaries Many applications require a dynamic set that supports dictionary-type operations such as Insert, Delete, and Search. E.g., a.
Hash Tables1   © 2010 Goodrich, Tamassia.
David Luebke 1 10/25/2015 CS 332: Algorithms Skip Lists Hash Tables.
Hashing Hashing is another method for sorting and searching data.
© 2004 Goodrich, Tamassia Hash Tables1  
David Luebke 1 11/26/2015 Hash Tables. David Luebke 2 11/26/2015 Hash Tables ● Motivation: Dictionaries ■ Set of key/value pairs ■ We care about search,
Hashing and Natural Parallism Companion slides for The Art of Multiprocessor Programming by Maurice Herlihy & Nir Shavit.
Hashing Chapter 7 Section 3. What is hashing? Hashing is using a 1-D array to implement a dictionary o This implementation is called a "hash table" Items.
Hashtables. An Abstract data type that supports the following operations: –Insert –Find –Remove Search trees can be used for the same operations but require.
Copyright © Curt Hill Hashing A quick lookup strategy.
CS6045: Advanced Algorithms Data Structures. Hashing Tables Motivation: symbol tables –A compiler uses a symbol table to relate symbols to associated.
CMSC 341 Hashing Readings: Chapter 5. Announcements Midterm II on Nov 7 Review out Oct 29 HW 5 due Thursday CMSC 341 Hashing 2.
Fundamental Structures of Computer Science II
Hash Tables 1/28/2018 Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia, and M.
Algorithmic Improvements for Fast Concurrent Cuckoo Hashing
Data Structures Using C++ 2E
Hashing, Hash Function, Collision & Deletion
Hash table CSC317 We have elements with key and satellite data
CSCI 210 Data Structures and Algorithms
Hashing CSE 2011 Winter July 2018.
CS 332: Algorithms Hash Tables David Luebke /19/2018.
Week 8 - Wednesday CS221.
Handling Collisions Open Addressing SNSCT-CSE/16IT201-DS.
CPSC-608 Database Systems
EEE2108: Programming for Engineers Chapter 8. Hashing
Data Structures Using C++ 2E
Review Graph Directed Graph Undirected Graph Sub-Graph
© 2013 Goodrich, Tamassia, Goldwasser
Dictionaries 9/14/ :35 AM Hash Tables   4
Hash Tables 3/25/15 Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia, and M.
Hash functions Open addressing
Advanced Associative Structures
External Memory Hashing
CSE 2331/5331 Topic 8: Hash Tables CSE 2331/5331.
External Memory Hashing
Resolving collisions: Open addressing
External Memory Hashing
CH 9.2 : Hash Tables Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and.
Multicore programming
Dictionaries 1/17/2019 7:55 AM Hash Tables   4
CH 9.2 : Hash Tables Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and.
CS202 - Fundamental Structures of Computer Science II
Database Systems (資料庫系統)
Database Design and Programming
Hashing Sections 10.2 – 10.3 Lecture 26 CS302 Data Structures
Space-for-time tradeoffs
Pseudorandom number, Universal Hashing, Chaining and Linear-Probing
Module 12a: Dynamic Hashing
CS210- Lecture 16 July 11, 2005 Agenda Maps and Dictionaries Map ADT
CS 3343: Analysis of Algorithms
Podcast Ch21b Title: Collision Resolution
Lecture-Hashing.
CSE 373: Data Structures and Algorithms
Index Structures Chapter 13 of GUW September 16, 2019
Presentation transcript:

Hopscotch Hashing Nir Shavit Tel Aviv U. & Sun Labs Joint work with Maurice Herlihy and Moran Tzafrir

 A new family of highly effective hash-map algorithms for Multicore Machines  Great performance also on Uniprocessors Our Results

Hash Tables  Add(), Remove(), and Contains() with expected O(1) complexity  Extensible/Resizable  Typical hash table usage pattern: high fraction of Contains(), small fraction of Add() and Remove().  Assume universal hash function h(k) for key k.

Concurrent Hash Tables  Linearizable implementation of a Set.  In theory, two adversaries, the scheduler and the distributer of keys. Lets look at the state-of-the-art Ignore resizing for now…

Chained Hashing [Luhn] Add(x)  if !Contains(x) then Malloc( ) and link in bucket h(x)

Chained Hashing  N buckets, M items, Uniform h  O(1) items expected in bucket (expected ~2.1 items) [Knuth]  Add(), Remove(), Contains() in Expected O(1)

Chained Hashing  Advantages:  retains good performance as table density (M/N) increases  less resizing  Disadvantages:  dynamic memory allocation  extra full size word for pointer  bad cache behavior (no locality)

Subliminal Cut

Linear Probing [Amdahl] Contains(x) – search linearly from h(x) until last location H noted in bucket. x x h(x) =7 z z

Linear Probing Add(x) - add in first empty bucket and update its H. z z h(x) =3 z z z z z z z z z z z z z z z z z z z z z z z z x x =6

Linear Probing  Open address means M <= N  Expected items in bucket same as Chaining  Expected distance till open slot [Knuth]: ½(1+(1/(1-M/N)) 2 M/N = 0.5  search 2.5 buckets M/N = 0.9  search 50 buckets 

Linear Probing  Advantages:  Good locality  less cache misses  Disadvantages:  As M/N increases more cache misses  searching 10’s of unrelated buckets  “Clustering” of keys into neighboring buckets  As computation proceeds “Contamination” by deleted items  more cache misses * *

Cuckoo Hashing Add(x) – if h 1 (x) and h 2 (x) full evict y and move it to h 2 (y) != h 2 (x). Then place x in its place. z z h 1 (x) z z z z z z z z z z z z z z z z z z z z z z z z h 2 (x) z z z z z z w w z z z z z z z z z z z z z z z z h 2 (y) y y x x [Pagh&Rodler]

Cuckoo Hashing  Advantages:  Contains() : deterministic 2 buckets  No clustering or contamination  Disadvantages:  2 tables  h i (x) are complex  As M/N increases  relocation cycles  Above M/N = 0.5 Add() does not work!

Can we do better?  Highly efficient Contains()  Using simple hash functions  In a table that will continue to work well when it is more than 50% full (M/N>0.5)

Hopscotch Approach  Single Array, Simple hash function  Idea: define neighborhood of original bucket  In neighborhood items found quickly  Use sequences of displacements to move items into their neighborhood

Simple Hopscotch Hashing Contains(x) – search in at most H buckets (the hop-range) based on hop-info bitmap. H is a constant. z z h(x) H=4 x x

Simple Hopscotch Hashing Add(x) – probe linearly to find open slot. Move the empty slot via sequence of displacements into the hop-range of h(x). z z h(x) x x r r s s v v u u w w

Add(x)  Starting at h(x) =i, use linear probing to find an empty slot j.  If j is within H-1 of i, place x and return.  Otherwise,  Find y with h(y) = k within H-1 left of j. Displacing y to j creates a new empty slot k closer to i. Repeat.  If no such item exists, or if the bucket i already contains H items, Resize().

Simple Hopscotch Hashing  Contains():  Max number of items H is a constant  Theoretically expected num items in bucket ~2.1 same as Chaining [Kunth], but they have a better chance of sitting in the same cache line!

Simple Hopscotch Hashing  Add(): Expected distance till open slot same as in linear probing  What are the chances of a Resize() because more than H items are hashed to a bucket? Lemma [following Knuth] : same as num items in chained bucket being greater than H, which is 1/H! If H=32 1/H! = 1/32! <

Simple Hopscotch Hashing  Advantages:  Good locality and cache behavior  Good performance as table density (M/N) increases  less resizing  Withstands clustering and contamination  Pay price in Add() not in frequent Contains()

Simple Hopscotch Hashing  Disadvantages:  As in Linear probing Add() may have to search 10s of buckets  Later: we can do better than the simple algorithm…

Concurrent Hash Tables  State-of-the-art is Lea’s ConcurrentHashMap from Java.util.concur: lock-based chaining  Also lock-free split-ordered incremental chaining algorithm [Shalev&Shavit]  Non-resizable lock-free linear probing [Purcell&Harris]  Resizable Lock-based Cuckoo [Herlihy,Shavit, Zafrir]

Concurrent Chained Hashing [Lea] Lock for Add() and unsuccessful Contains() Stripped Locks

Concurrent Simple Hopscotch Stripped locking same as Lea Contains() is wait-free h(x)

Concurrent Simple Hopscotch Add(x) – lock bucket, mark empty slot using CAS, add x erasing mark z z r r v v u u ts x x

Concurrent Simple Hopscotch Add(x) – lock bucket, mark empty slot using CAS, lock bucket and update timestamp of bucket being displaced before erasing old value z z v v u u ts r r s s s s

Concurrent Simple Hopscotch Wait-free Contains(x) – read ts, hop- info, goto marked buckets, if no x compare ts, if diff repeat, after k attempts search all H buckets z z r r v v u u ts s s X not found

Simple Hopscotch Limitations Minimizing cache misses key to hash table performance:  On SPARC line holds ~4 buckets (on Intel ~8 buckets)  If hopscotch neighborhood is num of buckets in a cache line  With H=4 or 8, long sequences of displacements during Add() are expensive

Cache Aware Hopscotch Hop-info consists of two pointers of distance hop range. z z c-line x x y y

Cache Aware Hopscotch Add(x) - if no place in cache line probe in hop-range and point to new location else perform hopscotch displacements z z x x c-line z z z z

Cache Aware Hopscotch Add(x) - if no place in cache line probe in hop-range and point to new location else perform hopscotch displacements z z x x c-line z z z z z z x x hop-range y y w w

Cache Aware Hopscotch Remove(x) - erase x, redirect pointer or move last item in x’s list into x’s empty slot. Move some item hashed to this cache line into new empty slot z z x x c-line z z z z y y x x (1) Redirect pointer (2) If pointer outside hop range move last list item

Cache Aware Hopscotch Remove(x) - …Move some item hashed to this cache line into new empty slot z z c-line z z z z Scan all items in cache line Moving some item into its hashed cache line z z …repeat recursively for new empty slot Anti-Contamination

Performance  We ran a series of benchmarks on state of the art multicores and uniprocessors:  Sun 64 way Niagara II, and  Intel 3GHz Xeon  Grain of Salt: we used standard microbenchmarks, not real application data

Benchmarking  We used the same locking structure for concurrent algs  We show graphs with pre-allocated memory to eliminate effects of memory management

Cuckoo stops here

Conclusions & Future  New Hopscotch family of hash table algorithms: great cache behavior – good fit with modern architectures  Superior to all known algorithms in performance and memory utilization?  Especially impressive at high table densities  Will be interesting to formally analyze performance