CSE 332 Data Abstractions B-Trees

Slides:



Advertisements
Similar presentations
Chapter 4: Trees Part II - AVL Tree
Advertisements

0 Course Outline n Introduction and Algorithm Analysis (Ch. 2) n Hash Tables: dictionary data structure (Ch. 5) n Heaps: priority queue data structures.
CSE332: Data Abstractions Lecture 10: More B-Trees Tyler Robison Summer
AVL Deletion: Case #1: Left-left due to right deletion Spring 2010CSE332: Data Abstractions1 h a Z Y b X h+1 h h+2 h+3 b ZY a h+1 h h+2 X h h+1 Same single.
CSE332: Data Abstractions Lecture 9: B Trees Dan Grossman Spring 2010.
CSE332: Data Abstractions Lecture 9: BTrees Tyler Robison Summer
CSE 326: Data Structures Lecture #11 B-Trees Alon Halevy Spring Quarter 2001.
CS 206 Introduction to Computer Science II 12 / 01 / 2008 Instructor: Michael Eckmann.
B + -Trees (Part 1) Lecture 20 COMP171 Fall 2006.
1 Database indices Database Systems manage very large amounts of data. –Examples: student database for NWU Social Security database To facilitate queries,
B + -Trees (Part 1). Motivation AVL tree with N nodes is an excellent data structure for searching, indexing, etc. –The Big-Oh analysis shows most operations.
B + -Trees (Part 1) COMP171. Slide 2 Main and secondary memories  Secondary storage device is much, much slower than the main RAM  Pages and blocks.
CSE 326: Data Structures B-Trees Ben Lerner Summer 2007.
(B+-Trees, that is) Steve Wolfman 2014W1
B + -Trees COMP171 Fall AVL Trees / Slide 2 Dictionary for Secondary storage * The AVL tree is an excellent dictionary structure when the entire.
Tirgul 6 B-Trees – Another kind of balanced trees.
1 B-Trees Section AVL (Adelson-Velskii and Landis) Trees AVL tree is binary search tree with balance condition –To ensure depth of the tree is.
1 B Trees - Motivation Recall our discussion on AVL-trees –The maximum height of an AVL-tree with n-nodes is log 2 (n) since the branching factor (degree,
B-Trees. CSM B-Trees 2 Motivation for B-Trees So far we have assumed that we can store an entire data structure in main memory What if we have so.
B + -Trees. Motivation An AVL tree with N nodes is an excellent data structure for searching, indexing, etc. The Big-Oh analysis shows that most operations.
CPSC 221: Algorithms and Data Structures Lecture #7 Sweet, Sweet Tree Hives (B+-Trees, that is) Steve Wolfman 2010W2.
IKI 10100: Data Structures & Algorithms Ruli Manurung (acknowledgments to Denny & Ade Azurat) 1 Fasilkom UI Ruli Manurung (Fasilkom UI)IKI10100: Lecture17.
CSE 326 Killer Bee-Trees David Kaplan Dept of Computer Science & Engineering Autumn 2001 Where was that root?
B-TREE. Motivation for B-Trees So far we have assumed that we can store an entire data structure in main memory What if we have so much data that it won’t.
CSE 326: Data Structures Lecture #9 Big, Bad B-Trees Steve Wolfman Winter Quarter 2000.
COMP261 Lecture 23 B Trees.
Unit 9 Multi-Way Trees King Fahd University of Petroleum & Minerals
B-Trees B-Trees.
Multiway Search Trees Data may not fit into main memory
B-Trees .
Btrees Deletion.
B-Trees B-Trees.
B+-Trees.
B+ Trees What are B+ Trees used for What is a B Tree What is a B+ Tree
Btrees Insertion.
B+-Trees.
B+-Trees.
B+ Tree.
B-Trees Disk Storage What is a multiway tree? What is a B-tree?
Trees 4 The B-Tree Section 4.7
CSE373: Data Structures & Algorithms Lecture 7: AVL Trees
CMSC 341 Lecture 10 B-Trees Based on slides from Dr. Katherine Gibson.
Haim Kaplan and Uri Zwick November 2014
Wednesday, April 18, 2018 Announcements… For Today…
Data Structures and Algorithms
B+ Trees What are B+ Trees used for What is a B Tree What is a B+ Tree
Multi-Way Search Trees
CSE 326: Data Structures Lecture #9 AVL II
B- Trees D. Frey with apologies to Tom Anastasio
B- Trees D. Frey with apologies to Tom Anastasio
B-Tree.
B+-Trees (Part 1).
Other time considerations
Multiway Trees Searching and B-Trees Advanced Tree Structures
B-Trees Disk Storage What is a multiway tree? What is a B-tree?
CSE 332: Data Abstractions AVL Trees
CSIT 402 Data Structures II With thanks to TK Prasad
B-Trees Disk Storage What is a multiway tree? What is a B-tree?
(2,4) Trees 2/15/2019 (2,4) Trees (2,4) Trees.
B- Trees D. Frey with apologies to Tom Anastasio
Solution for Section Worksheet 4, #7b & #7c
(2,4) Trees (2,4) Trees (2,4) Trees.
CSE 373: Data Structures and Algorithms
Lecture 13: Computer Memory
CSE 373 Data Structures and Algorithms
CSE 373: Data Structures and Algorithms
B-Trees.
B-Trees.
CSE 326: Data Structures Lecture #10 B-Trees
326 Lecture 9 Henry Kautz Winter Quarter 2002
Presentation transcript:

CSE 332 Data Abstractions B-Trees Richard Anderson Spring 2016

Announcements Next two weeks: Hashing and sorting Upcoming dates Friday, April 29. Midterm 2

Caches = SRAM (faster, more expensive) Memory = DRAM (slower, cheaper) Cycles to access: CPU Registers 1 L1 Cache 2 L2 Cache 30 Main memory 250 [ 1 2 10 200 200,000] Caches = SRAM (faster, more expensive) Memory = DRAM (slower, cheaper) Idea: Keep your working set in the cache. Draw arrow: faster, more expensive $$ Mov eax, [a] Add ecx, [a] What happens when you start a program? bring in from disk into meory execute first instruction – copy into L2 cache, then L1 cache, finally into the IR This accesses variable a Execute SECOND instruction, also accesses variable a. It can use the COPY of [a] that is in the L1 cache. Disk Random: 30,000,000 Streamed: 5000 3

M-ary Search Tree Consider a search tree with branching factor M: Complete tree has height: # hops for find: Runtime of find: So, we’ll try to solve this problem as we did with heaps. Here’s the general idea. We create a search tree with a branching factor of M. Each node has M-1 keys and we search between them. What’s the runtime? O(logMn)? That’s a nice thought, and it’s the best case. What about the worst case? 4

B+ Trees (book calls these B-trees) Each internal node has (up to) M-1 keys: Order property: subtree between two keys x and y contain leaves with values v such that x ≤ v < y Note the “≤” Leaf nodes have up to L sorted keys. 3 7 12 21 To address these problems, we’ll use a slightly more structured M-ary tree: B-Trees. As before, each internal node has M-1 keys. To manage memory problems, we’ll tune the size of a node (or leaf) to the size of a memory unit. Usually, a page or disk block. x<3 3≤x<7 7≤x<12 12≤x<21 21≤x 5

B+ Tree Structure Properties Internal nodes store up to M-1 keys have between ⎡M/2⎤ and M children Leaf nodes where data is stored all at the same depth contain between ⎡L/2⎤ and L data items Root (special case) has between 2 and M children (or root could be a leaf) The properties of B-Trees (and the trees themselves) are a bit more complex than previous structures we’ve looked at. Here’s a big, gnarly list; we’ll go one step at a time. The maximum branching factor, as we said, is M (tunable for a given tree). The root has between 2 and M children or at most L keys. (L is another parameter) These restrictions will be different for the root than for other nodes. 6

B+ Tree: Example B+ Tree with M = 4 (# pointers in internal node) and L = 5 (# data items in leaf) Data objects… which I’ll ignore in slides 12 44 6 20 27 34 50 1, AB.. 6 8 9 10 12 20 27 34 44 47 49 50 This is just an example B-tree. Notice that it has 24 entries with a depth of only 2. A BST would be 4 deep. Notice also that the leaves are at the same level in the tree. I’ll use integers as both key and data, but we all know that that could as well be different data at the bottom, right? 2, GH.. 14 22 28 38 60 All leaves at the same depth 4, XY.. 16 24 32 39 70 17 41 19 7 Definition for later: “neighbor” is the next sibling to the left or right.

Disk Friendliness What makes B+ trees disk-friendly? Many keys stored in a node All brought to memory/cache in one disk access. Internal nodes contain only keys; Only leaf nodes contain keys and actual data Much of tree structure can be loaded into memory irrespective of data object size Data actually resides in disk 8

B+ trees vs. AVL trees Suppose again we have n = 230 ≈ 109 items: Depth of AVL Tree Depth of B+ Tree with M = 256, L = 256 Great, but how to we actually make a B+ tree and keep it balanced…? x = 2^30 1.44 log_phi x = 43 Log_128 x = 4.3 Wow!!! 9 12

Building a B+ Tree with Insertions The empty B-Tree Alright, how do we insert and delete? Let’s start with the empty B-Tree. That’s one leaf as the root. Now, we’ll insert 3 and 14. Fine… What about inserting 1. Is there a problem? M = 3 L = 3 10

3 3 Insert(30) 14 14 18 18 Alright, how do we insert and delete? Let’s start with the empty B-Tree. That’s one leaf as the root. Now, we’ll insert 3 and 14. Fine… What about inserting 1. Is there a problem? M = 3 L = 3 11

18 18 18 3 18 3 18 3 18 Insert(32) Insert(36) 14 30 14 30 14 30 18 Insert(15) Alright, how do we insert and delete? Let’s start with the empty B-Tree. That’s one leaf as the root. Now, we’ll insert 3 and 14. Fine… What about inserting 1. Is there a problem? 3 18 14 30 M = 3 L = 3 12

18 32 18 32 3 18 32 3 18 32 Insert(16) 14 30 36 14 30 36 15 15 18 32 Alright, how do we insert and delete? Let’s start with the empty B-Tree. That’s one leaf as the root. Now, we’ll insert 3 and 14. Fine… What about inserting 1. Is there a problem? 18 32 30 36 13 M = 3 L = 3

Insert(12,40,45,38) 18 18 15 32 15 32 40 3 15 18 32 3 15 18 32 40 14 16 30 36 12 16 30 36 45 Alright, how do we insert and delete? Let’s start with the empty B-Tree. That’s one leaf as the root. Now, we’ll insert 3 and 14. Fine… What about inserting 1. Is there a problem? 14 38 14 M = 3 L = 3

Insertion Algorithm Insert the key in its leaf in sorted order If the leaf ends up with L+1 items, overflow! Split the leaf into two nodes: original with ⎡(L+1)/2⎤ smaller keys new one with ⎣(L+1)/2⎦ larger keys Add the new child to the parent If the parent ends up with M+1 children, overflow! 3. If an internal node ends up with M+1 children, overflow! Split the node into two nodes: original with ⎡(M+1)/2⎤ children with smaller keys new one with ⎣(M+1)/2⎦ children with larger keys Add the new child to the parent If the parent ends up with M+1 items, overflow! Split an overflowed root in two and hang the new nodes under a new root Propagate keys up tree. OK, here’s that process as an algorithm. The new funky symbol is floor; that’s just like regular C++ integer division. Notice that this can propagate all the way up the tree. How often will it do that? Notice that the two new leaves or internal nodes are guaranteed to have enough items (or subtrees). Because even the floor of (L+1)/2 is as big as the ceiling of L/2. This makes the tree deeper! 15

And Now for Deletion… 18 Delete(32) 18 15 32 40 15 40 3 18 32 40 3 18 Alright, how do we insert and delete? Let’s start with the empty B-Tree. That’s one leaf as the root. Now, we’ll insert 3 and 14. Fine… What about inserting 1. Is there a problem? 12 16 30 36 45 12 16 30 45 14 38 14 M = 3 L = 3 16

18 Delete(15) 18 15 36 40 36 40 3 15 18 36 40 3 18 36 40 12 16 30 38 45 12 16 30 38 45 Alright, how do we insert and delete? Let’s start with the empty B-Tree. That’s one leaf as the root. Now, we’ll insert 3 and 14. Fine… What about inserting 1. Is there a problem? 14 M = 3 L = 3 17

18 Delete(16) 18 14 36 40 36 40 3 14 18 36 40 18 36 40 12 16 30 38 45 30 38 45 Alright, how do we insert and delete? Let’s start with the empty B-Tree. That’s one leaf as the root. Now, we’ll insert 3 and 14. Fine… What about inserting 1. Is there a problem? M = 3 L = 3 18

18 Delete(16) 36 40 3 18 36 40 3 12 30 38 45 12 Alright, how do we insert and delete? Let’s start with the empty B-Tree. That’s one leaf as the root. Now, we’ll insert 3 and 14. Fine… What about inserting 1. Is there a problem? 14 14 M = 3 L = 3 19

Delete(14) 36 36 18 40 18 40 3 18 36 40 3 18 36 40 12 30 38 45 12 30 38 45 Alright, how do we insert and delete? Let’s start with the empty B-Tree. That’s one leaf as the root. Now, we’ll insert 3 and 14. Fine… What about inserting 1. Is there a problem? 14 M = 3 L = 3 20

Delete(18) 36 36 18 40 40 3 18 36 40 3 36 40 12 30 38 45 12 38 45 Alright, how do we insert and delete? Let’s start with the empty B-Tree. That’s one leaf as the root. Now, we’ll insert 3 and 14. Fine… What about inserting 1. Is there a problem? M = 3 L = 3 21

36 40 3 36 40 12 38 45 30 Alright, how do we insert and delete? Let’s start with the empty B-Tree. That’s one leaf as the root. Now, we’ll insert 3 and 14. Fine… What about inserting 1. Is there a problem? 30 M = 3 L = 3 22

Deletion Algorithm Remove the key from its leaf 2. If the leaf ends up with fewer than ⎡L/2⎤ items, underflow! Adopt data from a neighbor; update the parent If adopting won’t work, delete node and merge with neighbor If the parent ends up with fewer than ⎡M/2⎤ children, underflow! Alright, that’s deletion. Let’s talk about a few of the details. Why will dumping keys always work? If the neighbors were too low on keys to loan any, they must have L/2 keys, but we have one fewer. Therefore, putting them together, we get at most L, and that’s legal. 23

Deletion Slide Two This reduces the height of the tree! 3. If an internal node ends up with fewer than ⎡M/2⎤ children, underflow! Adopt from a neighbor; update the parent If adoption won’t work, merge with neighbor If the parent ends up with fewer than ⎡M/2⎤ children, underflow! If the root ends up with only one child, make the child the new root of the tree Propagate keys up through tree. The same applies here for dumping subtrees as on the previous slide for dumping keys. This reduces the height of the tree! 24

Thinking about B+ Trees B+ Tree insertion can cause (expensive) splitting and propagation up the tree B+ Tree deletion can cause (cheap) adoption or (expensive) merging and propagation up the tree Split/merge/propagation is rare if M and L are large (Why?) Pick branching factor M and data items/leaf L such that each node takes one full page/block of memory/disk. B*-Trees fix thrashing. Propagation is rare because (in a good case) only about 1/L inserts cause a split and only about 1/M of those go up even one level! 30 million’s not so big, right? How about height 5? 2 billion 25

Complexity Find: Insert: Claim: O(M) costs are negligible find: Insert in leaf: split/propagate up: Claim: O(M) costs are negligible 26

Tree Names You Might Encounter “B-Trees” More general form of B+ trees, allows data at internal nodes too Range of children is (key1,key2) rather than [key1, key2) B-Trees with M = 3, L = x are called 2-3 trees Internal nodes can have 2 or 3 children B-Trees with M = 4, L = x are called 2-3-4 trees Internal nodes can have 2, 3, or 4 children Why would we use these? Not to fit disk blocks, most likely. We might use them just to get the log n bound, however. 27