Chapter 6 Transform-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.

Slides:



Advertisements
Similar presentations
Chapter 7 Space and Time Tradeoffs Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Advertisements

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 1 Chapter 2.
Hash Tables CS 310 – Professor Roch Weiss Chapter 20 All figures marked with a chapter and section number are copyrighted © 2006 by Pearson Addison-Wesley.
HASH TABLE. HASH TABLE a group of people could be arranged in a database like this: Hashing is the transformation of a string of characters into a.
Space-for-Time Tradeoffs
Hashing as a Dictionary Implementation
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. Problem Reduction Dr. M. Sakalli Marmara Unv, Levitin ’ s notes.
Chapter 6: Transform and Conquer
Chapter 7 Space and Time Tradeoffs. Space-for-time tradeoffs Two varieties of space-for-time algorithms: b input enhancement — preprocess the input (or.
1 CSC 421: Algorithm Design & Analysis Spring 2013 Space vs. time  space/time tradeoffs  examples: heap sort, data structure redundancy, hashing  string.
Design and Analysis of Algorithms - Chapter 71 Space-time tradeoffs For many problems some extra space really pays off (extra space in tables - breathing.
Chapter 3 Brute Force Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Liang, Introduction to Java Programming, Eighth Edition, (c) 2011 Pearson Education, Inc. All rights reserved Chapter 48 Hashing.
Design and Analysis of Algorithms - Chapter 71 Hashing b A very efficient method for implementing a dictionary, i.e., a set with the operations: – insert.
Chapter 7 Space and Time Tradeoffs Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Chapter 6 Transform-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Hashing General idea: Get a large array
Introducing Hashing Chapter 21 Copyright ©2012 by Pearson Education, Inc. All rights reserved.
© 2006 Pearson Addison-Wesley. All rights reserved13 B-1 Chapter 13 (excerpts) Advanced Implementation of Tables CS102 Sections 51 and 52 Marc Smith and.
1. 2 Problem RT&T is a large phone company, and they want to provide enhanced caller ID capability: –given a phone number, return the caller’s name –phone.
String Matching. Problem is to find if a pattern P[1..m] occurs within text T[1..n] Simple solution: Naïve String Matching –Match each position in the.
ICS220 – Data Structures and Algorithms Lecture 10 Dr. Ken Cosh.
Data Structures and Algorithm Analysis Hashing Lecturer: Jing Liu Homepage:
CHAPTER 09 Compiled by: Dr. Mohammad Omar Alhawarat Sorting & Searching.
Chapter 7 Space and Time Tradeoffs James Gain & Sonia Berman
Chapter 6 Transform-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
MA/CSSE 473 Day 24 Student questions Quadratic probing proof
Hashing Table Professor Sin-Min Lee Department of Computer Science.
Hashing Chapter 20. Hash Table A hash table is a data structure that allows fast find, insert, and delete operations (most of the time). The simplest.
© 2006 Pearson Addison-Wesley. All rights reserved13 B-1 Chapter 13 (continued) Advanced Implementation of Tables.
Theory of Algorithms: Space and Time Tradeoffs James Gain and Edwin Blake {jgain | Department of Computer Science University of Cape.
Transform and Conquer This group of techniques solves a problem by a transformation to a simpler/more convenient instance of the same problem (instance.
Comp 335 File Structures Hashing.
Application: String Matching By Rong Ge COSC3100
MA/CSSE 473 Day 27 Hash table review Intro to string searching.
Hashing Sections 10.2 – 10.3 CS 302 Dr. George Bebis.
MA/CSSE 473 Day 23 Student questions Space-time tradeoffs Hash tables review String search algorithms intro.
Design and Analysis of Algorithms - Chapter 71 Space-time tradeoffs For many problems some extra space really pays off: b extra space in tables (breathing.
Chapter 5: Hashing Part I - Hash Tables. Hashing  What is Hashing?  Direct Access Tables  Hash Tables 2.
Been-Chian Chien, Wei-Pang Yang, and Wen-Yang Lin 8-1 Chapter 8 Hashing Introduction to Data Structure CHAPTER 8 HASHING 8.1 Symbol Table Abstract Data.
Chapter 13 C Advanced Implementations of Tables – Hash Tables.
Design and Analysis of Algorithms – Chapter 71 Space-Time Tradeoffs: String Matching Algorithms* Dr. Ying Lu RAIK 283: Data Structures.
Chapter 3 Brute Force Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
MA/CSSE 473 Day 25 Student questions Boyer-Moore.
Chapter 6 Transform-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
CSC 421: Algorithm Design & Analysis
CSG523/ Desain dan Analisis Algoritma
MA/CSSE 473 Day 26 Student questions Boyer-Moore B Trees.
Chapter 6 Transform-and-Conquer
MA/CSSE 473 Day 24 Student questions Space-time tradeoffs
Review Graph Directed Graph Undirected Graph Sub-Graph
Hash functions Open addressing
Sorting.
CSCE350 Algorithms and Data Structure
Space-for-time tradeoffs
Transform and Conquer This group of techniques solves a problem by a transformation to a simpler/more convenient instance [eg sorted] of the same problem.
Transform and Conquer This group of techniques solves a problem by a transformation to a simpler/more convenient instance of the same problem (instance.
Chapter 21 Hashing: Implementing Dictionaries and Sets
Chapter 7 Space and Time Tradeoffs
Chapter 3 Brute Force Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Chapter 6: Transform and Conquer
CH 9.2 : Hash Tables Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and.
Space-for-time tradeoffs
Transform and Conquer This group of techniques solves a problem by a transformation to a simpler/more convenient instance of the same problem (instance.
Hashing Sections 10.2 – 10.3 Lecture 26 CS302 Data Structures
Transform and Conquer Transform and Conquer Transform and Conquer.
Space-for-time tradeoffs
Transform and Conquer Transform and Conquer Transform and Conquer.
Space-for-time tradeoffs
Space-for-time tradeoffs
Presentation transcript:

Chapter 6 Transform-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.

5-2 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Transform and Conquer This group of techniques solves a problem by a transformation  to a simpler/more convenient instance of the same problem (instance simplification) First 3 sections of the chapter (presorting, applied math, AVL trees)First 3 sections of the chapter (presorting, applied math, AVL trees)  to a different representation of the same instance (representation change)‏ Heaps and Horner’s RuleHeaps and Horner’s Rule  to a different problem for which an algorithm is already available (problem reduction) Deal with how to apply this at the end of the chapterDeal with how to apply this at the end of the chapter

5-3 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Representation Change: Example

5-4 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Representation Change: Example

5-5 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Horner’s Rule For Polynomial Evaluation Given a polynomial of degree n p(x) = a n x n + a n-1 x n-1 + … + a 1 x + a 0 and a specific value of x, find the value of p at that point. Two brute-force algorithms: p  0 p  a 0 ; power  1 for i  n downto 0 do for i  1 to n do power  1 power  power * x power  1 power  power * x for j  1 to i do p  p + a i * power for j  1 to i do p  p + a i * power power  power * x return p power  power * x return p p  p + a i * power p  p + a i * power return p

5-6 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Horner’s Rule Example: p(x) = 2x 4 - x 3 + 3x 2 + x - 5 = Example: p(x) = 2x 4 - x 3 + 3x 2 + x - 5 = = x(2x 3 - x 2 + 3x + 1) - 5 = = x(2x 3 - x 2 + 3x + 1) - 5 = = x(x(2x 2 - x + 3) + 1) - 5 = = x(x(2x 2 - x + 3) + 1) - 5 = = x(x(x(2x - 1) + 3) + 1) - 5 = x(x(x(2x - 1) + 3) + 1) - 5 Substitution into the last formula leads to a faster algorithm Substitution into the last formula leads to a faster algorithm Same sequence of computations are obtained by simply arranging the coefficient in a table and proceeding as follows: Same sequence of computations are obtained by simply arranging the coefficient in a table and proceeding as follows: coefficients coefficients x=3 x=3

5-7 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Horner’s Rule pseudocode Efficiency of Horner’s Rule: # multiplications = # additions = n Synthetic division of of p(x) by (x-x 0 ) Example: Let p(x) = 2x 4 - x 3 + 3x 2 + x - 5. Find p(x):(x-3)

5-8 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Computing a n (revisited)‏ Left-to-right binary exponentiation Initialize product accumulator by 1. Scan n’s binary expansion from left to right and do the following: If the current binary digit is 0, square the accumulator (S); if the binary digit is 1, square the accumulator and multiply it by a (SM). Example: Compute a 13. Here, n = 13 = binary rep. of 13: SM SM S SM accumulator: *a=a a 2 *a = a 3 (a 3 ) 2 = a 6 (a 6 ) 2 *a= a 13 (computed left-to-right) Efficiency: (b-1) ≤ M(n) ≤ 2(b-1) where b =  log 2 n  + 1

5-9 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Computing a n (cont.)‏ Right-to-left binary exponentiation Scan n’s binary expansion from right to left and compute a n as the product of terms a 2 i corresponding to 1’s in this expansion. Example Compute a 13 by the right-to-left binary exponentiation. Here, n = 13 = a 8 a 4 a 2 a : a 2 i terms a 8 * a 4 * a : product (computed right-to-left)‏ a 8 a 4 a 2 a : a 2 i terms a 8 * a 4 * a : product (computed right-to-left)‏ Efficiency: same as that of left-to-right binary exponentiation

5-10 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Problem Reduction This variation of transform-and-conquer solves a problem by transforming it into different problem for which an algorithm is already available. To be of practical value, the combined time of the transformation and solving the other problem should be smaller than solving the problem as given by another method.

5-11 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Examples of Solving Problems by Reduction  computing lcm(m, n) via computing gcd(m, n)‏ Pg 239Pg 239  counting number of paths of length n in a graph by raising the graph’s adjacency matrix to the n-th power  transforming a maximization problem to a minimization problem and vice versa min f(x) = -max[-f(x)] OR max f(x) = - min[-f(x)]min f(x) = -max[-f(x)] OR max f(x) = - min[-f(x)]  linear programming  reduction to graph problems (e.g., solving puzzles via state- space graphs) - Figure 6.18 pg 244

Chapter 7 Space and Time Tradeoffs Copyright © 2007 Pearson Addison-Wesley. All rights reserved.

5-13 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Space-for-time tradeoffs Two varieties of space-for-time algorithms:  input enhancement — preprocess the input (or its part) to store some info to be used later in solving the problem counting sortscounting sorts string searching algorithmsstring searching algorithms  prestructuring — preprocess the input to make accessing its elements easier hashinghashing indexing schemes (e.g., B-trees)‏indexing schemes (e.g., B-trees)‏

5-14 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Review: String searching by brute force pattern: a string of m characters to search for text: a (long) string of n characters to search in Brute force algorithm Step 1Align pattern at beginning of text Step 2Moving from left to right, compare each character of pattern to the corresponding character in text until either all characters are found to match (successful search) or a mismatch is detected Step 3 While a mismatch is detected and the text is not yet exhausted, realign pattern one position to the right and repeat Step 2

5-15 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 String searching by preprocessing Several string searching algorithms are based on the input enhancement idea of preprocessing the pattern  Knuth-Morris-Pratt (KMP) algorithm preprocesses pattern left to right to get useful information for later searching  Boyer -Moore algorithm preprocesses pattern right to left and store information into two tables  Horspool’s algorithm simplifies the Boyer-Moore algorithm by using just one table

5-16 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Horspool’s Algorithm A simplified version of Boyer-Moore algorithm: preprocesses pattern to generate a shift table that determines how much to shift the pattern when a mismatch occurspreprocesses pattern to generate a shift table that determines how much to shift the pattern when a mismatch occurs always makes a shift based on the text’s character c aligned with the last character in the pattern according to the shift table’s entry for calways makes a shift based on the text’s character c aligned with the last character in the pattern according to the shift table’s entry for c

5-17 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 How far to shift? Look at first (rightmost) character in text that was compared:  The character is not in the pattern.....c ( c not in pattern)‏.....c ( c not in pattern)‏ BAOBAB BAOBAB  The character is in the pattern (but not the rightmost)‏.....O ( O occurs once in pattern) BAOBAB.....O ( O occurs once in pattern) BAOBAB.....A ( A occurs twice in pattern)‏.....A ( A occurs twice in pattern)‏ BAOBAB BAOBAB  The rightmost characters do match.....B B BAOBAB BAOBAB

5-18 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Shift table  Shift sizes can be precomputed by the formula 1. distance from c’s rightmost occurrence in pattern among its first m-1 characters to its right end 1. distance from c’s rightmost occurrence in pattern among its first m-1 characters to its right end t(c) = 2. pattern’s length m, otherwise t(c) = 2. pattern’s length m, otherwise by scanning pattern before search begins and stored in a table called shift table by scanning pattern before search begins and stored in a table called shift table  Shift table is indexed by text and pattern alphabet Eg, for BAOBAB: A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

5-19 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Example of Horspool’s alg. application BARD LOVED BANANAS BAOBAB BAOBAB BAOBAB BAOBAB (unsuccessful search)‏ BAOBAB (unsuccessful search)‏ A B C D E F G H I J K L M N O P Q R S T U V W X Y Z _ 6

5-20 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Boyer-Moore algorithm Based on same two ideas: comparing pattern characters to text from right to leftcomparing pattern characters to text from right to left precomputing shift sizes in two tablesprecomputing shift sizes in two tables –bad-symbol table indicates how much to shift based on text’s character causing a mismatch –good-suffix table indicates how much to shift based on matched part (suffix) of the pattern

5-21 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Bad-symbol shift in Boyer-Moore algorithm  If the rightmost character of the pattern doesn’t match, BM algorithm acts as Horspool’s  If the rightmost character of the pattern does match, BM compares preceding characters right to left until either all pattern’s characters match or a mismatch on text’s character c is encountered after k > 0 matches textpattern bad-symbol shift d 1 = max{t 1 (c ) - k, 1} c k matches

5-22 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Good-suffix shift in Boyer-Moore algorithm  Good-suffix shift d 2 is applied after 0 < k < m last characters were matched  d 2 (k) = the distance between matched suffix of size k and its rightmost occurrence in the pattern that is not preceded by the same character as the suffix Example: CABABA d 2 (1) = 4 also see: pg 261  If there is no such occurrence, match the longest part of the k-character suffix with corresponding prefix; if there are no such suffix-prefix matches, d 2 (k) = m Example: WOWWOW d 2 (2) = 5, d 2 (3) = 3, d 2 (4) = 3, d 2 (5) = 3

5-23 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Boyer-Moore Algorithm After matching successfully 0 < k < m characters, the algorithm shifts the pattern right by d = max {d 1, d 2 } d = max {d 1, d 2 } where d 1 = max{t 1 (c) - k, 1} is bad-symbol shift d 2 (k) is good-suffix shift d 2 (k) is good-suffix shift Example: Find pattern AT _ THAT in WHICH _ FINALLY _ HALTS. _ _ AT _ THAT WHICH _ FINALLY _ HALTS. _ _ AT _ THAT

5-24 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Boyer-Moore Algorithm (cont.)‏ Step 1 Fill in the bad-symbol shift table Step 2 Fill in the good-suffix shift table Step 3 Align the pattern against the beginning of the text Step 4 Repeat until a matching substring is found or text ends: Compare the corresponding characters right to left. Compare the corresponding characters right to left. If no characters match, retrieve entry t 1 (c) from the bad- symbol table for the text’s character c causing the mismatch and shift the pattern to the right by t 1 (c). If 0 < k < m characters are matched, retrieve entry t 1 (c) from the bad-symbol table for the text’s character c causing the mismatch and entry d 2 (k) from the good- suffix table and shift the pattern to the right by If no characters match, retrieve entry t 1 (c) from the bad- symbol table for the text’s character c causing the mismatch and shift the pattern to the right by t 1 (c). If 0 < k < m characters are matched, retrieve entry t 1 (c) from the bad-symbol table for the text’s character c causing the mismatch and entry d 2 (k) from the good- suffix table and shift the pattern to the right by d = max {d 1, d 2 } where d 1 = max{t 1 (c) - k, 1}. d = max {d 1, d 2 } where d 1 = max{t 1 (c) - k, 1}.

5-25 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Example of Boyer-Moore alg. application B E S S _ K N E W _ A B O U T _ B A O B A B S B E S S _ K N E W _ A B O U T _ B A O B A B S B A O B A B B A O B A B d 1 = t 1 ( K ) = 6 B A O B A B d 1 = t 1 ( K ) = 6 B A O B A B d 1 = t 1 ( _ )-2 = 4 d 1 = t 1 ( _ )-2 = 4 d 2 (2) = 5 d 2 (2) = 5 B A O B A B B A O B A B d 1 = t 1 ( _ )-1 = 5 d 1 = t 1 ( _ )-1 = 5 d 2 (1) = 2 d 2 (1) = 2 B A O B A B (success) B A O B A B (success) A B C D E F G H I J K L M N O P Q R S T U V W X Y Z _ 6 kpatternd2d2 1 BAOBAB

5-26 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Hashing  A very efficient method for implementing a dictionary, i.e., a set with the operations: – find – insert – delete  Based on representation-change and space-for-time tradeoff ideas  Important applications: – symbol tables – databases (extendible hashing)‏

5-27 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Hash tables and hash functions The idea of hashing is to map keys of a given file of size n into a table of size m, called the hash table, by using a predefined function, called the hash function, h: K  location (cell) in the hash table h: K  location (cell) in the hash table Example: student records, key = SSN. Hash function: h(K) = K mod m where m is some integer (typically, prime)‏ If m = 1000, where is record with SSN= stored? Generally, a hash function should: be easy to computebe easy to compute distribute keys about evenly throughout the hash tabledistribute keys about evenly throughout the hash table

5-28 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Collisions If h(K 1 ) = h(K 2 ), there is a collision If h(K 1 ) = h(K 2 ), there is a collision  Good hash functions result in fewer collisions but some collisions should be expected (birthday paradox)‏  Two principal hashing schemes handle collisions differently : Open hashing – each cell is a header of linked list of all keys hashed to itOpen hashing – each cell is a header of linked list of all keys hashed to it Closed hashingClosed hashing –one key per cell –in case of collision, finds another cell by –linear probing: use next free bucket – double hashing: use second hash function to compute increment

5-29 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Open hashing (Separate chaining)‏ Keys are stored in linked lists outside a hash table whose elements serve as the lists’ headers. Example: A, FOOL, AND, HIS, MONEY, ARE, SOON, PARTED h(K) = sum of K ‘s letters’ positions in the alphabet MOD 13 KeyAFOOLANDHISMONEYARESOONPARTED h(K)‏ AFOOLANDHISMONEYAREPARTED SOON Search for KID

5-30 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Open hashing (cont.)‏  If hash function distributes keys uniformly, average length of linked list will be α = n/m. This ratio is called load factor.  Average number of probes in successful, S, and unsuccessful searches, U: S  1+ α /2, U = α S  1+ α /2, U = α  Load α is typically kept small (ideally, about 1)‏  Open hashing still works if n > m

5-31 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Closed hashing (Open addressing)‏ Keys are stored inside a hash table. KeyAFOOLANDHISMONEYARESOONPARTED h(K)‏

5-32 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2 nd ed., Ch. 5 Closed hashing (cont.)‏  Does not work if n > m  Avoids pointers  Deletions are not straightforward  Number of probes to find/insert/delete a key depends on load factor α = n/m (hash table density) and collision resolution strategy. For linear probing: S = (½) (1+ 1/(1- α )) and U = (½) (1+ 1/(1- α )²)‏  As the table gets filled ( α approaches 1), number of probes in linear probing increases dramatically: