6-0 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 LECTURE.

Slides:



Advertisements
Similar presentations
The Dictionary ADT Definition A dictionary is an ordered or unordered list of key-element pairs, where keys are used to locate elements in the list. Example:
Advertisements

Transform and Conquer Chapter 6. Transform and Conquer Solve problem by transforming into: a more convenient instance of the same problem (instance simplification)
Chapter 15 Heaps. Copyright © 2005 Pearson Addison-Wesley. All rights reserved Chapter Objectives Define a heap abstract data structure Demonstrate.
Transform & Conquer Replacing One Problem With Another Saadiq Moolla.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. Problem Reduction Dr. M. Sakalli Marmara Unv, Levitin ’ s notes.
Chapter 6: Transform and Conquer
Design and Analysis of Algorithms - Chapter 61 Transform and Conquer Solve problem by transforming into: b a more convenient instance of the same problem.
Transform & Conquer Lecture 08 ITS033 – Programming & Algorithms
COSC 3100 Transform and Conquer
Heapsort.
Design and Analysis of Algorithms – Chapter 61 Transform and Conquer Dr. Ying Lu RAIK 283: Data Structures & Algorithms.
Chapter 4 Divide-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Heapsort. 2 Why study Heapsort? It is a well-known, traditional sorting algorithm you will be expected to know Heapsort is always O(n log n) Quicksort.
TCSS 343, version 1.1 Algorithms, Design and Analysis Transform and Conquer Algorithms Presorting HeapSort.
Course Review COMP171 Spring Hashing / Slide 2 Elementary Data Structures * Linked lists n Types: singular, doubly, circular n Operations: insert,
Chapter 13 Binary Search Trees. Copyright © 2005 Pearson Addison-Wesley. All rights reserved Chapter Objectives Define a binary search tree abstract.
Chapter 6 Transform-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Maps A map is an object that maps keys to values Each key can map to at most one value, and a map cannot contain duplicate keys KeyValue Map Examples Dictionaries:
Heapsort Based off slides by: David Matuszek
1 HEAPS & PRIORITY QUEUES Array and Tree implementations.
ICS 220 – Data Structures and Algorithms Week 7 Dr. Ken Cosh.
Compiled by: Dr. Mohammad Alhawarat BST, Priority Queue, Heaps - Heapsort CHAPTER 07.
1 B Trees - Motivation Recall our discussion on AVL-trees –The maximum height of an AVL-tree with n-nodes is log 2 (n) since the branching factor (degree,
ADT Table and Heap Ellen Walker CPSC 201 Data Structures Hiram College.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
Chapter 6 Transform-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Heapsort CSC Why study Heapsort? It is a well-known, traditional sorting algorithm you will be expected to know Heapsort is always O(n log n)
1 CSC 421: Algorithm Design Analysis Spring 2014 Transform & conquer  transform-and-conquer approach  presorting  balanced search trees, heaps  Horner's.
© 2006 Pearson Addison-Wesley. All rights reserved13 B-1 Chapter 13 (continued) Advanced Implementation of Tables.
Chapter 6 Transform-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Transform and Conquer This group of techniques solves a problem by a transformation to a simpler/more convenient instance of the same problem (instance.
Sorting. Pseudocode of Insertion Sort Insertion Sort To sort array A[0..n-1], sort A[0..n-2] recursively and then insert A[n-1] in its proper place among.
© 2010 Pearson Addison-Wesley. All rights reserved. Addison Wesley is an imprint of CHAPTER 12: Multi-way Search Trees Java Software Structures: Designing.
File Organization and Processing Week Tree Tree.
Chapter 2: Basic Data Structures. Spring 2003CS 3152 Basic Data Structures Stacks Queues Vectors, Linked Lists Trees (Including Balanced Trees) Priority.
Priority Queues and Heaps. October 2004John Edgar2  A queue should implement at least the first two of these operations:  insert – insert item at the.
Heaps and Heapsort Prof. Sin-Min Lee Department of Computer Science San Jose State University.
Heapsort. What is a “heap”? Definitions of heap: 1.A large area of memory from which the programmer can allocate blocks as needed, and deallocate them.
Week 10 - Friday.  What did we talk about last time?  Graph representations  Adjacency matrix  Adjacency lists  Depth first search.
Lecture 15 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
AVL Trees and Heaps. AVL Trees So far balancing the tree was done globally Basically every node was involved in the balance operation Tree balancing can.
Lecture 14 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Week 15 – Wednesday.  What did we talk about last time?  Review up to Exam 1.
CS 367 Introduction to Data Structures Lecture 8.
1 CSC 421: Algorithm Design Analysis Spring 2013 Transform & conquer  transform-and-conquer approach  presorting  balanced search trees, heaps  Horner's.
Chapter 6 Transform-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Sorting Cont. Quick Sort As the name implies quicksort is the fastest known sorting algorithm in practice. Quick-sort is a randomized sorting algorithm.
Chapter 6 Transform-and-Conquer
Heap Sort Example Qamar Abbas.
Sorting.
Heap Chapter 9 Objectives Upon completion you will be able to:
Chapter 6 Transform and Conquer.
Transform and Conquer This group of techniques solves a problem by a transformation to a simpler/more convenient instance [eg sorted] of the same problem.
Dr. David Matuszek Heapsort Dr. David Matuszek
Transform and Conquer This group of techniques solves a problem by a transformation to a simpler/more convenient instance of the same problem (instance.
Balanced-Trees This presentation shows you the potential problem of unbalanced tree and show two way to fix it This lecture introduces heaps, which are.
Tree Representation Heap.
Balanced-Trees This presentation shows you the potential problem of unbalanced tree and show two way to fix it This lecture introduces heaps, which are.
Chapter 6: Transform and Conquer
Heapsort.
Transform and Conquer This group of techniques solves a problem by a transformation to a simpler/more convenient instance of the same problem (instance.
Transform and Conquer Transform and Conquer Transform and Conquer.
Transform and Conquer Transform and Conquer Transform and Conquer.
CSC 380: Design and Analysis of Algorithms
Heapsort.
Heapsort.
Heaps and priority queues
Heapsort.
CO 303 Algorithm Analysis and Design
Presentation transcript:

6-0 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 LECTURE 7

6-1 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Decrease-by-Constant-Factor Algorithms In this variation of decrease-and-conquer, instance size is reduced by the same factor (typically, 2) The Problems : Binary search and the method of bisection Binary search and the method of bisection Exponentiation by squaring Exponentiation by squaring Multiplication à la russe (Russian peasant method) Multiplication à la russe (Russian peasant method) Fake-coin puzzle Fake-coin puzzle Josephus problem Josephus problem

6-2 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Russian Peasant Multiplication The problem: Compute the product of two positive integers Can be solved by a decrease-by-half algorithm based on the following formulas. For even values of n: For odd values of n: n * m = * 2m n * m = * 2m + m if n > 1 n * m = * 2m + m if n > 1 n2n2n2n2 n – 1 2

6-3 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Example of Russian Peasant Multiplication Compute 20 * 26 n m n m

Chapter 6 Transform-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.

6-5 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Transform and Conquer This group of techniques solves a problem by a transformation b to a simpler/more convenient instance of the same problem (instance simplification) b to a different representation of the same instance (representation change) b to a different problem for which an algorithm is already available (problem reduction)

6-6 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Instance simplification - Presorting Solve a problem’s instance by transforming it into another simpler/easier instance of the same problem Presorting Many problems involving lists are easier when list is sorted. b searching b computing the median (selection problem) b checking if all elements are distinct (element uniqueness) Also: b Topological sorting helps solving some problems for dags. b Presorting is used in many geometric algorithms.

6-7 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 How fast can we sort ? Efficiency of algorithms involving sorting depends on efficiency of sorting. Theorem (see Sec. 11.2):  log 2 n!   n log 2 n comparisons are necessary in the worst case to sort a list of size n by any comparison-based algorithm. Note: About nlog 2 n comparisons are also sufficient to sort array of size n (by mergesort).

6-8 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Searching with presorting Problem: Search for a given K in A[0..n-1] Presorting-based algorithm: Stage 1 Sort the array by an efficient sorting algorithm Stage 2 Apply binary search Stage 2 Apply binary search Efficiency: Θ (nlog n) + O(log n) = Θ (nlog n) Good or bad? Why do we have our dictionaries, telephone directories, etc. sorted?

6-9 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Element Uniqueness with presorting b Presorting-based algorithm Stage 1: sort by efficient sorting algorithm (e.g. mergesort) Stage 1: sort by efficient sorting algorithm (e.g. mergesort) Stage 2: scan array to check pairs of adjacent elements Stage 2: scan array to check pairs of adjacent elements Efficiency : Θ (nlog n) + O(n) = Θ (nlog n) Efficiency : Θ (nlog n) + O(n) = Θ (nlog n) b Brute force algorithm Compare all pairs of elements Efficiency: O(n 2 )

6-10 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Instance simplification – Gaussian Elimination Given: A system of n linear equations in n unknowns with an arbitrary coefficient matrix. Transform to: An equivalent system of n linear equations in n unknowns with an upper triangular coefficient matrix. Solve the latter by substitutions starting with the last equation and moving up to the first one. a 11 x 1 + a 12 x 2 + … + a 1n x n = b 1 a 11 x 1 + a 12 x 2 + … + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + … + a 2n x n = b 2 a 22 x 2 + … + a 2n x n = b 2 a n1 x 1 + a n2 x 2 + … + a nn x n = b n a nn x n = b n

6-11 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Gaussian Elimination (cont.) The transformation is accomplished by a sequence of elementary operations on the system’s coefficient matrix (which don’t change the system’s solution): for i ←1 to n-1 do replace each of the subsequent rows (i.e., rows i+1, …, n) by a difference between that row and an appropriate multiple of the i-th row to make the new coefficient in the i-th column of that row 0

6-12 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Example of Gaussian Elimination Solve 2x 1 - 4x 2 + x 3 = 6 3x 1 - x 2 + x 3 = 11 x 1 + x 2 - x 3 = -3 Gaussian elimination row2 – (3/2)*row / row2 – (3/2)*row / row3 – (1/2)*row /2 -6 row3–(3/5)*row row3 – (1/2)*row /2 -6 row3–(3/5)*row / / /5 -36/ /5 -36/5 Backward substitution Backward substitution x 3 = (-36/5) / (-6/5) = 6 x 3 = (-36/5) / (-6/5) = 6 x 2 = (2+(1/2)*6) / 5 = 1 x 2 = (2+(1/2)*6) / 5 = 1 x 1 = (6 – 6 + 4*1)/2 = 2 x 1 = (6 – 6 + 4*1)/2 = 2

6-13 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Pseudocode and Efficiency of Gaussian Elimination Stage 1: Reduction to the upper-triangular matrix for i ← 1 to n-1 do for j ← i+1 to n do for k ← i to n+1 do A[j, k] ← A[j, k] - A[i, k] * A[j, i] / A[i, i] //improve! for j ← i+1 to n do for k ← i to n+1 do A[j, k] ← A[j, k] - A[i, k] * A[j, i] / A[i, i] //improve! Stage 2: Backward substitution for j ← n downto 1 do t ← 0 t ← 0 for k ← j +1 to n do for k ← j +1 to n do t ← t + A[j, k] * x[k] x[j] ← (A[j, n+1] - t) / A[j, j] t ← t + A[j, k] * x[k] x[j] ← (A[j, n+1] - t) / A[j, j] Efficiency: Θ (n 3 ) + Θ (n 2 ) = Θ (n 3 ) Efficiency: Θ (n 3 ) + Θ (n 2 ) = Θ (n 3 ) Read the Pseudocode code for the algorithm and find its efficiency?

6-14 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Taxonomy of Searching Algorithms b List searching sequential searchsequential search binary searchbinary search b Tree searching binary search treebinary search tree binary balanced trees: AVL (Adelson-Velsky-Landis) trees, red-black treesbinary balanced trees: AVL (Adelson-Velsky-Landis) trees, red-black trees multiway balanced trees: 2-3 trees, trees, B treesmultiway balanced trees: 2-3 trees, trees, B trees b Hashing open hashing (separate chaining)open hashing (separate chaining) closed hashing (open addressing)closed hashing (open addressing)

6-15 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Binary Search Tree Arrange keys in a binary tree with the binary search tree property: K <K<K>K>K Example: 5, 3, 1, 10, 12, 7, 9

6-16 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Dictionary Operations on Binary Search Trees Searching – straightforward Insertion – search for key, insert at leaf where search terminated Deletion – 3 cases: deleting key at a leaf deleting key at node with single child deleting key at node with two children Efficiency depends of the tree’s height:  log 2 n   h  n-1 Efficiency depends of the tree’s height:  log 2 n   h  n-1 Thus all three operations have Thus all three operations have worst case efficiency:  (n) worst case efficiency:  (n) average case efficiency:  (log n) average case efficiency:  (log n) Bonus: inorder traversal (Left-root-right) produces sorted list Bonus: inorder traversal (Left-root-right) produces sorted list

6-17 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Balanced Search Trees Attractiveness of binary search tree is marred by the bad (linear) worst-case efficiency. Two ideas to overcome it are: b to rebalance binary search tree when a new insertion makes the tree “too unbalanced” AVL trees AVL trees red-black trees red-black trees b to allow more than one key per node of a search tree 2-3 trees 2-3 trees trees trees B-trees B-trees

6-18 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Balanced trees: AVL trees Definition An AVL tree is a binary search tree in which, for every node, the difference between the heights of its left and right subtrees, called the balance factor, is at most 1 (with the height of an empty tree defined as -1) Tree (a) is an AVL tree; tree (b) is not an AVL tree

6-19 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Rotations If a key insertion violates the balance requirement at some node, the subtree rooted at that node is transformed via one of the four rotations. 1- Single L-rotation (LL) b becomes the new root. a takes ownership of b's left child as its right child, or in this case, null. b takes ownership of a as its left child.

6-20 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Rotations 2- Right Rotation (RR) b b becomes the new root. b c takes ownership of b's right child, as its left child. In this case, that value is null. b b takes ownership of c, as it's right child.

6-21 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Rotations 3- Left-Right Rotation (LR) or "Double left“ b the right subtree having a negative balance b So, perform a right rotation on the right subtree. b Think of our right subtree, isolated from our main tree, and perform a right rotation on it. b Add the root b Do the left rotation Single Left Rotation right Rotation Left Rotation Stuck !!!

6-22 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Rotations 4- Right-Left Rotiation (RL) or "Double right“ performing a left rotation our left subtree The tree can now be balanced using a single right rotation Single right Rotation Stuck !!! right Rotation

6-23 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 General case: Single R-rotation c becomes the new root. r takes ownership of c's right child, as its left child. c takes ownership of r, as it's right child.

6-24 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 General case: Double LR-rotation

6-25 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 AVL tree construction - an example Construct an AVL tree for the list 5, 6, 8, 3, 2, 4, 7

6-26 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 AVL tree construction - an example (cont.)

6-27 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Analysis of AVL trees b h  log 2 (n + 2) average height: 1.01 log 2 n for large n (found empirically) average height: 1.01 log 2 n for large n (found empirically) b Search and insertion are O(log n) b Deletion is more complicated but is also O(log n) b Disadvantages: frequent rotationsfrequent rotations complexitycomplexity b A similar idea: red-black trees (height of subtrees is allowed to differ by up to a factor of 2)

6-28 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Multiway Search Trees Definition A multiway search tree is a search tree that allows more than one key in the same node of the tree. Definition A node of a search tree is called an n-node if it contains n-1 ordered keys (which divide the entire key range into n intervals pointed to by the node’s n links to its children): Note: Every node in a classical binary search tree is a 2-node Note: Every node in a classical binary search tree is a 2-node k 1 < k 2 < … < k n-1 < k 1 [k 1, k 2 )  k n-1

6-29 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch Tree Definition A 2-3 tree is a search tree that b may have 2-nodes and 3-nodes b height-balanced (all leaves are on the same level) A 2-3 tree is constructed by successive insertions of keys given, with a new key always inserted into a leaf of the tree. If the leaf is a 3-node, it’s split into two with the middle key promoted to the parent.

6-30 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Construct A 2-3 tree Video

6-31 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch tree construction – an example (try it yourself) Construct a 2-3 tree the list 9, 5, 8, 3, 2, 4, 7

6-32 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Analysis of 2-3 trees b log 3 (n + 1) - 1  h  log 2 (n + 1) - 1 b Search, insertion, and deletion are in  (log n) b The idea of 2-3 tree can be generalized by allowing more keys per node trees2-3-4 trees B-treesB-trees Each node of a b-tree may have a variable number of keys and children Each node of a b-tree may have a variable number of keys and children

6-33 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Heaps and Heapsort Definition A heap is a binary tree with keys at its nodes (one key per node) such that: b It is essentially complete, i.e., all its levels are full except possibly the last level, where only some rightmost keys may be missing  The key at each node is ≥ keys at its children

6-34 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch The heap property b A node has the heap property if the value in the node is as large as or larger than the values in its children b All leaf nodes automatically have the heap property b A binary tree is a heap if all nodes in it have the heap property Blue node has heap property 12 8 Blue node has heap property Blue node does not have heap property

6-35 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch siftUp b Given a node that does not have the heap property, you can give it the heap property by exchanging its value with the value of the larger child b This is sometimes called sifting up b Notice that the child may have lost the heap property Blue node has heap property Blue node does not have heap property

6-36 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Illustration of the heap’s definition a heap not a heap Note: Heap’s elements are ordered top down (along any path down from its root), but they are not ordered left to right

6-37 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Some Important Properties of a Heap b Given n, there exists a unique binary tree with n nodes that is essentially complete, with h =  log 2 n  is essentially complete, with h =  log 2 n  b The root contains the largest key b The subtree rooted at any node of a heap is also a heap b A heap can be represented as an array

6-38 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Heap’s Array Representation Store heap’s elements in an array (whose elements indexed, for convenience, 1 to n) in top-down left-to-right order Example: b Left child of node j is at 2j b Right child of node j is at 2j+1 b Parent of node j is at  j/2  b Parental nodes are represented in the first  n/2  locations

6-39 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Constructing a heap I b A tree consisting of a single node is automatically a heap b We construct a heap by adding nodes one at a time: Add the node just to the right of the rightmost node in the deepest levelAdd the node just to the right of the rightmost node in the deepest level If the deepest level is full, start a new levelIf the deepest level is full, start a new level b Examples: Add a new node here

6-40 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch Constructing a heap II b Each time we add a node, we may destroy the heap property of its parent node b To fix this, we sift up b But each time we sift up, the value of the topmost node in the sift may increase, and this may destroy the heap property of its parent node b We repeat the sifting up process, moving up in the tree, until either We reach nodes whose values don’t need to be swapped (because the parent is still larger than both children), orWe reach nodes whose values don’t need to be swapped (because the parent is still larger than both children), or We reach the rootWe reach the root

6-41 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch Constructing a heap III

6-42 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch Other children are not affected b The node containing 8 is not affected because its parent gets larger, not smaller b The node containing 5 is not affected because its parent gets larger, not smaller b The node containing 8 is still not affected because, although its parent got smaller, its parent is still greater than it was originally

6-43 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch A sample heap b Here’s a sample binary tree after it has been heapified b Notice that heapified does not mean sorted b Heapifying does not change the shape of the binary tree; this binary tree is balanced and left-justified because it started out that way

6-44 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch Removing the root (animated) b Notice that the largest number is now in the root b Suppose we discard the root: b How can we fix the binary tree so it is once again balanced and left-justified? b Solution: remove the rightmost leaf at the deepest level and use it for the new root

6-45 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch The reHeap method I b Our tree is balanced and left-justified, but no longer a heap b However, only the root lacks the heap property  We can siftUp() the root b After doing this, one and only one of its children may have lost the heap property

6-46 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch The reHeap method II  Now the left child of the root (still the number 11 ) lacks the heap property  We can siftUp() this node b After doing this, one and only one of its children may have lost the heap property

6-47 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch The reHeap method III  Now the right child of the left child of the root (still the number 11 ) lacks the heap property:  We can siftUp() this node b After doing this, one and only one of its children may have lost the heap property —but it doesn’t, because it’s a leaf

6-48 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch The reHeap method IV b Our tree is once again a heap, because every node in it has the heap property b Once again, the largest (or a largest) value is in the root b We can repeat this process until the tree becomes empty b This produces a sequence of values in order largest to smallest

6-49 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Step 0: Initialize the structure with keys in the order given Step 1: Starting with the last (rightmost) parental node, fix the heap rooted at it, if it doesn’t satisfy the heap condition: keep exchanging it with its largest child until the heap condition holds Step 2: Repeat Step 1 for the preceding parental node Heap Construction (bottom-up)

6-50 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Example of Heap Construction Construct a heap for the list 2, 9, 7, 6, 5, 8

6-51 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Pseudopodia of bottom-up heap construction

6-52 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch Sorting b What do heaps have to do with sorting an array? b Here’s the neat part: Because the binary tree is balanced and left justified, it can be represented as an arrayBecause the binary tree is balanced and left justified, it can be represented as an array All our operations on binary trees can be represented as operations on arraysAll our operations on binary trees can be represented as operations on arrays To sort:To sort: heapify the array; heapify the array; while the array isn’t empty { while the array isn’t empty { remove and replace the root; remove and replace the root; reheap the new root node; } reheap the new root node; }

6-53 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch Mapping into an array b Notice: The left child of index i is at index 2*i+1The left child of index i is at index 2*i+1 The right child of index i is at index 2*i+2The right child of index i is at index 2*i+2 Example: the children of node 3 (19) are 7 (18) and 8 (14)Example: the children of node 3 (19) are 7 (18) and 8 (14)

6-54 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch Removing and replacing the root b The “root” is the first element in the array b The “rightmost node at the deepest level” is the last element b Swap them... ...And pretend that the last element in the array no longer exists—that is, the “last index” is 11 (containing the value 9)

6-55 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch Reheap and repeat  Reheap the root node (index 0, containing 11 )... b Remember, though, that the “last” array index is changed b Repeat until the last becomes first, and the array is sorted! And again, remove and replace the root node

6-56 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Insertion of a New Element into a Heap b Insert the new element at last position in heap. b Compare it with its parent and, if it violates heap condition, exchange them b Continue comparing the new element with nodes up the tree until the heap condition is satisfied Example: Insert key 10 Efficiency: O(log n)

6-57 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Stage 1: Construct a heap for a given list of n keys Stage 2: Repeat operation of root removal n-1 times: –Exchange keys in the root and in the last (rightmost) leaf –Decrease heap size by 1 –If necessary, swap new root with larger child until the heap condition holds Heapsort

6-58 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch Analysis I b Here’s how the algorithm starts: heapify the array; heapify the array;  Heapifying the array: we add each of n nodes Each node has to be sifted up, possibly as far as the rootEach node has to be sifted up, possibly as far as the root –Since the binary tree is perfectly balanced, sifting up a single node takes O(log n) time Since we do this n times, heapifying takes n*O(log n) time, that is, O(n log n) timeSince we do this n times, heapifying takes n*O(log n) time, that is, O(n log n) time

6-59 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch Analysis II b Here’s the rest of the algorithm: while the array isn’t empty { while the array isn’t empty { remove and replace the root; remove and replace the root; reheap the new root node; } reheap the new root node; }  We do the while loop n times (actually, n-1 times), because we remove one of the n nodes each time  Removing and replacing the root takes O(1) time  Therefore, the total time is n times however long it takes the reheap method

6-60 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch Analysis III b To reheap the root node, we have to follow one path from the root to a leaf node (and we might stop before we reach a leaf) b The binary tree is perfectly balanced  Therefore, this path is O(log n) long And we only do O(1) operations at each nodeAnd we only do O(1) operations at each node Therefore, reheaping takes O(log n) timesTherefore, reheaping takes O(log n) times  Since we reheap inside a while loop that we do n times, the total time for the while loop is n*O(log n), or O(n log n)

6-61 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch Analysis IV b Here’s the algorithm again: heapify the array; heapify the array; while the array isn’t empty { while the array isn’t empty { remove and replace the root; remove and replace the root; reheap the new root node; } reheap the new root node; }  We have seen that heapifying takes O(n log n) time  The while loop takes O(n log n) time  The total time is therefore O(n log n) + O(n log n)  This is the same as O(n log n) time

6-62 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 A priority queue is the ADT of a set of elements with numerical priorities with the following operations: find element with highest priorityfind element with highest priority delete element with highest prioritydelete element with highest priority insert element with assigned priority (see below)insert element with assigned priority (see below) b Heap is a very efficient way for implementing priority queues b Two ways to handle priority queue in which highest priority = smallest number Priority Queue

6-63 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Horner’s Rule For Polynomial Evaluation Given a polynomial of degree n p(x) = a n x n + a n-1 x n-1 + … + a 1 x + a 0 and a specific value of x, find the value of p at that point. Two brute-force algorithms: p  0 p  a 0 ; power  1 for i  n downto 0 do for i  1 to n do power  1 power  power * x power  1 power  power * x for j  1 to i do p  p + a i * power for j  1 to i do p  p + a i * power power  power * x return p power  power * x return p p  p + a i * power p  p + a i * power return p

6-64 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Horner’s Rule Example: p(x) = 2x 4 - x 3 + 3x 2 + x - 5 = Example: p(x) = 2x 4 - x 3 + 3x 2 + x - 5 = = x(2x 3 - x 2 + 3x + 1) - 5 = = x(2x 3 - x 2 + 3x + 1) - 5 = = x(x(2x 2 - x + 3) + 1) - 5 = = x(x(2x 2 - x + 3) + 1) - 5 = = x(x(x(2x - 1) + 3) + 1) - 5 = x(x(x(2x - 1) + 3) + 1) - 5 Substitution into the last formula leads to a faster algorithm Substitution into the last formula leads to a faster algorithm Same sequence of computations are obtained by simply arranging the coefficient in a table and proceeding as follows: Same sequence of computations are obtained by simply arranging the coefficient in a table and proceeding as follows: coefficients x=3 x=3

6-65 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Horner’s Rule pseudocode Efficiency of Horner’s Rule: # multiplications = # additions = n

6-66 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Computing a n (revisited) Left-to-right binary exponentiation Initialize product accumulator by 1. Scan n’s binary expansion from left to right and do the following: If the current binary digit is 0, square the accumulator (S); if the binary digit is 1, square the accumulator and multiply it by a (SM). Example: Compute a 13. Here, n = 13 = binary rep. of 13: SM SM S SM accumulator: *a=a a 2 *a = a 3 (a 3 ) 2 = a 6 (a 6 ) 2 *a= a 13 (computed left-to-right) Efficiency: (b-1) ≤ M(n) ≤ 2(b-1) where b =  log 2 n  + 1

6-67 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Computing a n (cont.) Right-to-left binary exponentiation Scan n’s binary expansion from right to left and compute a n as the product of terms a 2 i corresponding to 1’s in this expansion. Example Compute a 13 by the right-to-left binary exponentiation. Here, n = 13 = a 8 a 4 a 2 a : a 2 i terms a 8 * a 4 * a : product (computed right-to-left) Efficiency: same as that of left-to-right binary exponentiation

6-68 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Problem Reduction This variation of transform-and-conquer solves a problem by a transforming it into different problem for which an algorithm is already available. To be of practical value, the combined time of the transformation and solving the other problem should be smaller than solving the problem as given by another method.

6-69 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 Examples of Solving Problems by Reduction b computing lcm(m, n) via computing gcd(m, n) b counting number of paths of length n in a graph by raising the graph’s adjacency matrix to the n-th power b transforming a maximization problem to a minimization problem and vice versa (also, min-heap construction) b linear programming b reduction to graph problems (e.g., solving puzzles via state- space graphs)