A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.

Slides:



Advertisements
Similar presentations
Transform and Conquer Chapter 6. Transform and Conquer Solve problem by transforming into: a more convenient instance of the same problem (instance simplification)
Advertisements

Divide-and-Conquer The most-well known algorithm design strategy:
Transform & Conquer Replacing One Problem With Another Saadiq Moolla.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. Problem Reduction Dr. M. Sakalli Marmara Unv, Levitin ’ s notes.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Chapter 6: Transform and Conquer
Design and Analysis of Algorithms - Chapter 61 Transform and Conquer Solve problem by transforming into: b a more convenient instance of the same problem.
Transform & Conquer Lecture 08 ITS033 – Programming & Algorithms
COSC 3100 Transform and Conquer
6-0 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 LECTURE.
MA/CSSE 473 Day 25 Transform and Conquer Student questions?
A balanced life is a prefect life.
Design and Analysis of Algorithms – Chapter 61 Transform and Conquer Dr. Ying Lu RAIK 283: Data Structures & Algorithms.
Chapter 4 Divide-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
Chapter 6: Transform and Conquer Trees, Red-Black Trees The Design and Analysis of Algorithms.
TCSS 343, version 1.1 Algorithms, Design and Analysis Transform and Conquer Algorithms Presorting HeapSort.
Course Review COMP171 Spring Hashing / Slide 2 Elementary Data Structures * Linked lists n Types: singular, doubly, circular n Operations: insert,
Design and Analysis of Algorithms - Chapter 61 Transform and Conquer Solve problem by transforming into: b a more convenient instance of the same problem.
Chapter 13 Binary Search Trees. Copyright © 2005 Pearson Addison-Wesley. All rights reserved Chapter Objectives Define a binary search tree abstract.
Chapter 6 Transform-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Maps A map is an object that maps keys to values Each key can map to at most one value, and a map cannot contain duplicate keys KeyValue Map Examples Dictionaries:
9/17/20151 Chapter 12 - Heaps. 9/17/20152 Introduction ► Heaps are largely about priority queues. ► They are an alternative data structure to implementing.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 11 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
David Luebke 1 10/3/2015 CS 332: Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort.
ADT Table and Heap Ellen Walker CPSC 201 Data Structures Hiram College.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 5 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Chapter 6 Transform-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
1 CSC 421: Algorithm Design Analysis Spring 2014 Transform & conquer  transform-and-conquer approach  presorting  balanced search trees, heaps  Horner's.
© 2006 Pearson Addison-Wesley. All rights reserved13 B-1 Chapter 13 (continued) Advanced Implementation of Tables.
Chapter 6 Transform-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Transform and Conquer This group of techniques solves a problem by a transformation to a simpler/more convenient instance of the same problem (instance.
Sorting. Pseudocode of Insertion Sort Insertion Sort To sort array A[0..n-1], sort A[0..n-2] recursively and then insert A[n-1] in its proper place among.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 9.
Chapter 2: Basic Data Structures. Spring 2003CS 3152 Basic Data Structures Stacks Queues Vectors, Linked Lists Trees (Including Balanced Trees) Priority.
Lecture 11COMPSCI.220.FS.T Balancing an AVLTree Two mirror-symmetric pairs of cases to rebalance the tree if after the insertion of a new key to.
Priority Queues and Heaps. October 2004John Edgar2  A queue should implement at least the first two of these operations:  insert – insert item at the.
The ADT Table The ADT table, or dictionary Uses a search key to identify its items Its items are records that contain several pieces of data 2 Figure.
3.1. Binary Search Trees   . Ordered Dictionaries Keys are assumed to come from a total order. Old operations: insert, delete, find, …
Week 10 - Friday.  What did we talk about last time?  Graph representations  Adjacency matrix  Adjacency lists  Depth first search.
Lecture 15 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
CSC 413/513: Intro to Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort.
Internal and External Sorting External Searching
Lecture 14 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
Week 15 – Wednesday.  What did we talk about last time?  Review up to Exam 1.
1 CSC 421: Algorithm Design Analysis Spring 2013 Transform & conquer  transform-and-conquer approach  presorting  balanced search trees, heaps  Horner's.
Chapter 6 Transform-and-Conquer Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Chapter 6 Transform-and-Conquer
Divide-and-Conquer The most-well known algorithm design strategy:
Sorting.
Divide-and-Conquer The most-well known algorithm design strategy:
Chapter 6 Transform and Conquer.
Transform and Conquer This group of techniques solves a problem by a transformation to a simpler/more convenient instance [eg sorted] of the same problem.
Transform and Conquer This group of techniques solves a problem by a transformation to a simpler/more convenient instance of the same problem (instance.
Balanced-Trees This presentation shows you the potential problem of unbalanced tree and show two way to fix it This lecture introduces heaps, which are.
Decrease-and-Conquer
Balanced-Trees This presentation shows you the potential problem of unbalanced tree and show two way to fix it This lecture introduces heaps, which are.
Chapter 6: Transform and Conquer
Divide-and-Conquer The most-well known algorithm design strategy:
Advanced Implementation of Tables
Transform and Conquer This group of techniques solves a problem by a transformation to a simpler/more convenient instance of the same problem (instance.
Transform and Conquer Transform and Conquer Transform and Conquer.
Divide-and-Conquer The most-well known algorithm design strategy:
Data structure: Heap Representation change: Heapsort Problem reduction
Data structure: Heap Representation change: Heapsort Problem reduction
Transform and Conquer Transform and Conquer Transform and Conquer.
CSC 380: Design and Analysis of Algorithms
CSC 380: Design and Analysis of Algorithms
Data structure: Heap Representation change: Heapsort Problem reduction
Presentation transcript:

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 1 Transform and Conquer This group of techniques solves a problem by a transformation to b a simpler/more convenient instance [eg sorted] of the same problem (instance simplification) b a different representation of the same instance [eg different data structure] (representation change) b a different problem for which an algorithm is already available (problem reduction)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 2 Instance simplification - Presorting Solve a problem’s instance by transforming it into another simpler/easier instance of the same problem Presorting Many problems involving lists are easier when list is sorted, e.g. b searching b computing the median (selection problem) b checking if all elements are distinct (element uniqueness) Also: b Topological sorting helps solving some problems for dags. b Presorting is used in many geometric algorithms [eg close pr]

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 3 How fast can we sort ? Efficiency of algorithms involving sorting depends on efficiency of sorting. Theorem (see Sec. 11.2):  log 2 n!   n log 2 n comparisons are necessary in the worst case to sort a list of size n by any comparison-based algorithm. Note: About nlog 2 n comparisons are also sufficient to sort array of size n (by mergesort).

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 4 Searching with presorting Problem: Search for a given K in A[0..n-1] Presorting-based algorithm: Stage 1 Sort the array by an efficient sorting algorithm Stage 2 Apply binary search Stage 2 Apply binary search Efficiency: Θ(nlog n) + O(log n) = Θ(nlog n) Good or bad? Why do we have our dictionaries, telephone directories, etc. sorted?

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 5 Element Uniqueness with presorting b Presorting-based algorithm Stage 1: sort by efficient sorting algorithm (e.g. mergesort) Stage 1: sort by efficient sorting algorithm (e.g. mergesort) Stage 2: scan array to check pairs of adjacent elements Stage 2: scan array to check pairs of adjacent elements Efficiency : Θ(nlog n) + O(n) = Θ(nlog n) Efficiency : Θ(nlog n) + O(n) = Θ(nlog n) b Brute force algorithm Compare all pairs of elements Efficiency: O(n 2 ) b Another 2 algorithms? …

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 6 Element Uniqueness with presorting b Presorting-based algorithm Stage 1: sort by efficient sorting algorithm (e.g. mergesort) Stage 1: sort by efficient sorting algorithm (e.g. mergesort) Stage 2: scan array to check pairs of adjacent elements Stage 2: scan array to check pairs of adjacent elements Efficiency : Θ(nlog n) + O(n) = Θ(nlog n) Efficiency : Θ(nlog n) + O(n) = Θ(nlog n) b Brute force algorithm Compare all pairs of elements Efficiency: O(n 2 ) b Another 2 algorithms? Hashing – frequently useful

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 7 Element Uniqueness with presorting b Presorting-based algorithm Stage 1: sort by efficient sorting algorithm (e.g. mergesort) Stage 1: sort by efficient sorting algorithm (e.g. mergesort) Stage 2: scan array to check pairs of adjacent elements Stage 2: scan array to check pairs of adjacent elements Efficiency : Θ(nlog n) + O(n) = Θ(nlog n) Efficiency : Θ(nlog n) + O(n) = Θ(nlog n) b Brute force algorithm Compare all pairs of elements Efficiency: O(n 2 ) b Another 2 algorithms? Hashing – frequently useful Closest pairClosest pair

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. Instance simplification – Gaussian Elimination Given: A system of n linear equations in n unknowns with an arbitrary coefficient matrix. Transform to: An equivalent system of n linear equations in n unknowns with an upper triangular coefficient matrix. Solve the latter by substitutions starting with the last equation and moving up to the first one. a 11 x 1 + a 12 x 2 + … + a 1n x n = b 1 a 1,1 x 1 + a 12 x 2 + … + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + … + a 2n x n = b 2 a 22 x 2 + … + a 2n x n = b 2 a n1 x 1 + a n2 x 2 + … + a nn x n = b n a nn x n = b n

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. Gaussian Elimination (cont.) SKIP The transformation is accomplished by a sequence of elementary operations on the system’s coefficient matrix (which don’t change the system’s solution): for i ←1 to n-1 do replace each of the subsequent rows (i.e., rows i+1, …, n) by a difference between that row and an appropriate multiple of the i-th row to make the new coefficient in the i-th column of that row 0

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. Example of Gaussian Elimination Solve 2x 1 - 4x 2 + x 3 = 6 3x 1 - x 2 + x 3 = 11 x 1 + x 2 - x 3 = -3 Gaussian elimination row2 – (3/2)*row / row2 – (3/2)*row / row3 – (1/2)*row /2 -6 row3–(3/5)*row row3 – (1/2)*row /2 -6 row3–(3/5)*row / / /5 -36/ /5 -36/5 Backward substitution Backward substitution x 3 = (-36/5) / (-6/5) = 6 x 3 = (-36/5) / (-6/5) = 6 x 2 = (2+(1/2)*6) / 5 = 1 x 2 = (2+(1/2)*6) / 5 = 1 x 1 = (6 – 6 + 4*1)/2 = 2 x 1 = (6 – 6 + 4*1)/2 = 2

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. Pseudocode and Efficiency of Gaussian Elim SKIP Stage 1: Reduction to an upper-triangular matrix for i ← 1 to n-1 do for j ← i+1 to n do for k ← i to n+1 do A[j, k] ← A[j, k] - A[i, k] * A[j, i] / A[i, i] //improve! for j ← i+1 to n do for k ← i to n+1 do A[j, k] ← A[j, k] - A[i, k] * A[j, i] / A[i, i] //improve! Stage 2: Back substitutions for j ← n downto 1 do t ← 0 t ← 0 for k ← j +1 to n do for k ← j +1 to n do t ← t + A[j, k] * x[k] x[j] ← (A[j, n+1] - t) / A[j, j] t ← t + A[j, k] * x[k] x[j] ← (A[j, n+1] - t) / A[j, j] Efficiency: Θ(n 3 ) + Θ(n 2 ) = Θ(n 3 ) Efficiency: Θ(n 3 ) + Θ(n 2 ) = Θ(n 3 )

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 12 Searching Problem Problem: Given a (multi)set S of keys and a search key K, find an occurrence of K in S, if any b Searching must be considered in the context of [context determines best data structurre]: file size (internal vs. external)file size (internal vs. external) dynamics of data (static vs. dynamic)dynamics of data (static vs. dynamic) b Dictionary operations (dynamic data) [data structure determines performance]: find (search)find (search) insertinsert deletedelete

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 13 Taxonomy of Searching Algorithms b List searching … b Tree searching … b Hashing …

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 14 Taxonomy of Searching Algorithms b List searching: sequential searchsequential search binary searchbinary search interpolation search [SKIP]interpolation search [SKIP]

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 15 Taxonomy of Searching Algorithms b Tree searching binary search treebinary search tree balanced binary trees:balanced binary trees: –AVL trees (SKIP) –red-black trees (Consider after trees) multiway trees [balanced]:multiway trees [balanced]: –2-3 trees (Chap 6) –2-3-4 trees, B trees (Chap 7)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 16 Taxonomy of Searching Algorithms b Hashing open hashing (separate chaining)open hashing (separate chaining) closed hashing (open addressing)closed hashing (open addressing)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 17 Binary Search Tree Arrange keys in a binary tree with the binary search tree property: K <K<K>K>K Example: 5, 3, 1, 10, 12, 7, 9 (Which could/should be the root? Best/Worst case?)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 18 Dictionary Operations on Binary Search Trees Searching – straightforward Insertion – search for key, insert at leaf where search terminated Deletion – 3 cases: deleting key at a leaf deleting key at node with single child deleting key at node with two children Efficiency depends of the tree’s height:  log 2 n   h  n-1, with height average (random files) be about 3log 2 n Efficiency depends of the tree’s height:  log 2 n   h  n-1, with height average (random files) be about 3log 2 n Maximum number of nodes if height k? Maximum number of nodes if height k? Thus all three operations have Thus all three operations have worst case efficiency:  (n) worst case efficiency:  (n) average case efficiency:  (log n) average case efficiency:  (log n) Bonus: inorder traversal produces sorted list Bonus: inorder traversal produces sorted list

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 19 Balanced Search Trees Attractiveness of binary search tree is marred by the bad (linear) worst-case efficiency. Two ideas to overcome it (ie keep BST close to balanced) are: b to restructure BST to maintain balance when an insertion makes the tree “too unbalanced” AVL trees AVL trees red-black trees [come back to after trees] red-black trees [come back to after trees] b to allow more than one key per node of a search tree 2-3 trees [Chapter 6] 2-3 trees [Chapter 6] trees [Chapter ?] trees [Chapter ?] B-trees [Chapter 7] (Red black tree = 234 in binary tree) B-trees [Chapter 7] (Red black tree = 234 in binary tree)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 20 Balanced trees: AVL trees Definition An AVL tree is a binary search tree in which, for every node, the difference between the heights of its left and right subtrees, called the balance factor, is at most 1 (with the height of an empty tree defined as -1) Tree (a) is an AVL tree; tree (b) is not an AVL tree

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 21 Rotations If a key insertion violates the balance requirement at some node, the subtree rooted at that node is transformed via one of the four rotations. (The rotation is always performed for a subtree rooted at an “unbalanced” node closest to the new leaf.) Single R-rotation Double LR-rotation

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 22 General case: Single R-rotation [SKIP]

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 23 General case: Double LR-rotation [SKIP]

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 24 AVL tree construction - an example [SKIP] Construct an AVL tree for the list 5, 6, 8, 3, 2, 4, 7

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 25 AVL tree construction - example (cont. SKIP)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 26 Analysis of AVL trees b h  log 2 (n + 2) average height: 1.01 log 2 n for large n (found empirically) average height: 1.01 log 2 n for large n (found empirically) b Search and insertion are O(log n) b Deletion is more complicated but is also O(log n) b Disadvantages: frequent rotationsfrequent rotations complexitycomplexity b A similar idea: red-black trees (height of subtrees is allowed to differ by up to a factor of 2)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 27 Multiway Search Trees Definition A multiway search tree is a search tree that allows more than one key in tree nodes. Definition A node of a search tree is called an n-node if it contains n-1 ordered keys (which divide the entire key range into n intervals pointed to by the node’s n links to its children): Note: Every node in a classical binary search tree is a 2-node Note: Every node in a classical binary search tree is a 2-node k 1 < k 2 < … < k n-1 < k 1 [k 1, k 2 )  k n-1

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved Tree Definition A 2-3 tree is a search tree that b may have 2-nodes and 3-nodes (How many keys per node?) b height-balanced (all leaves are on the same level !! ) A 2-3 tree is constructed by successive insertions of keys given, with a new key always inserted into a leaf of the tree. If the leaf is a 3-node, it’s split into two with the middle key promoted to the parent. Splitting can be repeated up the tree, as needed.

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved tree construction – an example Construct a 2-3 tree the list 9, 5, 8, 3, 2, 4, 7 LEAVES?

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 30 Analysis of 2-3 trees b Consider: Shortest tree with 8 nodes (add 1,4,10 to [3,8/2-5-9])Shortest tree with 8 nodes (add 1,4,10 to [3,8/2-5-9]) Tallest tree with 7 nodes [which tree]Tallest tree with 7 nodes [which tree] b log 3 (n + 1) - 1  h  log 2 (n + 1) - 1 Does this work for examples above? What is h?Does this work for examples above? What is h? b Search, insertion, and deletion are in  (log n) b The idea of 2-3 tree can be generalized by allowing more keys per node trees2-3-4 trees B-treesB-trees

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 31 Heaps and Heapsort Definition A heap is a binary tree with keys at its nodes (one key per node) such that: b It is essentially complete, i.e., all its levels are full except possibly the last level, where only some rightmost keys may be missing b The key at each node is ≥ all keys in its children (and descendents)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 32 Illustration of the heap’s definition a heap not a heap Note: Heap’s elements are ordered top down (along any path down from its root), but they are not ordered left to right

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 33 Some Important Properties of a Heap b Given n, there exists a unique binary tree (structure) with n nodes that is essentially complete, with h =  log 2 n  b The root contains the largest key b Every subtree rooted at every node of a heap is also a heap b A heap can easily be represented as an array (and usually is). Look at the following example, and then explain why the array representation always works.

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 34 Heap’s Array Representation Store heap’s elements in an array (whose elements indexed, for convenience, 1 to n) in top-down left-to-right order Example: b Left child of node j is at 2j b Right child of node j is at 2j+1 b Parent of node j is at  j/2  b Parental (ie interior) nodes are in the first  n/2  locations

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 35 Step 0: Initialize structure (ie array) with keys in order given Step 1: Starting with the last (rightmost) parental (ie interior) node, fix the heap rooted at it, if it doesn’t satisfy the heap condition: keep exchanging it with its largest child until the heap condition holds Step 2: Repeat Step 1 for the preceding parental node Bottom up: Adding nodes to heap from bottom to top, pushing elements down, as needed Step 1 called Heapify: Heap Construction (bottom-up)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 36 Example of Heap Construction [Bottom Up] Construct a heap for the list 2, 9, 7, 6, 5, 8. Insert elements into the array. Process interior nodes R to L. Why? Before and after each step? Heapify all the way down at each step. Bottom up? Worst case? Number of comparisons for a node??

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 37 Pseudocode of bottom-up heap construction

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 38 Insert a New Element into a Heap b Basic operation in building heap in Top Down order b Insert the new element at last position in heap (size++) b Compare new element with its parent and, if it violates heap condition, exchange them b Continue comparing the new element with nodes up the tree until the heap condition is satisfied Example: Insert key 10 into heap of size 6 What order is this? Efficiency: O(log n)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 39 Example of Heap Construction [Top Down] Construct a heap for the list 2, 9, 7, 6, 5, 8 [Start with elements in the array. Process interior nodes from L to R. Add nodes from top to bottom. Add each node, and push up as needed / 2 9/ 2 7 9/ 2 7/ 6 9/ 6 7/ 2; 9/ 6 7/ 2 5; 9/ 6 7/ 2 5, 8; 9/ 6 8/ 2 5, 7 Is this a heap?? Is this a heap??

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 40 Stage 1: Construct a heap for a given list of n keys Stage 2: Repeat operation of root removal n-1 times: –Exchange keys in the root and in the last (rightmost) leaf –Decrease heap size by 1 –If necessary, repeatedly swap new root (node) with larger child until the heap condition again holds Heapsort

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 41 Sort the list 2, 9, 7, 6, 5, 8 by heapsort Stage 1 (heap construction) Stage 2 (root/max removal) | | | | | | | | | | | | | | | | Example of Sorting by Heapsort

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 42 Stage 1: Build heap for a given list of n keys (of height h) worst-case C(n) = C(n) = Stage 2: Repeat operation of root removal n-1 times (fix heap) worst-case C(n) = Both worst-case and average-case efficiency:  (nlogn) In-place: yes Stability: no (e.g., apply heapsort to 1 1) C(n) = Both worst-case and average-case efficiency:  (nlogn) In-place: yes Stability: no (e.g., apply heapsort to 1 1)  2(h-i) 2 i = 2 ( n – log 2 (n + 1))   (n) i=0 h-1 # nodes at level i  i=1 n-1 2log 2 i   (nlogn) Analysis of Heapsort

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 43 A priority queue is the ADT of a set of elements with numerical priorities with the following operations: find element with highest priorityfind element with highest priority delete element with highest prioritydelete element with highest priority insert element with assigned priority (see below)insert element with assigned priority (see below) b Heap is a very efficient way for implementing priority queues b Two ways to handle priority queue in which highest priority = smallest number Priority Queue

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 44 Horner’s Rule For Polynomial Evaluation Given a polynomial of degree n p(x) = a n x n + a n-1 x n-1 + … + a 1 x + a 0 and a specific value of x, find the value of p at that point. Two brute-force algorithms: p  0 p  a 0 ; power  1 for i  n downto 0 do for i  1 to n do power  1 power  power * x power  1 power  power * x for j  1 to i do p  p + a i * power for j  1 to i do p  p + a i * power power  power * x return p power  power * x return p p  p + a i * power p  p + a i * power return p

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 45 Horner’s Rule Example: p(x) = 2x 4 - x 3 + 3x 2 + x - 5 Example: p(x) = 2x 4 - x 3 + 3x 2 + x - 5 = x(2x 3 - x 2 + 3x + 1) - 5 = x(2x 3 - x 2 + 3x + 1) - 5 = x(x(2x 2 - x + 3) + 1) - 5 = x(x(2x 2 - x + 3) + 1) - 5 = x(x(x(2x - 1) + 3) + 1) - 5 = x(x(x(2x - 1) + 3) + 1) - 5 Substitution into the last formula leads to a faster algorithm Substitution into the last formula leads to a faster algorithm Same sequence of computations are obtained by simply arranging the coefficient in a table and proceeding as follows: Same sequence of computations are obtained by simply arranging the coefficient in a table and proceeding as follows: coefficients x=3 x=3

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 46 Horner’s Rule pseudocode Efficiency of Horner’s Rule: # multiplications = # additions = n Synthetic division of of p(x) by (x-x 0 ) Example: Let p(x) = 2x 4 - x 3 + 3x 2 + x - 5. Find p(x):(x-3)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 47 Computing a n (revisited) Left-to-right binary exponentiation Initialize product accumulator by 1. Scan n’s binary expansion from left to right and do the following: If the current binary digit is 0, square the accumulator (S); if the binary digit is 1, square the accumulator and multiply it by a (SM). Example: Compute a 13. Here, n = 13 = binary rep. of 13: SM SM S SM accumulator: *a=a a 2 *a = a 3 (a 3 ) 2 = a 6 (a 6 ) 2 *a= a 13 (computed left-to-right) Efficiency: b ≤ M(n) ≤ 2b where b =  log 2 n  + 1

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 48 Computing a n (cont.) Right-to-left binary exponentiation Scan n’s binary expansion from right to left and compute a n as the product of terms a 2 i corresponding to 1’s in this expansion. Example Compute a 13 by the right-to-left binary exponentiation. Here, n = 13 = a 8 a 4 a 2 a : a 2 i terms a 8 * a 4 * a : product (computed right-to-left) Efficiency: same as that of left-to-right binary exponentiation

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 49 Problem Reduction This variation of transform-and-conquer solves a problem by a transforming it into different problem for which an algorithm is already available. Reduce P to Q = use a solution of Q as part of a solution to P To solve P, first solve Q then use solution to Q in solution to P OR To solve P, write an alg that calls Q To be of practical value, the combined time of the transformation and solving the other problem should be smaller than solving the problem as given by another method [eg by brute force]. VERY IMPORTANT FOR P=NP (Last topic in course) VERY IMPORTANT FOR P=NP (Last topic in course)

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 50 Examples of Solving Problems by Reduction b computing lcm(m, n) via computing gcd(m, n) !! b … Let’s examine LCM before looking at more examples

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 51 Problem Reduction Reduce P to Q = use a solution of Q as part of a solution to P To solve P, first solve Q then use solution to Q in solution to P OR equivalently: To solve P, write an alg that calls Q P = LCM, Q = GCD Reduce P to Q: Reduce LCM to GCD LCM(x, y) = (x * y) / GCD(x, y) LCM(60, 42) = (60*42) / GCD(60, 42) = 2520 / 6 = =2*2*3*5. 42=2*3*7. GCD=2*3. LCM=2*2*3*5*7. 60=2*2*3*5. 42=2*3*7. GCD=2*3. LCM=2*2*3*5*7. Assume * and / are constant time (ie Θ(1)). Compare performance of Best Algorithms for P & Q. Compare performance of Best Algorithms for P & Q.

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 52 Problem Reduction: Best Alg. Performance P = LCM, Q = GCD Reduce P to Q: Reduce LCM to GCD LCM(x, y) = (x * y) / GCD(x, y) LCM(60, 42) = (60*42) / GCD(60, 42) = 2520 / 6 = 420 Assume reduction steps are faster than Q (eg * and / are Θ(1)). Assume best alg for GCD is used. Are these stmts true or false? - The best alg for LCM can be faster than best alg for GCD. - The best alg for LCM can be faster than best alg for GCD. - The best alg for GCD can be faster than best alg for LCM. - The best alg for GCD can be faster than best alg for LCM. Notation: P  Q means P reduces to Q LCM  GCD. LCM  GCD.

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 53 More Examples of Reduction b Reduce P = lcm(m, n) to Q = gcd(m, n) … we’ve seen b Reduce finding median to ??, followed by ?? Performance of best alg for median? Perf of best alg for ??Performance of best alg for median? Perf of best alg for ?? b Reduce finding minimum to ??, followed by ?? Performance of best alg for min? Perf of best alg for ??Performance of best alg for min? Perf of best alg for ??

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 54 More Examples of Reduction b Reduce P = lcm(m, n) to Q = gcd(m, n) … we’ve seen b Reduce finding median to ??, followed by ?? Performance of best alg for median? Perf of best alg for ??Performance of best alg for median? Perf of best alg for ?? Median  Sort.Median  Sort. b Reduce finding minimum to ??, followed by ?? Performance of best alg for min? Perf of best alg for ??Performance of best alg for min? Perf of best alg for ?? Min  Sort.Min  Sort.

A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 6 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 55 More Examples of Reduction b reduction to graph problems (e.g., solving puzzles via state- space graphs) b transforming a maximization problem to a minimization problem and vice versa b SKIP: linear programminglinear programming counting number of paths of length n in a graph by raising the graph’s adjacency matrix to the n-th powercounting number of paths of length n in a graph by raising the graph’s adjacency matrix to the n-th power