Download presentation
Presentation is loading. Please wait.
1
Analysis & Design of Algorithms (CSCE 321)
Prof. Amr Goneid Department of Computer Science, AUC Part 6. Transform & Conquer Algorithms Prof. Amr Goneid, AUC
2
Transform & Conquer Prof. Amr Goneid, AUC
3
Transform & Conquer Transform & Conquer Pre-Sorting
Heaps and Heap Sort The Selection Problem Polynomial Evaluation Computing Spans Counting Paths in a Graph Prof. Amr Goneid, AUC
4
1. Transform & Conquer This group of techniques solves a problem by a
transformation: to a simpler/more convenient instance of the same problem (instance simplification) to a different representation of the same instance (representation change) to a different problem for which an algorithm is already available (problem reduction) Prof. Amr Goneid, AUC
5
2. Pre-Sorting Pre-Sorting can be used for instance simplification.
Example: Checking element Uniqueness in an Array. A brute –force algorithm: Analysis : Prof. Amr Goneid, AUC
6
Pre-Sorting Another algorithm would first sort the array and check for equal adjacent elements (duplicates). An efficient sorting algorithm (like quicksort) would have a complexity of (n log n): Algorithm UniqueElements2 (A[0..n-1]) Sort (A) for i = 0 to n-2 if A[i] = A[i+1] return false return true Worst case number of comparisons is T(n) = n log n + (n-1) = O(n log n) Prof. Amr Goneid, AUC
7
Pre-Sorting Another Example is searching. Linear search in an array costs O(n) comparisons. Pre-sorting allows the use of Binary Search with a complexity of O(log n). If we sort once and search many times, pre-sorting is then justified. Prof. Amr Goneid, AUC
8
3. Heaps and Heap Sort (a) Priority Queues and the Heap
Definition of Priority Queue A Priority Queue (PQ) is a set with the operations: • Insert an element • Remove and return the smallest / largest element . Prof. Amr Goneid, AUC
9
Applications Priority queues enable us to retrieve items not by the insertion time (as in a stack or queue), nor by a key match (as in a dictionary), but by which item has the highest priority of retrieval Priority queues are used to maintain schedules and calendars. They govern who goes next in simulations of airports, parking lots, and the like One famous application is an efficient sorting method called Heap Sort. Prof. Amr Goneid, AUC
10
The Heap The Binary Heap is an array implementation of PQ.
The Heap is visualized as a binary tree with the following properties: Property (1): Partially Balanced: The heap is as close to a complete binary tree as possible. If there are missing leaves, they will be in the last level at the far right. Prof. Amr Goneid, AUC
11
The Heap as a Complete Tree
1 2 3 4 6 7 5 Missing Leaves 8 9 10 Prof. Amr Goneid, AUC
12
The Heap as a Complete Tree
Property (2): Heap Condition: The value in each parent is ≤ the values in its children. 1 1 2 3 3 7 4 6 7 5 5 7 18 9 8 9 10 16 7 8 Prof. Amr Goneid, AUC
13
The Heap as a Complete Tree
Property (3): The Heap array The Heap array contains the level order traversal of the tree 1 1 2 3 3 7 4 6 7 5 5 7 18 9 8 9 10 16 7 8 Prof. Amr Goneid, AUC
14
The Heap Array Example Parent at location (i), children at locations (2i) and (2i + 1) Parent of child at (i) is in location (i / 2) The heap condition is: a[i] ≤ a[2*i] && a[i] ≤ a[2*i+1] 1 3 7 5 7 18 9 16 7 8 Prof. Amr Goneid, AUC
15
The Heap Array The binary heap data structure supports both insertion and extract-min in O(log n) time each. The minimum key is always at the root of the heap. New keys can be inserted by placing them at an open leaf and up-heaping the element upwards until it sits at its proper place in the partial order. Prof. Amr Goneid, AUC
16
Insert in The Heap Insert array X[ ] = {2,4,6,5,3,1} into a heap
Resulting heap = {1,3,2,5,4,6} 2 2 2 2 1 4 4 6 3 6 3 2 5 3 1 6 5 4 5 4 insert 4 insert 1 insert 2 insert 6,5,3 Prof. Amr Goneid, AUC
17
Insertion Algorithm insert (v) { Increment heap size
insert element (v) at end (first rightmost empty leaf) upheap element from location (N) if necessary } Prof. Amr Goneid, AUC
18
UpHeap Algorithm // Upheap element at location (k) in a minimum heap
// as long as it is less than or equal to its parent // Assume a[0] = -∞ upheap (k) { copy ak into v; while ( parent of ak >= v) move parent to child’s location; set k to point to old parent location (i.e. k ← k/2) ; } copy back v into ak; Prof. Amr Goneid, AUC
19
Remove from The Heap 1 2 1 6 2 4 3 2 3 2 3 6 3 6 6 5 4 5 4 5 4 5 5 3 4 6 3 5 4 6 5 6 4 6 4 6 5 6 5 6 5 Input heap = {1,3,2,5,4,6}, Output = {1,2,3,4,5,6} Prof. Amr Goneid, AUC
20
Removal Algorithm remove( ) { copy top of heap (a1) into v;
// remove and return top of the heap, then adjust heap. remove( ) { copy top of heap (a1) into v; replace a1 by last element (aN); decrement heap size; downheap a1 to its proper location if necessary; return v; } Prof. Amr Goneid, AUC
21
DownHeap Algorithm // downheap element at (k) in the heap downheap(k)
{ copy element ak to v; find location (j) of its left child; while (left child aj exists) { if ((there is a right child) && (left child > right child)) point j to right child; if ( v <= child ) break; copy child to parent; move j down to new left child; } copy back v to parent; Prof. Amr Goneid, AUC
22
Demo Prof. Amr Goneid, AUC
23
Analysis of PQ Operations
Worst case cost of insertion: This happens when the data are inserted in descending order in a minimum heap. Consider inserting such sequence of (n) integers and take T(n) to be the number of comparisons used in the upheap process. Let us find T(n) for each of the following sequences : 1. (1) (a complete tree of height h = 1) 2. (3,2,1) (a complete tree with h = 2) 3. (7,6,5,4,3,2,1) (a complete tree with h = 3) Prof. Amr Goneid, AUC
24
Analysis of insertion There is always a comparison of root with itemMin so that: n h Levels T(n) 1 1*1 = 1*20 = 1 3 2 1,2 T(1) + 2*2 = 5 7 1,2,3 T(2) + 3*4 = 17 Prof. Amr Goneid, AUC
25
Analysis of insertion Worst case number of comparisons in insertion:
Prof. Amr Goneid, AUC
26
Analysis of PQ Operations
upheap O(h) = O(log n) downheap O(h) = O(log n) insert (n) items O(nlog n) or O(n)* remove (n) items O(nlog n) * see Heapify algorithm Prof. Amr Goneid, AUC
27
Building the heap bottom-up
Robert Floyd (1964) found a better way to build the heap using a merge procedure called heapify Consider a node (i) in the heap with children (L) and (R). Consider (L) and (R) to be roots of proper heaps. If tree rooted at (i) violates heap property, we let the parent node trickle down so subtree (i) satisfies the heap property. Prof. Amr Goneid, AUC
28
Heapify Algorithm Heapify (A[1..n], n, i) left = 2i right = 2i+1
if ((left ≤ n) and (A[left] > A[i])) then max = left else max = i if ((right ≤ n) and (A[right] > A[max])) then max = right if (max ≠ i) then swap (A[i], A[max]) Heapify (A, n, max) Notice that in the worst case Heapify will take T(n) = O(log n) Prof. Amr Goneid, AUC
29
Building the Heap Given an unsorted array A, we can make it a heap using the Heapify algorithm. We know that all elements at n/2+1 n are already Heaps, so we apply Heapify only on elements at n/2 1 Build-MaxHeap (A[1..n], n) for i = n/2 downto 1 do Heapify (A, n, i) Since heapify costs O(logn), so what is the advantage? Assuming a full tree, the number of nodes on a level (L) is 2L-1 and the total number of nodes is n = 2h-1. The worst-case number of heap adjustments is (h-L) at level (L). Prof. Amr Goneid, AUC
30
Analysis The worst case cost of insertion will be:
Hence, using Heapify requires only O(n) operations instead of O (n logn). Prof. Amr Goneid, AUC
31
Priority Queue Class The CSCI 321 course web site contains full definitions and implementations of : PQ class: Priority Queue (minimum heap) class with dynamic array representation Prof. Amr Goneid, AUC
32
(b) Heap Sort Algorithm
The heap sort is a sorting algorithm based on Priority Queues. The idea is to insert all array elements into a minimum heap, then remove the top of the heap one by one back into the array. Prof. Amr Goneid, AUC
33
HeapSort Algorithm V1 //To sort an array X[ ] of n elements
heapsort( X[1..n ], n) { int i; PQ<type> Heap(n); for (i = 1 to n) Heap.insert(X[i]); for (i = 1 to n) X[i] = Heap.remove(); } Prof. Amr Goneid, AUC
34
Demos Prof. Amr Goneid, AUC
35
Analysis of HeapSort V1 Worst case cost of insertion:
As we saw, every new insertion will have to take the element all the way up to the root, i.e. O(h) operations. Since a complete tree has a height of O(log n) , the worst case cost of inserting (n) elements into a heap is O(n log n) Prof. Amr Goneid, AUC
36
Analysis of HeapSort V1 Worst case cost of removal:
It is now easy to see that the worst case cost of removal of an element is O(log n). Removing all elements from the heap will then cost O(n log n). Worst Case cost of HeapSort Therefore, the total worst case cost for heapsort is O(n log n) Prof. Amr Goneid, AUC
37
HeapSort Algorithm V2 //To sort an array X[ ] of n elements using Heapify Algorithm heapsort (X[1..n ], n) { heap_size = n; Build-MaxHeap (X[1..n], n) for i = n downto 2 swap(X[1] , X[i]) heap_size = heap_size -1 Heapify ( X , heap_size , 1) } Prof. Amr Goneid, AUC
38
Analysis of HeapSort V2 Build-MaxHeap takes O(n) time
Each of (n-1) calls to Heapify takes O(log n) Hence Heapsort takes: T(n) = O(n) + (n-1) O(log n) = O(n log n) Prof. Amr Goneid, AUC
39
Performance of Heap Sort
The complexity of the HeapSort is O(n log n) In-Place Sort No (uses heap array) Stable Algorithm No This technique is satisfactory for medium to large data sets Prof. Amr Goneid, AUC
40
See Internet Animation Sites:
For Example: Prof. Amr Goneid, AUC
41
4. The Selection Problem The Selection Problem
Let A be an array of n distinct elements. Find the kth smallest element. Some Applications Finding nearest neighbor Finding the Median Partition around median for closest pair problem Traveling salesman problem Prof. Amr Goneid, AUC
42
Selection by T&Q Algorithms: Several algorithms exist.. e.g.,
Sort array (e.g. by HeapSort), kth smallest element will be located at kth location. Complexity is O(n log n) Insert all elements in a minimum Heap, then remove k elements from Heap. Last removed is the kth smallest. Complexity is O(n)+O(k log n) Prof. Amr Goneid, AUC
43
5. Polynomial Evaluation
A polynomial of degree n can be evaluated directly as P(x) = an xn + an-1 xn-1 + …..+ ai xi + ….+ a1 x + a0 xi is computed by a function pow(x,i) using (i-1) multiplications. The direct algorithm is: (consider x and a[ ] to be of type double): A polynomial of degree n can be evaluated directly as P(x) = an xn + an-1 xn-1 + …..+ ai xi + ….+ a1 x + a0 xi is computed by a function pow(x,i) using i-1 multiplications.The direct algorithm is (consider x and a[ ] to be of type double): double p = a[0]; for (int i = 1; i <= n; i++) p = p + a[i] * pow(x,i); double p = a[0]; for (int i = 1; i <= n; i++) p = p + a[i] * pow(x,i); Prof. Amr Goneid, AUC
44
Polynomial Evaluation
The number of double arithmetic operations inside the loop is 2 + (i-1), Hence, Never use this method because it is quadratic. Instead, we use Horner’s method. Prof. Amr Goneid, AUC
45
Horner’s Algorithm double p = a[n];
William George Horner (1819) introduced a factorization (Transformation) in the form: P(x) = (…(((an)x + an-1)x + an-2)x +..+a1)x + a0 with the algorithm: double p = a[n]; for ( i = n-1; i >= 0; i--) p = p * x + a[i]; Prof. Amr Goneid, AUC
46
Horner’s Algorithm Analysis of this algorithm gives for the
number of double arithmetic operations: This is a faster linear algorithm Prof. Amr Goneid, AUC
47
6. Computing Spans Given an an array X, the span S[i] of X[i] is the maximum number of consecutive elements X[j] immediately preceding and including X[i] and such that X[j] ≤ X[i]. Spans have applications to financial analysis. Example: i 1 2 3 4 5 6 7 X S Prof. Amr Goneid, AUC
48
Computing Spans Algorithm (1)
Algorithm1: Quadratic Algorithm spans1(X, S, n) { for i ← 0 to n − 1 k ← 1 while (k ≤ i && X[i − k] ≤ X[i] ) k ← k + 1 S[i] ← k } Analysis: In the worst case, the while loop is executed (i) times so that the number of operation is Prof. Amr Goneid, AUC
49
Computing Spans Algorithm (2)
Computing Spans with a Stack Keep in a stack the indices of the elements visible when “looking back”. Scan the array from left to right. Let i be the current index. Pop indices from the stack until stack top = index j such that X[i] < X[j]. Set S[i] ← i − j Push current index i onto the stack Prof. Amr Goneid, AUC
50
Computing Spans Algorithm (2)
Algorithm2: Linear Algorithm using a Stack spans2(X,S, n) { A ← new empty stack for i ← 0 to n − 1 while (!A.stackIsEmpty() && X[A.top()] ≤ X[i] ) A.pop() if A.isEmpty() then S[i] ← i + 1 else S[i] ← i − A.top() A.push(i) } Analysis: Each index of the array is pushed into the stack exactly once Is popped from the stack at most once The statements in the while-loop are executed at most n times Algorithm spans2 runs in O(n) time Prof. Amr Goneid, AUC
51
7. Counting Paths in a Graph
consider the problem of counting paths between two vertices in a graph. It can be shown that the number of different paths of length k > 0 from the ith vertex to the jth vertex of a graph (undirected or directed) equals the (i, j )th element of Ak where A is the adjacency matrix of the graph. Therefore, the problem of counting a graph’s paths can be reduced to the problem of computing the power of a matrix. Prof. Amr Goneid, AUC
52
Counting Paths in a Graph
Example: Consider the given graph. Its adjacency matrix A and its square A2 indicate the numbers of paths of length 1 and 2, respectively, between the corresponding vertices of the graph. In particular, there are three paths of length 2 that start and end at vertex a (a − b − a, a − c − a, and a − d − a); But there is only one path of length 2 from a to c (a − d − c). Prof. Amr Goneid, AUC
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.