Algorithm Analysis.

Slides:



Advertisements
Similar presentations
Chapter 25 Lists, Stacks, Queues, and Priority Queues
Advertisements

2. Getting Started Heejin Park College of Information and Communications Hanyang University.
Chapter 7 Constructors and Other Tools. Copyright © 2006 Pearson Addison-Wesley. All rights reserved. 7-2 Learning Objectives Constructors Definitions.
Chapter 17 Linked Data Structures. Copyright © 2006 Pearson Addison-Wesley. All rights reserved Learning Objectives Nodes and Linked Lists Creating,
Introduction to Algorithms 6.046J/18.401J
Introduction to Algorithms 6.046J/18.401J
David Luebke 1 6/1/2014 CS 332: Algorithms Medians and Order Statistics Structures for Dynamic Sets.
Chapter 17 Linked Lists.
Chapter 4 Linked Lists. © 2005 Pearson Addison-Wesley. All rights reserved4-2 Preliminaries Options for implementing an ADT List –Array has a fixed size.
Stacks, Queues, and Linked Lists
DATA STRUCTURES AND ALGORITHMS Prepared by İnanç TAHRALI
Linked Lists Chapter 4.
Chapter 10: Applications of Arrays and the class vector
Chapter 24 Lists, Stacks, and Queues
Linked List 1. Introduction to Linked List 2. Node Class 3. Linked List 4. The Bag Class with Linked List.
Data Structures Part 2 Stacks, Queues, Trees, and Graphs Briana B. Morrison CSE 1302C Spring 2010.
Chapter 1 Object Oriented Programming 1. OOP revolves around the concept of an objects. Objects are created using the class definition. Programming techniques.
CSE Lecture 12 – Linked Lists …
1 Linked Lists A linked list is a sequence in which there is a defined order as with any sequence but unlike array and Vector there is no property of.
College of Information Technology & Design
© 2010 Pearson Addison-Wesley. All rights reserved. Addison Wesley is an imprint of CHAPTER 11: Priority Queues and Heaps Java Software Structures: Designing.
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Topic 16 Sorting Using ADTs to Implement Sorting Algorithms.
Chapter 13 Recursion, Complexity, and Searching and Sorting
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
MATH 224 – Discrete Mathematics
CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
Introduction to Algorithms Rabie A. Ramadan rabieramadan.org 2 Some of the sides are exported from different sources.
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
the fourth iteration of this loop is shown here
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Introduction to Analysis of Algorithms
© 2006 Pearson Addison-Wesley. All rights reserved10-1 Chapter 10 Algorithm Efficiency and Sorting CS102 Sections 51 and 52 Marc Smith and Jim Ten Eyck.
CSE 326: Data Structures Lecture #3 Analysis of Recursive Algorithms Alon Halevy Fall Quarter 2000.
Lecture 4 Feb 5 completion of recursion (inserting into a linked list as last item) analysis of algorithms – Chapter 2.
Lecture 3 Aug 31, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis discussion of lab – permutation generation.
Algorithm Efficiency and Sorting
Analysis of Algorithm.
Elementary Data Structures and Algorithms
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Describing algorithms in pseudo code To describe algorithms we need a language which is: – less formal than programming languages (implementation details.
Information and Computer Sciences University of Hawaii, Manoa
C++ Programming: Program Design Including Data Structures, Fourth Edition Chapter 19: Searching and Sorting Algorithms.
Analysis of Algorithms
Chapter 19: Searching and Sorting Algorithms
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley. Ver Chapter 9: Algorithm Efficiency and Sorting Data Abstraction &
© 2011 Pearson Addison-Wesley. All rights reserved 10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Chapter 10 A Algorithm Efficiency. © 2004 Pearson Addison-Wesley. All rights reserved 10 A-2 Determining the Efficiency of Algorithms Analysis of algorithms.
CSC 211 Data Structures Lecture 13
The Fundamentals: Algorithms, the Integers & Matrices.
1 CSC 427: Data Structures and Algorithm Analysis Fall 2008 Algorithm analysis, searching and sorting  best vs. average vs. worst case analysis  big-Oh.
Data Structure Introduction.
Self-Referential Classes A Self-referential class is a class that contains a reference to an object that has the same class type. –A self-referential class.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Efficiency of Algorithms. Node - data : Object - link : Node + createNode() + getData() + setData() + getLink() + setLink() + addNodeAfter() + removeNodeAfter()
Algorithm Complexity L. Grewe 1. Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them?
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
1 Algorithm Analysis. 2 Question Suppose you have two programs that will sort a list of student records and allow you to search for student information.
1. Searching The basic characteristics of any searching algorithm is that searching should be efficient, it should have less number of computations involved.
Algorithm Analysis 1.
Lecture – 2 on Data structures
Algorithm Efficiency Chapter 10.
Algorithm Efficiency and Sorting
Revision of C++.
Algorithm Analysis How can we demonstrate that one algorithm is superior to another without being misled by any of the following problems: Special cases.
Presentation transcript:

Algorithm Analysis

Question Suppose you have 2 programs that will sort a list of student records and allow you to search for student information. Question: How do you judge which is a better program? Answer: By running programs or Algorithm Analysis

Algorithm Analysis Is a methodology for estimating the resource (time and space) consumption of an algorithm. Allows us to compare the relative costs of two or more algorithms for solving the same problem.

How to Measure “Betterness” Critical resources in computer Time Memory space For most algorithms, running time depends on size n of data. Notation: T(n) Time T is a function of data size n.

Two approaches Two approaches to obtaining running time: Measuring under standard benchmark conditions (running programs) Estimating the algorithms performance (analyzing algorithms)

Estimation Assumptions Estimation of running time is based on: Size of the input Number of basic operations The time to complete a basic operation does not depend on the value of its operands.

Example: largest() // Return position of largest value // in array int largest(int array[], int n) { int posBig = 0; // 1 for (int i = 1; i < n; i++) if (array[posBig] < array[i]){ // 2 posBig = i; // 3 } return posBig; As n grows, how does T(n) grow? Cost: T(n) = c1n + c2 steps

Example: largest() Step 1: T(n) = c1 Step 2: It takes a fixed amount of time to do one comparison T(n) = c2n Step 3: T(n) = c3n, at most Total time: T(n) = c1 + c2n+ c3n ≈ c1 + (c2 + c3)n ≈ cn

Example: Assignment int a = array[0]; The running time is T(n) = c This is called constant running time.

Example: total() sum = 0; for (i = 1; i <= n; i++) for (j = 1; j < n; j++) sum++; } What is the running time for this code? Analysis: The basic operation is sum++ The cost of a sum operation can be bundled into some constant time c. Inner loop takes c1n; outer loop takes c2n. The total running time is T(n) = c1n * c2n = (c1 + c2)n = c3n2

Growth Rate of Algorithm —Big-O Natation Growth Rate for an algorithm is the rate at which the cost of the algorithm grows as the data size n grows. Constant: T(n) = c  O(1) Linear Growth: T(n) = n  O(n) Quadratic Growth: T(n) = n2  O(n2) Exponential Growth: T(n) = 2n  O(2n) Assumptions Growth rates are estimates. They have meaning when n is large.

Growth Rate Graph

Growth Rate Graph c Log n T(n) n

Characteristics of Growth Rates Constant— T(n) = c Independent of n Linear— T(n) = cn Constant slope Logarithmic— T(n) = c log n Slope flattens out quickly Quadratic— T(n) = cn2 Increasing slope Cubic— T(n) = cn3 Exponential— T(n) = c2n

Growth Rate—Array Operations bool search(int list[], int n, int key) found = false Loop (for i = 0 to i < n) If (list[i] == key) Then found = true End If End loop return found End // search

Growth Rate—Array Operations Linear Search Insert at Back Insert at Front Remove from Back Remove from Font

Linear Search bool search(int list[], int n, int key) found = false // c1 Loop (for i = 0 to i < n) If (list[i] == key) Then // nc2 found = true (at most) End If End loop return found // c3 End // search T(n) = c1 + nc2 + c3 = c + nc2 ≈ nc2 ∴ O(n)

Insert at Back void insertAtBack(int list[], int &n, item) If (!isFull) Then list[n] = item n++ End If End // insertAtBack

Insert at Back void insertAtBack(int list[], int &n, item) If (!isFull) Then list[n] = item // c1 n++ // c2 End If End // insertAtBack T(n) = c1 + c2 = c ∴ O(1)

Insert At Front void insertAtFront(int list[], int &n, int item) If (!isFull) Then // shift to right Loop (for i = n downto 1) list[i] = list[i – 1] End loop // assign list[0] = item count++ End If End // insertAtFront

Insert At Front—Growth Rate? void insertAtFront(int list[], int &n, int item) If (!isFull) Then Loop (for i = n downto 1) list[i] = list[i – 1] // c1n (at most) End loop list[0] = item // c2 count++ // c3 End If End // insertAtFront T(n) = c1n + c2 + c3 = nc1 + c ≈ nc1 ∴ O(n)

Remove From Back int removeFromBack(list[], int &n) result = NIL If (!isEmpty) Then result = list[n – 1] n = n - 1 End If return result End // removeFromBack

Remove From Front int removeFromFront(list[], int &n) result = NIL If (!isEmpty) Then result = list[0] // shift left Loop (for i = 0 to n – 2) list[i] = list[i + 1] End Loop n = n - 1 End If return result End // removeFromFront

Growth Rate —Linked-List Operations Linear Search Insert at Back Insert at Front Remove from Back Remove from Font

Linear Search bool linearSearch(int key){ bool found = false; c1 if (head != NULL){ node* temp = head; c2 while (temp != NULL){ if (temp=>getData() != key) found = true; else temp = temp->getNext(); c3n } } return found; } T(n) = c1 + c2 + c3n ≈ cn ∴ Growth Rate: O(n)

insertAtBack() void insertAtBack(int item){ node *newP = new node(item, NULL); node *temp = head; c1 if (head == NULL) head = newP; else while (temp->getNext() != NULL){ temp = temp->getNext(); c2n } temp->setNext(newP); c3 count++; c4 } T(n) = c1 + c2n + c3 + c4 ≈ c + c2n ∴ G.R. = O(n)

insertAtFront() void insertAtFront(int item){ node *newP = new node(item, NULL); c1 newP->setNext(head); c2 head = newP; Next(newP); c3 count++; c4 } T(n) = c1 + c2 + c3 + c3 = c ∴ G.R. = O(1)

removeFromBack() void removeFromBack(){ node *temp = head; c1 node *tail = NULL; c2 while (temp != NULL){ trail = temp; nc3 temp = temp->getNext(); nc4 } count++; c5 } T(n) = c1 + c2 + nc3 + nc4 + c5 = c6 + n(c3 + c4) ≈ cn ∴ G.R. = O(n)

removeFromFront() void removeFromBack(){ node *newP = new node(item, NULL); c1 newP->setNext(head); c2 head = newP; Next(newP); c3 count++; c4 } T(n) = c1 + c2 + c3 + c3 = c ∴ G.R. = O(1)

Best, Worst, Average Cases For an algorithm with a given growth rate, we consider Best case Worst case Average case For example: Sequential search for K in an array of n integers Best case: The first item of the array equals K Worst case: The last position of the array equals K Average case: Match at n/2 Best: Find at first position. Cost is 1 compare. Worst: Find at last position. Cost is n compares. Average: (n+1)/2 compares IF we assume the element with value K is equally likely to be in any position in the array.

Which Analysis to Use The Worst Case. Useful in many real-time applications. Advantage: Predictability You know for certain that the algorithm must perform at least that well. Disadvantage: Might not be a representative measure of the behavior of the algorithm on inputs of size n.

Which Analysis to Use? The average case. Often we prefer to know the average-case running time. The average case reveals the typical behavior of the algorithm on inputs of size n. Average case estimation is not always possible. For the sequential search example, it assumes that the key value K is equally likely to appear in any position of the array. This assumption is not always correct.

Which Analysis to Use The Best Case Normally, we are not interested in the best case, because: It is too optimistic Not a fair characterization of the algorithms’ running time Useful in some rare cases, where the best case has high probability of occurring.

The moral of the story If we know enough about the distribution of our input we prefer the average-case analysis. If we do not know the distribution, then we must resort to worst-case analysis. For real-time applications, the worst-case analysis is the preferred method.

Your Turn (Stack w/array) Given Stack Class: … private: elemType data[MAX]; int top; }

Stack (w/array) What is the growth rate (big O) of the push operation? void push(elemType item){ if (!isFull()){ top++; data[top] = item; } }

Stack (w/array) What is the growth rate (big O) of the pop operation? elemType pop (){ elemType result = NIL; if (!isEmpty()){ result = data[top]; top--; } }

Stack (w/array) What is the growth rate (big O) of the clear operation? void clear(){ top = -1; }

Stack (w/linked list) Given: Stack class with linked list … private: Node *top; }

Stack (w/linked list) What is the growth rate (big O) of the push operation? void push(elemType item){ Node *temp = new Node(item, NULL); top->setNext(temp); top = temp; }

Stack (w/linked list) What is the growth rate (big O) of the pop operation? elemType pop () { elemType result = NIL; Node *temp = top; if (!isEmpty()){ result = top-<getData(); top = top->getNext(); delete temp; } return result; }

Stack (w/linked list) What is the growth rate (big O) of the clear operation? Void clear () { Node *temp = top; while (top != NULL) { top = top->getNext(); delete temp; temp = top; } }

Growth Rate of Binary Search (w/array) Comparisons Number left 0 n 1 n(1/2) 2 n(1/2)2 3 n(1/2)3 4 n(1/2)4 … … k-1 n(1/2)k-1 k n(1/2)k Solution n(1k) n n(1/2)k = ------- = ---- = 1 (2k) (2k) 2k = n log(2k) = log(n) k log(2) = log(n) k = log(n)/log(2) k = c1 log(n) T(n) = c1 c2 log(n)O(log n)

Analysis of Bubble Sort Suppose: count = 10 maximum pass = 9 pass comparisons 1 9 2 8 3 7 4 6 5 5 6 4 7 3 8 2 9 1 void bubbleSort(int a[], int count) pass  count – 1 last  pass – 1 Loop (for i from 1 to pass) Loop (for j from 0 to last) If (a[j] > a[j + 1]) Then swap a[j] and a[j = 1] End If End Loop last  last – 1 End Loop

Analysis of Bubble Sort Total number of comparisons 1 + 2 + 3 + 4 + 5 + 6 + 7 + 8 + 9 = ? 9 + 8 + 7 + 6 + 5 + 4 + 3 + 2 + 1 = ? -------------------------------------------------- 10+10 +10+……………………+10 +10 = 9x10 (n-1) + (n-2) + (n-3)… + 3 + 2 + 1 1 + 2 + 3 …+ (n-3) + (n-2) + (n-1) n + n + n … + n + n + n = n(n-1) Therefore, ? = n(n – 1)/2 = cn2 – cn ≈ cn2 Growth Rate: O(n2)