LeongHW, SoC, NUS (UIT2201: Algorithms) Page 1 © Leong Hon Wai, 2003-2008 Efficiency of Algorithms  Readings: [SG] Ch. 3  Chapter Outline: oAttributes.

Slides:



Advertisements
Similar presentations
Chapter 3: The Efficiency of Algorithms Invitation to Computer Science, Java Version, Third Edition.
Advertisements

The Efficiency of Algorithms
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Efficiency of Algorithms
CS107 Introduction to Computer Science
Chapter 3 The Efficiency of Algorithms
Chapter 3: The Efficiency of Algorithms Invitation to Computer Science, C++ Version, Third Edition Additions by Shannon Steinfadt SP’05.
Efficiency of Algorithms February 19th. Today Binary search –Algorithm and analysis Order-of-magnitude analysis of algorithm efficiency –Review Sorting.
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
1 Data Structures A program solves a problem. A program solves a problem. A solution consists of: A solution consists of:  a way to organize the data.
Cmpt-225 Algorithm Efficiency.
Chapter 3: The Efficiency of Algorithms Invitation to Computer Science, C++ Version, Fourth Edition.
Efficiency of Algorithms Csci 107 Lecture 8. Last time –Data cleanup algorithms and analysis –  (1),  (n),  (n 2 ) Today –Binary search and analysis.
Algorithms and Efficiency of Algorithms February 4th.
Chapter 3: The Efficiency of Algorithms
CS Main Questions Given that the computer is the Great Symbol Manipulator, there are three main questions in the field of computer science: What kinds.
LeongHW, SoC, NUS (UIT2201: Algorithms) Page 1 © Leong Hon Wai, Algorithms Problem Solving  Readings: [SG] Ch. 2  Chapter Outline: 1.Chapter.
Chapter 3: The Efficiency of Algorithms Invitation to Computer Science, C++ Version, Third Edition Additions by Shannon Steinfadt SP’05.
Liang, Introduction to Java Programming, Eighth Edition, (c) 2011 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Analysis of Algorithm.
Copyright © Cengage Learning. All rights reserved. CHAPTER 11 ANALYSIS OF ALGORITHM EFFICIENCY ANALYSIS OF ALGORITHM EFFICIENCY.
Chapter 3: The Efficiency of Algorithms
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
CS107 Introduction to Computer Science Lecture 7, 8 An Introduction to Algorithms: Efficiency of algorithms.
Data Structures Introduction Phil Tayco Slide version 1.0 Jan 26, 2015.
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Liang, Introduction to Java Programming, Seventh Edition, (c) 2009 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
{ CS203 Lecture 7 John Hurley Cal State LA. 2 Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Lecture 2 Computational Complexity
Mathematics Review and Asymptotic Notation
Reynolds 2006 Complexity1 Complexity Analysis Algorithm: –A sequence of computations that operates on some set of inputs and produces a result in a finite.
LeongHW, SoC, NUS (UIT2201: AI) Page 1 © Leong Hon Wai, Integrating Different Ideas Together  Reading Materials:  Ch 3.6 of [SG]  Contents:
计算机科学概述 Introduction to Computer Science 陆嘉恒 中国人民大学 信息学院
Chapter 12 Recursion, Complexity, and Searching and Sorting
Analysis of Algorithms
Complexity of algorithms Algorithms can be classified by the amount of time they need to complete compared to their input size. There is a wide variety:
Efficiency of Algorithms Csci 107 Lecture 7. Last time –Data cleanup algorithms and analysis –  (1),  (n),  (n 2 ) Today –Binary search and analysis.
© 2011 Pearson Addison-Wesley. All rights reserved 10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Algorithm Analysis (Algorithm Complexity). Correctness is Not Enough It isn’t sufficient that our algorithms perform the required tasks. We want them.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
CSC 211 Data Structures Lecture 13
Data Structure Introduction.
Analysis of algorithms. What are we going to learn? Need to say that some algorithms are “better” than others Criteria for evaluation Structure of programs.
Invitation to Computer Science 6th Edition Chapter 3 The Efficiency of Algorithms.
David Luebke 1 1/6/2016 CS 332: Algorithms Asymptotic Performance.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
1. Searching The basic characteristics of any searching algorithm is that searching should be efficient, it should have less number of computations involved.
Searching Topics Sequential Search Binary Search.
LeongHW, SoC, NUS (UIT2201: Algorithms) Page 1 © Leong Hon Wai, Notice….  No tutorial this Friday (12-Feb)  Special class on BASIC Algorithms.
Algorithmics - Lecture 41 LECTURE 4: Analysis of Algorithms Efficiency (I)
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
1 Algorithms Searching and Sorting Algorithm Efficiency.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong.
LeongHW, SoC, NUS (UIT2201: Algorithms) Page 1 © Leong Hon Wai, Algorithms Problem Solving  Readings: [SG] Ch. 2  Chapter Outline: 1.Chapter.
Algorithm Analysis 1.
Analysis of Algorithms
Introduction to Algorithms
Chapter 3: The Efficiency of Algorithms
Lecture 6 Efficiency of Algorithms (2) (S&G, ch.3)
Programming and Data Structure
Programming and Data Structure
Chapter 3: The Efficiency of Algorithms
Invitation to Computer Science 5th Edition
Presentation transcript:

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 1 © Leong Hon Wai, Efficiency of Algorithms  Readings: [SG] Ch. 3  Chapter Outline: oAttributes of Algorithms oMeasuring Efficiency of Algorithms oSimple Analysis of Algorithms oPolynomial vs Exponential Time Algorithms

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 2 © Leong Hon Wai, Efficiency of Algorithms  Readings: [SG] Ch. 3  Chapter Outline: oAttributes of Algorithms uWhat makes a Good Algorithm uKey Efficiency considerations oMeasuring Efficiency of Algorithms oSimple Analysis of Algorithms oPolynomial vs Exponential Time Algorithms

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 3 © Leong Hon Wai, What are good algorithms?  Desirable attributes in an algorithm oCorrectness oSimplicity (Ease of understanding) oElegance oEfficiency oEmbrace Multiple Levels of Abstraction oWell Documented, Multi-Platform

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 4 © Leong Hon Wai, Attribute: Correctness, Simplicity  Correctness oDoes the algorithm solve the problem it is designed for? oDoes the algorithm solve all instances of the problem correctly?  Simplicity (Ease of understanding) oIs it easy to understand, oIs it clear, concise (not tricky) oIs it easy to alter the algorithm? oImportant for program maintenance

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 5 © Leong Hon Wai, Attributes: Abstraction, Elegance  Multiple Levels of Abstraction oDecomposes problem into sub-problems oEasier to understand at different levels of abstraction? oUsable as modules (libraries) by others  Elegance oHow clever or sophisticated is the algorithm? oIs pleasing and satisfying to the designer.

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 6 © Leong Hon Wai, Attributes: Efficiency, etc…  Efficiency oThe amount of time the algorithm requires? oThe amount of space the algorithm requires? oThe most important attribute oEspecially for large-scale problems.  Well documented, multi-platform oIs well-documented with sufficient details oNot OS-dependent, company-dependent, computer-dependent

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 7 © Leong Hon Wai, Attributes: Key Considerations…  However, they are often contradictory oSimple algorithms are often slow oEfficient algorithms tend to be complicated When designing algorithms, all computer scientists strive to achieve Simplicity, Elegance Efficiency plus If you really want to learn how, take an algorithms course.algorithms course

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 8 © Leong Hon Wai, Efficiency of Algorithms  Readings: [SG] Ch. 3  Chapter Outline: oAttributes of Algorithms oMeasuring Efficiency of Algorithms uOne Problem, Many algorithmic solutions uTime complexity, Space complexity u  notation, order of growth of functions oSimple Analysis of Algorithms oPolynomial vs Exponential Time Algorithms

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 9 © Leong Hon Wai, One Problem, Many algorithmic solutions  Given an algorithmic problem, Many different algorithms to solve it Problem: Searching for a name in a list; Algorithms: Sequential search binary search interpolation search, etc Problem: Searching for a name in a list; Algorithms: Sequential search binary search interpolation search, etc Slow, Take linear time Fast, Take logarithmic time Not covered in UIT2201

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 10 © Leong Hon Wai, Sequential Search: Idea  Search for NAME among a list of n names  Start at the beginning and compare NAME to each entry in the list until a match is found

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 11 © Leong Hon Wai, Figure 3.1: Sequential Search Algorithm Sequential Search: Pseudo-Code Initialization block Iteration block; the key step where most of the work is done Post-Processing block

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 12 © Leong Hon Wai, Recall: Algorithm Sequential Search Seq-Search(N, T, n, NAME); begin i  1; Found  No; while (Found=No) and (i <= n) do if (NAME = N[i]) then Print T[i]; Found  Yes; else i  i + 1; endif endwhile if (Found=No) then Print NAME “is not found” endif end;  Precondition: The variables n, NAME and the arrays N and T have been read into memory.

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 13 © Leong Hon Wai, Analysis of Algorithm (introduction) Analysis of Algorithms Analyze an algorithm to predict its efficiency – namely, the resources (time and space) that an algorithm need during its execution. Time complexity T A (n) the time taken by an algorithm A on problems with input size n Space complexity S A (n) the space taken by an algorithm A on problems with input size n

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 14 © Leong Hon Wai, Sequential Search: Analysis  Comparison of the NAME against a name in the list N of names oCentral unit of work (dominant operation) oUsed for efficiency analysis  For lists with n entries oBest case (best case is usually not important) uNAME is the first name in the list u1 comparison u  (1) Roughly means a constant

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 15 © Leong Hon Wai, Sequential Search: Analysis  For lists with n entries oWorst case (usually the most important) uNAME is the last name in the list uNAME is not in the list un comparisons u(n)u(n) oAverage case (sometimes used) uRoughly n/2 comparisons u(n)u(n) Roughly means “proportional to n” Here ½n is also proportional to n. The constant c in cn is not important. Usually, we let c=1.

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 16 © Leong Hon Wai, Sequential Search: Analysis  Space efficiency oUses 2n memory storage for the input names and telephone numbers oA few more memory storage for variables (NAME, i, FOUND) oSpace is  (n) oVery space efficient

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 17 © Leong Hon Wai, Figure 3.4. Work = cn for Various Values of c Viewing the Rate of Growth of T(n) = cn

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 18 © Leong Hon Wai, Order of Magnitude: Order n [Linear]  All functions that have a linear “shape” are considered equivalent  As n grows large, the order of magnitude dominates the running time ominimizing effect of coefficients oand lower-order terms  Order of magnitude n oWritten as  (n) [read as “theta-n”] oFunctions vary as a c x n, for some constant c oLinear time

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 19 © Leong Hon Wai, [SideTrack: Why analyze only dominant operations]  In the analysis above, oWe only analyze the dominant operation  This sub-section gives why. oNamely, why we can take short-cuts  It may help you better appreciate “analysis of algorithm” obut, if you have difficulties with this part, you can skip it, without affecting anything else.

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 20 © Leong Hon Wai, Analysis of Algorithm  To estimate running time of algorithm oWithout actually running the algorithm  Method… oEstimate cost (work done) for each elementary operation oAnalyze the number of times each operation is executed during the algorithm oThen, sum up the total cost (work done) by the algorithm  AIM: To conclude that we oOnly need to analyze the dominant operation

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 21 © Leong Hon Wai, Suppose we assume the following estimated costs Analyzing Algorithm Sequential Search #times n+1 n *1 *n n 1 cost StatementCostStatementCost assignment20Print4 while, if5endwhile1 T(n) = ( ) + (n+1)5 + n(5+20+1) + (4+20)+(5+4+1) = 31n + 80 =  (n)  [proportional to n] Seq-Search(N, T, n, NAME); begin i  1; Found  No; while (Found=No) and (i <= n) do if (NAME = N[i]) then Print T[i]; Found  Yes; else i  i + 1; endwhile if (Found=No) then Print NAME “is not found” endif end;

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 22 © Leong Hon Wai, Now, let’s assume a different set of estimated of costs Analyzing Algorithm Sequential Search #times n+1 n *1+1 *n n 1 cost StatementCostStatementCost assignment10Print20 while, if15endwhile0 T(n) = (10+10) + (n+1)15 + n(15+10) + (20+10)+(15+20) = 40n =  (n)  [also proportional to n] Seq-Search(N, T, n, NAME); begin i  1; Found  No; while (Found=No) and (i <= n) do if (NAME = N[i]) then Print T[i]; Found  Yes; else i  i + 1; endwhile if (Found=No) then Print NAME “is not found” endif end;

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 23 © Leong Hon Wai, From the two examples above…  Actual total cost different for oThe different sets of estimated costs for basic ops  But… Order of growth is the same ofor the different sets of estimated costs for basic ops oAll linear (but with different constants)  So… to simplify analysis oAssign a constant cost  (1) for basic operation oCan analyze only the dominant operation(s) uNamely, the operation that is done “most often” uCan also ignore “lower order” terms uSuch as operations that are done only once

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 24 © Leong Hon Wai, Only dominant ops,  (1) cost per basic op (  (1) means a constant) Simplified Analysis #times n n cost  (1)  (1)  (1)  (1) StatementCostStatementCost assignment (1)(1) Print (1)(1) while, if (1)(1) endwhile (1)(1) T(n) = 4n x  (1) (counting only dominant ops) =  (4n) =  (n)  [proportional to n] Seq-Search(N, T, n, NAME); begin i  1; Found  No; while (Found=No) and (i <= n) do if (NAME = N[i]) then Print T[i]; Found  Yes; else i  i + 1; endwhile if (Found=No) then Print NAME “is not found” endif end;

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 25 © Leong Hon Wai, Identifying the dominant operation #times n T(n) = n x  (1) =  (n)  [proportional to n] Seq-Search(N, T, n, NAME); begin i  1; Found  No; while (Found=No) and (i <= n) do if (NAME = N[i]) then Print T[i]; Found  Yes; else i  i + 1; endwhile if (Found=No) then Print NAME “is not found” endif end; cost  (1)  (1)  (1)  (1) Name comparison is a dominant operation

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 26 © Leong Hon Wai,  As the above examples show, oSufficient to analyze only dominant operation oIt gives the same running time in  notations oBut, it is MUCH simpler.  Conclusion: oSufficent to analyze only dominant operation  END of SideTrack: and remember… oIf you have difficulties with this sub-section, you can skip it, without affecting anything else. [END SideTrack: Why analyze only dominant operations]

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 27 © Leong Hon Wai,  Readings: [SG] Ch. 3  Chapter Outline: oAttributes of Algorithms oMeasuring Efficiency of Algorithms oSimple Analysis of Algorithms uPattern Match Algorithm uSelection Sort Algorithm uBinary Search Algorithm oPolynomial vs Exponential Time Algorithms Efficiency of Algorithms

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 28 © Leong Hon Wai, Analysis of Algorithms  To analyze algorithms, oAnalyze dominant operations oUse  -notations to simplify analysis oDetermine order of the running time  Can apply this at high-level pseudo-codes oIf high-level primitive are used u Analyze running time of high-level primitive u expressed in  notations umultiply by number of times it is called oSee example in analysis of Pattern-Matching

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 29 © Leong Hon Wai, Analysis of Pat-Match Algorithm oAcheives good division-of-labour  Overview: To analyze, we do bottom-up analysis oFirst, analyze time complexity of Match(T,k,P,m) u Note: This takes much more than  (1) operations u Express in  notation (simplified). oThen analyze Pat-Match Pat-Match(T, n, P, m) Match(T, k, P, m) “high-level” view “high-level” primitive  Our pattern matching alg. consists of two modules

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 30 © Leong Hon Wai, First, analyze Match high-level primitive Align T[k..k+m–1] with P[1..m] (Here, k = 4) Match(T,k,P,m); begin i  1; MisMatch  No; while (i <= m) and (MisMatch=No) do if (T[k+i-1] not equal to P[i]) then MisMatch=Yes else i  i + 1 endif endwhile Match  not(MisMatch); (* Opposite of *) end; ATA P 1 23 CATATCATA T i Dominant Op is comparison. In worst case, m comparisons. So,  (m)

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 31 © Leong Hon Wai, Next, Analyze Pat-Match Algorithm Pat-Match(T,n,P,m); (* Finds all occurrences of P in T *) begin k  1; while (k <= n-m+1) do if Match(T,k,P,m) = Yes then Print “Match at pos ”, k; endif k  k+1; endwhile end; Dominant Operation: high level op Match(T,k,P,m); Match is called (n+1–m) times, each call cost  (m) times Total:  ((n+1–m)m) =  (nm)

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 32 © Leong Hon Wai, Sorting: Problem and Algorithms  Problem: Sorting oTake a list of n numbers and rearrange them in increasing order  Algorithms: oSelection Sort  (n 2 ) oInsertion Sort  (n 2 ) oBubble Sort  (n 2 ) oMerge Sort  (n lg n) oQuicksort  (n lg n)** ** average case Not covered in the course

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 33 © Leong Hon Wai, Selection sort  Idea behind the algorithm… oRepeatedly u find the largest number in unsorted section uSwap it to the end (the sorted section) oRe-uses the Find-Max algorithm sorted mj swap A:A: 1n A[m] is largest among A[1..j]

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 34 © Leong Hon Wai, Selection Sort Algorithm (pseudo-code) Selection-Sort(A, n); begin j  n; while (j > 1) do m  Find-Max(A,j); swap(A[m],A[j]); j  j - 1; endwhile end; sorted mj swap A:A: 1n A[m] is largest among A[1..j]

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 35 © Leong Hon Wai, Example of selection sort mj swap

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 36 © Leong Hon Wai, Example of selection sort j

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 37 © Leong Hon Wai, Example of selection sort mj swap

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 38 © Leong Hon Wai, Example of selection sort j

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 39 © Leong Hon Wai, Example of selection sort j m swap

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 40 © Leong Hon Wai, Example of selection sort j

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 41 © Leong Hon Wai, Example of selection sort j m swap

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 42 © Leong Hon Wai, Example of selection sort j Done.

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 43 © Leong Hon Wai, Figure 3.6: Selection Sort Algorithm Selection Sort Algorithm [SG] unsorted

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 44 © Leong Hon Wai, What about the time complexity? j Done.  Dominant operation: comparisons 4 comparisons 3 comparisons 2 comparisons 1 comparisons

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 45 © Leong Hon Wai, Analysis of Selection Sort  When sorting n numbers, o(n-1) comparisons in iteration 1 (when j = n) o(n-2) comparisons in iteration 2 (when j = n-1) o(n-3) comparisons in iteration 3 (when j = n-2) o o2 comparisons in iteration (n-2) (when j = 3) o1 comparisons in iteration (n-1) (when j = 2)  Total number of comparisons: oCost = (n-1) + (n-2) + … = n(n-1)/2 =  (n 2 ) Find-Max for j numbers takes (j-1) comparisons

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 46 © Leong Hon Wai, Selection Sort: Time complexity: T(n) =  (n 2 ) Space complexity: S(n) =  (n) Analysis of Selection Sort: Summary  Time complexity:  (n 2 ) oComparisons: n(n-1)/2 oExchanges: n (swapping largest into place) oOverall time compexity:  (n 2 )  Space complexity:  (n) o  (n) – space for input sequence, plus a few variables.

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 47 © Leong Hon Wai, Figure 3.10: Work = cn 2 for various values of c Viewing the Rate of Growth of T(n) = cn 2

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 48 © Leong Hon Wai,  All functions (with highest-order term) cn 2 ohave similar shape ohave same order of growth  Quadratic algorithm --  (n 2 ) oan algorithm that does cn 2 work u for some constant c oorder of magnitude is n 2 o  (n 2 ) (read, theta n-square) Order of Magnitude – Order n 2 [Quadratic]

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 49 © Leong Hon Wai, Comparison: Order n vs Order n 2  Have seen…  Algorithms of order n oSum, Find-Max, Find-Min, Seq-Search  Algorithm of order n 2 oSelection sort oPrinting an n x n table

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 50 © Leong Hon Wai, Figure 3.11: A Comparison of n and n 2 Rate of Growth Comparison: n 2 vs n

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 51 © Leong Hon Wai,  Anything that is  (n 2 ) will eventually have larger values than anything that is  (n), no matter what the constants are.  An algorithm that runs in time  (n) will outperform one that runs in  (n 2 )  Eg: compare T 1 (n) = 1000n and T 2 (n) = 10n 2 Comparison:  (n 2 ) vs  (n) See also tutorial problem.

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 52 © Leong Hon Wai, A Very Fast Search Algorithm  If the list is sorted, that is A 1  A 2  A 3  ….  A n  Then, we can do better when searching o actually a lot better….  Can use “Binary Search” Example: Find

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 53 © Leong Hon Wai, Binary search Find an element in a sorted array: IDEA: Check middle element. Recursively search 1 subarray. Example: Find

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 54 © Leong Hon Wai, Binary search Find an element in a sorted array: IDEA: Check middle element. Recursively search 1 subarray. Example: Find

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 55 © Leong Hon Wai, Binary search Find an element in a sorted array: IDEA: Check middle element. Recursively search 1 subarray. Example: Find

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 56 © Leong Hon Wai, Binary search Find an element in a sorted array: IDEA: Check middle element. Recursively search 1 subarray. Example: Find

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 57 © Leong Hon Wai, Binary search Find an element in a sorted array: IDEA: Check middle element. Recursively search 1 subarray. Example: Find

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 58 © Leong Hon Wai, Binary search Find an element in a sorted array: IDEA: Check middle element. Recursively search 1 subarray. Example: Find Found!

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 59 © Leong Hon Wai, Binary search – overview of algorithm Find an element in a sorted array: 1. Check middle element. Have two pointers first, last on two ends of the sub-array being search Here, mid = (first + last) / 2; Move one of the two pointers to update the sub-array being search. Either last  mid – 1; Or first  mid + 1; 2. Recursively search 1 subarray.

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 60 © Leong Hon Wai, Binary Search Algorithm (pseudo-code) BinarySearch(A,n,x); (* search for x in a sorted array A[1..n] *) begin first  1; last  n; while (first <= last) do mid  (first + last) div 2; if (x = A[mid]) then print “Found x in pos”, m; Stop else if (x < A[mid]) then last  mid-1; else first  mid+1; endif endwhile print “x not found”; end;

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 61 © Leong Hon Wai, Binary Search – How fast is it? (1/3)  Starting with n numbers, oBinary search repeatedly halves the size oUntil it reaches 1 element – to check Eg: When n = 100 oAfter 1 step, size is  50 oAfter 2 steps, size is  25 oAfter 3 steps, size is  12 oAfter 4 steps, size is  6 oAfter 5 steps, size is  3 oAfter 6 steps, size is  1 oOne last comparison, DONE!! 7 = log steps; [Déjà vu? repeated-halving]

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 62 © Leong Hon Wai,  Starting with n numbers, oBinary search repeatedly halves the size oUntil it reaches 1 element – to check # steps = lg n = log 2 n  Binary Search has complexity oT(n) =  ( lg n)  Recall facts about T(n) = lg n When 2 k = n [after taking lg-base-2] k = log 2 n or lg n = # of steps of repeated-halving Binary Search – How fast is it? (2/3)

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 63 © Leong Hon Wai,  Starting with n numbers, oBinary search repeatedly halves the size oUntil it reaches 1 element – to check # steps = lg n = log 2 n  Binary Search has complexity oT(n) =  ( lg n)  T(n) =  (lg n) is very fast! n (# of element) T(n) = lg n 1, ,000, ,000,000, Binary Search – How fast is it? (3/3)

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 64 © Leong Hon Wai, Summary: Searching Algorithms  Sequential Search (Alg): o Worst Case: n comparisons o Best Case: 1 comparison o Avg Case: n/2 comparisons  Binary Search (Alg): oWorst Case: lg n comparisons o Best Case: 1 comparison o Avg Case: lg n comparisons* * How to get the Average Case ? Answer: using mathematical analysis. This is OK (do-able) for small n (see example in tutorial). (Read Sect of [SG3]) For general n, analysis is complex (beyond the scope of this course)

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 65 © Leong Hon Wai, Figure A Comparison of n and lg n Comparison: order n vs order lg n

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 66 © Leong Hon Wai, Complexity of Algorithms…  Logarithmic Time Algorithm oBinary Search –  (lg n) time  A Linear Time Algorithm oSum(A,n) –  (n) time oSequential-Search(A,n,x) –  (n) time  A Quadratic Time Algorithm oSelection Sort –  (n 2 ) time  An Exponential Time Algorithm oAll-Subsets(A,n) –  (2 n ) time

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 67 © Leong Hon Wai, Figure 3.22: Summary of Time Efficiency Complexity of Time Efficiency

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 68 © Leong Hon Wai,  Readings: [SG] Ch. 3  Chapter Outline: oAttributes of Algorithms oMeasuring Efficiency of Algorithms oSimple Analysis of Algorithms oWhen things get out of hand uPolynomial Time Algorithms (Tractable) uExponential Time Algorithms (Intractable) uApproximation Algorithms (eg: Bin Packing) Efficiency of Algorithms

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 69 © Leong Hon Wai, When Things Get Out of Hand  Polynomially bound algorithms oTime complexity is some polynomial order oExample: Time complexity is of order of n 2  Intractable algorithms oRun time is worse than polynomial time oExamples uHamiltonian circuit uBin-packing

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 70 © Leong Hon Wai, Figure 3.27: A Comparison of Four Orders of Magnitude Comparison of Time Complexities See extended table in tutorial problem

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 71 © Leong Hon Wai, Figure 3.25: Comparisons of lg n, n, n 2, and 2 n

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 72 © Leong Hon Wai, When Things Get Out of Hand (2)  Exponential algorithm o  (2 n ) oMore work than any polynomial in n  Approximation algorithms oRun in polynomial time but do not give optimal solutions oExample: Bin Packing Algorithms

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 73 © Leong Hon Wai, Summary of Chapter 3  Desirable attributes in algorithms: oCorrectness oEase of understanding oElegance oEfficiency  Efficiency of an algorithm is extremely important oTime Complexity oSpace Complexity

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 74 © Leong Hon Wai, Summary of Chapter 3 (2)  To compare the efficiency of two algorithms that do the same task oMeasure their time complexities  Efficiency focuses on order of magnitude oTime complexity in  -notations. oIn its simplest form (eg:  (n),  (n 2 ))

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 75 © Leong Hon Wai, Thank you!