Presentation is loading. Please wait.

Presentation is loading. Please wait.

LeongHW, SoC, NUS (UIT2201: Algorithms) Page 1 © Leong Hon Wai, 2003-2008 Efficiency of Algorithms  Readings: [SG] Ch. 3  Chapter Outline: oAttributes.

Similar presentations


Presentation on theme: "LeongHW, SoC, NUS (UIT2201: Algorithms) Page 1 © Leong Hon Wai, 2003-2008 Efficiency of Algorithms  Readings: [SG] Ch. 3  Chapter Outline: oAttributes."— Presentation transcript:

1 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 1 © Leong Hon Wai, 2003-2008 Efficiency of Algorithms  Readings: [SG] Ch. 3  Chapter Outline: oAttributes of Algorithms oMeasuring Efficiency of Algorithms oSimple Analysis of Algorithms oPolynomial vs Exponential Time Algorithms

2 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 2 © Leong Hon Wai, 2003-2008 Efficiency of Algorithms  Readings: [SG] Ch. 3  Chapter Outline: oAttributes of Algorithms uWhat makes a Good Algorithm uKey Efficiency considerations oMeasuring Efficiency of Algorithms oSimple Analysis of Algorithms oPolynomial vs Exponential Time Algorithms

3 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 3 © Leong Hon Wai, 2003-2008 What are good algorithms?  Desirable attributes in an algorithm oCorrectness oSimplicity (Ease of understanding) oElegance oEfficiency oEmbrace Multiple Levels of Abstraction oWell Documented, Multi-Platform

4 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 4 © Leong Hon Wai, 2003-2008 Attribute: Correctness, Simplicity  Correctness oDoes the algorithm solve the problem it is designed for? oDoes the algorithm solve all instances of the problem correctly?  Simplicity (Ease of understanding) oIs it easy to understand, oIs it clear, concise (not tricky) oIs it easy to alter the algorithm? oImportant for program maintenance

5 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 5 © Leong Hon Wai, 2003-2008 Attributes: Abstraction, Elegance  Multiple Levels of Abstraction oDecomposes problem into sub-problems oEasier to understand at different levels of abstraction? oUsable as modules (libraries) by others  Elegance oHow clever or sophisticated is the algorithm? oIs pleasing and satisfying to the designer.

6 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 6 © Leong Hon Wai, 2003-2008 Attributes: Efficiency, etc…  Efficiency oThe amount of time the algorithm requires? oThe amount of space the algorithm requires? oThe most important attribute oEspecially for large-scale problems.  Well documented, multi-platform oIs well-documented with sufficient details oNot OS-dependent, company-dependent, computer-dependent

7 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 7 © Leong Hon Wai, 2003-2008 Attributes: Key Considerations…  However, they are often contradictory oSimple algorithms are often slow oEfficient algorithms tend to be complicated When designing algorithms, all computer scientists strive to achieve Simplicity, Elegance Efficiency plus If you really want to learn how, take an algorithms course.algorithms course

8 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 8 © Leong Hon Wai, 2003-2008 Efficiency of Algorithms  Readings: [SG] Ch. 3  Chapter Outline: oAttributes of Algorithms oMeasuring Efficiency of Algorithms uOne Problem, Many algorithmic solutions uTime complexity, Space complexity u  notation, order of growth of functions oSimple Analysis of Algorithms oPolynomial vs Exponential Time Algorithms

9 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 9 © Leong Hon Wai, 2003-2008 One Problem, Many algorithmic solutions  Given an algorithmic problem, Many different algorithms to solve it Problem: Searching for a name in a list; Algorithms: Sequential search binary search interpolation search, etc Problem: Searching for a name in a list; Algorithms: Sequential search binary search interpolation search, etc Slow, Take linear time Fast, Take logarithmic time Not covered in UIT2201

10 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 10 © Leong Hon Wai, 2003-2008 Sequential Search: Idea  Search for NAME among a list of n names  Start at the beginning and compare NAME to each entry in the list until a match is found

11 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 11 © Leong Hon Wai, 2003-2008 Figure 3.1: Sequential Search Algorithm Sequential Search: Pseudo-Code Initialization block Iteration block; the key step where most of the work is done Post-Processing block

12 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 12 © Leong Hon Wai, 2003-2008 Recall: Algorithm Sequential Search Seq-Search(N, T, n, NAME); begin i  1; Found  No; while (Found=No) and (i <= n) do if (NAME = N[i]) then Print T[i]; Found  Yes; else i  i + 1; endif endwhile if (Found=No) then Print NAME “is not found” endif end;  Precondition: The variables n, NAME and the arrays N and T have been read into memory.

13 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 13 © Leong Hon Wai, 2003-2008 Analysis of Algorithm (introduction) Analysis of Algorithms Analyze an algorithm to predict its efficiency – namely, the resources (time and space) that an algorithm need during its execution. Time complexity T A (n) the time taken by an algorithm A on problems with input size n Space complexity S A (n) the space taken by an algorithm A on problems with input size n

14 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 14 © Leong Hon Wai, 2003-2008 Sequential Search: Analysis  Comparison of the NAME against a name in the list N of names oCentral unit of work (dominant operation) oUsed for efficiency analysis  For lists with n entries oBest case (best case is usually not important) uNAME is the first name in the list u1 comparison u  (1) Roughly means a constant

15 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 15 © Leong Hon Wai, 2003-2008 Sequential Search: Analysis  For lists with n entries oWorst case (usually the most important) uNAME is the last name in the list uNAME is not in the list un comparisons u(n)u(n) oAverage case (sometimes used) uRoughly n/2 comparisons u(n)u(n) Roughly means “proportional to n” Here ½n is also proportional to n. The constant c in cn is not important. Usually, we let c=1.

16 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 16 © Leong Hon Wai, 2003-2008 Sequential Search: Analysis  Space efficiency oUses 2n memory storage for the input names and telephone numbers oA few more memory storage for variables (NAME, i, FOUND) oSpace is  (n) oVery space efficient

17 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 17 © Leong Hon Wai, 2003-2008 Figure 3.4. Work = cn for Various Values of c Viewing the Rate of Growth of T(n) = cn

18 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 18 © Leong Hon Wai, 2003-2008 Order of Magnitude: Order n [Linear]  All functions that have a linear “shape” are considered equivalent  As n grows large, the order of magnitude dominates the running time ominimizing effect of coefficients oand lower-order terms  Order of magnitude n oWritten as  (n) [read as “theta-n”] oFunctions vary as a c x n, for some constant c oLinear time

19 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 19 © Leong Hon Wai, 2003-2008 [SideTrack: Why analyze only dominant operations]  In the analysis above, oWe only analyze the dominant operation  This sub-section gives why. oNamely, why we can take short-cuts  It may help you better appreciate “analysis of algorithm” obut, if you have difficulties with this part, you can skip it, without affecting anything else.

20 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 20 © Leong Hon Wai, 2003-2008 Analysis of Algorithm  To estimate running time of algorithm oWithout actually running the algorithm  Method… oEstimate cost (work done) for each elementary operation oAnalyze the number of times each operation is executed during the algorithm oThen, sum up the total cost (work done) by the algorithm  AIM: To conclude that we oOnly need to analyze the dominant operation

21 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 21 © Leong Hon Wai, 2003-2008 Suppose we assume the following estimated costs Analyzing Algorithm Sequential Search #times 1 1+1 n+1 n *1 *n n 1 cost 1 20 5 4+20 20 1 5 4 1 StatementCostStatementCost assignment20Print4 while, if5endwhile1 T(n) = (1+20+20) + (n+1)5 + n(5+20+1) + (4+20)+(5+4+1) = 31n + 80 =  (n)  [proportional to n] Seq-Search(N, T, n, NAME); begin i  1; Found  No; while (Found=No) and (i <= n) do if (NAME = N[i]) then Print T[i]; Found  Yes; else i  i + 1; endwhile if (Found=No) then Print NAME “is not found” endif end;

22 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 22 © Leong Hon Wai, 2003-2008 Now, let’s assume a different set of estimated of costs Analyzing Algorithm Sequential Search #times 1 1+1 n+1 n *1+1 *n n 1 cost 0 10 15 20+10 10 0 15 20 0 StatementCostStatementCost assignment10Print20 while, if15endwhile0 T(n) = (10+10) + (n+1)15 + n(15+10) + (20+10)+(15+20) = 40n + 100 =  (n)  [also proportional to n] Seq-Search(N, T, n, NAME); begin i  1; Found  No; while (Found=No) and (i <= n) do if (NAME = N[i]) then Print T[i]; Found  Yes; else i  i + 1; endwhile if (Found=No) then Print NAME “is not found” endif end;

23 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 23 © Leong Hon Wai, 2003-2008 From the two examples above…  Actual total cost different for oThe different sets of estimated costs for basic ops  But… Order of growth is the same ofor the different sets of estimated costs for basic ops oAll linear (but with different constants)  So… to simplify analysis oAssign a constant cost  (1) for basic operation oCan analyze only the dominant operation(s) uNamely, the operation that is done “most often” uCan also ignore “lower order” terms uSuch as operations that are done only once

24 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 24 © Leong Hon Wai, 2003-2008 Only dominant ops,  (1) cost per basic op (  (1) means a constant) Simplified Analysis #times n n cost  (1)  (1)  (1)  (1) StatementCostStatementCost assignment (1)(1) Print (1)(1) while, if (1)(1) endwhile (1)(1) T(n) = 4n x  (1) (counting only dominant ops) =  (4n) =  (n)  [proportional to n] Seq-Search(N, T, n, NAME); begin i  1; Found  No; while (Found=No) and (i <= n) do if (NAME = N[i]) then Print T[i]; Found  Yes; else i  i + 1; endwhile if (Found=No) then Print NAME “is not found” endif end;

25 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 25 © Leong Hon Wai, 2003-2008 Identifying the dominant operation #times n T(n) = n x  (1) =  (n)  [proportional to n] Seq-Search(N, T, n, NAME); begin i  1; Found  No; while (Found=No) and (i <= n) do if (NAME = N[i]) then Print T[i]; Found  Yes; else i  i + 1; endwhile if (Found=No) then Print NAME “is not found” endif end; cost  (1)  (1)  (1)  (1) Name comparison is a dominant operation

26 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 26 © Leong Hon Wai, 2003-2008  As the above examples show, oSufficient to analyze only dominant operation oIt gives the same running time in  notations oBut, it is MUCH simpler.  Conclusion: oSufficent to analyze only dominant operation  END of SideTrack: and remember… oIf you have difficulties with this sub-section, you can skip it, without affecting anything else. [END SideTrack: Why analyze only dominant operations]

27 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 27 © Leong Hon Wai, 2003-2008  Readings: [SG] Ch. 3  Chapter Outline: oAttributes of Algorithms oMeasuring Efficiency of Algorithms oSimple Analysis of Algorithms uPattern Match Algorithm uSelection Sort Algorithm uBinary Search Algorithm oPolynomial vs Exponential Time Algorithms Efficiency of Algorithms

28 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 28 © Leong Hon Wai, 2003-2008 Analysis of Algorithms  To analyze algorithms, oAnalyze dominant operations oUse  -notations to simplify analysis oDetermine order of the running time  Can apply this at high-level pseudo-codes oIf high-level primitive are used u Analyze running time of high-level primitive u expressed in  notations umultiply by number of times it is called oSee example in analysis of Pattern-Matching

29 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 29 © Leong Hon Wai, 2003-2008 Analysis of Pat-Match Algorithm oAcheives good division-of-labour  Overview: To analyze, we do bottom-up analysis oFirst, analyze time complexity of Match(T,k,P,m) u Note: This takes much more than  (1) operations u Express in  notation (simplified). oThen analyze Pat-Match Pat-Match(T, n, P, m) Match(T, k, P, m) “high-level” view “high-level” primitive  Our pattern matching alg. consists of two modules

30 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 30 © Leong Hon Wai, 2003-2008 First, analyze Match high-level primitive Align T[k..k+m–1] with P[1..m] (Here, k = 4) Match(T,k,P,m); begin i  1; MisMatch  No; while (i <= m) and (MisMatch=No) do if (T[k+i-1] not equal to P[i]) then MisMatch=Yes else i  i + 1 endif endwhile Match  not(MisMatch); (* Opposite of *) end; ATA P 1 23 CATATCATA T 1 23456789 i Dominant Op is comparison. In worst case, m comparisons. So,  (m)

31 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 31 © Leong Hon Wai, 2003-2008 Next, Analyze Pat-Match Algorithm Pat-Match(T,n,P,m); (* Finds all occurrences of P in T *) begin k  1; while (k <= n-m+1) do if Match(T,k,P,m) = Yes then Print “Match at pos ”, k; endif k  k+1; endwhile end; Dominant Operation: high level op Match(T,k,P,m); Match is called (n+1–m) times, each call cost  (m) times Total:  ((n+1–m)m) =  (nm)

32 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 32 © Leong Hon Wai, 2003-2008 Sorting: Problem and Algorithms  Problem: Sorting oTake a list of n numbers and rearrange them in increasing order  Algorithms: oSelection Sort  (n 2 ) oInsertion Sort  (n 2 ) oBubble Sort  (n 2 ) oMerge Sort  (n lg n) oQuicksort  (n lg n)** ** average case Not covered in the course

33 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 33 © Leong Hon Wai, 2003-2008 Selection sort  Idea behind the algorithm… oRepeatedly u find the largest number in unsorted section uSwap it to the end (the sorted section) oRe-uses the Find-Max algorithm sorted mj swap A:A: 1n A[m] is largest among A[1..j]

34 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 34 © Leong Hon Wai, 2003-2008 Selection Sort Algorithm (pseudo-code) Selection-Sort(A, n); begin j  n; while (j > 1) do m  Find-Max(A,j); swap(A[m],A[j]); j  j - 1; endwhile end; sorted mj swap A:A: 1n A[m] is largest among A[1..j]

35 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 35 © Leong Hon Wai, 2003-2008 Example of selection sort mj 6 6 10 13 5 5 8 8 swap

36 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 36 © Leong Hon Wai, 2003-2008 Example of selection sort j 6 6 10 8 8 5 5 13 6 6 10 13 5 5 8 8

37 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 37 © Leong Hon Wai, 2003-2008 Example of selection sort mj 6 6 10 8 8 5 5 13 6 6 10 13 5 5 8 8 swap

38 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 38 © Leong Hon Wai, 2003-2008 Example of selection sort j 6 6 10 8 8 5 5 13 6 6 10 13 5 5 8 8 6 6 5 5 8 8 10 13

39 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 39 © Leong Hon Wai, 2003-2008 Example of selection sort j 6 6 10 8 8 5 5 13 6 6 10 13 5 5 8 8 6 6 5 5 8 8 10 13 m swap

40 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 40 © Leong Hon Wai, 2003-2008 Example of selection sort j 6 6 10 8 8 5 5 13 6 6 10 13 5 5 8 8 6 6 5 5 8 8 10 13 6 6 5 5 8 8 10 13

41 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 41 © Leong Hon Wai, 2003-2008 Example of selection sort j 6 6 10 8 8 5 5 13 6 6 10 13 5 5 8 8 6 6 5 5 8 8 10 13 m swap 6 6 5 5 8 8 10 13

42 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 42 © Leong Hon Wai, 2003-2008 Example of selection sort j 6 6 10 8 8 5 5 13 6 6 10 13 5 5 8 8 6 6 5 5 8 8 10 13 6 6 5 5 8 8 10 13 5 5 6 6 8 8 10 13 Done.

43 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 43 © Leong Hon Wai, 2003-2008 Figure 3.6: Selection Sort Algorithm Selection Sort Algorithm [SG] unsorted

44 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 44 © Leong Hon Wai, 2003-2008 What about the time complexity? j 6 6 10 8 8 5 5 13 6 6 10 13 5 5 8 8 6 6 5 5 8 8 10 13 6 6 5 5 8 8 10 13 5 5 6 6 8 8 10 13 Done.  Dominant operation: comparisons 4 comparisons 3 comparisons 2 comparisons 1 comparisons

45 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 45 © Leong Hon Wai, 2003-2008 Analysis of Selection Sort  When sorting n numbers, o(n-1) comparisons in iteration 1 (when j = n) o(n-2) comparisons in iteration 2 (when j = n-1) o(n-3) comparisons in iteration 3 (when j = n-2) o............... o2 comparisons in iteration (n-2) (when j = 3) o1 comparisons in iteration (n-1) (when j = 2)  Total number of comparisons: oCost = (n-1) + (n-2) + … + 2 + 1 = n(n-1)/2 =  (n 2 ) Find-Max for j numbers takes (j-1) comparisons

46 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 46 © Leong Hon Wai, 2003-2008 Selection Sort: Time complexity: T(n) =  (n 2 ) Space complexity: S(n) =  (n) Analysis of Selection Sort: Summary  Time complexity:  (n 2 ) oComparisons: n(n-1)/2 oExchanges: n (swapping largest into place) oOverall time compexity:  (n 2 )  Space complexity:  (n) o  (n) – space for input sequence, plus a few variables.

47 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 47 © Leong Hon Wai, 2003-2008 Figure 3.10: Work = cn 2 for various values of c Viewing the Rate of Growth of T(n) = cn 2

48 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 48 © Leong Hon Wai, 2003-2008  All functions (with highest-order term) cn 2 ohave similar shape ohave same order of growth  Quadratic algorithm --  (n 2 ) oan algorithm that does cn 2 work u for some constant c oorder of magnitude is n 2 o  (n 2 ) (read, theta n-square) Order of Magnitude – Order n 2 [Quadratic]

49 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 49 © Leong Hon Wai, 2003-2008 Comparison: Order n vs Order n 2  Have seen…  Algorithms of order n oSum, Find-Max, Find-Min, Seq-Search  Algorithm of order n 2 oSelection sort oPrinting an n x n table

50 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 50 © Leong Hon Wai, 2003-2008 Figure 3.11: A Comparison of n and n 2 Rate of Growth Comparison: n 2 vs n

51 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 51 © Leong Hon Wai, 2003-2008  Anything that is  (n 2 ) will eventually have larger values than anything that is  (n), no matter what the constants are.  An algorithm that runs in time  (n) will outperform one that runs in  (n 2 )  Eg: compare T 1 (n) = 1000n and T 2 (n) = 10n 2 Comparison:  (n 2 ) vs  (n) See also tutorial problem.

52 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 52 © Leong Hon Wai, 2003-2008 A Very Fast Search Algorithm  If the list is sorted, that is A 1  A 2  A 3  ….  A n  Then, we can do better when searching o actually a lot better….  Can use “Binary Search” Example: Find 9 357891215

53 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 53 © Leong Hon Wai, 2003-2008 Binary search Find an element in a sorted array: IDEA: Check middle element. Recursively search 1 subarray. Example: Find 9 357891215

54 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 54 © Leong Hon Wai, 2003-2008 Binary search Find an element in a sorted array: IDEA: Check middle element. Recursively search 1 subarray. Example: Find 9 357891215

55 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 55 © Leong Hon Wai, 2003-2008 Binary search Find an element in a sorted array: IDEA: Check middle element. Recursively search 1 subarray. Example: Find 9 357891215

56 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 56 © Leong Hon Wai, 2003-2008 Binary search Find an element in a sorted array: IDEA: Check middle element. Recursively search 1 subarray. Example: Find 9 357891215

57 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 57 © Leong Hon Wai, 2003-2008 Binary search Find an element in a sorted array: IDEA: Check middle element. Recursively search 1 subarray. Example: Find 9 357891215

58 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 58 © Leong Hon Wai, 2003-2008 Binary search Find an element in a sorted array: IDEA: Check middle element. Recursively search 1 subarray. Example: Find 9 357891215 Found!

59 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 59 © Leong Hon Wai, 2003-2008 Binary search – overview of algorithm Find an element in a sorted array: 1. Check middle element. Have two pointers first, last on two ends of the sub-array being search Here, mid = (first + last) / 2; Move one of the two pointers to update the sub-array being search. Either last  mid – 1; Or first  mid + 1; 2. Recursively search 1 subarray.

60 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 60 © Leong Hon Wai, 2003-2008 Binary Search Algorithm (pseudo-code) BinarySearch(A,n,x); (* search for x in a sorted array A[1..n] *) begin first  1; last  n; while (first <= last) do mid  (first + last) div 2; if (x = A[mid]) then print “Found x in pos”, m; Stop else if (x < A[mid]) then last  mid-1; else first  mid+1; endif endwhile print “x not found”; end;

61 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 61 © Leong Hon Wai, 2003-2008 Binary Search – How fast is it? (1/3)  Starting with n numbers, oBinary search repeatedly halves the size oUntil it reaches 1 element – to check Eg: When n = 100 oAfter 1 step, size is  50 oAfter 2 steps, size is  25 oAfter 3 steps, size is  12 oAfter 4 steps, size is  6 oAfter 5 steps, size is  3 oAfter 6 steps, size is  1 oOne last comparison, DONE!! 7 = log 2 100 steps; [Déjà vu? repeated-halving]

62 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 62 © Leong Hon Wai, 2003-2008  Starting with n numbers, oBinary search repeatedly halves the size oUntil it reaches 1 element – to check # steps = lg n = log 2 n  Binary Search has complexity oT(n) =  ( lg n)  Recall facts about T(n) = lg n When 2 k = n [after taking lg-base-2] k = log 2 n or lg n = # of steps of repeated-halving Binary Search – How fast is it? (2/3)

63 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 63 © Leong Hon Wai, 2003-2008  Starting with n numbers, oBinary search repeatedly halves the size oUntil it reaches 1 element – to check # steps = lg n = log 2 n  Binary Search has complexity oT(n) =  ( lg n)  T(n) =  (lg n) is very fast! n (# of element) T(n) = lg n 1,000 10 1,000,000 20 1,000,000,000 30 Binary Search – How fast is it? (3/3)

64 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 64 © Leong Hon Wai, 2003-2008 Summary: Searching Algorithms  Sequential Search (Alg): o Worst Case: n comparisons o Best Case: 1 comparison o Avg Case: n/2 comparisons  Binary Search (Alg): oWorst Case: lg n comparisons o Best Case: 1 comparison o Avg Case: lg n comparisons* * How to get the Average Case ? Answer: using mathematical analysis. This is OK (do-able) for small n (see example in tutorial). (Read Sect 3.4.2 of [SG3]) For general n, analysis is complex (beyond the scope of this course)

65 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 65 © Leong Hon Wai, 2003-2008 Figure 3.21. A Comparison of n and lg n Comparison: order n vs order lg n

66 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 66 © Leong Hon Wai, 2003-2008 Complexity of Algorithms…  Logarithmic Time Algorithm oBinary Search –  (lg n) time  A Linear Time Algorithm oSum(A,n) –  (n) time oSequential-Search(A,n,x) –  (n) time  A Quadratic Time Algorithm oSelection Sort –  (n 2 ) time  An Exponential Time Algorithm oAll-Subsets(A,n) –  (2 n ) time

67 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 67 © Leong Hon Wai, 2003-2008 Figure 3.22: Summary of Time Efficiency Complexity of Time Efficiency

68 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 68 © Leong Hon Wai, 2003-2008  Readings: [SG] Ch. 3  Chapter Outline: oAttributes of Algorithms oMeasuring Efficiency of Algorithms oSimple Analysis of Algorithms oWhen things get out of hand uPolynomial Time Algorithms (Tractable) uExponential Time Algorithms (Intractable) uApproximation Algorithms (eg: Bin Packing) Efficiency of Algorithms

69 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 69 © Leong Hon Wai, 2003-2008 When Things Get Out of Hand  Polynomially bound algorithms oTime complexity is some polynomial order oExample: Time complexity is of order of n 2  Intractable algorithms oRun time is worse than polynomial time oExamples uHamiltonian circuit uBin-packing

70 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 70 © Leong Hon Wai, 2003-2008 Figure 3.27: A Comparison of Four Orders of Magnitude Comparison of Time Complexities See extended table in tutorial problem

71 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 71 © Leong Hon Wai, 2003-2008 Figure 3.25: Comparisons of lg n, n, n 2, and 2 n

72 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 72 © Leong Hon Wai, 2003-2008 When Things Get Out of Hand (2)  Exponential algorithm o  (2 n ) oMore work than any polynomial in n  Approximation algorithms oRun in polynomial time but do not give optimal solutions oExample: Bin Packing Algorithms

73 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 73 © Leong Hon Wai, 2003-2008 Summary of Chapter 3  Desirable attributes in algorithms: oCorrectness oEase of understanding oElegance oEfficiency  Efficiency of an algorithm is extremely important oTime Complexity oSpace Complexity

74 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 74 © Leong Hon Wai, 2003-2008 Summary of Chapter 3 (2)  To compare the efficiency of two algorithms that do the same task oMeasure their time complexities  Efficiency focuses on order of magnitude oTime complexity in  -notations. oIn its simplest form (eg:  (n),  (n 2 ))

75 LeongHW, SoC, NUS (UIT2201: Algorithms) Page 75 © Leong Hon Wai, 2003-2008 Thank you!


Download ppt "LeongHW, SoC, NUS (UIT2201: Algorithms) Page 1 © Leong Hon Wai, 2003-2008 Efficiency of Algorithms  Readings: [SG] Ch. 3  Chapter Outline: oAttributes."

Similar presentations


Ads by Google