Download presentation
Presentation is loading. Please wait.
Published byEmerald Armstrong Modified over 8 years ago
1
Algorithms April-May 2013 Dr. Youn-Hee Han The Project for the Establishing the Korea ㅡ Vietnam College of Technology in Bac Giang
2
Algorithm Efficiency Given a set of algorithms for the same problem, how to compare their efficiency (or complexity)? – Function without loop nor recursion Execution time – Function containing loop or recursion Execution time is a function of input size N Ex) finding a particular student from a enrollment General format of algorithm efficiency – Efficiency of an algorithm: f(n) n: size of input Which case is considered? mostly worst case 2/18
3
Algorithm Efficiency Linear Loops – Execution time is proportional to n for(i = 0; i < n; i++){ // application code } for(i = 0; i < n; i += 2){ // application code } f(n) = n f(n) = n/2 3/18
4
Algorithm Efficiency Logarithmic Loops – Execution time is proportional to log d (n) Multiply for(i = 1; i <= n; i *= 2){ // application code } Division for(i = n; i >= 1; i /= 2){ // application code } Termination conditions – Multiply: 2 Iteration > n – Division: n / 2 Iteration < 1 f(n) = log 2 (n) 4/18
5
Algorithm Efficiency Nested Loops – Basic formula Total iterations = outer loop iteration * inner loop iteration – Quadratic for(i = 0; i < n; i++){// outer loop: n iterations for(j = 0; j < n; j++){// inner loop: n iterations // application code } Total iterations: f(n) = n 2 5/18
6
Algorithm Efficiency Nested Loops – Dependent quadratic for(i = 0; i < n; i++){// outer loop: n iterations for(j = 0; j < i; j++){// inner loop: (n+1)/2 iterations // application code } f(n) = n(n+1)/2 – Linear logarithm for(i = 0; i < n; i++){// outer loop: n iterations for(j = 0; j < n; j *= 2){// inner loop: log 2 (n) iterations // application code } f(n) = n log 2 (n) 6/18
7
Algorithm Efficiency Worst Case Considered – 1/2 – Sequential Search index seqsearch(int n,type[] S, type x) { index location; location = 1; while (location <= n && S[location] != x) location++; if (location > n) location = 0; return location; } f(n) = n 7/18
8
Algorithm Efficiency Worst Case Considered – 2/2 – Binary Search index binsearch(int n, keytype[] S, keytype x) { index location, low, high, mid; low = 1; high = n; location = 0; while (low <= high && location == 0) { mid = (low + high) / 2 ; if (x == S[mid]) location = mid; else if (x < S[mid]) high = mid – 1; else low = mid + 1; } return location; } f(n) = log 2 (n)+1 8/18
9
Asymptotic Complexity Comparison of Algorithms – Given two algorithms whose complexities are f 1 (n) and f 2 (n) respectively, then … Is the order of magnitude important? – n vs. n 2, n 2 vs. n 3, … Is the constant factor important? – 0.5n vs. 10n, n 2 vs. 2n 2, n vs. log n, … More difficult cases – 5n vs. n 2 – 100n vs. 0.001n 2 – 100000000n vs. 0.000000001n 2 9/18
10
Asymptotic Complexity Constant vs. Order – Comparing c 1 n with c 2 n 2 (c 1 and c 2 are constants) Regardless of c1 and c2, there exists a break even point. – Consequently … Order is important Constant can be negligible That is, we consider the case that n is infinite We usually use the “asymptotic Complexity”!!! c2n2c2n2 c1nc1n break even point 10/18
11
Asymptotic Complexity – the usage of the asymptotic analysis for the estimation of algorithm complexity Asymptotic analysis – a method of describing limiting behavior. – commonly associated with the big O notation – roughly, O(f(n)) represents “order of” f(n) 1, 3, 100, … O(1) – Constant complexity (independent from size of input) n, 0.5n, 10n, 1000000n,… O(n) – Linear complexity n 2, 0.5n 2, 10n 2, 1000000n 2,… O(n 2 ) – Quadratic complexity 11/18
12
Asymptotic Complexity Growth of Function Values – Ordering of complexities O(1) < O(log n) < O(n) < O(n log n) < O(n 2 ) < O(n 3 ) < O(2 n ) < O(n!) – Plot of function values 12/18
13
Asymptotic Complexity Growth of Function Values – If n is sufficiently large, the only important is the order of complexity. Constant coefficient is less important unless n is small. Ex) If n is sufficiently large, 100n + 1 <= 0.00001 n 2 13/18
14
Asymptotic Complexity Linear Time Algorithm – (N-19001) ⋅ (MAX - 1) ⋅ 2 O(N) Quadratic Time Algorithm – 2 ((N-1)-2+1) N O(N 2 ) for i = 1 to N for j = 2 to (N-1) do { Read A[i, j]; Write A[i, j]; } #define MAX 9000000 for i = 1000 to N-20000 for j = 2 to MAX do { Read A[i, j]; Write A[i, j]; } 14/18
15
Algorithm Efficiency Constant Time Algorithm – 2 = 2N 0 O(N 0 ) = O(1) Linear Time Algorithm – (N-19001) ⋅ (MAX - 1) ⋅ 2 O(N) Quadratic Time Algorithm – 2 ((N-1)-2+1) N O(N 2 ) for i = 1 to N for j = 2 to (N-1) do { Read A[i, j]; Write A[i, j]; } #define MAX 9000000 for i = 1000 to N-20000 for j = 2 to MAX do { Read A[i, j]; Write A[i, j]; } x = 20; printf("%d", x); 15/18
16
Algorithm Efficiency How to handle “if statement” – Consider worst case! – O(N 2 ) if (x = = 1) { for i = 1 to N for j = 2 to (N-1) do { Read A[i, j]; Write A[i, j]; } } else { x = 20; printf("%d", x); } 16/18
17
Measuring Performance of Algorithms Practical measurement – Put a timer before the algorithm starts and stop it when it ends. 17/18 long start=System.nanoTime(); your_function(); long finish=System.nanoTime(); System.out.println(((double)(finish-start)) / 1000000000.0); //in seconds System.out.println(((double)(finish-start)) / 1000000.0); //in milliseconds System.out.println(((double)(finish-start)) / 1000.0); //in micro-seconds
18
[Programming Practice 2] Fibonacci Sequence: Recursive vs. Iterative – Visit http://link.koreatech.ac.kr/courses/2013_1/AP-KOICA/AP- KOICA20131.html – Download “FibonacciMain.java” and run it – Analyze the source codes – Complete the source codes while insert right codes within the two functions public static long iterativeFibonacci(int num) public static long recursiveFibonacci(int num) – Compare the execution times of the two functions – What are the maximum values of the parameter “num” for the two functions? 18/18
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.