1 Algorithms Searching and Sorting Algorithm Efficiency.

Slides:



Advertisements
Similar presentations
Growth-rate Functions
Advertisements

Practice Quiz Question
Fundamentals of Python: From First Programs Through Data Structures
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Computational Complexity 1. Time Complexity 2. Space Complexity.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CS107 Introduction to Computer Science
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Algorithm Efficiency and Sorting
Data Structure Algorithm Analysis TA: Abbas Sarraf
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
CS107 Introduction to Computer Science Lecture 7, 8 An Introduction to Algorithms: Efficiency of algorithms.
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
 2006 Pearson Education, Inc. All rights reserved Searching and Sorting.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
COMP s1 Computing 2 Complexity
Liang, Introduction to Java Programming, Seventh Edition, (c) 2009 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Program Performance & Asymptotic Notations CSE, POSTECH.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Introduction to complexity. 2 Analysis of Algorithms Why do we need to analyze algorithms? –To measure the performance –Comparison of different algorithms.
(C) 2010 Pearson Education, Inc. All rights reserved. Java How to Program, 8/e.
Chapter 19 Searching, Sorting and Big O
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
Chapter 12 Recursion, Complexity, and Searching and Sorting
Analysis of Algorithms
Chapter 19: Searching and Sorting Algorithms
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
© 2011 Pearson Addison-Wesley. All rights reserved 10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Chapter 10 A Algorithm Efficiency. © 2004 Pearson Addison-Wesley. All rights reserved 10 A-2 Determining the Efficiency of Algorithms Analysis of algorithms.
 Pearson Education, Inc. All rights reserved Searching and Sorting.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
 2006 Pearson Education, Inc. All rights reserved Searching and Sorting.
Ch 18 – Big-O Notation: Sorting & Searching Efficiencies Our interest in the efficiency of an algorithm is based on solving problems of large size. If.
Big Oh Algorithms are compared to each other by expressing their efficiency in big-oh notation Big O notation is used in Computer Science to describe the.
CSC 211 Data Structures Lecture 13
1 CSC 427: Data Structures and Algorithm Analysis Fall 2008 Algorithm analysis, searching and sorting  best vs. average vs. worst case analysis  big-Oh.
Starting Out with C++ Early Objects Seventh Edition by Tony Gaddis, Judy Walters, and Godfrey Muganda Modified for use by MSU Dept. of Computer Science.
Data Structure Introduction.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Chapter 8 Sorting and Searching Goals: 1.Java implementation of sorting algorithms 2.Selection and Insertion Sorts 3.Recursive Sorts: Mergesort and Quicksort.
1 5. Abstract Data Structures & Algorithms 5.6 Algorithm Evaluation.
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
C++ How to Program, 7/e © by Pearson Education, Inc. All Rights Reserved.
Searching Topics Sequential Search Binary Search.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Search Algorithms Written by J.J. Shepherd. Sequential Search Examines each element one at a time until the item searched for is found or not found Simplest.
 2006 Pearson Education, Inc. All rights reserved. 1 Searching and Sorting.
1 Computability Tractable, Intractable and Non-computable functions.
CMPT 120 Topic: Searching – Part 2 and Intro to Time Complexity (Algorithm Analysis)
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Section 1.7 Comparing Algorithms: Big-O Analysis.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong
Algorithm Analysis 1.
Fundamentals of Algorithms MCS - 2 Lecture # 11
19 Searching and Sorting.
Analysis of Algorithms
Computer Science 112 Fundamentals of Programming II
COMP 53 – Week Seven Big O Sorting.
David Kauchak CS52 – Spring 2015
Teach A level Computing: Algorithms and Data Structures
Building Java Programs
24 Searching and Sorting.
Building Java Programs
Divide and Conquer Algorithms Part I
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
Presentation transcript:

1 Algorithms Searching and Sorting Algorithm Efficiency

2 Efficiency A computer program should be totally correct, but it should also execute as quickly as possible (time-efficiency) use memory wisely (storage-efficiency) How do we compare programs (or algorithms in general) with respect to execution time? various computers run at different speeds due to different processors compilers optimize code before execution the same algorithm can be written differently depending on the programming language used

3 Example: Linear Search Input: List A of n unique data values, indexed from 0 to n-1. (The data values are not in any specific order.) The desired data value (target) we’re searching for. Output: The index of the location of the target in the list or -1 if the target is not found.

4 Example: Linear Search Algorithm: 1. Set index = While index < n and A[index] ≠ target, do the following: a. Add 1 to index 3. If index ≤ n-1, output index; Otherwise, output -1.

5 Example: Linear Search Algorithm: 1. Set index = While index < n and A[index] ≠ target, do the following: a. Add 1 to index 3. If index ≤ n-1, output index; Otherwise, output -1. index  0 index  index + 1 index < n and A[index] ≠ target index ≤ n-1 index yes no yesno

6 Example: Linear Search Algorithm: 1. Set index = While index < n and A[index] ≠ target, do the following: a. Add 1 to index 3. If index ≤ n-1, output index; Otherwise, output -1. index  0 index  index + 1 index >= n or A[index] = target index ≤ n-1 index no yes no DeMorgan’s Law: not (A and B) = (not A) or (not B)

7 Some questions What is the maximum number of data values we need to examine as a function of n? (i.e. worst case) What is the minimum number of data values we need to examine as a function of n? (i.e. best case) How much storage do we need to hold all of the information for use in this algorithm as a function of n?

8 Linear Search We measure time efficiency by counting the number of operations performed by the algorithm. But what is an operation? 1. Set index = While index ≤ n-1 and A[index] ≠ target, do the following: a. Add 1 to index 3. If index ≤ n-1, output index; Otherwise, output -1.

9 Counting Operations We typically count operations that are a function of the amount of data (n) that we have to process. Abstraction: We don't count individual operators (+, -,...). We count more general operations: assignments (← or "Set... equal to..."), or comparisons (,, =, ≠) For linear search, we might count comparisons. How many comparisons does linear search require in the worst case?n+2

10 Example: Binary Search Input: List A of n unique data values. NOTE: The data values must be sorted in increasing order. The desired data value (target) we’re searching for. Output: The index of the location of the target in the array or -1 if the target is not found.

11 Example: Binary Search Algorithm: 1. Set i = 0 and j = n Repeat: a. mid = (i + j) / 2 (ignore any remainder). b. If (target A[mid]), set i = mid+1. until i > j or A[mid] = target. 3. If i <= j, output mid. Otherwise, output -1.

12 Example: Binary Search Algorithm: 1. Set i = 0 and j = n Repeat: a. mid = (i + j) / 2 (ignore any remainder). b. If (target A[mid]), set i = mid+1. until i > j or A[mid] = target. 3. If i ≤ j, output mid. Otherwise, output -1. i  0, j  n-1 mid  (i + j) div 2 i > j or A[mid] = target i ≤ j mid yesno target < A[mid] j  mid-1 i  mid+1 no yes no target > A[mid] yes no

13 Counting Operations Again For binary search, consider the worst-case scenario (target is not in list) How many times can we split the list before the list becomes empty? n=15: 15  7  3  1  0(4 splits) In general, we can split the list in half approximately  log 2 n  + 1 times before it becomes empty. Recall the log function: log a b = c is equivalent to a c = b Examples: log = 7log 2 n = 5 implies n = 32

14 Order of Complexity For very large n, we express the number of operations as the order of complexity. Order of complexity for worst-case behavior is often expressed using Big-O notation: Number of operationsOrder of Complexity nO(n) n/2 + 6O(n) 2n + 9O(n) Usually doesn't matter what the constants are... we are only concerned about the highest power of n.

15 O(n) ("Linear") n (amount of data) Number of Operations n n/ n + 9

16 O(n) n (amount of data) Number of Operations n

17 Order of Complexity Number of operationsOrder of Complexity n 2 O(n 2 ) 2n 2 + 7O(n 2 ) n 2 /2 + 5n + 2O(n 2 ) Usually doesn't matter what the constants are... we are only concerned about the highest power of n.

18 O(n 2 ) ("Quadratic") n (amount of data) Number of Operations n 2 /2 + 5n + 2 2n n2n2

19 O(n 2 ) n (amount of data) Number of Operations n2n2

20 Order of Complexity Number of operationsOrder of Complexity log 2 nO(log n) log 10 nO(log n) 2(log 2 n) + 5O(log n) The logarithm base is not written in big O notation since all that matters is that the function is logarithmic.

21 O(log n) ("Logarithmic") n (amount of data) Number of Operations log 2 n log 10 n 2(log 2 n) + 5

22 O(log n) n (amount of data) Number of Operations log 2 n 1

23 O(n log n) n (amount of data) Number of Operations n log 2 n = O(n log n) (not drawn to scale)

24 O(1) ("Constant") n (amount of data) Number of Operations = O(1)

25 O(2 n ) n (amount of data) Number of Operations n = O(2 n ) (definitely not drawn to scale)

26 Comparing Big O Functions n (amount of data) Number of Operations O(2 n ) O(1) O(n log n) O(log n) O(n 2 ) O(n)

27 Searches Linear Search Count comparisons as operations. Worst Case: O(N) Binary Search Count comparisons as operations. Worst Case: O(log N) Which algorithm is better? In general, for large values of N: log N < N so binary search is better for large N. BUT, binary search requires a sorted list.

28 Selection Sort Find the smallest data value in the list from index 0 to index n-1 and swap it into position 0 of the list. Find the smallest data value in the list from index 1 to index n-1 and swap it into position 1 of the list. Find the smallest data value in the list from index 2 to index n-1 and swap it into position 2 of the list. …etc…

29 Selection Sort What is the worst-case order of complexity of selection sort on n data values? Count the comparisons as operations. First pass through:n-1 comparisons Second pass through:n-2 comparisons... Last (n-1 st ) pass through:1 comparison Order of complexity: (n-1)+(n-2) = n(n-1)/2 = (n 2 -n)/2 = O(n 2 ) We call selection sort a "quadratic sort".

30 MergeSort Split the list into 2 halves.* If the size of the half-list is not 1, Sort each half using merge sort. (recursively) Set up a new list. Merge step: Move the smallest of each half-list into the new list. Repeat this step until one of the half-lists is empty. Then copy the remaining data in the non-empty half- list into the new list. *If the list has an odd number of elements, one list will have one extra element.

31 MergeSort

32 MergeSort

33 Merge Sort What is the order of complexity of merge sort? Let C(n) = total number of operations performed for merge sort on n data values. Operations include comparisons in the merge process. C(1) = 0 Maximum number of comparisons to merge 2 lists on size n/2 into one list of size n: n-1 comparisons. C(n) = C(n/2) + C(n/2) + n-1 = 2C(n/2) + n – 1 This is a recurrence relation (it’s recursive!). Solution: C(n) = n log 2 n - n + 1 = O(n log n).

34 Order of Complexity Examples: O(log N) + O(N) = O(N) O(N log N) + O(N) = O(N log N) O(N log N) + O(N 2 ) = O(N 2 ) O(2 N ) + O(N 2 ) = O(2 N ) O(A) O(B) Algorithm Overall order of complexity of algorithm is max (O(A), O(B)).

35 Order of Complexity Examples: O(log N) * O(N) = O(N log N) O(N log N) * O(N) = O(N 2 log N) O(N) * O(1) = O(N) O(A) O(B) Algorithm Overall order of complexity of algorithm is O(A * B). Example: - Nested loops O(A) does not include complexity of part B of algorithm

36 Searches Revisited What if you wanted to use binary search on data that is not sorted? Sort first:O(n log n) using Merge Sort Then use binary search:O(log n) Total runtime complexity: O(n log n) + O(log n) = O(n log n) Now, how does this compare to linear search?

37 Comparing Algorithms Assume an algorithm processes n data values. If each operation takes 1  s to execute, how many  s will it take to run the algorithm on 100 data values if the algorithm has the following number of computations? Number of ComputationsExecution Time n100  s n log 2 n665  s n 2 10,000  s n 3 1,000,000  s = 1 sec 2 n >  s n!>  s