Computer Science 112 Fundamentals of Programming II Finding Faster Algorithms.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

Garfield AP Computer Science
Sorting Chapter 8 CSCI 3333 Data Structures.
Sorting Sorting is the process of arranging a list of items in a particular order The sorting process is based on specific value(s) Sorting a list of test.
Divide And Conquer Distinguish between small and large instances. Small instances solved differently from large ones.
ISOM MIS 215 Module 7 – Sorting. ISOM Where are we? 2 Intro to Java, Course Java lang. basics Arrays Introduction NewbieProgrammersDevelopersProfessionalsDesigners.
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
CS 206 Introduction to Computer Science II 04 / 28 / 2009 Instructor: Michael Eckmann.
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Chapter 7: Sorting Algorithms
CSC 2300 Data Structures & Algorithms March 23, 2007 Chapter 7. Sorting.
CPSC 171 Introduction to Computer Science More Efficiency of Algorithms.
Chapter 19: Searching and Sorting Algorithms
CS203 Programming with Data Structures Sorting California State University, Los Angeles.
Ver. 1.0 Session 5 Data Structures and Algorithms Objectives In this session, you will learn to: Sort data by using quick sort Sort data by using merge.
CS 206 Introduction to Computer Science II 12 / 09 / 2009 Instructor: Michael Eckmann.
Sorting Algorithms Nelson Padua-Perez Bill Pugh Department of Computer Science University of Maryland, College Park.
Recitation 9 Programming for Engineers in Python.
Chapter 11 Sorting and Searching. Copyright © 2005 Pearson Addison-Wesley. All rights reserved Chapter Objectives Examine the linear search and.
Quicksort. Quicksort I To sort a[left...right] : 1. if left < right: 1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and.
CHAPTER 11 Sorting.
Quicksort.
C++ Plus Data Structures
Quicksort
Sorting CS-212 Dick Steflik. Exchange Sorting Method : make n-1 passes across the data, on each pass compare adjacent items, swapping as necessary (n-1.
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
CS 206 Introduction to Computer Science II 12 / 08 / 2008 Instructor: Michael Eckmann.
CS2420: Lecture 11 Vladimir Kulyukin Computer Science Department Utah State University.
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
Chapter 16: Searching, Sorting, and the vector Type.
C++ Programming: Program Design Including Data Structures, Third Edition Chapter 19: Searching and Sorting Algorithms.
Computer Science 101 Fast Searching and Sorting. Improving Efficiency We got a better best case by tweaking the selection sort and the bubble sort We.
C++ Programming: Program Design Including Data Structures, Fourth Edition Chapter 19: Searching and Sorting Algorithms.
Chapter 19: Searching and Sorting Algorithms
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
CSC 211 Data Structures Lecture 13
Chapter 5 Searching and Sorting. Copyright © 2004 Pearson Addison-Wesley. All rights reserved.1-2 Chapter Objectives Examine the linear search and binary.
Chapter 12 Binary Search and QuickSort Fundamentals of Java.
1 C++ Plus Data Structures Nell Dale Chapter 10 Sorting and Searching Algorithms Slides by Sylvia Sorkin, Community College of Baltimore County - Essex.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Searching & Sorting Programming 2. Searching Searching is the process of determining if a target item is present in a list of items, and locating it A.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Introduction to Computation and Problem Solving Class 33: Activ Learning: Sorting Prof. Steven R. Lerman and Dr. V. Judson Harward.
Computer Science 101 Fast Algorithms. What Is Really Fast? n O(log 2 n) O(n) O(n 2 )O(2 n )
1 Searching and Sorting Searching algorithms with simple arrays Sorting algorithms with simple arrays –Selection Sort –Insertion Sort –Bubble Sort –Quick.
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
M180: Data Structures & Algorithms in Java Sorting Algorithms Arab Open University 1.
Computer Science 112 Fundamentals of Programming II Searching, Sorting, and Complexity Analysis.
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Quicksort Quicksort is a well-known sorting algorithm that, in the worst case, it makes Θ(n 2 ) comparisons. Typically, quicksort is significantly faster.
Quick Sort Modifications By Mr. Dave Clausen Updated for Python.
Chapter 16: Searching, Sorting, and the vector Type.
INTRO2CS Tirgul 8 1. Searching and Sorting  Tips for debugging  Binary search  Sorting algorithms:  Bogo sort  Bubble sort  Quick sort and maybe.
1 compares each element of the array with the search key. works well for small arrays or for unsorted arrays works for any table slow can put more commonly.
Searching and Sorting Searching algorithms with simple arrays
Fundamentals of Programming I Sort Algorithms
Fundamentals of Programming II Finding Faster Algorithms
Computer Science 112 Fundamentals of Programming II
Teach A level Computing: Algorithms and Data Structures
Quicksort 1.
How can this be simplified?
Algorithm design and Analysis
Quicksort analysis Bubble sort
Fundamentals of Python: First Programs Second Edition
Higher-Order Functions in Haskell
Presentation transcript:

Computer Science 112 Fundamentals of Programming II Finding Faster Algorithms

Bubble Sort Strategy Compare the first two items and if they are out of order, exchange them Repeat this process for the second and third items, etc. At the end of this process, the largest itemwill have bubbled down to the end of the list Repeat this process for the unsorted portion of the list, etc.

set n to the length of the list while n > 1 bubble the elements from position 0 to position n - 1 decrement n Formalize the Strategy

set n to the length of the list while n > 1 for each position i from 1 to n - 1 if the elements at i and i - 1 are out of order swap them decrement n Refine the Strategy

def bubbleSort(lyst): n = len(lyst) while n > 1: # Do n - 1 bubbles #i = 1 # Start each bubble for i in range(1, n): if lyst[i] < lyst[i - 1]: # Swap if needed swap(lyst, i, i – 1) n -= 1 Implement bubbleSort Analysis: How many iterations does the outer loop perform? How many iterations does the inner loop perform?

def bubbleSort(lyst): n = len(lyst) while n > 1: isSorted = True i = 1 for i in range(n): if lyst[i] < lyst[i - 1]: swap(lyst, i, i – 1) isSorted = False if isSorted: break n -= 1 Improving bubbleSort Analysis: Best, worst, average cases?

Example: Exponentiation def ourPow(base, expo): if expo == 0: return 1 else: return base * ourPow(base, expo – 1) What is the best case performance? Worst case? Average case? Recursive definition: b n = 1, when n = 0 b n = b * b n-1 otherwise

Faster Exponentiation def fastPow(base, expo): if expo == 0: return 1 elif n % 2 == 1: return base * fastPow(base, expo – 1) else: result = fastPow(base, expo // 2) return result * result What is the best case performance? Worst case? Average case? Recursive definition: b n = 1, when n = 0 b n = b * b n-1, when n is odd b n = (b n/2 ) 2, when n is even

The Fibonacci Series def fib(n): if n == 1 or n == 2: return 1 else: return fib(n – 1) + fib(n – 2) fib(n) = 1, when n = 1 or n = 2 fib(n) = fib(n – 1) + fib(n – 2) otherwise

Tracing fib(5) with a Call Tree fib(5) fib(4) fib(3) fib(2) fib(1)

Work Done – Function Calls fib(5) fib(4) fib(3) fib(2) fib(1) Somewhere between 1 n and 2 n

Memoization def fib(n): if n == 1 or n == 2: return 1 else: return fib(n – 1) + fib (n – 2) Intermediate values returned by the function can be memoized, or saved in a cache, for subsequent access Then they don’t have to be recomputed!

Memoization def fib(n): cache = dict() def fastFib(n): if n == 1 or n == 2: return 1 elif n in cache: return cache[n] else: value = fastFib(n – 1) + fastFib(n – 2) cache[n] = value return value return fastFib(n) The cache is a dictionary whose keys are the arguments of fib and whose values are the values of fib at those keys

Improving on n 2 Sorting Selection sort uses a linear method within a linear method, so it ’ s an O(n 2 ) method Find a way of using a linear method with a method that ’ s better than linear

A Hint from Binary Search Binary search is better than linear, because we divide the problem size by 2 on each step Find a way of dividing the size of sorting problem by 2 on each step, even though each step will itself be linear This should produce an O(nlogn) algorithm

Quick Sort Select a pivot element (say, the element at the midpoint) Shift all of the smaller values to the left of the pivot, and all of the larger values to the right of the pivot (the linear part) Sort the values to the left and to the right of the pivot (ideally, done logn times)

pivot Step 1: select the pivot (at the midpoint) Step 2: shift the data pivot Trace of Quick Sort

Step 3: sort to the left of the pivot pivot Step 4: sort to the right of the pivot pivot Trace of Quick Sort

Design of Quick Sort: First Cut quickSort(lyst, left, right) if left < right pivotPosition = partition(lyst, left, right) quickSort (lyst, left, pivotPosition - 1); quickSort (lyst, pivotPosition + 1, right)

Design of Quick Sort: First Cut quickSort(lyst, left, right) if left < right pivotPosition = partition(lyst, left, right) quickSort (lyst, left, pivotPosition - 1); quickSort (lyst, pivotPosition + 1, right) This version selects the midpoint element as the pivot The position of the pivot might change during the shifting of data partition(lyst, left, right) pivotValue = lyst[(left + right) // 2] shift smaller values to left of pivotValue shift larger values to right of pivotValue return pivotPosition

Implementation of Partition def partition(lyst, left, right): # Find the pivot and exchange it with the last item middle = (left + right) // 2 pivot = lyst[middle] lyst[middle] = lyst[right] lyst[right] = pivot # Set boundary point to first position boundary = left # Move items less than pivot to the left for index in range(left, right): if lyst[index] < pivot: swap(lyst, index, boundary) boundary += 1 # Exchange the pivot item and the boundary item swap(lyst, right, boundary) return boundary The number of comparisons required to shift values in each sublist is equal to the size of the sublist.

def quickSort(lyst): def recurse(left, right): if left < right: pivotPosition = partition(lyst, left, right) recurse(left, pivotPosition - 1); recurse(pivotPosition + 1, right) def partition(lyst, left, right): # Find the pivot and exchange it with the last item middle = (left + right) // 2 pivot = lyst[middle] lyst[middle] = lyst[right] lyst[right] = pivot # Set boundary point to first position boundary = left # Move items less than pivot to the left for index in range(left, right): if lyst[index] < pivot: swap(lyst, index, boundary) boundary += 1 # Exchange the pivot item and the boundary item swap(lyst, right, boundary) return boundary recurse(0, len(lyst) – 1)

The number of comparisons in the top-level call is n Complexity Analysis The sum of the comparisons in the two recursive calls is also n The sum of the comparisons in the four recursive calls beneath these is also n, etc. Thus, the total number of comparisons equals n * the number of times the list must be subdivided

How Many Times Must the Array Be Subdivided? It depends on the data and on the choice of the pivot element Ideally, when the pivot is the median on each call, the list is subdivided log 2 n times Best-case behavior is O(nlogn)

Call Tree For a Best Case We select the midpoint element as the pivot. The median element happens to be at the midpoint on each call. But the list was already sorted!

Worst Case What if the value at the midpoint is near the largest value on each call? Or near the smallest value on each call? Then there will be approximately n subdivisions, and quick sort will degenerate to O(n 2 )

Call Tree For a Worst Case We select the first element as the pivot. The smallest element happens to be the first one on each call. n subdivisions!

Other Methods of Selecting the Pivot Element Pick a random element Pick the median of the first three elements Pick the median of the first, middle, and last elements Pick the median element - not!! This is an O(n) algorithm

For Friday Working with the Array Data Structure Chapter 4