Data Structures and Algorithms Lecture 17, 18 and 19 (Sorting) Instructor: Quratulain Date: 10, 13 and 17 November, 2009 Faculty of Computer Science, IBA.

Slides:



Advertisements
Similar presentations
Visual C++ Programming: Concepts and Projects
Advertisements

Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
SORTING ROUTINES. OBJECTIVES INTRODUCTION BUBBLE SORT SELECTION SORT INSERTION SORT QUICK SORT MERGE SORT.
ISOM MIS 215 Module 7 – Sorting. ISOM Where are we? 2 Intro to Java, Course Java lang. basics Arrays Introduction NewbieProgrammersDevelopersProfessionalsDesigners.
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Quicksort COMP171 Fall Sorting II/ Slide 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N.
Data Structures Data Structures Topic #13. Today’s Agenda Sorting Algorithms: Recursive –mergesort –quicksort As we learn about each sorting algorithm,
Sorting Chapter Sorting Consider list x 1, x 2, x 3, … x n We seek to arrange the elements of the list in order –Ascending or descending Some O(n.
Data Structures Advanced Sorts Part 2: Quicksort Phil Tayco Slide version 1.0 Mar. 22, 2015.
© 2006 Pearson Addison-Wesley. All rights reserved10-1 Chapter 10 Algorithm Efficiency and Sorting CS102 Sections 51 and 52 Marc Smith and Jim Ten Eyck.
Chapter 11 Sorting and Searching. Copyright © 2005 Pearson Addison-Wesley. All rights reserved Chapter Objectives Examine the linear search and.
Quicksort. 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N 2 ) n But, the worst case seldom.
CHAPTER 11 Sorting.
Sorting Chapter 10.
Quicksort.
Algorithm Efficiency and Sorting
Sorting CS-212 Dick Steflik. Exchange Sorting Method : make n-1 passes across the data, on each pass compare adjacent items, swapping as necessary (n-1.
Sorting Chapter 10. Chapter 10: Sorting2 Chapter Objectives To learn how to use the standard sorting methods in the Java API To learn how to implement.
CS2420: Lecture 11 Vladimir Kulyukin Computer Science Department Utah State University.
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Sorting and Searching Arrays CSC 1401: Introduction to Programming with Java Week 12 – Lectures 1 & 2 Wanda M. Kunkle.
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
Recursion, Complexity, and Searching and Sorting By Andrew Zeng.
Week 11 Introduction to Computer Science and Object-Oriented Programming COMP 111 George Basham.
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
CHAPTER 09 Compiled by: Dr. Mohammad Omar Alhawarat Sorting & Searching.
Recursion, Complexity, and Sorting By Andrew Zeng.
Fall 2013 Instructor: Reza Entezari-Maleki Sharif University of Technology 1 Fundamentals of Programming Session 17 These.
Computer Science 101 Fast Searching and Sorting. Improving Efficiency We got a better best case by tweaking the selection sort and the bubble sort We.
C++ Programming: Program Design Including Data Structures, Fourth Edition Chapter 19: Searching and Sorting Algorithms.
HKOI 2006 Intermediate Training Searching and Sorting 1/4/2006.
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley. Ver Chapter 9: Algorithm Efficiency and Sorting Data Abstraction &
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
Chapter 10 B Algorithm Efficiency and Sorting. © 2004 Pearson Addison-Wesley. All rights reserved 9 A-2 Sorting Algorithms and Their Efficiency Sorting.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
Sorting Chapter 10. Chapter Objectives  To learn how to use the standard sorting methods in the Java API  To learn how to implement the following sorting.
Elementary Sorting Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
CS 61B Data Structures and Programming Methodology July 28, 2008 David Sun.
Sorting. Pseudocode of Insertion Sort Insertion Sort To sort array A[0..n-1], sort A[0..n-2] recursively and then insert A[n-1] in its proper place among.
Adapted from instructor resource slides Nyhoff, ADTs, Data Structures and Problem Solving with C++, Second Edition, © 2005 Pearson Education, Inc. All.
CSC 211 Data Structures Lecture 13
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
© 2006 Pearson Addison-Wesley. All rights reserved10 B-1 Chapter 10 (continued) Algorithm Efficiency and Sorting.
Sorting Chapter Sorting Consider list x 1, x 2, x 3, … x n We seek to arrange the elements of the list in order –Ascending or descending Some O(n.
CS 361 – Chapters 8-9 Sorting algorithms –Selection, insertion, bubble, “swap” –Merge, quick, stooge –Counting, bucket, radix How to select the n-th largest/smallest.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Searching & Sorting Programming 2. Searching Searching is the process of determining if a target item is present in a list of items, and locating it A.
Sorting. Objectives Become familiar with the following sorting methods: Insertion Sort Shell Sort Selection Sort Bubble Sort Quick Sort Heap Sort Merge.
Sorting and Searching. Selection Sort  “Search-and-Swap” algorithm 1) Find the smallest element in the array and exchange it with a[0], the first element.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
UNIT 5.  The related activities of sorting, searching and merging are central to many computer applications.  Sorting and merging provide us with a.
Lecture No. 04,05 Sorting.  A process that organizes a collection of data into either ascending or descending order.  Can be used as a first step for.
Sorting and Searching by Dr P.Padmanabham Professor (CSE)&Director
1 Searching and Sorting Searching algorithms with simple arrays Sorting algorithms with simple arrays –Selection Sort –Insertion Sort –Bubble Sort –Quick.
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
Data Structures - CSCI 102 Selection Sort Keep the list separated into sorted and unsorted sections Start by finding the minimum & put it at the front.
M180: Data Structures & Algorithms in Java Sorting Algorithms Arab Open University 1.
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Chapter 9: Sorting1 Sorting & Searching Ch. # 9. Chapter 9: Sorting2 Chapter Outline  What is sorting and complexity of sorting  Different types of.
Sorting & Searching Geletaw S (MSC, MCITP). Objectives At the end of this session the students should be able to: – Design and implement the following.
SORTING Chapter 8. Chapter Objectives  To learn how to use the standard sorting methods in the Java API  To learn how to implement the following sorting.
Searching and Sorting Searching algorithms with simple arrays
Prof. U V THETE Dept. of Computer Science YMA
Chapter 7 Sorting Spring 14
Bubble Sort Bubble sort is one way to sort an array of numbers. Adjacent values are swapped until the array is completely sorted. This algorithm gets its.
Binary Search Back in the days when phone numbers weren’t stored in cell phones, you might have actually had to look them up in a phonebook. How did you.
Algorithm Efficiency and Sorting
Presentation transcript:

Data Structures and Algorithms Lecture 17, 18 and 19 (Sorting) Instructor: Quratulain Date: 10, 13 and 17 November, 2009 Faculty of Computer Science, IBA

Introduction to Sorting To sort a collection of data is to place it in order. ◦ “Sort” here is computing jargon. Normal people use “sort” to mean placing things in categories. Efficient sorting is of great interest. ◦ Sorting is a very common operation. ◦ Sorting code that is written with little thought is often much less efficient than code using a good sorting algorithm. ◦ Some algorithms (like Binary Search) require sorted data. The efficiency of sorting affects the desirability of such algorithms.

Introduction to Sorting We will deal primarily with algorithms that solve the General Sorting Problem. ◦ In this problem, we are given:  A list.  Items are all of the same type.  A comparison function.  Given two list items, determine which should come first.  Using this function is the only way we can make such a determination. ◦ We return:  A sorted list with the same items as the original list.

Introduction to Sorting: Analyzing Sorting Algorithms We will analyze sorting algorithms according to five criteria: ◦ Efficiency  What is the (worst-case) order of the algorithm?  Is the algorithm much faster on average than its worst-case performance? ◦ Requirements on Data  Does the algorithm need random-access data? Does it work well with linked lists?  What operations does the algorithm require the data to have?  Of course, we always need “compare”. What else? ◦ Space Usage  Can the algorithm sort in-place?  In-place means the algorithm does not copy the data to a separate buffer.  How much additional storage is required?  Every algorithm requires at least a little. ◦ Stability  Is the algorithm stable?  A sorting algorithm is stable if it does not reverse the order of equivalent items. ◦ Performance on Nearly Sorted Data  Is the algorithm faster when given sorted (or nearly sorted) data?  All items close to where they should be, or a limited number of items out of order.

Introduction to Sorting: Overview of Algorithms There is no known sorting algorithm that has all the properties we would like one to have. We will examine a number of sorting algorithms. Generally, these fall into two categories: O(n 2 ) and O(n log n). ◦ Quadratic [O(n 2 )] Algorithms  Bubble Sort  Selection Sort  Insertion Sort  Quicksort  A tree-based sort (later in semester) ◦ Log-Linear [O(n log n)] Algorithms  Merge Sort  Heap Sort (mostly later in semester) ◦ Special-Purpose Algorithm  Radix Sort It may seem odd that an algorithm called “Quicksort” is in the slow category. More about this later.

Sorting Algorithms I: Bubble Sort — Introduction One of the simplest sorting algorithms is also one of the worst: Bubble Sort. ◦ We cover it because it is easy to understand and analyze. ◦ But we never use it. Bubble Sort uses two basic operations: ◦ Compare  Given two consecutive items in a list, which comes first? ◦ Swap  Given two consecutive items in a list, swap them. Note that Bubble Sort only performs these operations on consecutive pairs of items.

Sorting Algorithms I: Bubble Sort — Description Bubble Sort proceeds in a number of “passes”. ◦ In each pass, we go through the list, considering each consecutive pair of items. ◦ We compare. If the pair is out of order, we swap. ◦ The larger items rise to the top like bubbles.  I assume we are sorting in ascending order. ◦ After the first pass, the last item in the list is the largest. ◦ Thus, later passes need not go through the entire list. We can improve Bubble Sort’s performance on some kinds of nearly sorted data: ◦ During each pass, keep track of whether we have done any swaps during that pass. ◦ If not, then the data was sorted when the pass began. Quit.

Sorting Algorithms I: Bubble Sort — 1. Set i to 0 2. Set j to i If a[i] > a[j], exchange their values 4. Set j to j + 1. If j < n goto step 3 5. Set i to i + 1. If i < n - 1 goto step 2 6. a is now sorted in ascending order. Note: n is the number of elements in the array.

Code Code int hold, j, pass; Int switched=true; for (pass=0; pass< n-1 && switched==True; pass++) { switched=False; for( j=0; j<n-pass-1; j++) if (x[j] > x[j+1]) { switched =True; hold=x[j]; x[j]=x[j+1]; x[j+1]=hold; }

Sorting Algorithms I: Bubble Sort — Analysis Efficiency  ◦ Bubble Sort is O(n 2 ). ◦ Bubble Sort also has an average-case time of O(n 2 ).  Requirements on Data ◦ Bubble Sort works on a linked list. ◦ Operations needed: (compare and) swap of consecutive items. Space Usage ◦ Bubble Sort can be done in-place. ◦ It does not require significant additional storage. Stability ◦ Bubble Sort is stable. Performance on Nearly Sorted Data /  ◦ We can write Bubble Sort to be O(n) if no item is far out of place. ◦ Bubble Sort is O(n 2 ) even if only one item is out of order. 

Sorting Algorithms I: Bubble Sort — Comments Bubble Sort is very slow. ◦ It is never used, except:  In C.S. classes, as a simple example.  By people who do not understand sorting algorithms. Bubble Sort does not require random-access data, however, our implementation does. ◦ Can we fix this?  Yes, we can rewrite our Bubble Sort to use forward iterators.

Insertion Sort In each pass of an insertion sort, one or more pieces of data are inserted into their correct location in an ordered list (just as a card player picks up cards and places them in his hand in order).

Insertion Sort In the insertion sort, the list is divided into 2 parts: ◦ Sorted ◦ Unsorted In each pass, the first element of the unsorted sublist is transferred to the sorted sublist by inserting it at the appropriate place. If we have a list of n elements, it will take, at most, n-1 passes to sort the data.

Insertion Sort We can visualize this type of sort with the above figure. The first part of the list is the sorted portion which is separated by a “conceptual” wall from the unsorted portion of the list.

Insertion Sort

Insertion Sort Pseudocode See sorting handouts

Selection Sort Imagine some data that you can examine all at once. To sort it, you could select the smallest element and put it in its place, select the next smallest and put it in its place, etc. For a card player, this process is analogous to looking at an entire hand of cards and ordering them by selecting cards one at a time and placing them in their proper order.

Selection Sort The selection sort follows this idea. Given a list of data to be sorted, we simply select the smallest item and place it in a sorted list. We then repeat these steps until the list is sorted.

Selection Sort In the selection sort, the list at any moment is divided into 2 sublists, sorted and unsorted, separated by a “conceptual” wall. We select the smallest element from the unsorted sublist and exchange it with the element at the beginning of the unsorted data. After each selection and exchange, the wall between the 2 sublists moves – increasing the number of sorted elements and decreasing the number of unsorted elements.

Selection Sort You can select smallest element or largest and place that in its position. Here smallest is used.

Selection Sort Pseudocode See Sorting handout

Quick Sort In the bubble sort, consecutive items are compared, and possibly exchanged, on each pass through the list. This means that many exchanges may be needed to move an element to its correct position. Quick sort is more efficient than bubble sort because a typical exchange involves elements that are far apart, so fewer exchanges are required to correctly position an element.

Quick Sort Each iteration of the quick sort selects an element, known as the pivot, and divides the list into 3 groups: ◦ Elements whose keys are less than (or equal to) the pivot’s key. ◦ The pivot element ◦ Elements whose keys are greater than the pivot’s key.

Quick Sort The sorting then continues by quick sorting the left partition followed by quick sorting the right partition. The basic algorithm is as follows:

Quick Sort 1) Partitioning Step: Take an element in the unsorted array and determine its final location in the sorted array. This occurs when all values to the left of the element in the array are less than (or equal to) the element, and all values to the right of the element are greater than (or equal to) the element. We now have 1 element in its proper location and two unsorted subarrays. 2) Recursive Step: Perform step 1 on each unsorted subarray.

Quick Sort Each time step 1 is performed on a subarray, another element is placed in its final location of the sorted array, and two unsorted subarrays are created. When a subarray consists of one element, that subarray is sorted. Therefore, that element is in its final location.

Quick Sort There are several partitioning strategies used in practice (i.e., several “versions” of quick sort), but the one we are about to describe is known to work well. For simplicity we will select the last element as the pivot element. We could also chose a different pivot element and swap it with the last element in the array.

Quick Sort Below is the array we would like to sort:

Quick Sort The index left starts at the first element and right starts at the next-to-last element. We want to move all of the elements smaller than the pivot to the left part of the array and all of the elements larger than the pivot to the right part leftright

Quick Sort We move left to the right, skipping over elements that are smaller than the pivot leftright

Quick Sort We then move right to the left, skipping over elements that are greater than the pivot. When left and right have stopped, left is on an element greater than (or equal to) the pivot and right is on an element smaller than (or equal to) the pivot leftright

Quick Sort If left is to the left of right (or if left = right), those elements are swapped leftright leftright

Quick Sort Effectively, large elements are pushed to the right and small elements are pushed to the left. We then repeat the process until left and right cross.

Quick Sort leftright leftright leftright

Quick Sort leftright rightleft

Quick Sort At this point, left and right have crossed so no swap is performed. The final part of the partitioning is to swap the pivot element with left rightleft rightleft

Quick Sort Note that all elements to the left of the pivot are less than (or equal to) the pivot and all elements to the right of the pivot are greater than (or equal to) the pivot. Hence, the pivot element has been placed in its final sorted position rightleft

Quick Sort We now repeat the process using the sub-arrays to the left and right of the pivot

Quick Sort Pseudocode quicksort(list, leftMostIndex, rightMostIndex) { if (leftMostIndex >= rightMostIndex) return end if pivot = list[rightMostIndex] left = leftMostIndex right = rightMostIndex – 1 loop (left <= right) // Find key on left that belongs on right loop (list[left] < pivot) left = left + 1 end loop Make sure right and left don’t cross

Quick Sort Pseudocode (cont) // Find key on right that belongs on left loop (right >= leftMostIndex && list[right] > pivot) right = right – 1 end loop // Swap out-of-order elements if (left <= right) // Why swap if left = right? swap(list, left, right) left = left + 1 right = right – 1 end if end loop Must account for special case of list[left]=list[right]=pivot Necessary if pivot is the smallest element in the array.

Quick Sort Pseudocode (cont) // Move the pivot element to its correct location swap(list, left, rightMostIndex) // Continue splitting list and sorting quickSort(list, leftMostIndex, right) quickSort(list, left+1, rightMostIndex) }

Quick Sort A couple of notes about quick sort: ◦ There are more optimal ways to choose the pivot value (such as the median-of-three method). ◦ Also, when the subarrays get small, it becomes more efficient to use the insertion sort as opposed to continued use of quick sort.

Efficiency of Quick Sort List have n items every time split into two i-e n=2 m (n-1) comparison on first pass. List split in to half n/2 approximately. Same way continue to split as follows n+2*(n/2)+4*(n/4)+8*(n/8)+…+n*(n/n) n+n+n+n+…+n  [List split m times] The total number of comparison in average case is O(n*m) where m= log n. Thus, O(n log n) In worst case is O(n 2 )

Bubble Sort vs. Quick Sort If we calculate the Big-O notation, we find that (in the average case): ◦ Bubble Sort: O(n 2 ) ◦ Quick Sort: O(nlog 2 n)

Heap Sort Idea: take the items that need to be sorted and insert them into a heap. By calling deleteHeap, we remove the smallest (or largest) element, depending on whether or not we are working with a min- or max-heap, respectively. Hence, the elements are removed in ascending or descending order. Efficiency: O(nlog 2 n)

Efficiency Summary SortWorst CaseAverage Case InsertionO(n 2 ) SelectionO(n 2 ) BubbleO(n 2 ) QuickO(n 2 )O(nlog 2 n) HeapO(nlog 2 n) MergeO(nlog 2 n)

Searching Sequential search Indexed sequential Search Binary search Binary tree Hashing