i206: Lecture 8: Sorting and Analysis of Algorithms

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Lecture 25 Selection sort, reviewed Insertion sort, reviewed Merge sort Running time of merge sort, 2 ways to look at it Quicksort Course evaluations.
Software Design Analysis of Algorithms i206 Fall 2010 John Chuang Some slides adapted from Glenn Brookshear, Marti Hearst, or Goodrich & Tamassia.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Lecture 3 Aug 31, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis discussion of lab – permutation generation.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
Elementary Data Structures and Algorithms
Lecture 3 Feb 7, 2011 Goals: Chapter 2 (algorithm analysis) Examples: Selection sorting rules for algorithm analysis Image representation Image processing.
1 Foundations of Software Design Fall 2002 Marti Hearst Lecture 11: Analysis of Algorithms, cont.
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
1 Divide and Conquer Binary Search Mergesort Recurrence Relations CSE Lecture 4 – Algorithms II.
CSE 373 Data Structures and Algorithms Lecture 4: Asymptotic Analysis II / Math Review.
CSE 5311 DESIGN AND ANALYSIS OF ALGORITHMS. Definitions of Algorithm A mathematical relation between an observed quantity and a variable used in a step-by-step.
1 i206: Lecture 6: Math Review, Begin Analysis of Algorithms Marti Hearst Spring 2012.
Lecture 2 Computational Complexity
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Algorithms and their Applications CS2004 ( ) Dr Stephen Swift 4.1 Time Complexity and Asymptotic Notation.
Asymptotic Notation (O, Ω, )
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Big Java by Cay Horstmann Copyright © 2009 by John Wiley & Sons. All rights reserved. Selection Sort Sorts an array by repeatedly finding the smallest.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Algorithm Analysis (Big O)
Asymptotic Notation Faculty Name: Ruhi Fatima
Sorting Algorithms Written by J.J. Shepherd. Sorting Review For each one of these sorting problems we are assuming ascending order so smallest to largest.
Computability Sort homework. Formal definitions of time complexity. Big 0. Homework: Exercises. Searching. Shuffling.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Ch03-Algorithms 1. Algorithms What is an algorithm? An algorithm is a finite set of precise instructions for performing a computation or for solving a.
Chapter 16: Searching, Sorting, and the vector Type.
Lecture 3COMPSCI.220.S1.T Running Time: Estimation Rules Running time is proportional to the most significant term in T(n) Once a problem size.
Analysis of Algorithms and Inductive Proofs
Introduction to Algorithms: Asymptotic Notation
Recitation 13 Searching and Sorting.
Introduction to Algorithms
Introduction Algorithms Order Analysis of Algorithm
Recitation 10 Prelim Review.
Sorting by Tammy Bailey
CS 3343: Analysis of Algorithms
Sorting Algorithms Written by J.J. Shepherd.
O-notation (upper bound)
CS 3343: Analysis of Algorithms
Asymptotic Notations Algorithms Lecture 9.
Complexity Present sorting methods. Binary search. Other measures.
Asymptotic Growth Rate
Data Structures Review Session
CS 3343: Analysis of Algorithms
Advanced Analysis of Algorithms
CS200: Algorithms Analysis
Algorithmic Complexity
Divide and Conquer Algorithms Part I
Algorithms Analysis Algorithm efficiency can be measured in terms of:
Time Complexity Lecture 14 Sec 10.4 Thu, Feb 22, 2007.
Intro to Data Structures
Time Complexity Lecture 15 Mon, Feb 27, 2006.
O-notation (upper bound)
G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY
Recitation 10 Prelim Review.
CSE 373, Copyright S. Tanimoto, 2001 Asymptotic Analysis -
Advanced Analysis of Algorithms
Algorithm Analysis How can we demonstrate that one algorithm is superior to another without being misled by any of the following problems: Special cases.
Presentation transcript:

i206: Lecture 8: Sorting and Analysis of Algorithms Marti Hearst Spring 2012

Example: the function f(n)=8n–2 is O(n) Definition of Big-Oh A running time is O(g(n)) if there exist constants n0 > 0 and c > 0 such that for all problem sizes n > n0, the running time for a problem of size n is at most c(g(n)). In other words, c(g(n)) is an upper bound on the running time for sufficiently large n. c g(n) Example: the function f(n)=8n–2 is O(n) The running time of algorithm arrayMax is O(n) http://www.cs.dartmouth.edu/~farid/teaching/cs15/cs5/lectures/0519/0519.html

More formally Let f(n) and g(n) be functions mapping nonnegative integers to real numbers. f(n) is (g(n)) if there exist positive constants n0 and c such that for all n>=n0, f(n) <= c*g(n) Other ways to say this: f(n) is order g(n) f(n) is big-Oh of g(n) f(n) is Oh of g(n) f(n)  O(g(n)) (set notation)

Analysis Example: Phonebook Given: A physical phone book Organized in alphabetical order A name you want to look up An algorithm in which you search through the book sequentially, from first page to last What is the order of: The best case running time? The worst case running time? The average case running time? What is: A better algorithm? The worst case running time for this algorithm?

Analysis Example (Phonebook) This better algorithm is called Binary Search What is its running time? First you look in the middle of n elements Then you look in the middle of n/2 = ½*n elements Then you look in the middle of ½ * ½*n elements … Continue until there is only 1 element left Say you did this m times: ½ * ½ * ½* …*n Then the number of repetitions is the smallest integer m such that

Analyzing Binary Search In the worst case, the number of repetitions is the smallest integer m such that We can rewrite this as follows: Multiply both sides by Take the log of both sides Since m is the worst case time, the algorithm is O(logn)

Sorting www.naturalbrands.com

Sorting Putting collections of things in order Numerical order Alphabetical order An order defined by the domain of use E.g. color, yumminess … http://www.hugkiss.com/platinum/candy.shtml

Bubble Sort O(n2) The dark gray represents a comparison and the white represents a swap. http://www.youtube.com/watch?v=P00xJgWzz2c

Insertion Sort O(n2) http://xoax.net/comp/sci/algorithms/Lesson2.php Brookshear Figure 5.11 Brookshear Figure 5.19 http://xoax.net/comp/sci/algorithms/Lesson2.php

Merge Sort O(n log n) https://en.wikipedia.org/wiki/Mergesort

Merge Sort O(n log n) http://xoax.net/comp/sci/algorithms/Lesson3.php

Merge Sort O(n log n) http://xoax.net/comp/sci/algorithms/Lesson2.php

Merge Sort O(n log n) http://xoax.net/comp/sci/algorithms/Lesson2.php

Merge Sort O(n log n) An average and worst-case performance of O(n log n). Based on how the algorithm works, the recurrence T(n) = 2T(n/2) + n = n log n describes the average running time. (Apply the algorithm to two lists of half the size of the original list, and add the n steps taken to merge the resulting two lists.) In the worst case, the number of comparisons merge sort makes is equal to or slightly smaller than (n log n - 2log n + 1), which is between (n log n - n + 1) and (n log n + n + O(lg n)).

Substrings of a String Say you have a long string, such as "supercalifragilisticexpialidocious". Say you want to print out all the substrings of this string, starting from the left, and always retaining the substring you have printed out so far, e.g. "s", "su", "sup", "supe", and so on. Assuming this string is of length n, how many substrings will you print out in terms of big-O notation? If you printed out all possible substrings, starting at any location, and of any length, how many are there?

Traveling Salesman Problem http://xkcd.com/399/

Summary: Analysis of Algorithms A method for determining, in an abstract way, the asymptotic running time of an algorithm Here asymptotic means as n gets very large Useful for comparing algorithms Useful also for determing tractability Meaning, a way to determine if the problem is intractable (impossible) or not Exponential time algorithms are usually intractable. We’ll revisit these ideas throughout the rest of the course.