Jessie Zhao Course page: 1.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

Analysis of Algorithms CS 477/677 Linear Sorting Instructor: George Bebis ( Chapter 8 )
MS 101: Algorithms Instructor Neelima Gupta
CSE332: Data Abstractions Lecture 14: Beyond Comparison Sorting Dan Grossman Spring 2010.
CSC 2300 Data Structures & Algorithms March 16, 2007 Chapter 7. Sorting.
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
CSE115/ENGR160 Discrete Mathematics 02/28/12
What is an Algorithm? (And how do we analyze one?)
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Lecture 5: Linear Time Sorting Shang-Hua Teng. Sorting Input: Array A[1...n], of elements in arbitrary order; array size n Output: Array A[1...n] of the.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
CSE115/ENGR160 Discrete Mathematics 03/03/11 Ming-Hsuan Yang UC Merced 1.
CSE 830: Design and Theory of Algorithms
Lecture 5: Master Theorem and Linear Time Sorting
CS Main Questions Given that the computer is the Great Symbol Manipulator, there are three main questions in the field of computer science: What kinds.
What is an Algorithm? (And how do we analyze one?) COMP 122, Spring 04.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 5 Linear-time sorting Can we do better than comparison sorting? Linear-time sorting.
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
CS 202, Spring 2003 Fundamental Structures of Computer Science II Bilkent University1 Sorting CS 202 – Fundamental Structures of Computer Science II Bilkent.
Lower Bounds for Comparison-Based Sorting Algorithms (Ch. 8)
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
DIVIDE & CONQUR ALGORITHMS Often written as first as a recursive algorithm Master’s Theorem: T(n) = aT(n/b) + cn i, for some constant integer i, and constants.
Sorting in Linear Time Lower bound for comparison-based sorting
Data Structures and Algorithms Lecture 5 and 6 Instructor: Quratulain Date: 15 th and 18 th September, 2009 Faculty of Computer Science, IBA.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Complexity of algorithms Algorithms can be classified by the amount of time they need to complete compared to their input size. There is a wide variety:
CSC 41/513: Intro to Algorithms Linear-Time Sorting Algorithms.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
CSE332: Data Abstractions Lecture 14: Beyond Comparison Sorting Dan Grossman Spring 2012.
Analysis of Algorithms CS 477/677
Fall 2015 Lecture 4: Sorting in linear time
CS 361 – Chapters 8-9 Sorting algorithms –Selection, insertion, bubble, “swap” –Merge, quick, stooge –Counting, bucket, radix How to select the n-th largest/smallest.
Chapter Algorithms 3.2 The Growth of Functions 3.3 Complexity of Algorithms 3.4 The Integers and Division 3.5 Primes and Greatest Common Divisors.
New Mexico Computer Science For All Algorithm Analysis Maureen Psaila-Dombrowski.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
September 9, Algorithms and Data Structures Lecture II Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
COSC 3101A - Design and Analysis of Algorithms 6 Lower Bounds for Sorting Counting / Radix / Bucket Sort Many of these slides are taken from Monica Nicolescu,
Data Structures Haim Kaplan & Uri Zwick December 2013 Sorting 1.
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
2IS80 Fundamentals of Informatics Fall 2015 Lecture 6: Sorting and Searching.
1 Ch. 2: Getting Started. 2 About this lecture Study a few simple algorithms for sorting – Insertion Sort – Selection Sort (Exercise) – Merge Sort Show.
Young CS 331 D&A of Algo. Topic: Divide and Conquer1 Divide-and-Conquer General idea: Divide a problem into subprograms of the same kind; solve subprograms.
Today’s Material Sorting: Definitions Basic Sorting Algorithms
Copyright © 2014 Curt Hill Algorithms From the Mathematical Perspective.
Sorting & Searching Geletaw S (MSC, MCITP). Objectives At the end of this session the students should be able to: – Design and implement the following.
19 March More on Sorting CSE 2011 Winter 2011.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
CS6045: Advanced Algorithms Sorting Algorithms. Sorting So Far Insertion sort: –Easy to code –Fast on small inputs (less than ~50 elements) –Fast on nearly-sorted.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
September 18, Algorithms and Data Structures Lecture II Simonas Šaltenis Aalborg University
David Luebke 1 7/2/2016 CS 332: Algorithms Linear-Time Sorting: Review + Bucket Sort Medians and Order Statistics.
CSE15 Discrete Mathematics 03/06/17
Growth of Functions & Algorithms
Math/CSE 1019C: Discrete Mathematics for Computer Science Fall 2012
Algorithm Analysis CSE 2011 Winter September 2018.
CS 2210 Discrete Structures Algorithms and Complexity
CS 2210 Discrete Structures Algorithms and Complexity
Discrete Mathematics CMP-101 Lecture 12 Sorting, Bubble Sort, Insertion Sort, Greedy Algorithms Abdul Hameed
Programming and Data Structure
CSCE 222 Discrete Structures for Computing
Ch. 2: Getting Started.
Math/CSE 1019N: Discrete Mathematics for Computer Science Winter 2007
Discrete Mathematics CS 2610
CS 2210 Discrete Structures Algorithms and Complexity
Presentation transcript:

Jessie Zhao Course page: 1

 Final Exam ◦ Time: Dec 20 th, 2pm ◦ Location: TM TMEAST  Assignment 6 is released! 2

 Measures of efficiency: ◦ Running time ◦ Space used (Not included in this course)  Efficiency as a function of input size ◦ Number of data elements (numbers, points) ◦ Number of bits in an input number  Example: Find the factors of a number n  Example: Determine if an integer n is prime  Example: Find the max in A[1..n] 3

 What operations are counted? ◦ Arithmetic (add, subtract, multiply, etc.) ◦ Data movement (assign) ◦ Control (branch, subroutine call, return) ◦ Comparison 4

 COUNT the number of cycles (running time) as a function of the input size ◦ Running time (upper bound): c 1 + c 5 – c 3 – c 4 + (c 2 + c 3 + c 4 )n ◦ Running time (lower bound): c 1 + c 5 – c 3 – c 4 + (c 2 + c 3 )n 5

 Best case: A[1] is the largest element.  Worst case: elements are sorted in increasing order  Average case: ? Depends on the input characteristics  What do we use?  Worst case or Average-case is usually used: ◦ Worst-case is an upper-bound; in certain application domains (e.g., air traffic control, surgery) knowing the worst-case time complexity is of crucial importance ◦ Finding the average case can be very difficult; needs knowledge of input distribution. ◦ Best-case is not very useful 6

7

 Running time upper bound: c 1 + c 5 – c 3 – c 4 + (c 2 + c 3 + c 4 )n  What are the values of c i ? ◦ Machine-dependent ◦ Constant to n  A simpler expression: a + bn  The running time is Θ(n) ◦ a + bn is O(n)  ∃ constants C and k such that ∀n>k a + bn ≤ Cn ◦ a + bn is Ω(n)  ∃ constants C and k such that ∀n>k a + bn ≥ Cn 8

 Bounds on running time ◦ 1. O() is used for upper bounds “grows slower than” ◦ 2. Ω() used for lower bounds “grows faster than” ◦ 3. Θ() used for denoting matching upper and lower bounds. “grows as fast as”  The rules for getting the running time are ◦ 1. Throw away all terms other than the most significant one ◦ 2. Throw away the constant factors. ◦ 3. The expression is Θ() of whatever’s left. 9

 Very basic operations  Used very, very often in real applications  LOTS of new ideas 10

 Given an array A[1..n] does there exist a number (key) x?  Unsorted array: linear search ◦ Input: A[1..n]: array of distinct integers; x: an integer. ◦ Output: The location of n in A[1..n], or 0 if n is not found.  LinearSearch(A,x) ◦ j=1 ◦ Loop  : x is not in the scanned subarray.  Exit when j>n or x=A[j]  j=j+1 ◦ End loop ◦ if j<=n then return j ◦ else return 0 11

Loop 12 Prove its correctness  Input  Loop ◦ Loop Invariant I ◦ Exit when E ◦  End Loop  Output  Step 1. Basis case: Input->I  Step 2. Inductive Step: Assume I(i) is true before the ith interaction. Prove it is true after i+1 iteration. I(i)∧˥E->I(i+1)  Step 3. Show loop terminate and return the correct results. I∧E->Output

 Proof by using loop invariant. ◦ Basis Case: j=1. No element is scanned. ◦ Inductive Step: Assume x is not in the scanned subarray A[1..j-1]. Prove x is not in A[1..j] if the loop does not terminate. Prove the output is correct if the loop terminates.  If the loop does not terminate.  j<=n and x≠A[j]. Then x is not in A[1..j].  If the loop terminates.  j>n or x=A[j]  If j>n, then j=n+1. By loop invariant, x is not in A[1..n]. The output 0 is correct.  If x=A[j], the n j is returned. The output j is correct. 13 LinearSearch(A,x) j=1 Loop : x is not in the scanned subarray. Exit when j>n or x=A[j] j=j+1 End loop if j<=n then return j else return 0

14 LinearSearch(A,x) j=1 Loop : x is not in the scanned subarray. Exit when j>n or x=A[j] j=j+1 End loop if j<=n then return j else return 0  Running Time? ◦ Outside the loop: Constant O(C) ◦ The loop: O(n) ◦ Overall: O(n)

 Sorted array: Can we do better?  Binary search: Use the sorted property to eliminate large parts of the array ◦ Input: L[1..n]: a sorted array L(i)<L(j) if 1≤i<j≤n; x: an integer. ◦ Output: The location of n in A[1..n], or 0 if n is not found.  BinarySearch(L,x) ◦ i=1, j=n ◦ Loop  : If x is in L[1..n], then x is in L[i..j].  Exit when j<=i  mid = ⌊(i+j)/2⌋  If (x ≤ L(mid)) then  j=mid  Else  i=mid+1  End if End loop ◦ if (x=L(i)) then return i ◦ else return 0 15 Running time? O(log n)

 By preprocessing (sorting) the data into a data structure (sorted array), we were able to speed up search queries. Very common idea in Computer Science  Many other data structures are commonly used: linked lists, trees, hash tables,….  CSE 2011: Data Structures  CSE 4101: Advanced Data Structures 16

◦ Input: A[1..n]: array of distinct numbers ◦ Output: A[1..n]: a sorted array A(i)<A(j) if 1≤i<j≤n  Simple algorithm using FindMax ◦ 1. j=n ◦ 2. while (j>1){ ◦ 3. maxindex = index of max A[1..j] ◦ 4. swap (A[maxindex], A[j]) ◦ 5. j=j-1 ◦ 6. }  Proof and Running time? O(n²) 17

 We maintain a subset of elements sorted within a list. ◦ Initially, think of the first element in the array as a sorted list of length one. ◦ One at a time, we take one of the elements (from the original list) and we insert it into the sorted list where it belongs. This gives a sorted list that is one element longer than it was before. ◦ When the last element has been inserted, the array is completely sorted. 18 Insert A[j] into the sorted list A[1..j-1]

 Proof: ◦ Basis Step: j = 2, the invariant holds because A[1] is a sorted array. 19 Insert A[j] into the sorted list A[1..j-1] Loop Invariant: at the start of each for loop, A[1…j-1] consists of elements originally in A[1…j-1] but in sorted order

◦ Inductive Step: Assume elements in A[1…j-1] are sorted.  The inner while loop moves elements A[j-1], A[j-2], …, A[k] one position right without changing their order.  Then the former A[j] element is inserted into kth position so that A[k-1] ≤ A[k] ≤ A[k+1].  A[1…j] is sorted. 20 Insert A[j] into the sorted list A[1..j-1] Loop Invariant: at the start of each for loop, A[1…j-1] consists of elements originally in A[1…j-1] but in sorted order

◦ Termination: the loop terminates, when j=n+1.  By loop invariant: “A[1…n] consists of elements originally in A[1…n] but in sorted order”  The output A[1..n] is correctly sorted. 21 Insert A[j] into the sorted list A[1..j-1] Loop Invariant: at the start of each for loop, A[1…j-1] consists of elements originally in A[1…j-1] but in sorted order

 Let’s compute the running time as a function of the input size.  What is the running time? O(n²) 22

 bubble sort O(n²)  merge sort, quick sort, heap sort O(nlogn)  Linear time sorts (require certain type of input): counting sort, radix sort, bucket sort. 23

 Optimization problems ◦ Find a solution to the given problem that either minimizes or maximizes the value of some parameters.  Greedy Algorithm  Select the best choice at each step  Does the solution always be optimal? 24

 Example: Want to make change for ANY amount using the fewest number of coins  Simple “greedy” algorithm: keep using the largest denomination possible ◦ Works for our coins: 1,5,10, 25,100. ◦ Does it always work? ◦ Fails for the following coins: 1,5,7,10 ◦ e.g: 14 = , 14 =

 Prove the greedy algorithm works for {1,5,10, 25,100}. ◦ Lemma 1. Using the fewest coins possible has at most three 25(quarters), two 10(dimes), one 5(nickels), four 1(cents), and can not have two 10 and one 5 together.  Proof by contradiction: If we have more than any above numbers, then we can replace them with fewer coins. 26

 Prove the greedy algorithm works for {1,5,10, 25,100}.  Proof by contradiction:  Assume there is an integer n, such that there is a way to make changes using less coins than the greedy algorithm.  Suppose different numbers for 100 (dollars): x dollars for greedy algorithm, and y dollars for the optimal solution  By greedy algorithm x>=y  If x>y, then we need to make up at least 100 from {1,5,10, 25}. This is impossible by lemma 1.  Similarly we can prove the greedy solution and the optimal solution won’t have different numbers for {1,5,10, 25}.  Q.E.D. 27

 Understand existing classic algorithms  Design simple algorithms  Prove the correctness of an algorithm ◦ Loop Invariant  Analyze the time complexity of an algorithm 28