Asymptotic Behavior Algorithm : Design & Analysis [2]

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

Analysis of Algorithms
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
© The McGraw-Hill Companies, Inc., Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
Using Divide and Conquer for Sorting
Discrete Structure Li Tak Sing( 李德成 ) Lectures
Chapter 19: Searching and Sorting Algorithms
1 Sorting Problem: Given a sequence of elements, find a permutation such that the resulting sequence is sorted in some order. We have already seen: –Insertion.
Chapter 1 – Basic Concepts
1 ICS 353 Design and Analysis of Algorithms Spring Semester (062) King Fahd University of Petroleum & Minerals Information & Computer Science.
Lecture 5: Linear Time Sorting Shang-Hua Teng. Sorting Input: Array A[1...n], of elements in arbitrary order; array size n Output: Array A[1...n] of the.
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
Analysis of Algorithms Review COMP171 Fall 2005 Adapted from Notes of S. Sarkar of UPenn, Skiena of Stony Brook, etc.
Object (Data and Algorithm) Analysis Cmput Lecture 5 Department of Computing Science University of Alberta ©Duane Szafron 1999 Some code in this.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
Analysis of Algorithms CS 477/677
DAST 2005 Week 4 – Some Helpful Material Randomized Quick Sort & Lower bound & General remarks…
The Complexity of Algorithms and the Lower Bounds of Problems
Lower Bounds for Comparison-Based Sorting Algorithms (Ch. 8)
Ch. 8 & 9 – Linear Sorting and Order Statistics What do you trade for speed?
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Program Performance & Asymptotic Notations CSE, POSTECH.
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Fall 2009.
Discrete Structures Lecture 11: Algorithms Miss, Yanyan,Ji United International College Thanks to Professor Michael Hvidsten.
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
1 Chapter 1 Analysis Basics. 2 Chapter Outline What is analysis? What to count and consider Mathematical background Rates of growth Tournament method.
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
C++ Programming: Program Design Including Data Structures, Fourth Edition Chapter 19: Searching and Sorting Algorithms.
Analysis of Algorithms
Analyzing algorithms & Asymptotic Notation BIO/CS 471 – Algorithms for Bioinformatics.
Introduction to Algorithms Jiafen Liu Sept
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Analysis of Algorithms CS 477/677
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
CSC 211 Data Structures Lecture 13
1 o-notation For a given function g(n), we denote by o(g(n)) the set of functions: o(g(n)) = {f(n): for any positive constant c > 0, there exists a constant.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Sorting.
Asymptotic Behavior Algorithm : Design & Analysis [2]
COSC 3101A - Design and Analysis of Algorithms 6 Lower Bounds for Sorting Counting / Radix / Bucket Sort Many of these slides are taken from Monica Nicolescu,
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Asymptotic Notation Faculty Name: Ruhi Fatima
Search Algorithms Written by J.J. Shepherd. Sequential Search Examines each element one at a time until the item searched for is found or not found Simplest.
Sorting Lower Bounds n Beating Them. Recap Divide and Conquer –Know how to break a problem into smaller problems, such that –Given a solution to the smaller.
Computability Sort homework. Formal definitions of time complexity. Big 0. Homework: Exercises. Searching. Shuffling.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Chapter 2 Algorithm Analysis
Analysis of Algorithms
Introduction to Algorithms
CS 3343: Analysis of Algorithms
Sorting by Tammy Bailey
Complexity Present sorting methods. Binary search. Other measures.
Asymptotic Growth Rate
Linear-Time Sorting Algorithms
Time Complexity Lecture 14 Sec 10.4 Thu, Feb 22, 2007.
Time Complexity Lecture 15 Mon, Feb 27, 2006.
At the end of this session, learner will be able to:
Presentation transcript:

Asymptotic Behavior Algorithm : Design & Analysis [2]

In the last class… Goal of the Course Algorithm: the Concept Algorithm Analysis: the Contents Average and Worst-Case Analysis Lower Bounds and the Complexity of Problems

Asymptotic Behavior Asymptotic growth rate The Sets ,  and  Complexity Class An Example: Searching an Ordered Array Improved Sequential Search Binary Search Binary Search Is Optimal

How to Compare Two Algorithm? Simplifying the analysis assumption that the total number of steps is roughly proportional to the number of basic operations counted (a constant coefficient) only the leading term in the formula is considered constant coefficient is ignored Asymptotic growth rate large n vs. smaller n

Relative Growth Rate g Ω (g):functions that grow at least as fast as g Θ (g):functions that grow at the same rate as g Ο (g):functions that grow no faster as g

The Set “Big Oh” Definition Giving g:N→R +, then Ο(g) is the set of f:N→R +, such that for some c  R + and some n 0  N, f(n)  cg(n) for all n  n 0. A function f  Ο(g) if lim n→  [f(n)/g(n)]=c<  Note: c may be zero. In that case, f  (g), “little Oh”

Example Let f(n)=n 2, g(n)=nlgn, then: f  Ο(g), since lim n→  [f(n)/g(n)]= lim n→  [n 2 /nlgn]= lim n→  [n/lgn]= lim n→  [1/(1/nln2)]=  g  Ο(f), since lim n→  [g(n)/f(n)]= lim n→  [nlogn / n 2 ]= = 0

Logarithmic Functions and Powers Which grows faster? For your reference: L’Hôpital’s rule with some constraints So, log 2 n  O(  n)

The Result Generalized The log function grows more slowly than any positive power of n lgn  o(n  ) for any  >0 By the way: The power of n grows more slowly than any exponential function with base greater than 1 n k  o(c n ) for any c>1

Factorials and Exponential Functions n! grows faster than 2 n for any positive integer n.

The Sets  and  Definition Giving g:N→R +, then  (g) is the set of f:N→R +, such that for some c  R + and some n 0  N, f(n)  cg(n) for all n  n 0. A function f  (g) if lim n→  [f(n)/g(n)]>0 Note: the limit may be infinity Definition Giving g:N→R +, then  (g) = Ο(g)   (g) A function f  Ο(g) if lim n→  [f(n)/g(n)]=c,0<c< 

How Functions Grow Algorithm1234 Time function(ms)33n46n lg n13n 2 3.4n 3 2n2n Input size(n)Solution time sec sec sec sec sec sec.0.03 sec.0.13 sec.3.4 sec. 4  yr. 1, sec.0.45 sec.13 sec.0.94 hr. 10, sec.6.1 sec.22 min.39 days 100, sec.1.3 min.1.5 days108 yr. Time allowedMaximum solvable input size (approx.) 1 second30,0002, minute1,800,00082,0002,

Increasing Computer Speed Number of steps performed on input of size n f(n) Maximum feasible input size s Maximum feasible input size in t times as much time s new lg ns1s1 s1ts1t ns2s2 t s 2 n2n2 s3s3  t s 3 n3n3 s4s4 s 4 +lg n

Sorting a Array of 1 Million Numbers Computer A 1000 Mips Computer B 10 Mips Using insertion sort, taking time 2n 2 : 2000 seconds! Using insertion sort, taking time 2n 2 : 2000 seconds! Using merge sort, taking time 50nlogn: 100 seconds! Using merge sort, taking time 50nlogn: 100 seconds!

Properties of O,  and  Transitive property: If f  O(g) and g  O(h), then f  O(h) Symmetric properties f  O(g) if and only if g  (f) f  (g) if and only if g  (f) Order of sum function O(f+g)=O(max(f,g))

Complexity Class Let S be a set of f:N  R* under consideration, define the relation ~ on S as following: f~g iff. f  (g) then, ~ is an equivalence. Each set  (g) is an equivalence class, called complexity class. We usually use the simplest element as possible as the representative, so,  (n),  (n 2 ), etc.

Searching an Ordered Array The Problem: Specification Input: an array E containing n entries of numeric type sorted in non-decreasing order a value K Output: index for which K=E[index], if K is in E, or, -1, if K is not in E

Sequential Search: the Procedure The Procedure Int seqSearch(int[] E, int n, int K) 1. Int ans, index; 2. Ans=-1; // Assume failure 3. For (index=0; index<n; index++) 4. If (K==E[index]) ans=index;//success! 5. break; 6. return ans

Searching a Sequence For a given K, there are two possibilities K in E (say, with probability q), then K may be any one of E[i] (say, with equal probability, that is 1/n) K not in E (with probability 1-q), then K may be located in any one of gap(i) (say, with equal probability, that is 1/(n+1)) E[1] E[2] E[i]E[i] E[i-1] E[I+1] gap(i-1) E[n]E[n] gap(0) gap(i-1)

Improved Sequential Search Since E is sorted, when an entrie larger than K is met, no nore comparison is needed Worst-case complexity: n, unchanged Average complexity Note: A(n)  (n) Roughly n/2

Divide and Conquer If we compare K to every jth entry, we can locate the small section of E that may contain K. To locate a section, roughly n/j steps at most To search in a section, j steps at most So, the worst-case complexity: (n/j)+j, with j selected properly, (n/j)+j  (  n) However, we can use the same strategy in the small sections recursively Choose j =  n

Binary Search int binarySearch(int[] E, int first, int last, int K) if (last<first) index=-1; else int mid=(first+last)/2 if (K==E[mid]) index=mid; else if (K<E[mid]) index=binarySearch(E, first, mid-1, K) else if (K<E[mid]) index=binarySearch(E, mid+1, last, K) return index;

Worst-case Complexity of Binary Search Observation: with each call of the recursive procedure, only at most half entries left for further consideration. At most  lg n  calls can be made if we want to keep the range of the section left not less than 1. So, the worst-case complexity of binary search is  lg n  +1= ⌈ lg(n+1) ⌉

Average Complexity of Binary Search Observation: for most cases, the number of comparison is or is very close to that of worst-case particularly, if n=2 k -1, all failure position need exactly k comparisons Assumption: all success position are equally likely (1/n) n=2 k -1

Average Complexity of Binary Search Average complexity Note: We count the sum of s t, which is the number of inputs for which the algorithm does t comparisons, and if n=2 k -1, s t =2 t-1

Decision Tree A decision tree for A and a given input of size n is a binary tree whose nodes are labeled with numbers between 0 and n-1 Root: labeled with the index first compared If the label on a particular node is i, then the left child is labeled the index next compared if K<E[i], the right child the index next compared if K>E[i], and no branch for the case of K=E[i]

Binary Search Is Optimal If the number of comparison in the worst case is p, then the longest path from root to a leaf is p-1, so there are at most 2 p -1 node in the tree. There are at least n node in the tree. (We can prove that For all i  {0,1,…,n-1}, exist a node with the label i.) Conclusion: n  2 p -1, that is p  lg(n+1)

Binary Search Is Optimal For all i  {0,1,…,n-1}, exist a node with the label i. Proof: if otherwise, suppose that i doesn’t appear in the tree, make up two inputs E 1 and E 2, with E 1 [i]=K, E 2 [i]=K’, K’>K, for any j  i, (0  j  n-1), E 1 [j]=E 2 [j]. (Arrange the values to keeping the order in both arrays). Since i doesn’t appear in the tree, for both K and K’, the algorithm behave alike exactly, and give the same outputs, of which at least one is wrong, so A is not a right algorithm.

Finding the Fake Coin [Situation] You have 70 coins that are all supposed to be gold coins of the same weight, but you know that one coin is fake and weighs less than the others. You have a balance scale; you can put any number of coins on each side of the scale at one time, and it will tell you if the two sides weigh the same, or which side is lighter if they don’t weigh the same. [Problem] Outline an algorithm for finding the fake coin. How many weighings will you do?

Outline of an Algorithm [1..23]:[24..46] [1..8]:[9..16][47..54]:[55..62] [24..31]:[32..40] < = > [47..49]:[50..52] [63..65]:[66..68][55..57]:[58..60] [47]:[48] [53]:[54] [50]:[51] Done!

Home Assignment pp.63 –