Download presentation
Presentation is loading. Please wait.
1
Algorithm design and Analysis
Focus: developing algorithms abstractly Independent of programming language, data types, etc. Think of a stack or queue: not specific to C++, but can be implemented when needed Addressed in depth during COSC 320 Develop mathematical tools to analyze the costs of the algorithms Time Space
2
Big-O Notation Big-O notation: a way to formally quantify the complexity (cost and/or space) of a function Definition: A function π is said to be βbig-ohβ of a function π, denoted π= π π if there exist constants πΆ,π>0 such that π π₯ β€πΆ|π π₯ | for all π₯> π. In words: eventually, some multiple of the function π will outgrow the function π. Functions here will typically denote the runtime of an algorithm. The argument will denote the size of the input Big-O will capture the growth of the algorithms runtime, with respect to the size of the input given E.g. How does sorting 100 items compare to sorting 1,000? 10,000? 1,000,000?
3
Examples π₯ 2 +π₯+1 is π( π₯ 3 ) but also is π π₯ 2 logβ‘(π₯) is π(π₯)
π₯ 2 is not π(π₯) logβ‘(π₯) is π(π₯) A function can also be π(1) β meaning does not grow faster than a constant: upper-bounded for all π₯! βlinearβ vs βquadraticβ vs βcubicβ Polynomials grow according to the highest power
4
Big-o captures βgrowthβ
Note: constants donβt matter Note: adding functions with βsmallerβ growth rates donβt matter If π 1 is π( π 1 ) and π 2 is π( π 2 ) then π 1 + π 2 is π( max π 1 , π 2 ) Algorithmically: if we run two algorithms, each with its own growth, the total time cost asymptotically is only the larger of the two Put another way: adding the βsameβ or βlessβ runtime to an algorithm is βfreeβ!
5
Example: Searching a Sorted List
We have seen one method already: βlinear searchβ Given: a list (or array, or similar structure of data) with the requirement that it is already sorted in either increasing or decreasing order Algorithm: Start at the beginning, while the item is not found, scan through the list If we find the item, return βtrueβ. Otherwise return βfalseβ once we get to the end. Runtime: if there are π items in the list, the total time cost is π(π) Wonβt be exactly π because we need to manage temp variables, function calls, etc. These things only add a constant cost increase, which is less than π !
6
Method One: Linear Search
Idea: check each element in the array, stop if you find the one you want Given array: [1, 15, 6, 40, 35, 99] Target: 40 i = 2 i = 3 i = 1 i = 0 Found! Returns i = 3 1 15 6 40 35 99
7
Better Method: Binary Search
Given: a sorted array (increasing, for definiteness), target x Algorithm: Use three variables: βleftβ, βrightβ and βmiddleβ Start βleftβ at index 0, βrightβ and index length β 1 Start middle at length/2 If array[middle] == x Return true; Else If array[length/2] < x Set βleftβ to middle Else Set βrightβ to middle Repeat until βleftβ <= βrightβ or return
8
1 5 16 40 55 99 Example First: Array gets sorted! Target:55
What about target: 50? What about target: 60? Bottom Middle Top 1 5 16 40 55 99 2 3 4 6
9
Cost of Binary Search On each iteration of the βwhileβ loop, half of the search range is discarded! If we start with 100 elements, in the worst case, the search space would evolve (approximately) as: 100 -> 50 -> 25 -> 13 -> 7 -> 4 -> 2 -> 1 Only 8 iterations of the while loop! In general, for an array of size n, it will take k iterations, where k is the largest integer such that n/2k β₯ 1 Simplified, k will be about log2(n)! Cost of sorting: About nβlog(n) operations How? We will see later.
10
Linear vs. Binary Search
Both are correct Binary search requires sorting, which takes extra time Only needs to happen once! For multiple queries, then, compare n vs. log(n) time!
11
SORTING (Slowly) Algorithm: Bubble Sort
Given: an unsorted array or list Do Set swap flag to false. For count = 0 through length - 1 If array[count] is greater than array[count + 1] Swap the contents of array[count] and array[count + 1]. Set swap flag to true. While any elements have been swapped.
12
Bubble Sort Time complexity
Each element only moves one space toward itβs proper position on each iteration of the for-loop In the worst case, the for-loop will run about n times for each element, meaning a cost of π πβπ =π π 2 operations
13
Selection Sort A (slightly) better approach is selection sort
Algorithm: Set βpositionβ = 0 Find smallest element from βpositionβ to the end of the array Swap that smallest element with the one at βpositionβ Increment βpositionβ and repeat until βpositionβ = length
14
Selection Sort Total complexity: π π 2 Proof:
Time to find smallest of π elements: π(π) First loop: π π Second loop: π πβ1 Third loop: π πβ2 β¦ In general: π=1 π π = π π+1 2 = π 2 +π 2 =π( π 2 )
15
Recursion How do we achieve the π log (π) complexity promised?
Need a new tool: recursive algorithms An algorithm which calls itself Example recursive computation: the Fibonacci sequence The nβth term is defined as the sum of the two previous ones The first is 1, the second is 1 Formally: π π =π πβ1 +π(πβ2) π 0 =1;π 1 =1
16
Recursion In C++, a recursive solution to compute the nβth Fibonacci number: int fib (int n){ if (n == 1 || n == 0){ return 0; } return fib(n) + fib(n-1);
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.