Matt Schierholtz
Method for solving problems with finite steps Algorithm example: Error Check for problem Solve problem Must terminate
Algorithms vary greatly in either memory space or time required Sorted into worst, best and average performance Big O, Big Θ and Big Ω notation O(f(x)) is at most f(x)
Nested loop s = 0 for i = 1 to n for j = 1 to i s = s + j(i – j + 1) next j next i Find the number of steps this algorithm takes Compare efficiency of particular algorithms
Log b x = a is very useful in computer calculations For example, Log = 10 and log = 20 ⌊ a ⌋ is the number of bits used to represent a number x in binary, or ternary, or any number base ya want
Takes less time than sequential search Trades time for organization Example While (top >= bot and index = 0) Mid = ⌊ (bot + top) / 2 ⌋ If a[mid] = x then index = mid If a[mid] > x Then top = mid - 1 Else bot = mid + 1 End while
Easier to write than Binary search But takes more time Think recursively Suppose: Efficient algorithm for arrays < k known What if you have array < k?
Sort smaller parts! Then you merge Initial number group Get sorted Merge
Algorithms with exponential order Ex: Tower of Hanoi If it has 64 disks Number of moves = 2 64 – 1 For a computer, moves / second = 10 9 This will take 584 years
Polynomial algorithms are class P These are tractable Even if they take a very long time Problems not solved in polynomial time are… Intractable Class NP (nondeterministic polynomial) $1,000,000, to anyone who proves P=NP NP-Complete If one NP problem is solvable, all of them are