CSE 326: Data Structures Lecture #24 The Algorhythmics Steve Wolfman Winter Quarter 2000 Today we’ll look at a pair of interesting and important graph algorithms that also happen to make good use of data structures! The algorithms were developed by Dijkstra and Kruskal.
Today’s Outline Greedy Divide & Conquer Dynamic Programming Randomized Backtracking Since I’m getting filmed today, I need you to sign a release. The form just says that you don’t mind being filmed. Then, we can finish off some graph properties from Wednesday. It’s important to understand what these terms mean! Then, we’ll hit the two algorithms.
Greedy Algorithms Repeat until problem is solved: Measure options according to marginal value Commit to maximum Greedy algorithms are normally fast and simple. Sometimes appropriate as a heuristic solution or to approximate the optimal solution.
Hill-Climbing Global Maximum Local Maximum
Greed in Action Best First Search A* Search Huffman Encodings Kruskal’s Algorithm Dijkstra’s Algorithm Prim’s Algorithm Scheduling
Scheduling Problem Given: a group of tasks {T1, …, Tn} each with a duration {d1, …, dn} a single processor without interrupts Select an order for the tasks that minimizes average completion time 1 1.5 0.5 0.3 2 1.4 T1 T2 T3 T4 T5 T6 Average time to completion:
Greedy Solution
Proof of Correctness Common technique for proving correctness of greedy algorithms is by contradiction: assume there is a non-greedy best way show that making that way more like greedy solution improves the supposedly best way: contradiction! Proof of correctness for greedy scheduling:
Divide & Conquer Divide problem into multiple smaller parts Solve smaller parts Solve base cases directly Otherwise, solve subproblems recursively Merge solutions together (Conquer!) Often leads to elegant and simple recursive implementations.
Divide & Conquer in Action Mergesort Quicksort buildHeap buildTree Closest points
Closest Points Problem Given: a group of points {(x1, y1), …, (xn, yn)} Return the distance between the closest pair of points 0.75
Closest Points Algorithm Closest pair is: closest pair on left or closest pair on right or closest pair spanning the middle runtime:
Closest Points Algorithm Refined Closest pair is: closest pair on left or closest pair on right or closest pair in middle strip within one of each other vertically runtime:
Memoizing/ Dynamic Programming Define problem in terms of smaller subproblems Solve and record solution for base cases Build solutions for subproblems up from solutions to smaller subproblems Can improve runtime of divide & conquer algorithms that have shared subproblems with optimal substructure. Usually involves a table of subproblem solutions.
Dynamic Programming in Action Sequence Alignment Fibonacci numbers All pairs shortest path Optimal Binary Search Tree
Fibonacci Numbers F(n) = F(n - 1) + F(n - 2) F(0) = 1 F(1) = 1 int fib(int n) { if (n <= 1) return 1; else return fib(n - 1) + fib(n - 2); } “Divide” & Conquer int fib(int n) { static vector<int> fibs; if (n <= 1) return 1; if (fibs[n] == 0) fibs[n] = fib(n - 1) + fib(n - 2); return fibs[n]; } Memoized
Optimal Binary Search Tree Problem Given: a set of words {w1, …, wn} probabilities of each word’s occurrence {p1, …, pn} Produce a Binary Search Tree which includes all the words and has the lowest expected cost: Expected cost = (where di is the depth of word i in the tree)
The Optimal BST Optimal Substructure Shared Subproblems Falcon Millenium Falcon And to Ever Millenium to Zaphod And to Ever First to Meet Can an optimal solution possibly have suboptimal subtrees?
Optimal BST Cost Let CLeft,Right be the cost of the optimal subtree between wLeft and wRight. Then, CLeft,Right is: Let’s maintain a 2-D table to store values of Ci,j
Optimal BST Algorithm …a …am …and …egg …if …the …two a… am… and… egg…
To Do Finish Project IV!!!!!! Read Chapter 10 (algorithmic techniques) Work on final review (remember you can and should work with others) That’s it. You should finish off project IV this weekend. Tell us if you’re having trouble! Read chapter 9 on graphs and start reading chapter 10 on algorithmic techniques. We’ll talk about chapter 10 Monday and Wednesday of next week. Finally, start working on that final review. It’s not homework, but it will help you get ready for the final. We’ll have the official final review next week.
Coming Up Course Wrap-up Project IV due (March 7th; that’s tomorrow!) Final Exam (March 13th)