CSE 331: Review August 1, 2013. Main Steps in Algorithm Design Problem Statement Algorithm Real world problem Problem Definition Precise mathematical.

Slides:



Advertisements
Similar presentations
Single Source Shortest Paths
Advertisements

Lecture 37 CSE 331 Nov 4, Homework stuff (Last!) HW 10 at the end of the lecture Solutions to HW 9 on Monday Graded HW 9 on.
CSE 421 Algorithms Richard Anderson Dijkstra’s algorithm.
Lecture 28 CSE 331 Nov 9, Flu and HW 6 Graded HW 6 at the END of the lecture If you have the flu, please stay home Contact me BEFORE you miss a.
Lecture 6 CSE 331 Sep 14. A run of the GS algorithm Mal Wash Simon Inara Zoe Kaylee.
Lecture 32 CSE 331 Nov 18, HW 8 solutions Friday.
Lecture 4 CSE 331 Sep 9, Blog posts for lectures Starts from today See Sep 8 post on the blog.
Lecture 34 CSE 331 Nov 19, HW 9 due today Q1 in one pile and Q 2+3 in another I will not take any HW after 1:15pm.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Lecture 38 CSE 331 Dec 3, A new grading proposal Towards your final score in the course MAX ( mid-term as 25%+ finals as 40%, finals as 65%) .
Lecture 23 CSE 331 Oct 24, Temp letter grades assigned See the blog post for more details.
Lecture 36 CSE 331 Dec 2, Graded HW 8 END of the lecture.
Lecture 27 CSE 331 Nov 3, Combining groups Groups can unofficially combine in the lectures.
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming.
Lecture 37 CSE 331 Dec 1, A new grading proposal Towards your final score in the course MAX ( mid-term as 25%+ finals as 40%, finals as 65%) .
Lecture 24 CSE 331 Oct 27, Online office hours tonight 9:00pm.
Lecture 25 CSE 331 Nov 2, Adding teeth to group talk Form groups of size at most six (6) Pick a group leader I will ask group leader(s) to come.
Lecture 20 CSE 331 Oct 21, Algorithm for Interval Scheduling R: set of requests Set A to be the empty set While R is not empty Choose i in R with.
Lecture 30 CSE 331 Nov 10, Online Office Hours
Lecture 27 CSE 331 Nov 6, Homework related stuff Solutions to HW 7 and HW 8 at the END of the lecture Turn in HW 7.
Lecture 8 CSE 331 Sep 17, HW 1 due today Place Q1 and Q2 in separate piles I will not accept HWs after 1:15pm.
Lecture 6 CSE 331 Sep 10, Homeworks HW 1 posted online: see blog/piazza Pickup graded HW 0 in TA OHs.
Lecture 3 CSE 331. Stable Matching Problem Problem Statement Algorithm Problem Definition Implementation Analysis.
Lecture 5 CSE 331 Sep 11, Submit the form I’ll need confirmation in writing. No graded material will be handed back till I get this signed form.
CSE 331: Review. Main Steps in Algorithm Design Problem Statement Algorithm Real world problem Problem Definition Precise mathematical def “Implementation”
Lectures on Greedy Algorithms and Dynamic Programming
Lecture 38 CSE 331 Dec 2, Review Sessions etc. Atri: (at least ½ of) class next Friday Jiun-Jie: Extra office hour next Friday Jesse: Review Session.
Lecture 23 CSE 331 Oct 24, Reminder 2 points for Piazza participation 3 points for mini-project.
Lecture 8 CSE 331. Main Steps in Algorithm Design Problem Statement Algorithm Problem Definition “Implementation” Analysis n! Correctness+Runtime Analysis.
CSEP 521 Applied Algorithms Richard Anderson Winter 2013 Lecture 3.
Lecture 38 CSE 331 Dec 5, OHs today (only online 9:30-11)
CSE 331: Review.
Lecture 33 CSE 331 Nov 20, HW 8 due today Place Q1, Q2 and Q3 in separate piles I will not accept HWs after 1:15pm Submit your HWs to the side of.
Lecture 9 CSE 331 June 18, The “real” end of Semester blues MondayTuesdayWednesdayThursdayFriday Project 331 HW Exam study Party! Write up a term.
Lecture 4 CSE 331 Sep 7, 2016.
Lecture 32 CSE 331 Nov 16, 2016.
Lecture 8 CSE 331 Sep 14, 2011.
Graph Algorithms BFS, DFS, Dijkstra’s.
Lecture 26 CSE 331 Nov 2, 2016.
Greedy Algorithms / Minimum Spanning Tree Yin Tat Lee
Chapter 4 Greedy Algorithms
Lecture 21 CSE 331 Oct 21, 2016.
Lecture 24 CSE 331 Oct 28, 2016.
Lecture 24 CSE 331 Oct 27, 2017.
Lecture 21 CSE 331 Oct 20, 2017.
Lecture 33 CSE 331 Nov 18, 2016.
Lecture 33 CSE 331 Nov 17, 2017.
Lecture 26 CSE 331 Nov 1, 2017.
Lecture 19 CSE 331 Oct 12, 2016.
Lecture 26 CSE 331 Nov 2, 2012.
Autumn 2015 Lecture 10 Minimum Spanning Trees
Lecture 19 CSE 331 Oct 8, 2014.
Lecture 32 CSE 331 Nov 15, 2017.
Lecture 33 CSE 331 Nov 14, 2014.
Lecture 27 CSE 331 Oct 31, 2014.
Lecture 25 CSE 331 Oct 27, 2014.
Lecture 28 CSE 331 Nov 7, 2012.
Lecture 7 CSE 331 Sep 13, 2011.
Lecture 27 CSE 331 Nov 2, 2010.
Lecture 22 CSE 331 Oct 15, 2014.
Lecture 18 CSE 331 Oct 9, 2017.
Lecture 36 CSE 331 Nov 28, 2011.
Lecture 36 CSE 331 Nov 30, 2012.
Lecture 6 CSE 331 Sep 12, 2011.
Lecture 9 CSE 331 Sep 20, 2010.
Lecture 23 CSE 331 Oct 24, 2011.
CSE 417: Algorithms and Computational Complexity
Lecture 25 CSE 331 Oct 28, 2011.
Lecture 19 CSE 331 Oct 10, 2016.
Lecture 27 CSE 331 Nov 1, 2013.
Presentation transcript:

CSE 331: Review August 1, 2013

Main Steps in Algorithm Design Problem Statement Algorithm Real world problem Problem Definition Precise mathematical def “Implementation” Data Structures Analysis Correctness/Run time

Stable Matching Problem Gale-Shaply Algorithm

Stable Marriage problem Set of men M and women W Matching (no polygamy in M X W) Perfect Matching (everyone gets married) Instablity m m w w m’m’w’w’ Preferences (ranking of potential spouses) Stable matching = perfect matching+ no instablity

Gale-Shapley Algorithm Intially all men and women are free While there exists a free woman who can propose Let w be such a woman and m be the best man she has not proposed to w proposes to m If m is free (m,w) get engaged Else (m,w’) are engaged If m prefers w’ to w w remains free Else (m,w) get engaged and w’ is free Output the engaged pairs as the final output At most n 2 iterations O(1) time implementation

GS algorithm: Firefly Edition Mal Wash Simon Inara Zoe Kaylee

GS algo outputs a stable matching Lemma 1: GS outputs a perfect matching S Lemma 2: S has no instability

Proof technique de jour Source: 4simpsons.wordpress.com Proof by contradiction Assume the negation of what you want to prove After some reasoning

Two obervations Obs 1: Once m is engaged he keeps getting engaged to “better” women Obs 2: If w proposes to m’ first and then to m (or never proposes to m) then she prefers m’ to m

Proof of Lemma 2 By contradiction m m w w m’m’w’w’ Assume there is an instability (m,w’) m prefers w’ to w w’ prefers m to m’ w’ last proposed to m’

Contradiction by Case Analysis Depending on whether w’ had proposed to m or not Case 1: w’ never proposed to m w’ prefers m’ to m Assumed w’ prefers m to m’ Source: 4simpsons.wordpress.com By Obs 2

By Obs 1 Case 2: w’ had proposed to m Case 2.1: m had accepted w’ proposal m is finally engaged to w Thus, m prefers w to w’ 4simpsons.wordpress.com Case 2.2: m had rejected w’ proposal m was engaged to w’’ (prefers w’’ to w’) m is finally engaged to w (prefers w to w’’) m prefers w to w’ 4simpsons.wordpress.com By Obs 1

Overall structure of case analysis Did w’ propose to m? Did m accept w’ proposal? 4simpsons.wordpress.com

Graph Searching BFS/DFS

O(m+n) BFS Implementation BFS(s) CC[s] = T and CC[w] = F for every w≠ s Set i = 0 Set L 0 = {s} While L i is not empty L i+1 = Ø For every u in L i For every edge (u,w) If CC[w] = F then CC[w] = T Add w to L i+1 i++ Array Linked List Input graph as Adjacency list Version in KT also computes a BFS tree

An illustration

O(m+n) DFS implementation BFS(s) CC[s] = T and CC[w] = F for every w≠ s Intitialize Q= {s} While Q is not empty Delete the front element u in Q For every edge (u,w) If CC[w] = F then CC[w] = T Add w to the back of Q O(n) O(1) Repeated n u times O(n u ) Repeated at most once for each vertex u Σ u O(n u ) = O(Σ u n u ) = O(m) Σ u O(n u ) = O(Σ u n u ) = O(m) O(1)

A DFS run using an explicit stack

Topological Ordering

Run of TopOrd algorithm

Greedy Algorithms

Interval Scheduling: Maximum Number of Intervals Schedule by Finish Time

End of Semester blues MondayTuesdayWednesdayThursdayFriday Project 331 HW Exam study Party! Write up a term paper Can only do one thing at any day: what is the maximum number of tasks that you can do?

Schedule by Finish Time Set A to be the empty set While R is not empty Choose i in R with the earliest finish time Add i to A Remove all requests that conflict with i from R Return A*=A O(n log n) time sort intervals such that f(i) ≤ f(i+1) O(n) time build array s[1..n] s.t. s[i] = start time for i Do the removal on the fly

The final algorithm MondayTuesdayWednesdayThursdayFriday Project 331 HW Exam study Party! Write up a term paper Order tasks by their END time

Proof of correctness uses “greedy stays ahead”

Interval Scheduling: Maximum Intervals Schedule by Finish Time

Scheduling to minimize lateness MondayTuesdayWednesdayThursdayFriday Project 331 HW Exam study Party! Write up a term paper All the tasks have to be scheduled GOAL: minimize maximum lateness All the tasks have to be scheduled GOAL: minimize maximum lateness

The Greedy Algorithm (Assume jobs sorted by deadline: d 1 ≤ d 2 ≤ ….. ≤ d n ) f=s For every i in 1..n do Schedule job i from s(i)=f to f(i)=f+t i f=f+t i

Proof of Correctness uses “Exchange argument”

Proved the following Any two schedules with 0 idle time and 0 inversions have the same max lateness Greedy schedule has 0 idle time and 0 inversions There is an optimal schedule with 0 idle time and 0 inversions

Shortest Path in a Graph: non- negative edge weights Dijkstra’s Algorithm

Shortest Path problem Input: Directed graph G=(V,E) Edge lengths, l e for e in E “start” vertex s in V Output: All shortest paths from s to all nodes in V s u w 5 s u 5 s u w

Dijkstra’s shortest path algorithm Input: Directed G=(V,E), l e ≥ 0, s in V R = {s}, d(s) =0 While there is a x not in R with (u,x) in E, u in R d’(w) = min e=(u,w) in E, u in R d(u)+l e Pick w that minimizes d’(w) Add w to R d(w) = d’(w) s s w w u u z z x x y y d(s) = s s u u d(u) = w w d(w) = 2 5 x x d(x) = y y d(y) = 3 z z d(z) = 4 Shortest paths

Dijkstra’s shortest path algorithm (formal) Input: Directed G=(V,E), l e ≥ 0, s in V S = {s}, d(s) =0 While there is a v not in S with (u,v) in E, u in S Pick w that minimizes d’(w) Add w to S d(w) = d’(w) At most n iterations O(m) time O(mn) time bound is trivial O(m log n) time implementation is possible

Proved that d’(v) is best when v is added

Minimum Spanning Tree Kruskal/Prim

Minimum Spanning Tree (MST) Input: A connected graph G=(V,E), c e > 0 for every e in E Output: A tree containing all V that minimizes the sum of edge weights

Kruskal’s Algorithm Joseph B. Kruskal Input: G=(V,E), c e > 0 for every e in E T = Ø Sort edges in increasing order of their cost Consider edges in sorted order If an edge can be added to T without adding a cycle then add it to T

Prim’s algorithm Robert Prim Similar to Dijkstra’s algorithm Input: G=(V,E), c e > 0 for every e in E S = {s}, T = Ø While S is not the same as V Among edges e= (u,w) with u in S and w not in S, pick one with minimum cost Add w to S, e to T

Cut Property Lemma for MSTs S S V \ S Cheapest crossing edge is in all MSTs Condition: S and V\S are non-empty Assumption: All edge costs are distinct

Divide & Conquer

Sorting Merge-Sort

Sorting Given n numbers order them from smallest to largest Works for any set of elements on which there is a total order

Mergesort algorithm Input: a 1, a 2, …, a n Output: Numbers in sorted order MergeSort( a, n ) If n = 2 return the order min(a 1,a 2 ); max(a 1,a 2 ) a L = a 1,…, a n/2 a R = a n/2+1,…, a n return MERGE ( MergeSort(a L, n/2), MergeSort(a R, n/2) )

An example run MergeSort( a, n ) If n = 2 return the order min(a 1,a 2 ); max(a 1,a 2 ) a L = a 1,…, a n/2 a R = a n/2+1,…, a n return MERGE ( MergeSort(a L, n/2), MergeSort(a R, n/2) )

Correctness Input: a 1, a 2, …, a n Output: Numbers in sorted order MergeSort( a, n ) If n = 2 return the order min(a 1,a 2 ); max(a 1,a 2 ) a L = a 1,…, a n/2 a R = a n/2+1,…, a n return MERGE ( MergeSort(a L, n/2), MergeSort(a R, n/2) ) By induction on n Inductive step follows from correctness of MERGE If n = 1 return the order a 1

Counting Inversions Merge-Count

Mergesort-Count algorithm Input: a 1, a 2, …, a n Output: Numbers in sorted order+ #inversion MergeSortCount( a, n ) If n = 2 return ( a1 > a2, min(a 1,a 2 ); max(a 1,a 2 )) a L = a 1,…, a n/2 a R = a n/2+1,…, a n return (c+c L +c R,a) (c L, a L ) = MergeSortCount(a L, n/2) (c R, a R ) = MergeSortCount(a R, n/2) (c, a) = MERGE-COUNT(a L,a R ) Counts #crossing-inversions+ MERGE O(n) T(2) = c T(n) = 2T(n/2) + cn O(n log n) time If n = 1 return ( 0, a 1 )

Closest Pair of Points Closest Pair of Points Algorithm

Closest pairs of points Input: n 2-D points P = {p 1,…,p n }; p i =(x i,y i ) Output: Points p and q that are closest d(p i,p j ) = ( (x i -x j ) 2 +(y i -y j ) 2 ) 1/2

The algorithm Input: n 2-D points P = {p 1,…,p n }; p i =(x i,y i ) Sort P to get P x and P y Q is first half of P x and R is the rest Closest-Pair (P x, P y ) Compute Q x, Q y, R x and R y (q 0,q 1 ) = Closest-Pair (Q x, Q y ) (r 0,r 1 ) = Closest-Pair (R x, R y ) δ = min ( d(q 0,q 1 ), d(r 0,r 1 ) ) S = points (x,y) in P s.t. |x – x*| < δ return Closest-in-box (S, (q 0,q 1 ), (r 0,r 1 )) If n < 4 then find closest point by brute-force Assume can be done in O(n) O(n log n) O(n) O(n log n) + T(n) T(< 4) = c T(n) = 2T(n/2) + cn O(n log n) overall

Dynamic Programming

Weighted Interval Scheduling Scheduling Algorithm

Weighted Interval Scheduling Input: n jobs (s i,t i,v i ) Output: A schedule S s.t. no two jobs in S have a conflict Goal: max Σ i in S v j Assume: jobs are sorted by their finish time

A recursive algorithm Compute-Opt(j) If j = 0 then return 0 return max { v j + Compute-Opt( p(j) ), Compute-Opt( j-1 ) } OPT(j) = max { v j + OPT( p(j) ), OPT(j-1) } Proof of correctness by induction on j Correct for j=0 = OPT( p(j) ) = OPT( j-1 )

Exponential Running Time p(j) = j-2 OPT(5) OPT(3) OPT(4) OPT(1) OPT(2) OPT(1) OPT(2) OPT(1) OPT(2) OPT(3) Formal proof: Ex. Only 5 OPT values!

Bounding # recursions M-Compute-Opt(j) If j = 0 then return 0 M[j] = max { v j + M-Compute-Opt( p(j) ), M-Compute-Opt( j-1 ) } If M[j] is not null then return M[j] return M[j] Whenever a recursive call is made an M value of assigned At most n values of M can be assigned O(n) overall

Property of OPT OPT(j) = max { v j + OPT( p(j) ), OPT(j-1) } Given OPT(1), …, OPT(j-1), one can compute OPT(j) Given OPT(1), …, OPT(j-1), one can compute OPT(j)

Recursion+ memory = Iteration Iteratively compute the OPT(j) values M[0] = 0 M[j] = max { v j + M[p(j)], M[j-1] } For j=1,…,n Iterative-Compute-Opt M[j] = OPT(j) O(n) run time

Shortest Path in a Graph Bellman-Ford

Shortest Path Problem Input: (Directed) Graph G=(V,E) and for every edge e has a cost c e (can be <0) t in V Output: Shortest path from every s to t s t Shortest path has cost negative infinity Assume that G has no negative cycle

Recurrence Relation OPT(i,u) = cost of shortest path from u to t with at most i edges OPT(i,u) = min { OPT(i-1,u), min (u,w) in E { c u,w + OPT(i-1, w) } } Path uses ≤ i-1 edges Best path through all neighbors

P vs NP

P vs NP question P : problems that can be solved by poly time algorithms NP : problems that have polynomial time verifiable witness to optimal solution Is P=NP? Alternate NP definition: Guess witness and verify!