Download presentation
Presentation is loading. Please wait.
Published byHollie Dennis Modified over 8 years ago
1
CSE 331: Review August 1, 2013
2
Main Steps in Algorithm Design Problem Statement Algorithm Real world problem Problem Definition Precise mathematical def “Implementation” Data Structures Analysis Correctness/Run time
3
Stable Matching Problem Gale-Shaply Algorithm
4
Stable Marriage problem Set of men M and women W Matching (no polygamy in M X W) Perfect Matching (everyone gets married) Instablity m m w w m’m’w’w’ Preferences (ranking of potential spouses) Stable matching = perfect matching+ no instablity
5
Gale-Shapley Algorithm Intially all men and women are free While there exists a free woman who can propose Let w be such a woman and m be the best man she has not proposed to w proposes to m If m is free (m,w) get engaged Else (m,w’) are engaged If m prefers w’ to w w remains free Else (m,w) get engaged and w’ is free Output the engaged pairs as the final output At most n 2 iterations O(1) time implementation
6
GS algorithm: Firefly Edition 1 1 2 2 3 3 4 4 5 5 6 6 Mal Wash Simon Inara Zoe Kaylee
7
GS algo outputs a stable matching Lemma 1: GS outputs a perfect matching S Lemma 2: S has no instability
8
Proof technique de jour Source: 4simpsons.wordpress.com Proof by contradiction Assume the negation of what you want to prove After some reasoning
9
Two obervations Obs 1: Once m is engaged he keeps getting engaged to “better” women Obs 2: If w proposes to m’ first and then to m (or never proposes to m) then she prefers m’ to m
10
Proof of Lemma 2 By contradiction m m w w m’m’w’w’ Assume there is an instability (m,w’) m prefers w’ to w w’ prefers m to m’ w’ last proposed to m’
11
Contradiction by Case Analysis Depending on whether w’ had proposed to m or not Case 1: w’ never proposed to m w’ prefers m’ to m Assumed w’ prefers m to m’ Source: 4simpsons.wordpress.com By Obs 2
12
By Obs 1 Case 2: w’ had proposed to m Case 2.1: m had accepted w’ proposal m is finally engaged to w Thus, m prefers w to w’ 4simpsons.wordpress.com Case 2.2: m had rejected w’ proposal m was engaged to w’’ (prefers w’’ to w’) m is finally engaged to w (prefers w to w’’) m prefers w to w’ 4simpsons.wordpress.com By Obs 1
13
Overall structure of case analysis Did w’ propose to m? Did m accept w’ proposal? 4simpsons.wordpress.com
14
Graph Searching BFS/DFS
15
O(m+n) BFS Implementation BFS(s) CC[s] = T and CC[w] = F for every w≠ s Set i = 0 Set L 0 = {s} While L i is not empty L i+1 = Ø For every u in L i For every edge (u,w) If CC[w] = F then CC[w] = T Add w to L i+1 i++ Array Linked List Input graph as Adjacency list Version in KT also computes a BFS tree
16
An illustration 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 1 1 2 2 3 3 4 4 5 5 7 7 8 8 6 6
17
O(m+n) DFS implementation BFS(s) CC[s] = T and CC[w] = F for every w≠ s Intitialize Q= {s} While Q is not empty Delete the front element u in Q For every edge (u,w) If CC[w] = F then CC[w] = T Add w to the back of Q O(n) O(1) Repeated n u times O(n u ) Repeated at most once for each vertex u Σ u O(n u ) = O(Σ u n u ) = O(m) Σ u O(n u ) = O(Σ u n u ) = O(m) O(1)
18
A DFS run using an explicit stack 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 1 1 2 2 4 4 5 5 6 6 3 3 8 8 7 7 3 3 5 5 3 3 7 7
19
Topological Ordering
20
Run of TopOrd algorithm
21
Greedy Algorithms
22
Interval Scheduling: Maximum Number of Intervals Schedule by Finish Time
23
End of Semester blues MondayTuesdayWednesdayThursdayFriday Project 331 HW Exam study Party! Write up a term paper Can only do one thing at any day: what is the maximum number of tasks that you can do?
24
Schedule by Finish Time Set A to be the empty set While R is not empty Choose i in R with the earliest finish time Add i to A Remove all requests that conflict with i from R Return A*=A O(n log n) time sort intervals such that f(i) ≤ f(i+1) O(n) time build array s[1..n] s.t. s[i] = start time for i Do the removal on the fly
25
The final algorithm MondayTuesdayWednesdayThursdayFriday Project 331 HW Exam study Party! Write up a term paper Order tasks by their END time
26
Proof of correctness uses “greedy stays ahead”
27
Interval Scheduling: Maximum Intervals Schedule by Finish Time
28
Scheduling to minimize lateness MondayTuesdayWednesdayThursdayFriday Project 331 HW Exam study Party! Write up a term paper All the tasks have to be scheduled GOAL: minimize maximum lateness All the tasks have to be scheduled GOAL: minimize maximum lateness
29
The Greedy Algorithm (Assume jobs sorted by deadline: d 1 ≤ d 2 ≤ ….. ≤ d n ) f=s For every i in 1..n do Schedule job i from s(i)=f to f(i)=f+t i f=f+t i
30
Proof of Correctness uses “Exchange argument”
31
Proved the following Any two schedules with 0 idle time and 0 inversions have the same max lateness Greedy schedule has 0 idle time and 0 inversions There is an optimal schedule with 0 idle time and 0 inversions
32
Shortest Path in a Graph: non- negative edge weights Dijkstra’s Algorithm
33
Shortest Path problem Input: Directed graph G=(V,E) Edge lengths, l e for e in E “start” vertex s in V Output: All shortest paths from s to all nodes in V 100 15 5 s u w 5 s u 5 s u w
34
Dijkstra’s shortest path algorithm Input: Directed G=(V,E), l e ≥ 0, s in V R = {s}, d(s) =0 While there is a x not in R with (u,x) in E, u in R d’(w) = min e=(u,w) in E, u in R d(u)+l e Pick w that minimizes d’(w) Add w to R d(w) = d’(w) s s w w u u z z x x y y 1 2 4 3 3 1 2 1 2 d(s) = 0 1 4 2 s s u u d(u) = 1 4 2 w w d(w) = 2 5 x x d(x) = 2 3 4 y y d(y) = 3 z z d(z) = 4 Shortest paths
35
Dijkstra’s shortest path algorithm (formal) Input: Directed G=(V,E), l e ≥ 0, s in V S = {s}, d(s) =0 While there is a v not in S with (u,v) in E, u in S Pick w that minimizes d’(w) Add w to S d(w) = d’(w) At most n iterations O(m) time O(mn) time bound is trivial O(m log n) time implementation is possible
36
Proved that d’(v) is best when v is added
37
Minimum Spanning Tree Kruskal/Prim
38
Minimum Spanning Tree (MST) Input: A connected graph G=(V,E), c e > 0 for every e in E Output: A tree containing all V that minimizes the sum of edge weights
39
Kruskal’s Algorithm Joseph B. Kruskal Input: G=(V,E), c e > 0 for every e in E T = Ø Sort edges in increasing order of their cost Consider edges in sorted order If an edge can be added to T without adding a cycle then add it to T
40
Prim’s algorithm Robert Prim Similar to Dijkstra’s algorithm Input: G=(V,E), c e > 0 for every e in E 2 1 3 51 50 0.5 S = {s}, T = Ø While S is not the same as V Among edges e= (u,w) with u in S and w not in S, pick one with minimum cost Add w to S, e to T 2 1 50 0.5
41
Cut Property Lemma for MSTs S S V \ S Cheapest crossing edge is in all MSTs Condition: S and V\S are non-empty Assumption: All edge costs are distinct
42
Divide & Conquer
43
Sorting Merge-Sort
44
Sorting Given n numbers order them from smallest to largest Works for any set of elements on which there is a total order
45
Mergesort algorithm Input: a 1, a 2, …, a n Output: Numbers in sorted order MergeSort( a, n ) If n = 2 return the order min(a 1,a 2 ); max(a 1,a 2 ) a L = a 1,…, a n/2 a R = a n/2+1,…, a n return MERGE ( MergeSort(a L, n/2), MergeSort(a R, n/2) )
46
An example run MergeSort( a, n ) If n = 2 return the order min(a 1,a 2 ); max(a 1,a 2 ) a L = a 1,…, a n/2 a R = a n/2+1,…, a n return MERGE ( MergeSort(a L, n/2), MergeSort(a R, n/2) ) 151100192834 51119100 11951100 2843 2348 1234 81951100
47
Correctness Input: a 1, a 2, …, a n Output: Numbers in sorted order MergeSort( a, n ) If n = 2 return the order min(a 1,a 2 ); max(a 1,a 2 ) a L = a 1,…, a n/2 a R = a n/2+1,…, a n return MERGE ( MergeSort(a L, n/2), MergeSort(a R, n/2) ) By induction on n Inductive step follows from correctness of MERGE If n = 1 return the order a 1
48
Counting Inversions Merge-Count
49
Mergesort-Count algorithm Input: a 1, a 2, …, a n Output: Numbers in sorted order+ #inversion MergeSortCount( a, n ) If n = 2 return ( a1 > a2, min(a 1,a 2 ); max(a 1,a 2 )) a L = a 1,…, a n/2 a R = a n/2+1,…, a n return (c+c L +c R,a) (c L, a L ) = MergeSortCount(a L, n/2) (c R, a R ) = MergeSortCount(a R, n/2) (c, a) = MERGE-COUNT(a L,a R ) Counts #crossing-inversions+ MERGE O(n) T(2) = c T(n) = 2T(n/2) + cn O(n log n) time If n = 1 return ( 0, a 1 )
50
Closest Pair of Points Closest Pair of Points Algorithm
51
Closest pairs of points Input: n 2-D points P = {p 1,…,p n }; p i =(x i,y i ) Output: Points p and q that are closest d(p i,p j ) = ( (x i -x j ) 2 +(y i -y j ) 2 ) 1/2
52
The algorithm Input: n 2-D points P = {p 1,…,p n }; p i =(x i,y i ) Sort P to get P x and P y Q is first half of P x and R is the rest Closest-Pair (P x, P y ) Compute Q x, Q y, R x and R y (q 0,q 1 ) = Closest-Pair (Q x, Q y ) (r 0,r 1 ) = Closest-Pair (R x, R y ) δ = min ( d(q 0,q 1 ), d(r 0,r 1 ) ) S = points (x,y) in P s.t. |x – x*| < δ return Closest-in-box (S, (q 0,q 1 ), (r 0,r 1 )) If n < 4 then find closest point by brute-force Assume can be done in O(n) O(n log n) O(n) O(n log n) + T(n) T(< 4) = c T(n) = 2T(n/2) + cn O(n log n) overall
53
Dynamic Programming
54
Weighted Interval Scheduling Scheduling Algorithm
55
Weighted Interval Scheduling Input: n jobs (s i,t i,v i ) Output: A schedule S s.t. no two jobs in S have a conflict Goal: max Σ i in S v j Assume: jobs are sorted by their finish time
56
A recursive algorithm Compute-Opt(j) If j = 0 then return 0 return max { v j + Compute-Opt( p(j) ), Compute-Opt( j-1 ) } OPT(j) = max { v j + OPT( p(j) ), OPT(j-1) } Proof of correctness by induction on j Correct for j=0 = OPT( p(j) ) = OPT( j-1 )
57
Exponential Running Time 1 2 3 4 5 p(j) = j-2 OPT(5) OPT(3) OPT(4) OPT(1) OPT(2) OPT(1) OPT(2) OPT(1) OPT(2) OPT(3) Formal proof: Ex. Only 5 OPT values!
58
Bounding # recursions M-Compute-Opt(j) If j = 0 then return 0 M[j] = max { v j + M-Compute-Opt( p(j) ), M-Compute-Opt( j-1 ) } If M[j] is not null then return M[j] return M[j] Whenever a recursive call is made an M value of assigned At most n values of M can be assigned O(n) overall
59
Property of OPT OPT(j) = max { v j + OPT( p(j) ), OPT(j-1) } Given OPT(1), …, OPT(j-1), one can compute OPT(j) Given OPT(1), …, OPT(j-1), one can compute OPT(j)
60
Recursion+ memory = Iteration Iteratively compute the OPT(j) values M[0] = 0 M[j] = max { v j + M[p(j)], M[j-1] } For j=1,…,n Iterative-Compute-Opt M[j] = OPT(j) O(n) run time
61
Shortest Path in a Graph Bellman-Ford
62
Shortest Path Problem Input: (Directed) Graph G=(V,E) and for every edge e has a cost c e (can be <0) t in V Output: Shortest path from every s to t 1 1 100 -1000 899 s t Shortest path has cost negative infinity Assume that G has no negative cycle
63
Recurrence Relation OPT(i,u) = cost of shortest path from u to t with at most i edges OPT(i,u) = min { OPT(i-1,u), min (u,w) in E { c u,w + OPT(i-1, w) } } Path uses ≤ i-1 edges Best path through all neighbors
64
P vs NP
65
P vs NP question P : problems that can be solved by poly time algorithms NP : problems that have polynomial time verifiable witness to optimal solution Is P=NP? Alternate NP definition: Guess witness and verify!
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.