Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 4 Greedy Algorithms

Similar presentations


Presentation on theme: "Chapter 4 Greedy Algorithms"— Presentation transcript:

1 Chapter 4 Greedy Algorithms
Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.

2 4.5 Minimum Spanning Tree

3 Minimum Spanning Tree Minimum spanning tree. Given a connected graph G = (V, E) with real-valued edge weights ce, an MST is a subset of the edges T  E such that T is a spanning tree whose sum of edge weights is minimized. Cayley's Theorem. There are nn-2 spanning trees of Kn. 24 4 4 23 9 6 6 9 18 Cayley ( ) was one of great mathematicians of 19th century, but his 1889 paper did not contain a rigorous proof of Cayley's Theorem. 5 5 11 16 11 8 8 7 7 10 14 21 G = (V, E) T, eT ce = 50 can't solve by brute force

4 Applications MST is fundamental problem with diverse applications.
Network design. telephone, electrical, hydraulic, TV cable, computer, road Approximation algorithms for NP-hard problems. traveling salesperson problem, Steiner tree Indirect applications. max bottleneck paths LDPC codes for error correction image registration with Renyi entropy learning salient features for real-time face verification reducing data storage in sequencing amino acids in a protein model locality of particle interactions in turbulent fluid flows autoconfig protocol for Ethernet bridging to avoid cycles in a network Cluster analysis. Applications: Ahuja, Magnanti, Orlin Section 13.2

5 Greedy Algorithms Kruskal's algorithm. Start with T = . Consider edges in ascending order of cost. Insert edge e in T unless doing so would create a cycle. Reverse-Delete algorithm. Start with T = E. Consider edges in descending order of cost. Delete edge e from T unless doing so would disconnect T. Prim's algorithm. Start with some root node s and greedily grow a tree T from s outward. At each step, add the cheapest edge e to T that has exactly one endpoint in T. Remark. All three algorithms produce an MST.

6 Greedy Algorithms Simplifying assumption. All edge costs ce are distinct. Cut property. Let S be any subset of nodes, and let e be the min cost edge with exactly one endpoint in S. Then the MST contains e. Cycle property. Let C be any cycle, and let f be the max cost edge belonging to C. Then the MST does not contain f. Note: simplifying assumption guarantees that MST in unique so it make sense to refer to "the MST" C f S e e is in the MST f is not in the MST

7 Cycles and Cuts Cycle. Set of edges the form a-b, b-c, c-d, …, y-z, z-a. Cutset. A cut is a subset of nodes S. The corresponding cutset D is the subset of edges with exactly one endpoint in S. 2 3 1 6 4 Cycle C = 1-2, 2-3, 3-4, 4-5, 5-6, 6-1 5 7 8 2 3 1 Cut S = { 4, 5, 8 } Cutset D = 5-6, 5-7, 3-4, 3-5, 7-8 6 4 5 7 8

8 Cycle-Cut Intersection
Claim. A cycle and a cutset intersect in an even number of edges. Pf. (by picture) 2 3 1 Cycle C = 1-2, 2-3, 3-4, 4-5, 5-6, 6-1 Cutset D = 3-4, 3-5, 5-6, 5-7, 7-8 Intersection = 3-4, 5-6 6 4 5 7 8 C S V - S

9 Greedy Algorithms Simplifying assumption. All edge costs ce are distinct. Cut property. Let S be any subset of nodes, and let e be the min cost edge with exactly one endpoint in S. Then the MST T* contains e. Pf. (exchange argument) Suppose e does not belong to T*, and let's see what happens. Adding e to T* creates a cycle C in T*. Edge e is both in the cycle C and in the cutset D corresponding to S  there exists another edge, say f, that is in both C and D. T' = T*  { e } - { f } is also a spanning tree. Since ce < cf, cost(T') < cost(T*). This is a contradiction. ▪ f S e T*

10 Greedy Algorithms Simplifying assumption. All edge costs ce are distinct. Cycle property. Let C be any cycle in G, and let f be the max cost edge belonging to C. Then the MST T* does not contain f. Pf. (exchange argument) Suppose f belongs to T*, and let's see what happens. Deleting f from T* creates a cut S in T*. Edge f is both in the cycle C and in the cutset D corresponding to S  there exists another edge, say e, that is in both C and D. T' = T*  { e } - { f } is also a spanning tree. Since ce < cf, cost(T') < cost(T*). This is a contradiction. ▪ f S e T*

11 Prim's Algorithm: Proof of Correctness
Prim's algorithm. [Jarník 1930, Dijkstra 1957, Prim 1959] Initialize S = any node. Apply cut property to S. Add min cost edge in cutset corresponding to S to T, and add one new explored node u to S. S

12 Implementation: Prim's Algorithm
Implementation. Use a priority queue ala Dijkstra. Maintain set of explored nodes S. For each unexplored node v, maintain attachment cost a[v] = cost of cheapest edge v to a node in S. O(n2) with an array; O(m log n) with a binary heap. Prim(G, c) { foreach (v  V) a[v]   Initialize an empty priority queue Q foreach (v  V) insert v onto Q Initialize set of explored nodes S   while (Q is not empty) { u  delete min element from Q S  S  { u } foreach (edge e = (u, v) incident to u) if ((v  S) and (ce < a[v])) decrease priority a[v] to ce }

13 Kruskal's Algorithm: Proof of Correctness
Kruskal's algorithm. [Kruskal, 1956] Consider edges in ascending order of weight. Case 1: If adding e to T creates a cycle, discard e according to cycle property. Case 2: Otherwise, insert e = (u, v) into T according to cut property where S = set of nodes in u's connected component. v S e e u Case 1 Case 2

14 Implementation: Kruskal's Algorithm
Implementation. Use the union-find data structure. Build set T of edges in the MST. Maintain set for each connected component. O(m log n) for sorting and O(m  (m, n)) for union-find. m  n2  log m is O(log n) essentially a constant Kruskal(G, c) { Sort edges weights so that c1  c2  ...  cm. T   foreach (u  V) make a set containing singleton u for i = 1 to m (u,v) = ei if (u and v are in different sets) { T  T  {ei} merge the sets containing u and v } return T are u and v in different connected components? merge two components

15 Lexicographic Tiebreaking
To remove the assumption that all edge costs are distinct: perturb all edge costs by tiny amounts to break any ties. Impact. Kruskal and Prim only interact with costs via pairwise comparisons. If perturbations are sufficiently small, MST with perturbed costs is MST with original costs. Implementation. Can handle arbitrarily small perturbations implicitly by breaking ties lexicographically, according to index. e.g., if all edge costs are integers, perturbing cost of edge ei by i / n2 boolean less(i, j) { if (cost(ei) < cost(ej)) return true else if (cost(ei) > cost(ej)) return false else if (i < j) return true else return false }

16 4.7 Clustering History: London physicist John Snow plotted outbreak of cholera deaths on map in 1850s location indicated that clusters were around certain intersections with polluted wells; this exposed the problem and solution! Outbreak of cholera deaths in London in 1850s. Reference: Nina Mishra, HP Labs

17 Clustering Clustering. Given a set U of n objects labeled p1, …, pn, classify into coherent groups. Distance function. Numeric value specifying "closeness" of two objects. Fundamental problem. Divide into clusters so that points in different clusters are far apart. Routing in mobile ad hoc networks. Identify patterns in gene expression. Document categorization for web search. Similarity searching in medical image databases Skycat: cluster 109 sky objects into stars, quasars, galaxies. photos, documents. micro-organisms number of corresponding pixels whose intensities differ by some threshold skycat: 7 dimensional vectors, each coordinate represents radiation in some band of the spectrum Sloan Sky Survey: cluster entire visible universe

18 Clustering of Maximum Spacing
k-clustering. Divide objects into k non-empty groups. Distance function. Assume it satisfies several natural properties. d(pi, pj) = 0 iff pi = pj (identity of indiscernibles) d(pi, pj)  0 (nonnegativity) d(pi, pj) = d(pj, pi) (symmetry) Spacing. Min distance between any pair of points in different clusters. Clustering of maximum spacing. Given an integer k, find a k-clustering of maximum spacing. semimetric: not necessarily a metric space (no triangle inequality) Ex: assumes Euclidean distance function spacing k = 4

19 Greedy Clustering Algorithm
Single-link k-clustering algorithm. Form a graph on the vertex set U, corresponding to n clusters. Find the closest pair of objects such that each object is in a different cluster, and add an edge between them. Repeat n-k times until there are exactly k clusters. Key observation. This procedure is precisely Kruskal's algorithm (except we stop when there are k connected components). Remark. Equivalent to finding an MST and deleting the k-1 most expensive edges. Agglomerative clustering. Build tree bottom up. Divisive clustering. Build tree top down.

20 Greedy Clustering Algorithm: Analysis
Theorem. Let C* denote the clustering C*1, …, C*k formed by deleting the k-1 most expensive edges of a MST. C* is a k-clustering of max spacing. Pf. Let C denote some other clustering C1, …, Ck. The spacing of C* is the length d* of the (k-1)st most expensive edge. Let pi, pj be in the same cluster in C*, say C*r, but different clusters in C, say Cs and Ct. Some edge (p, q) on pi-pj path in C*r spans two different clusters in C. All edges on pi-pj path have length  d* since Kruskal chose them. Spacing of C is  d* since p and q are in different clusters. ▪ Cs Ct C*r pi p q pj

21 Extra Slides

22 MST Algorithms: Theory
Deterministic comparison based algorithms. O(m log n) [Jarník, Prim, Dijkstra, Kruskal, Boruvka] O(m log log n). [Cheriton-Tarjan 1976, Yao 1975] O(m (m, n)). [Fredman-Tarjan 1987] O(m log (m, n)). [Gabow-Galil-Spencer-Tarjan 1986] O(m  (m, n)). [Chazelle 2000] Holy grail. O(m). Notable. O(m) randomized. [Karger-Klein-Tarjan 1995] O(m) verification. [Dixon-Rauch-Tarjan 1992] Euclidean. 2-d: O(n log n). compute MST of edges in Delaunay k-d: O(k n2). dense Prim Idea for O(m log log n): combine Boruvka + Prim. (see CLR 24-2) Chazelle: soft heaps, lower bound tight over arbitrary semi-group

23 Dendrogram Dendrogram. Scientific visualization of hypothetical sequence of evolutionary events. Leaves = genes. Internal nodes = hypothetical ancestors. also need an algorithm to order leaves optimally Reference:

24 Dendrogram of Cancers in Human
Tumors in similar tissues cluster together. Gene 1 slide borrowed with permission from Olga Troyanskaya "Microarrays allow us to see if a particular gene is on or off in a particular tissue. The colors on this picture correspond to whether or not the gene is expressed. So each row is a gene, and each column is a particular experiment, for example a particular type of tissue. If you see a red spot for some gene <show gene> for breast tumors, that means this gene is expressed in breast cancer. But that same gene is not expressed in the brain, for example." arranging the experiments (columns) for optimal presentation is also an interesting problem Gene n gene expressed Reference: Botstein & Brown group gene not expressed


Download ppt "Chapter 4 Greedy Algorithms"

Similar presentations


Ads by Google