COSC 3101A - Design and Analysis of Algorithms 9 Knapsack Problem Huffman Codes Introduction to Graphs Many of these slides are taken from Monica Nicolescu,

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
Greedy Algorithms Amihood Amir Bar-Ilan University.
Greedy Algorithms Greed is good. (Some of the time)
Comp 122, Fall 2004 Elementary Graph Algorithms. graphs Lin / Devi Comp 122, Fall 2004 Graphs  Graph G = (V, E) »V = set of vertices »E = set of.
CS 473Lecture 141 CS473-Algorithms I Lecture 14-A Graph Searching: Breadth-First Search.
ALGORITHMS THIRD YEAR BANHA UNIVERSITY FACULTY OF COMPUTERS AND INFORMATIC Lecture eight Dr. Hamdy M. Mousa.
CS 473Lecture 141 CS473-Algorithms I Lecture 14-A Graph Searching: Breadth-First Search.
Breadth First Search. Two standard ways to represent a graph –Adjacency lists, –Adjacency Matrix Applicable to directed and undirected graphs. Adjacency.
1 Huffman Codes. 2 Introduction Huffman codes are a very effective technique for compressing data; savings of 20% to 90% are typical, depending on the.
Lecture 19: Shortest Paths Shang-Hua Teng. Weighted Directed Graphs Weight on edges for distance
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Chapter 9: Greedy Algorithms The Design and Analysis of Algorithms.
Lecture 14: Graph Algorithms Shang-Hua Teng. Undirected Graphs A graph G = (V, E) –V: vertices –E : edges, unordered pairs of vertices from V  V –(u,v)
CSE 780 Algorithms Advanced Algorithms Graph Algorithms Representations BFS.
Data Structures – LECTURE 10 Huffman coding
Greedy Algorithms Huffman Coding
Tirgul 7 Review of graphs Graph algorithms: – BFS (next tirgul) – DFS – Properties of DFS – Topological sort.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
David Luebke 1 8/7/2015 CS 332: Algorithms Graph Algorithms.
Lecture 1: The Greedy Method 主講人 : 虞台文. Content What is it? Activity Selection Problem Fractional Knapsack Problem Minimum Spanning Tree – Kruskal’s Algorithm.
Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b
COSC 3101A - Design and Analysis of Algorithms 10
Spring 2015 Lecture 10: Elementary Graph Algorithms
Sept Elementary Graph Algorithms Graph representation Graph traversal -Breadth-first search -Depth-first search Parenthesis theorem.
Spring 2015 Lecture 11: Minimum Spanning Trees
 Greedy Algorithms. Greedy Algorithm  Greedy Algorithm - Makes locally optimal choice at each stage. - For optimization problems.  If the local optimum.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
GREEDY ALGORITHMS UNIT IV. TOPICS TO BE COVERED Fractional Knapsack problem Huffman Coding Single source shortest paths Minimum Spanning Trees Task Scheduling.
COSC 3101A - Design and Analysis of Algorithms 8 Elements of DP Memoization Longest Common Subsequence Greedy Algorithms Many of these slides are taken.
Lecture 11 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
1 Chapter 22 Elementary Graph Algorithms. 2 Introduction G=(V, E) –V = vertex set –E = edge set Graph representation –Adjacency list –Adjacency matrix.
Introduction to Graphs And Breadth First Search. Graphs: what are they? Representations of pairwise relationships Collections of objects under some specified.
CSC 413/513: Intro to Algorithms Graph Algorithms.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 20.
1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms.
Greedy Algorithms.
Graphs & Paths Presentation : Part II. Graph representation Given graph G = (V, E). May be either directed or undirected. Two common ways to represent.
Shahed University Dr. Shahriar Bijani May  A path is a sequence of vertices P = (v 0, v 1, …, v k ) such that, for 1 ≤ i ≤ k, edge (v i – 1, v.
Huffman Codes. Overview  Huffman codes: compressing data (savings of 20% to 90%)  Huffman’s greedy algorithm uses a table of the frequencies of occurrence.
1 Chapter 16: Greedy Algorithm. 2 About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm.
Greedy Algorithms Analysis of Algorithms.
Chapter 05 Introduction to Graph And Search Algorithms.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
Greedy Algorithms. p2. Activity-selection problem: Problem : Want to schedule as many compatible activities as possible., n activities. Activity i, start.
Lecture 20. Graphs and network models 1. Recap Binary search tree is a special binary tree which is designed to make the search of elements or keys in.
Introduction to Algorithms
HUFFMAN CODES.
CSC317 Greedy algorithms; Two main properties:
Elementary Graph Algorithms
CSCE 411 Design and Analysis of Algorithms
The Greedy Method and Text Compression
The Greedy Method and Text Compression
Introduction to Algorithms`
Graph: representation and traversal CISC4080, Computer Algorithms
Elementary Graph Algorithms
Graph Representation Adjacency list representation of G = (V, E)
Lecture 10 Algorithm Analysis
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
Advanced Algorithms Analysis and Design
Basic Graph Algorithms
Merge Sort Dynamic Programming
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
Data Structure and Algorithms
Lecture 2: Greedy Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Huffman Coding Greedy Algorithm
Algorithms CSCI 235, Spring 2019 Lecture 31 Huffman Codes
Analysis of Algorithms CS 477/677
Presentation transcript:

COSC 3101A - Design and Analysis of Algorithms 9 Knapsack Problem Huffman Codes Introduction to Graphs Many of these slides are taken from Monica Nicolescu, Univ. of Nevada, Reno,

6/29/2004 Lecture 9COSC3101A2 The Knapsack Problem The 0-1 knapsack problem –A thief rubbing a store finds n items: the i -th item is worth v i dollars and weights w i pounds ( v i, w i integers) –The thief can only carry W pounds in his knapsack –Items must be taken entirely or left behind –Which items should the thief take to maximize the value of his load? The fractional knapsack problem –Similar to above –The thief can take fractions of items

6/29/2004 Lecture 9COSC3101A3 Fractional Knapsack Problem Knapsack capacity: W There are n items: the i -th item has value v i and weight w i Goal: –find x i such that for all 0  x i  1, i = 1, 2,.., n  w i x i  W and  x i v i is maximum

6/29/2004 Lecture 9COSC3101A4 Fractional Knapsack Problem Greedy strategy 1: –Pick the item with the maximum value E.g.: –W = 1 –w 1 = 100, v 1 = 2 –w 2 = 1, v 2 = 1 –Taking from the item with the maximum value: Total value taken = v 1 /w 1 = 2/100 –Smaller than what the thief can take if choosing the other item Total value (choose item 2) = v 2 /w 2 = 1

6/29/2004 Lecture 9COSC3101A5 Fractional Knapsack Problem Greedy strategy 2: Pick the item with the maximum value per pound v i /w i If the supply of that element is exhausted and the thief can carry more: take as much as possible from the item with the next greatest value per pound It is good to order items based on their value per pound

6/29/2004 Lecture 9COSC3101A6 Fractional Knapsack Problem Alg.: Fractional-Knapsack ( W, v[n], w[n] ) 1.While w > 0 and as long as there are items remaining 2.pick item with maximum v i /w i 3. x i  min (1, w/w i ) 4.remove item i from list 5. w  w – x i w i w – the amount of space remaining in the knapsack ( w = W ) Running time:  (n) if items already ordered; else  (nlgn)

6/29/2004 Lecture 9COSC3101A7 50 Fractional Knapsack - Example E.g.: Item 1 Item 2 Item 3 $60$100$ $60 $100 + $240 $6/pound$5/pound$4/pound $80 +

6/29/2004 Lecture 9COSC3101A8 Greedy Choice Items: … j … n Optimal solution: x 1 x 2 x 3 x j x n Greedy solution: x 1 ’x 2 ’x 3 ’ x j ’ x n ’ We know that: x 1 ’  x 1 –greedy choice takes as much as possible from item 1 Modify the optimal solution to take x 1 ’ of item 1 –We have to decrease the quantity taken from some item j : the new x j is decreased by: (x 1 ’ - x 1 ) w 1 /w j Increase in profit: Decrease in profit:  True, since x 1 had the best value/pound ratio

6/29/2004 Lecture 9COSC3101A9 Optimal Substructure Consider the most valuable load that weights at most W pounds If we remove a weight w of item j from the optimal load  The remaining load must be the most valuable load weighing at most W – w that can be taken from the remaining n – 1 items plus w j – w pounds of item j

6/29/2004 Lecture 9COSC3101A10 The 0-1 Knapsack Problem Thief has a knapsack of capacity W There are n items: for i -th item value v i and weight w i Goal: –find x i such that for all x i = {0, 1}, i = 1, 2,.., n  w i x i  W and  x i v i is maximum

6/29/2004 Lecture 9COSC3101A11 The 0-1 Knapsack Problem Thief has a knapsack of capacity W There are n items: for i -th item value v i and weight w i Goal: –find x i such that for all x i = {0, 1}, i = 1, 2,.., n  w i x i  W and  x i v i is maximum

6/29/2004 Lecture 9COSC3101A Knapsack - Greedy Strategy E.g.: Item 1 Item 2 Item 3 $60$100$ $60 $100 + $ $100 $120 + $ $6/pound$5/pound$4/pound None of the solutions involving the greedy choice (item 1) leads to an optimal solution –The greedy choice property does not hold

6/29/2004 Lecture 9COSC3101A Knapsack - Dynamic Programming P(i, w) – the maximum profit that can be obtained from items 1 to i, if the knapsack has size w Case 1: thief takes item i P(i, w) = Case 2: thief does not take item i P(i, w) = v i + P(i - 1, w-w i ) P(i - 1, w)

6/29/2004 Lecture 9COSC3101A Knapsack - Dynamic Programming :0: n 1 w - w i W i-1 0 first P(i, w) = max {v i + P(i - 1, w-w i ), P(i - 1, w) } Item i was takenItem i was not taken i w second

6/29/2004 Lecture 9COSC3101A15 P(i, w) = max {v i + P(i - 1, w-w i ), P(i - 1, w) } ItemWeightValue W = P(1, 1) = P(1, 2) = P(1, 3) = P(1, 4) = P(1, 5) = P(2, 1)= P(2, 2)= P(2, 3)= P(2, 4)= P(2, 5)= P(3, 1)= P(3, 2)= P(3, 3)= P(3, 4)= P(4, 5)= P(4, 1)= P(4, 2)= P(4, 3)= P(4, 4)= P(4, 5)= max{12+0, 0} = 12 max{10+0, 0} = 10 max{10+0, 12} = 12 max{10+12, 12} = 22 P(2,1) = 10 P(2,2) = 12 max{20+0, 22}=22 max{20+10,22}=30 max{20+12,22}=32 P(3,1) = 10 max{15+0, 12} = 15 max{15+10, 22}=25 max{15+12, 30}=30 max{15+22, 32}=37 0 P(0, 1) = 0 Example:

6/29/2004 Lecture 9COSC3101A16 Reconstructing the Optimal Solution Start at P(n, W) When you go left-up  item i has been taken When you go straight up  item i has not been taken Item 4 Item 2 Item 1

6/29/2004 Lecture 9COSC3101A17 Optimal Substructure Consider the most valuable load that weights at most W pounds If we remove item j from this load  The remaining load must be the most valuable load weighing at most W – w j that can be taken from the remaining n – 1 items

6/29/2004 Lecture 9COSC3101A18 Overlapping Subproblems :0: n 1 W i-1 0 P(i, w) = max {v i + P(i - 1, w-w i ), P(i - 1, w) } i w E.g. : all the subproblems shown in grey may depend on P(i-1, w)

6/29/2004 Lecture 9COSC3101A19 Huffman Codes Widely used technique for data compression Assume the data to be a sequence of characters Looking for an effective way of storing the data

6/29/2004 Lecture 9COSC3101A20 Huffman Codes Idea: –Use the frequencies of occurrence of characters to build a optimal way of representing each character Binary character code –Uniquely represents a character by a binary string abcdef Frequency (thousands)

6/29/2004 Lecture 9COSC3101A21 Fixed-Length Codes E.g.: Data file containing 100,000 characters 3 bits needed a = 000, b = 001, c = 010, d = 011, e = 100, f = 101 Requires: 100,000  3 = 300,000 bits abcdef Frequency (thousands)

6/29/2004 Lecture 9COSC3101A22 Variable-Length Codes E.g.: Data file containing 100,000 characters Assign short codewords to frequent characters and long codewords to infrequent characters a = 0, b = 101, c = 100, d = 111, e = 1101, f = 1100 (45       4)  1,000 = 224,000 bits abcdef Frequency (thousands)

6/29/2004 Lecture 9COSC3101A23 Prefix Codes Prefix codes: –Codes for which no codeword is also a prefix of some other codeword –Better name would be “prefix-free codes” We can achieve optimal data compression using prefix codes –We will restrict our attention to prefix codes

6/29/2004 Lecture 9COSC3101A24 Encoding with Binary Character Codes Encoding –Concatenate the codewords representing each character in the file E.g. : –a = 0, b = 101, c = 100, d = 111, e = 1101, f = 1100 –abc = 0  101  100 =

6/29/2004 Lecture 9COSC3101A25 Decoding with Binary Character Codes Prefix codes simplify decoding –No codeword is a prefix of another  the codeword that begins an encoded file is unambiguous Approach –Identify the initial codeword –Translate it back to the original character –Repeat the process on the remainder of the file E.g. : –a = 0, b = 101, c = 100, d = 111, e = 1101, f = 1100 – = 0  0 0  101  1101 = aabe

6/29/2004 Lecture 9COSC3101A26 Prefix Code Representation Binary tree whose leaves are the given characters Binary codeword –the path from the root to the character, where 0 means “go to the left child” and 1 means “go to the right child” Length of the codeword –Length of the path from root to the character leaf (depth of node) a: 45b: 13c: 12d: 16e: 9f: a: c: 12b: f: 5e: 9 10 d:

6/29/2004 Lecture 9COSC3101A27 Optimal Codes An optimal code is always represented by a full binary tree –Every non-leaf has two children –Fixed-length code is not optimal, variable-length is How many bits are required to encode a file? –Let C be the alphabet of characters –Let f(c) be the frequency of character c –Let d T (c) be the depth of c ’s leaf in the tree T corresponding to a prefix code the cost of tree T

6/29/2004 Lecture 9COSC3101A28 Constructing a Huffman Code A greedy algorithm that constructs an optimal prefix code called a Huffman code Assume that: –C is a set of n characters –Each character has a frequency f(c) –The tree T is built in a bottom up manner Idea: –Start with a set of |C| leaves –At each step, merge the two least frequent objects: the frequency of the new node = sum of two frequencies –Use a min-priority queue Q, keyed on f to identify the two least frequent objects a: 45c: 12b: 13f: 5e: 9d: 16

6/29/2004 Lecture 9COSC3101A29 Example a: 45c: 12b: 13f: 5e: 9d: 16 a: 45c: 12b: 13d: f: 5e: 9 01 d: 16 c: 12b: a: 45 f: 5e: f: 5e: 9 14 c: 12b: d: a: f: 5e: 9 14 c: 12b: d: f: 5e: 9 14 c: 12b: d: a:

6/29/2004 Lecture 9COSC3101A30 Building a Huffman Code Alg. : HUFFMAN( C ) 1.n   C  2.Q  C 3.for i  1 to n – 1 4. do allocate a new node z 5. left[z]  x  EXTRACT-MIN( Q ) 6. right[z]  y  EXTRACT-MIN( Q ) 7. f[z]  f[x] + f[y] 8. INSERT ( Q, z ) 9.return EXTRACT-MIN( Q ) O(n) O(nlgn) Running time: O(nlgn)

6/29/2004 Lecture 9COSC3101A31 Greedy Choice Property Lemma: Let C be an alphabet in which each character c  C has frequency f[c]. Let x and y be two characters in C having the lowest frequencies. Then, there exists an optimal prefix code for C in which the codewords for x and y have the same length and differ only in the last bit.

6/29/2004 Lecture 9COSC3101A32 Proof of the Greedy Choice Idea: –Consider a tree T representing an arbitrary optimal prefix code –Modify T to make a tree representing another optimal prefix code in which x and y will appear as sibling leaves of maximum depth  The codes of x and y will have the same length and differ only in the last bit

6/29/2004 Lecture 9COSC3101A33 Proof of the Greedy Choice (cont.) a, b – two characters, sibling leaves of maximum depth in T Assume: f[a]  f[b] and f[x]  f[y] f[x] and f[y] are the two lowest leaf frequencies, in order  f[x]  f[a] and f[y]  f[b] Exchange the positions of a and x (T’) and of b and y (T’’) x y ba a y bx a b yx TT’T’’

6/29/2004 Lecture 9COSC3101A34 Proof of the Greedy Choice (cont.) B(T) – B(T’) = = f[x]d T (x) + f[a]d T (a) – f[x]d T’ (x) – f[a]d T’ (a) = f[x]d T (x) + f[a]d T (a) – f[x]d T (a) – f[a]d T (x) = (f[a] - f[x]) (d T (a) - d T (x))  0 x y ba a y bx a b yx TT’T’’ ≥ 0 a is a leaf of maximum depth ≥ 0 x is a minimum frequency leaf

6/29/2004 Lecture 9COSC3101A35 Proof of the Greedy Choice (cont.) B(T) – B(T’)  0 Similarly, exchanging y and b does not increase the cost  B(T’) – B(T’’)  0  B(T’’)  B(T) and since T is optimal  B(T)  B(T’’)  B(T) = B(T’’)  T’’ is an optimal tree, in which x and y are sibling leaves of maximum depth x y ba a y bx a b yx TT’T’’

6/29/2004 Lecture 9COSC3101A36 Discussion Greedy choice property: –Building an optimal tree by mergers can begin with the greedy choice: merging the two characters with the lowest frequencies –The cost of each merger is the sum of frequencies of the two items being merged –Of all possible mergers, HUFFMAN chooses the one that incurs the least cost

6/29/2004 Lecture 9COSC3101A37 Graphs Applications that involve not only a set of items, but also the connections between them Maps Hypertexts Circuits Schedules Transactions Matching Computer Networks

6/29/2004 Lecture 9COSC3101A38 Graphs - Background Graphs = a set of nodes (vertices) with edges (links) between them. Notations: G = (V, E) - graph V = set of vertices  V  = n E = set of edges  E  = m Directed graph Undirected graph Acyclic graph

6/29/2004 Lecture 9COSC3101A39 Other Types of Graphs A graph is connected if there is a path between every two vertices A bipartite graph is an undirected graph G = (V, E) in which V = V 1 + V 2 and there are edges only between vertices in V 1 and V Connected Not connected

6/29/2004 Lecture 9COSC3101A40 Graph Representation Adjacency list representation of G = (V, E) –An array of  V  lists, one for each vertex in V –Each list Adj[u] contains all the vertices v such that there is an edge between u and v Adj[u] contains the vertices adjacent to u (in arbitrary order) –Can be used for both directed and undirected graphs / / / Undirected graph

6/29/2004 Lecture 9COSC3101A41 Properties of Adjacency-List Representation Sum of the lengths of all the adjacency lists –Directed graph: Edge (u, v) appears only once in u’s list –Undirected graph: u and v appear in each other’s adjacency lists: edge (u, v) appears twice Undirected graph Directed graph  E  2  E 

6/29/2004 Lecture 9COSC3101A42 Properties of Adjacency-List Representation Memory required –  (V + E) Preferred when –the graph is sparse:  E  <<  V  2 Disadvantage –no quick way to determine whether there is an edge between node u and v Time to list all vertices adjacent to u: –  (degree(u)) Time to determine if (u, v)  E: –O(degree(u)) Undirected graph Directed graph

6/29/2004 Lecture 9COSC3101A43 Graph Representation Adjacency matrix representation of G = (V, E) –Assume vertices are numbered 1, 2, …  V  –The representation consists of a matrix A  V  x  V  : –a ij = 1 if (i, j)  E 0 otherwise Undirected graph Matrix A is symmetric: a ij = a ji A = A T

6/29/2004 Lecture 9COSC3101A44 Properties of Adjacency Matrix Representation Memory required –  (V 2 ), independent on the number of edges in G Preferred when –The graph is dense  E  is close to  V  2 –We need to quickly determine if there is an edge between two vertices Time to list all vertices adjacent to u: –  (V) Time to determine if (u, v)  E: –  (1)

6/29/2004 Lecture 9COSC3101A45 Weighted Graphs Weighted graphs = graphs for which each edge has an associated weight w(u, v) w: E  R, weight function Storing the weights of a graph –Adjacency list: Store w(u,v) along with vertex v in u ’s adjacency list –Adjacency matrix: Store w(u, v) at location (u, v) in the matrix

6/29/2004 Lecture 9COSC3101A46 Searching in a Graph Graph searching = systematically follow the edges of the graph so as to visit the vertices of the graph Two basic graph searching algorithms: –Breadth-first search –Depth-first search –The difference between them is in the order in which they explore the unvisited edges of the graph Graph algorithms are typically elaborations of the basic graph-searching algorithms

6/29/2004 Lecture 9COSC3101A47 Breadth-First Search (BFS) Input: –A graph G = (V, E) (directed or undirected) –A source vertex s  V Goal: –Explore the edges of G to “discover” every vertex reachable from s, taking the ones closest to s first Output: –d[v] = distance (smallest # of edges) from s to v, for all v  V –A “breadth-first tree” rooted at s that contains all reachable vertices

6/29/2004 Lecture 9COSC3101A48 Breadth-First Search (cont.) Discover vertices in increasing order of distance from the source s – search in breadth not depth –Find all vertices at 1 edge from s, then all vertices at 2 edges from s, and so on

6/29/2004 Lecture 9COSC3101A49 Breadth-First Search (cont.) Keeping track of progress: –Color each vertex in either white, gray or black –Initially, all vertices are white –When being discovered a vertex becomes gray –After discovering all its adjacent vertices the node becomes black –Use FIFO queue Q to maintain the set of gray vertices source

6/29/2004 Lecture 9COSC3101A50 Breadth-First Tree BFS constructs a breadth-first tree –Initially contains the root (source vertex s ) –When vertex v is discovered while scanning the adjacency list of a vertex u  vertex v and edge (u, v) are added to the tree –u is the predecessor (parent) of v in the breadth-first tree –A vertex is discovered only once  it has at most one parent source

6/29/2004 Lecture 9COSC3101A51 BFS Additional Data Structures G = (V, E) represented using adjacency lists color[u] – the color of the vertex for all u  V  [u] – predecessor of u –If u = s (root) or node u has not yet been discovered   [u] = NIL d[u] – the distance from the source s to vertex u Use a FIFO queue Q to maintain the set of gray vertices d=1  =1 d=1  =1 d=2  =5 d=2  =2 source

6/29/2004 Lecture 9COSC3101A52 BFS( G, s ) 1.for each u  V[G] - {s} 2. do color[u]  WHITE 3. d[u] ←  4.  [u] = NIL 5.color[s]  GRAY 6.d[s] ← 0 7.  [s] = NIL 8.Q   9.Q ← ENQUEUE( Q, s ) Q: s  0   rstu vwxy   rstu vwxy rstu vwxy

6/29/2004 Lecture 9COSC3101A53 BFS( V, E, s ) 10. while Q   11. do u ← DEQUEUE( Q ) 12. for each v  Adj[u] 13. do if color[v] = WHITE 14. then color[v] ← GRAY 15. d[v] ← d[u]  [v] = u 17. ENQUEUE( Q, v ) 18. color[u]  BLACK  0   1  rstu vwxy Q: w Q: s  0   rstu vwxy 10   1  rstu vwxy Q: w, r

6/29/2004 Lecture 9COSC3101A54 Example 10   1  rstu vwxy Q: s  0   rstu vwxy Q: w, r vwxy 102   12  rstu Q: r, t, x 102  212  rstu vwxy Q: t, x, v  rstu vwxy Q: x, v, u rstu vwxy Q: v, u, y rstu vwxy Q: u, y rstu vwxy Q: y rstu vwxy Q: 

6/29/2004 Lecture 9COSC3101A55 Analysis of BFS 1.for each u  V - {s} 2. do color[u]  WHITE 3. d[u] ←  4.  [u] = NIL 5.color[s]  GRAY 6.d[s] ← 0 7.  [s] = NIL 8.Q   9.Q ← ENQUEUE(Q, s) O(V)  (1)

6/29/2004 Lecture 9COSC3101A56 Analysis of BFS 10. while Q   11. do u ← DEQUEUE( Q ) 12. for each v  Adj[u] 13. do if color[v] = WHITE 14. then color[v] = GRAY 15. d[v] ← d[u]  [v] = u 17. ENQUEUE( Q, v ) 18. color[u]  BLACK  (1) Scan Adj[u] for all vertices in the graph Each vertex is scanned only once, when the vertex is dequeued Sum of lengths of all adjacency lists =  (E) Scanning operations: O(E) Total running time for BFS = O(V + E)

6/29/2004 Lecture 9COSC3101A57 Shortest Paths Property BFS finds the shortest-path distance from the source vertex s  V to each node in the graph Shortest-path distance =  (s, u) –Minimum number of edges in any path from s to u r s t u vwx y source

6/29/2004 Lecture 9COSC3101A58 Readings Chapter 16 Chapter 22