The Greedy Approach Winter-2004 Young CS 331 D&A of Algo. Greedy.

Slides:



Advertisements
Similar presentations
Unit-iv.
Advertisements

Chapter 9 Greedy Technique. Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible - b feasible.
Lecture 15. Graph Algorithms
CHAPTER 7 Greedy Algorithms.
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
Greedy Algorithms Greed is good. (Some of the time)
Greed is good. (Some of the time)
IKI 10100: Data Structures & Algorithms Ruli Manurung (acknowledgments to Denny & Ade Azurat) 1 Fasilkom UI Ruli Manurung (Fasilkom UI)IKI10100: Lecture10.
Chapter 4 The Greedy Approach. Minimum Spanning Tree A tree is an acyclic, connected, undirected graph. A spanning tree for a given graph G=, where E.
Chapter 3 The Greedy Method 3.
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
Minimum Spanning Tree Algorithms
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
Chapter 9: Greedy Algorithms The Design and Analysis of Algorithms.
Minimum-Cost Spanning Tree weighted connected undirected graph spanning tree cost of spanning tree is sum of edge costs find spanning tree that has minimum.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Minimum Spanning Trees. Subgraph A graph G is a subgraph of graph H if –The vertices of G are a subset of the vertices of H, and –The edges of G are a.
The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each decision is locally optimal. These.
Analysis of Algorithms
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
2IL05 Data Structures Fall 2007 Lecture 13: Minimum Spanning Trees.
Spring 2015 Lecture 11: Minimum Spanning Trees
© 2015 JW Ryder CSCI 203 Data Structures1. © 2015 JW Ryder CSCI 203 Data Structures2.
알고리즘 설계 및 분석 Foundations of Algorithm 유관우. Digital Media Lab. 2 Chap4. Greedy Approach Grabs data items in sequence, each time with “best” choice, without.
Contents of Chapter 4 Chapter 4 The Greedy method
Lecture 19 Greedy Algorithms Minimum Spanning Tree Problem.
Lectures on Greedy Algorithms and Dynamic Programming
Unit-iii. Greedy Method Greedy algorithm obtains an optimal solution by making a sequence of decisions. Decisions are made one by one in some order. Each.
CSE 589 Applied Algorithms Spring 1999 Prim’s Algorithm for MST Load Balance Spanning Tree Hamiltonian Path.
1 Ch18. The Greedy Methods. 2 BIRD’S-EYE VIEW Enter the world of algorithm-design methods In the remainder of this book, we study the methods for the.
Greedy Algorithms General principle of greedy algorithm
CSCE 411 Design and Analysis of Algorithms
COMP108 Algorithmic Foundations Greedy methods
Chapter 9 : Graphs Part II (Minimum Spanning Trees)
Introduction to Algorithms
Greedy Technique.
COMP 6/4030 ALGORITHMS Prim’s Theorem 10/26/2000.
CS 3343: Analysis of Algorithms
Minimum Spanning Trees
Lecture 12 Algorithm Analysis
Introduction to Graphs
Greedy Algorithms / Minimum Spanning Tree Yin Tat Lee
Topological Sort (topological order)
Short paths and spanning trees
First-Cut Techniques Lecturer: Jing Liu
Discrete Mathematics for Computer Science
Minimum-Cost Spanning Tree
CSCE350 Algorithms and Data Structure
Unit 3 (Part-I): Greedy Algorithms
Greedy Algorithm Enyue (Annie) Lu.
Dynamic Programming General Idea
Minimum-Cost Spanning Tree
Lectures on Graph Algorithms: searching, testing and sorting
Minimum-Cost Spanning Tree
CS 583 Analysis of Algorithms
Lecture 12 Algorithm Analysis
Minimum Spanning Tree Algorithms
Minimum Spanning Trees
Minimum Spanning Tree.
Dynamic Programming General Idea
Greedy Algorithms Comp 122, Spring 2004.
The Greedy Approach Young CS 530 Adv. Algo. Greedy.
Lecture 12 Algorithm Analysis
Chapter 14 Graphs © 2011 Pearson Addison-Wesley. All rights reserved.
Algorithm Course Dr. Aref Rashad
Minimum-Cost Spanning Tree
INTRODUCTION A graph G=(V,E) consists of a finite non empty set of vertices V , and a finite set of edges E which connect pairs of vertices .
Minimum Spanning Trees
Presentation transcript:

The Greedy Approach Winter-2004 Young CS 331 D&A of Algo. Greedy

Feasible solution — any subset that satisfies some constraints General idea:   Given a problem with n inputs, we are required to obtain a subset that maximizes or minimizes a given objective function subject to some constraints. Feasible solution — any subset that satisfies some constraints Optimal solution — a feasible solution that maximizes or minimizes the objective function Winter-2004 Young CS 331 D&A of Algo. Greedy

x  Select (A); // based on the objective // function procedure Greedy (A, n) begin solution  Ø; for i  1 to n do x  Select (A); // based on the objective // function if Feasible (solution, x), then solution  Union (solution, x); end; Select: A greedy procedure, based on a given objective function, which selects input from A, removes it and assigns its value to x. Feasible: A boolean function to decide if x can be included into solution vector (without violating any given constraint). Winter-2004 Young CS 331 D&A of Algo. Greedy

About Greedy method The n inputs are ordered by some selection procedure which is based on some optimization measures. It works in stages, considering one input at a time. At each stage, a decision is made regarding whether or not a particular input is in an optimal solution. Winter-2004 Young CS 331 D&A of Algo. Greedy

Minimum Spanning Tree ( For Undirected Graph) The problem: Tree A Tree is connected graph with no cycles. Spanning Tree A Spanning Tree of G is a tree which contains all vertices in G. Example: G: Winter-2004 Young CS 331 D&A of Algo. Greedy

Is G a Spanning Tree? Key: Yes Key: No Note: Connected graph with n vertices and exactly n – 1 edges is Spanning Tree. Minimum Spanning Tree Assign weight to each edge of G, then Minimum Spanning Tree is the Spanning Tree with minimum total weight. Winter-2004 Young CS 331 D&A of Algo. Greedy

Edges have the same weight Example: Edges have the same weight G: 1 2 3 7 6 4 5 8 Winter-2004 Young CS 331 D&A of Algo. Greedy

DFS (Depth First Search) 1 7 2 3 5 4 6 8 Winter-2004 Young CS 331 D&A of Algo. Greedy

BFS (Breadth First Search) 1 2 3 4 5 6 7 8 Winter-2004 Young CS 331 D&A of Algo. Greedy

Edges have different weights G: DFS 1 2 3 4 5 6 16 19 21 33 11 14 18 10 1 5 2 3 4 6 16 10 14 33 Winter-2004 Young CS 331 D&A of Algo. Greedy

Minimum Spanning Tree (with the least total weight) BFS Minimum Spanning Tree (with the least total weight) 1 2 3 4 5 6 16 19 21 1 2 3 4 5 6 16 11 18 Winter-2004 Young CS 331 D&A of Algo. Greedy

Prim’s Algorithm (Minimum Spanning Tree) Basic idea: Algorithms: Prim’s Algorithm (Minimum Spanning Tree) Basic idea: Start from vertex 1 and let T  Ø (T will contain all edges in the S.T.); the next edge to be included in T is the minimum cost edge(u, v), s.t. u is in the tree and v is not. Example: G 1 2 3 4 5 6 16 19 21 33 11 14 18 10 Winter-2004 Young CS 331 D&A of Algo. Greedy

S.T. S.T. S.T. S.T. (Spanning Tree) 1 2 5 6 16 19 21 3 4 11 10 18 14 Winter-2004 Young CS 331 D&A of Algo. Greedy

(n – # of vertices, e – # of edges) It takes O(n) steps 1 2 3 4 6 5 19 18 33 S.T. 5 Cost = 16+5+6+11+18=56 Minimum Spanning Tree 1 2 3 S.T. 6 4 (n – # of vertices, e – # of edges) It takes O(n) steps each step takes O(e) and e  n(n-1)/2 Winter-2004 Young CS 331 D&A of Algo. Greedy

Cost (two dimension array) Data structure: Cost (two dimension array) Example: G: weight on edge if there is an edge between vi and vj if there is no edge between vi and vj 0 if i = j Cost[i][j] = 1 2 3 4 5 6 16 19 21 33 11 14 18 10 Winter-2004 Young CS 331 D&A of Algo. Greedy

One dimensional array Near(j) for each vertex j Cost = One dimensional array Near(j) for each vertex j 0,  j already in the S.T. a vertex in the tree, s.t. Cost(j, Near(j)) is minimum, Otherwise Winter-2004 Young CS 331 D&A of Algo. Greedy

procedure Prim (Cost, n, T, MinCost) // Cost, n, T is input, Algorithm:  procedure Prim (Cost, n, T, MinCost) // Cost, n, T is input, MinCost is output begin for i  2 to n do Near(j)  1; for i  1 to (n – 1) do let j be an index s.t.(Near(j)  0) and Cost(j,Near(j)) is minimum; (cont.) Winter-2004 Young CS 331 D&A of Algo. Greedy

if (Near(k)  0) and (Cost(k, Near(k)) > Cost(k, j)) for k  1 to n do if (Near(k)  0) and (Cost(k, Near(k)) > Cost(k, j)) then Near(k)  j; end; Example: G: 1 2 3 4 5 6 16 19 21 33 11 14 18 10 Winter-2004 Young CS 331 D&A of Algo. Greedy

Initially include include include include include Example: Initially include include include include include       Near(1)=0 0 0 0 0 Near(2)=1 0 0 0 0 Near(3)=1 2 0 0 0 Near(4)=1 2 2 0 0 Near(5)=1 1 1 4 4 Near(6)=1 2 2 2 0 1 2 3 4 5 6 16 19 21 33 11 14 18 10 Winter-2004 Young CS 331 D&A of Algo. Greedy

(1,4) 30 × reject create cycle (3,5) 35 √ Kruskal’s Algorithm Example: Step 1: Sort all of edges (1,2) 10 √ (3,6) 15 √ (4,6) 20 √ (2,6) 25 √ (1,4) 30 × reject create cycle (3,5) 35 √ Kruskal’s Algorithm Basic idea:   Don’t care if T is a tree or not in the intermediate stage, as long as the including of a new edge will not create a cycle, we include the minimum cost edge 1 2 3 5 4 6 10 50 35 40 55 15 45 25 30 20 Winter-2004 Young CS 331 D&A of Algo. Greedy

Step 2: T 1 2 1 2 3 6 1 2 3 6 4 1 2 3 6 4 1 2 3 6 4 5 Winter-2004 Young CS 331 D&A of Algo. Greedy

adding an edge will create a cycle or not? How to check: adding an edge will create a cycle or not? If Maintain a set for each group (initially each node represents a set) Ex: set1 set2 set3  new edge from different groups  no cycle created Data structure to store sets so that: The group number can be easily found, and Two sets can be easily merged 1 2 3 6 4 5 2 6 Winter-2004 Young CS 331 D&A of Algo. Greedy

Method 1 (straightforward):   Use an array to store group # for each element and use the smallest # of each set as its label Example: We use an array (1 .. n) to store the group# of each element: set: node: 1 2 3 4 5 6 7 8 9 10 1 2 3 Winter-2004 Young CS 331 D&A of Algo. Greedy

function Find1 (x) //find the label of the set containing object x begin return setx; end; procedure Merge1 (a, b) //merge sets labeled a and b i  a; j  b; if i > j, exchange i and j; for k  1 to n do if setk = j then setk  i; So, complexity of Find1 is O(1), and Merge1 is O(n) Winter-2004 Young CS 331 D&A of Algo. Greedy

Represent each set as a tree, Method 2 (use tree):   Represent each set as a tree, If seti = i, then i is both the label of a set and the root of the corresponding tree; if seti = j  i, then j is the parent of i in some tree Example: tree: 1 5 2 4 7 10 3 6 8 9 Winter-2004 Young CS 331 D&A of Algo. Greedy

while seti  i do i  seti; return i; end; procedure Merge2(a, b) node: 1 2 3 4 5 6 7 8 9 10 function Find2 (x) begin i  x; while seti  i do i  seti; return i; end; procedure Merge2(a, b) if a < b, then setb  a; else seta  b; 1 2 3 4 Winter-2004 Young CS 331 D&A of Algo. Greedy

So, the complexity for Find2 is O(n), Merge2 is O(1), compared with Method1, total complexity does not change. For method 3, we improve the Merge2, and it can reduce Find2 to O(Log n). Method 3: in method2, we choose the smallest number of a set as its label. When we merge 2 trees of heights h1 and h2, it would be better to balance the height of the merged tree. Example: let the tree with smaller height be the child of the other tree, then the merged tree will be of height max(h1 , h2) if h1  h2 h1 + 1 if h1 = h2 And, it can be proved by induction that after an arbitrary sequence of merge operations, a tree containing n nodes will have a height at most log n. Winter-2004 Young CS 331 D&A of Algo. Greedy

if heighta = heightb, then heighta  heighta + 1; setb  a; procedure Merge3 (a, b) begin if heighta = heightb, then heighta  heighta + 1; setb  a; else if heighta > heightb, then setb  a; else seta  b; end; It can be proved by induction: After any # of merge operations, a tree with n nodes will have height at most Log n. Winter-2004 Young CS 331 D&A of Algo. Greedy

Kruskal’s algorithm While (T contains fewer than n-1 edges) and (E   ) do Begin Choose an edge (v,w) from E of lowest cost; Delete (v,w) from E; If (v,w) does not create a cycle in T then add (v,w) to T else discard (v,w); End; Winter-2004 Young CS 331 D&A of Algo. Greedy

procedure Kruskal (Cost, n, e, T, MinCost) begin Kruskal’s algorithm:  procedure Kruskal (Cost, n, e, T, MinCost) begin construct a heap out of the edge cost; i  0; MinCost  0; while ((i < (n – 1)) and (heap not empty) do delete the minimum cost edge(u, v) from the heap; j  Find2 (u); k  Find2 (v); if j  k, then i  i + 1; T(i, 1)  u; T(i, 2)  v; MinCost  MinCost + Cost(u, v); Merge3 (j, k); end while; if i  n – 1, then print (“No spanning Tree”); end; O(eloge) While (T contains fewer than n-1 edges ) and (E   ) do Begin Choose an edge (v,w) from E of lowest cost; Delete (v,w) from E; If (v,w) does not create a cycle in T then add (v,w) to T else discard (v,w); End; O(eloge) Winter-2004 Young CS 331 D&A of Algo. Greedy

So, complexity of Kruskal is Comparing Prim’s Algorithm with Kruskal’s Algorithm Prim’s complexity is Kruskal’s complexity is if G is a complete (dense) graph, Kruskal’s complexity is . if G is a sparse graph, Winter-2004 Young CS 331 D&A of Algo. Greedy

Dijkstra’s Algorithm for Single-Source Shortest Paths The problem: Given directed graph G = (V, E), a weight for each edge in G, a source node v0, Goal: determine the (length of) shortest paths from v0 to all the remaining vertices in G   Def: Length of the path: Sum of the weight of the edges Observation: May have more than 1 paths between w and x (y and z) But each individual path must be minimal length (in order to form an overall shortest path form v0 to vi ) w x y z V0 shortest paths from v0 to vi Vi Winter-2004 Young CS 331 D&A of Algo. Greedy

Notation cost adjacency matrix Cost, 1  a,b  V Cost (a, b) = cost from vertex i to vertex j if there is a edge 0 if a = b otherwise 1 if shortest path (v, w) is defined 0 otherwise s(w) = in the vertex set V = the length of the shortest path from v to j if i is the predecessor of j along the shortest path from v to j Winter-2004 Young CS 331 D&A of Algo. Greedy

Example: Cost adjacent matrix 50 10 20 15 3 35 30 45 V0 V1 V4 V2 V3 V5 Winter-2004 Young CS 331 D&A of Algo. Greedy

Steps in Dijkstra’s Algorithm 1. Dist (v0) = 0, From (v0) = v0 2. Dist (v2) = 10, From (v2) = v0 V0 V2 V1 V3 V4 V5 50 10 20 15 35 30 3 45 V0 V1 V2 V3 V4 V5 50 10 3 15 20 35 30 45 Winter-2004 Young CS 331 D&A of Algo. Greedy

3. Dist (v3) = 25, From (v3) = v2 4. Dist (v1) = 45, From (v1) = v3 50 10 45 20 15 35 3 V0 V2 V1 V3 V4 V5 50 45 10 20 15 35 30 3 30 Winter-2004 Young CS 331 D&A of Algo. Greedy

5. Dist (v4) = 45, From (v4) = v0 6. Dist (5) =  50 10 20 15 35 30 3 V0 V1 V2 V3 V4 V5 45 50 10 20 15 35 30 3 Winter-2004 Young CS 331 D&A of Algo. Greedy

Shortest paths from source v0 v0  v2  v3  v1 45 v0  v2 10 Winter-2004 Young CS 331 D&A of Algo. Greedy

Dijkstra’s algorithm: procedure Dijkstra (Cost, n, v, Dist, From)   procedure Dijkstra (Cost, n, v, Dist, From) // Cost, n, v are input, Dist, From are output begin for i  1 to n do for num  1 to (n – 1) do choose u s.t. s(u) = 0 and Dist(u) is minimum; for all w with s(w) = 0 do if end; (cont. next page) Winter-2004 Young CS 331 D&A of Algo. Greedy

Ex: Cost adjacent matrix 10 50 30 100 20 1 5 2 3 4 Winter-2004 Young CS 331 D&A of Algo. Greedy

Steps in Dijkstra’s algorithm 1. Dist (1) = 0, From (1) = 1 2. Dist (5) = 10, From (5) = 1 1 2 5 3 4 10 50 100 30 20 50 1 2 5 3 4 10 100 30 20 Winter-2004 Young CS 331 D&A of Algo. Greedy

3. Dist (4) = 20, From (4) = 5 4. Dist (3) = 30, From (3) = 1 10 50 100 30 20 1 2 5 3 4 10 50 100 30 20 5. Dist (2) = 35, From (3) Shortest paths from source 1 1  3  2 35 1  3 30 1  5 4 20 1  5 10 1 2 5 3 4 10 50 100 30 20 Winter-2004 Young CS 331 D&A of Algo. Greedy

Optimal Storage on Tapes The problem:   Given n programs to be stored on tape, the lengths of these n programs are l1, l2 , . . . , ln respectively. Support the programs are stored in the order of i1, i2 , . . . , in Let tj be the time to retrieve program ij. Assume that the tape is initially positioned at the beginning tj is proportional to the sum of all lengths of programs stored in front of the program ij. Winter-2004 Young CS 331 D&A of Algo. Greedy

The goal is to minimize MRT (Mean Retrieval Time), i.e. want to minimize Ex: There are n! = 6 possible orderings for storing them.   order total retrieval time MRT 1 2 3 4 5 6 1 2 3 1 3 2 2 1 3 2 3 1 3 1 2 3 2 1 5+(5+10)+(5+10+3)=38 5+(5+3)+(5+3+10)=31 10+(10+5)+(10+5+3)=43 10+(10+3)+(10+3+5)=41 3+(3+5)+(3+5+10)=29 3+(3+10)+(3+10+5)=34 38/3 31/3 43/3 41/3 29/3 34/3 Smallest Note: The problem can be solved in greedy strategy, just always let the shortest program goes first. ( Can simply get the right order by using any sorting algorithm) Winter-2004 Young CS 331 D&A of Algo. Greedy

Try all combination: O( n! ) Analysis:   Try all combination: O( n! ) Shortest-length-First Greedy method: O (nlogn) Shortest-length-First Greedy method: Sort the programs s.t. and call this ordering L. Next is to show that the ordering L is the best Proof by contradiction: Suppose Greedy ordering L is not optimal, then there exists some other permutation I that is optimal. I = (i1, i2, … in)  a < b, s.t. (otherwise I = L) Winter-2004 Young CS 331 D&A of Algo. Greedy

Interchange ia and ib in and call the new list I : I x … ia ia+1 ia+2 ib SWAP … ib ia+1 ia+2 ia x In I, Program ia+1 will take less (lia- lib) time than in I to be retrieval In fact, each program ia+1 , …, ib-1 gains (lia- lib) For ib, the retrieval time decreases x + lia For ia, the retrieval time increases x + lib Contradiction!! Therefore, greedy ordering L is optimal Winter-2004 Young CS 331 D&A of Algo. Greedy

Given a knapsack with a certain capacity M, Knapsack Problem The problem: Given a knapsack with a certain capacity M, n objects, are to be put into the knapsack, each has a weight and a profit if put in the knapsack . The goal is find where s.t. is maximized and   Note: All objects can break into small pieces or xi can be any fraction between 0 and 1. Winter-2004 Young CS 331 D&A of Algo. Greedy

Example: Greedy Strategy#1: Profits are ordered in nonincreasing order (1,2,3) Winter-2004 Young CS 331 D&A of Algo. Greedy

Greedy Strategy#2: Weights are ordered in nondecreasing order (3,2,1) Greedy Strategy#3: p/w are ordered in nonincreasing order (2,3,1) Optimal solution Winter-2004 Young CS 331 D&A of Algo. Greedy

Show that the ordering is the best. Proof by contradiction: Analysis: Sort the p/w, such that Show that the ordering is the best. Proof by contradiction: Given some knapsack instance Suppose the objects are ordered s.t. let the greedy solution be Show that this ordering is optimal Case1: it’s optimal Case2: s.t. where Winter-2004 Young CS 331 D&A of Algo. Greedy

yk  xk  yk < xk Assume X is not optimal, and then there exists s.t. and Y is optimal examine X and Y, let yk be the 1st one in Y that yk  xk. yk  xk  yk < xk same Now we increase yk to xk and decrease as many of as necessary, so that the capacity is still M. Winter-2004 Young CS 331 D&A of Algo. Greedy

Let this new solution be where Winter-2004 Young CS 331 D&A of Algo. Greedy

else (repeat the same process, Y So, if else (repeat the same process, Y can be transformed into X and X is also optimal) Job Sequencing with Deadlines The problem:   Given n jobs, each job i has For any job i, profit pi is earned if the job i is completed by its deadline. Assume each job needs one unit of execute time and there is one machine available. an integer deadline di  0 an integer profit pi  0 Winter-2004 Young CS 331 D&A of Algo. Greedy

Goal: find a job processing sequence to maximize total profit Ex: 2,1 4,1 2.3 4,3 1,3 100+10=110 27+100=127 10+15=25 27+15=42 100+15=115 Winter-2004 Young CS 331 D&A of Algo. Greedy

Consider jobs in order of nonincreasing profits , The greedy solution:   Consider jobs in order of nonincreasing profits , maintain at each stage a set J of feasible jobs, i.e. jobs that can be run in some sequence in which all jobs in J meet their deadline. Example: Time line 1 2 Job # 1 4 3 2 Profit 100 27 15 10 Deadline 2 1 2 1 Winter-2004 Young CS 331 D&A of Algo. Greedy

Question: how to maintain a set of feasible jobs? Claim: Let J be a set of jobs, ; is a permutation of the jobs in J, s.t. then J is feasible iff the jobs in J can be processed in order S Winter-2004 Young CS 331 D&A of Algo. Greedy

If J is feasible, then these exists an ordering Proof: () Obvious by definition () If J is feasible, then these exists an ordering in which all jobs meet their deadlines, i.e. Assume Let a be such that S and S start to differ, or ia  ra || Winter-2004 Young CS 331 D&A of Algo. Greedy

Swap ra with rb , call the new ordering S so, job ia is within say for some Swap ra with rb , call the new ordering S all meet their deadlines the only problem is ra Claim: in S If J is feasible, J can be processed in order S. Repeat this process: Winter-2004 Young CS 331 D&A of Algo. Greedy

Ex: to show how it works . . . SWAP SWAP Winter-2004 Young CS 331 D&A of Algo. Greedy

Algorithm:   procedure JS (D, n, J, k) // D, n are input, J, k are output // assume jobs are sorted by profit in nonincreasing order, therefore, // no “profit” in the parameter is the deadline of the most profitable job // assume begin for i  2 to n do while and do if then //insert i into J, and (r+1) is the position for i for lk down to (r+1) by –1 do end; JS’s algorithm complexity is Winter-2004 Young CS 331 D&A of Algo. Greedy