1 BECO 2004 When can one develop an FPTAS for a sequential decision problem? with apologies to Gerhard Woeginger James B. Orlin MIT working jointly with.

Slides:



Advertisements
Similar presentations
Alexander Kononov Sobolev Institute of Mathematics Siberian Branch of Russian Academy of Science Novosibirsk, Russia.
Advertisements

On the Complexity of Scheduling
Algorithm Design Methods Spring 2007 CSE, POSTECH.
Incremental Linear Programming Linear programming involves finding a solution to the constraints, one that maximizes the given linear function of variables.
ECE 667 Synthesis and Verification of Digital Circuits
 Review: The Greedy Method
Approximation Algorithms Chapter 14: Rounding Applied to Set Cover.
Chapter 5 Fundamental Algorithm Design Techniques.
ISE480 Sequencing and Scheduling Izmir University of Economics ISE Fall Semestre.
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
Online Scheduling with Known Arrival Times Nicholas G Hall (Ohio State University) Marc E Posner (Ohio State University) Chris N Potts (University of Southampton)
CS774. Markov Random Field : Theory and Application Lecture 17 Kyomin Jung KAIST Nov
Complexity 16-1 Complexity Andrei Bulatov Non-Approximability.
Approximating Maximum Edge Coloring in Multigraphs
1 IOE/MFG 543 Chapter 3: Single machine models (Sections 3.1 and 3.2)
Dealing with NP-Complete Problems
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
Computational Methods for Management and Economics Carla Gomes
CSC401 – Analysis of Algorithms Lecture Notes 12 Dynamic Programming
Computational Complexity, Physical Mapping III + Perl CIS 667 March 4, 2004.
IEOR March 121 The Complexity of Trade-offs Christos H. Papadimitriou UC Berkeley (JWWMY)
Job Scheduling Lecture 19: March 19. Job Scheduling: Unrelated Multiple Machines There are n jobs, each job has: a processing time p(i,j) (the time to.
Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under contract.
Approximation Algorithms Motivation and Definitions TSP Vertex Cover Scheduling.
MIT and James Orlin © Chapter 3. The simplex algorithm Putting Linear Programs into standard form Introduction to Simplex Algorithm.
Online Function Tracking with Generalized Penalties Marcin Bieńkowski Institute of Computer Science, University of Wrocław, Poland Stefan Schmid Deutsche.
Computational aspects of stability in weighted voting games Edith Elkind (NTU, Singapore) Based on joint work with Leslie Ann Goldberg, Paul W. Goldberg,
Minimizing Makespan and Preemption Costs on a System of Uniform Machines Hadas Shachnai Bell Labs and The Technion IIT Tami Tamir Univ. of Washington Gerhard.
15.082J and 6.855J and ESD.78J November 2, 2010 Network Flow Duality and Applications of Network Flows.
Approximation schemes Bin packing problem. Bin Packing problem Given n items with sizes a 1,…,a n  (0,1]. Find a packing in unit-sized bins that minimizes.
Integer Programming Key characteristic of an Integer Program (IP) or Mixed Integer Linear Program (MILP): One or more of the decision variable must be.
MIT and James Orlin1 NP-completeness in 2005.
A Graphical Approach for Solving Single Machine Scheduling Problems Approximately Evgeny R. Gafarov Alexandre Dolgui Alexander A. Lazarev Frank Werner.
Approximation Algorithms for Knapsack Problems 1 Tsvi Kopelowitz Modified by Ariel Rosenfeld.
Approximation Algorithms
LECTURE 13. Course: “Design of Systems: Structural Approach” Dept. “Communication Networks &Systems”, Faculty of Radioengineering & Cybernetics Moscow.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Approximation Algorithms These lecture slides are adapted from CLRS.
1 Short Term Scheduling. 2  Planning horizon is short  Multiple unique jobs (tasks) with varying processing times and due dates  Multiple unique jobs.
Partitioning Graphs of Supply and Demand Generalization of Knapsack Problem Takao Nishizeki Tohoku University.
Outline Introduction Minimizing the makespan Minimizing total flowtime
Maximum Flow Problem (Thanks to Jim Orlin & MIT OCW)
Linear Program Set Cover. Given a universe U of n elements, a collection of subsets of U, S = {S 1,…, S k }, and a cost function c: S → Q +. Find a minimum.
Business Process Model Development – Phase 2 Discussion August 19, 2002 James B. Orlin MIT Sloan School of Management Very Large-Scale Neighborhood Search.
Operational Research & ManagementOperations Scheduling Economic Lot Scheduling 1.Summary Machine Scheduling 2.ELSP (one item, multiple items) 3.Arbitrary.
15.082J and 6.855J March 4, 2003 Introduction to Maximum Flows.
Lecture.6. Table of Contents Lp –rounding Dual Fitting LP-Duality.
Earliness and Tardiness Penalties Chapter 5 Elements of Sequencing and Scheduling by Kenneth R. Baker Byung-Hyun Ha R1.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
1 Approximation algorithms Algorithms and Networks 2015/2016 Hans L. Bodlaender Johan M. M. van Rooij TexPoint fonts used in EMF. Read the TexPoint manual.
15.082J & 6.855J & ESD.78J September 30, 2010 The Label Correcting Algorithm.
Young CS 331 D&A of Algo. NP-Completeness1 NP-Completeness Reference: Computers and Intractability: A Guide to the Theory of NP-Completeness by Garey and.
Spring 2008The Greedy Method1. Spring 2008The Greedy Method2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1)
1 The instructor will be absent on March 29 th. The class resumes on March 31 st.
Product A Product B Product C A1A1 A2A2 A3A3 B1B1 B2B2 B3B3 B4B4 C1C1 C3C3 C4C4 Turret lathes Vertical mills Center lathes Drills From “Fundamentals of.
Flow Shop Scheduling.
Single Machine Scheduling Problem Lesson 5. Maximum Lateness and Related Criteria Problem 1|r j |L max is NP-hard.
Solving IPs – Implicit Enumeration Similar to Binary IP Branch and Bound General Idea: Fixed variables – those for which a value has been fixed. Free Variable.
Algorithm Design Methods
An FPTAS for Counting Integer Knapsack Solutions Made Easy
Lecture 8: Dispatch Rules
Computability and Complexity
James B. Orlin Presented by Tal Kaminker
Hidden Markov Models Part 2: Algorithms
Integer Programming (정수계획법)
Coverage Approximation Algorithms
Integer Programming (정수계획법)
Algorithm Design Methods
Algorithm Design Methods
Algorithm Design Methods
Presentation transcript:

1 BECO 2004 When can one develop an FPTAS for a sequential decision problem? with apologies to Gerhard Woeginger James B. Orlin MIT working jointly with Mohamed Mostagir

2 Fully Polynomial Time Approximation Scheme (FPTAS) INPUT.  A sequential decision problem with n stages or decisions. Also, a given accuracy . OUTPUT  A solution that is guaranteed to be within  of optimal. RUNNING TIME  Polynomial in the size of the problem and in 1/ .

3 Goal of this talk Present Branch and Dominate: a generic method for creating FPTASes  Starting point for this research: Woeginger [2000]  applies to all problems considered by Woeginger  Generalizes results to multiple criteria as per Angel, Bampis, and Kononov [2003]  Extends to many dynamic lot-sizing and many new problems.

4 Overview of what comes next Two examples  knapsack  a more complex machine scheduling problem A generalization to many other FPTAS

5 Example 1. The Knapsack Problem

6 The Knapsack Problem as a Decision Problem x 1 = 1 x 1 = 0 x 2 = 1 x 2 = 0 x 2 = 1 x 2 = 0 x 3 = 1 x 3 = 0 x 3 = 1 x 3 = 0 x 3 = 1 x 3 = 0 x 3 = 1 x 3 = 0 Decision tree. Enumeration Tree

7 Domination Let x denote a decision node at stage j. its state is  v, w , where  v = c 1 x 1 + … + c j x j  w = a 1 x 1 + … + a j x j x is infeasible if w > b. Let x’ have state  v’, w’  at stage j Node x dominates node x’ at stage j if  v  v’ and w  w’.

8 A pseudo-polynomial time algorithm Branch and Dominate (B & D)  Expand the enumeration tree one stage at a time.  Eliminate any infeasible nodes  Whenever one node dominates another, eliminate the dominated node. (Do this sequentially)  The optimum corresponds to the stage n node with greatest value. Theorem. Branch and dominate is pseudo- polynomial for the knapsack problem.

9 A simple Example Maximize 6x 1 + 8x x 3 + … Subject to 7x 1 + 5x x 3 + …  100

10 The Knapsack Problem as a Decision Problem 0,0 x 2 = 1 14,12 x 2 = 0 6,7 x 2 = 1 8,5 x 2 = 0 0,0 x 1 = 1 6,7 x 1 = 0 0,0 x 3 = 1 x 3 = 0 x 3 = 1 x 3 = 0 x 3 = 1 x 3 = 0 28,24 22,17 8,5 0,0 14,12 Max 6x 1 + 8x x 3 + … s.t. 7x 1 + 5x x 3 + …  100 6,714,12

11  -domination Node x  -dominates node x’ at stage j if  v  (1-  ) v’ and w  w’. The number of undominated states at each stage is O(  -1 log nC max ). Theorem. Branch and  -dominate with  =  /n is an FPTAS for the knapsack problem. The running time is O(n 2 /  ) Note: we did not use w  (1+  ) w’ because there is a hard constraint on knapsack weights, and we cannot approximate it. $28,000 1,201 value weight $27,800 1,200 $28,000 1,201

12 Outline of Proof of  -optimality Let x = (x 1, x 2, …, x n ) be the optimal solution. Let y j be a partial solution at stage j for j = 0 to n.  y 0 = .  for each j = 1 to n, y j = y j-1,x j or else y j is the solution that  -dominates y j-1,x j Let x j = (x 1, …, x j ). Then y j j  -dominates x j for each j. This is the standard construction

13 y3y3 x 3 = 1 x0x0 x 2 = 1 x2x2 x 1 = 0 x1x1 x 3 = 1 x4x4 x 4 = 0 x3x3 x 3 = 1 x5x5 x 5 = 1 y0y0 y1y1 y2y2 y 2  -dominates x 2. y4y4 x 5 = 1 w5w5 y5y5 y 4  -dominates w 4.y 5  -dominates w 5. w4w4 x 4 = 0 A node in the tree, not  -dominatedA  -dominated node in the tree x2x2 part of the opt. solution Total accumulated error: at most n .

14 List Scheduling Problems Scheduling problems in which jobs are constrained to be assigned in sequential order Machine 1 Machine 2 Machine 3 Finding an optimal list schedule: finds the optimal solution for some problems minimize sum of completion times on K machines. can be used as a heuristic for an NP-hard problem

15 A 2-machine List Scheduling Problem same number of jobs on each machine proc. time bounds on machine 1. C j = completion time C j …. is defined correctly for j = 1 to n x  {0, 1} n

16 Stage j: after j jobs have been assigned M 1 (j) processing time on machine 1: at most b j M 2 (j) processing time on machine 2 d(j) number of jobs on machine 1 – number of jobs on machine 2 z(j) cumulated objective function Each partial solution x k has an associated state vector.

17 On moving from Stage j to Stage j+1 Stage x 12 = Stage 11 M1M1 M2M2 d z p 12 = 3 F 11 (27, 35, 2, 500, 1) =  30, 35, 3, 530  current state + decision  state at next stage

18 Polynomially Bounded (PB) Components A component of the state vector is called polynomially bounded (PB) if the number of different values it can take is polynomially bounded in the size of the input. 23 d The component for d is PB.

19 Monotone Components Monotone Stage j Stage j M 1 (j) M 2 (j) d(j) z(j) A non-PB component is called monotone if for each stage j replacing its current value i by i’ > i cannot not decrease F j, and PB components stay unchanged in F j (S) e.g., components 1, 2, and 4, F 11 (27, 35, 2, 500, 1) =  30, 35, 3, 530 

20 Domination at Stage j Monotone    dom if Stage j M1M1 M2M2 z d M’ 1 M’ 2 z’ d’ =  for monotone =for PB PB

21 Theorem. B & D is a pseudo-polynomial time algorithm for a list scheduling problem if 1. There are a fixed number of components of the state vector 2. Each component is either monotone or PB 3. The objective is to minimize 4. Any constraint on a monotone component is a strict upper bound 5. All other constraints involve only PB components 6. Additional technical conditions (e.g., all functions can be computed in polynomial time.) Proof. The number of undominated state vectors at each stage is pseudo-polynomial.

22 Moving from Pseudo-polynomial to an FPTAS. We want conditions under which Branch and  -dominate leads to an FPTAS. We need to replace domination by  -domination in all except one of the monotone components.

23 Good and Bad Monotone Components A monotone component (say component 1) is good if all of the following conditions are satisfied: 1.It has no strict upper bound, and 2.It cannot decrease from stage to stage. Non-Example: Suppose we keep track of M 1 – M 2 3.F j is not overly sensitive to small changes in s 1 e.g., F j (s 1 (1+  ), s 2, …, s k, )  (1 + n  ) F j (s 1, s 2, …, s k ). If it is not good, it is bad.

24 On bad monotone components Condition 1. Any monotone component on which there are hard upper bounds is bad. Example. the processing time on machine 1 at stage j at most b j. So, M 1 is bad.  Small relative changes in the value of the component can mean the difference between the constraint being satisfied or not.

25 On bad monotone components Condition 2 Suppose s 1 could decrease from stage to stage In this case, a small relative change in the value of s 1 at some stage could have a very large impact on states at later stages. And so it is bad. e.g., Suppose we keep track of |M 1 – M 2 |

26 On Bad Monotone States Condition 3. A state is bad if a small change in its value can cause a large change in the value at the next stage for some monotone state Stage 11 M1M1 M2M2 d Tardi- ness Bad Monotone Stage 11 Suppose that we are minimizing total tardiness. Bad Monotone PB Good Monotone

27  -Domination at Stage j Stage Bad Monotone PB Good Monotone Stage M1M1 M2M2 d z  =  (1+  ) Good Monotone  -domination is the same as domination except for the (1+  ) term on the good monotone states.

28 Theorem. If a list scheduling problem satisfies conditions 1-6 from before, and if at most one monotone component is bad, then Branch and  -dominate can be used to create an FPTAS. Note: use  -domination on all good monotone components, and use domination on the remaining monotone component.

29 A list scheduling problem with outsourcing Schedule jobs on a single machine Each job j has a due date d j, and a maximum tardiness L j job i can be outsourced at a cost of c i. At most K jobs can be outsourced. Objective: minimize the weighted sum of the tardinesses of the jobs plus the outsourcing costs. subject to: job j must be completed before d j + L j. Processing Time on machine States at stage j Number outsourced Cumulative Objective Function bad-bad Good PB

30 Theorem. Branch and  -dominate can be used to create an FPTAS for the list scheduling problem on the previous slide. Note: if we do not require a list schedule, the previous problem is strongly NP-hard.

31 References Contrast with Woeginger [2000]  very similar in essence  fewer abstractions here; more direct focus on the components of the transition function  focus on list scheduling helps to clarify contribution  different in generalizations that follow

32 Other references Rich history of references in FPTASes Pioneers (1970s)  Horowitz & Sahni, Ibarra &Kim, Sethi, Garey, Johnson, Babat, Gens, Levner, Lawler, Lenstra, Rinnooy Kan, and more 61 references in the ACM digital guide for FPTAS Domination in Branch and Bound is an old idea. But I don’t have early references.

33 Multi-criteria FPTAS Suppose we have two or more objective criteria. We say that (c, b)  -dominates (c’, b’) if 1.c  (1 +  ) c’ and 2.b  (1 +  ) b’ A pareto set consists of a maximal number of undominated solutions. The size of a pareto set is polynomial in the size of the data, 1/  and the number of objectives.

34 Multi-criteria FPTAS

35 Multiple Criteria FPTAS Theorem. If a sequential problem is solvable with an FPTAS for one good objective using the results of Theorem 1, then the multiple criteria version is solvable with an FPTAS for multiple good objectives. Proof. Same argument as in single criterion case.

36 References Multi-criteria FPTAS  Hansen [1979]  Orlin [1981]  Warburton [1987]  Safer and Orlin [1995]  Papadimitriou and Yannakakis [2000], [2001]  Angel, Bampis, and Kononov [2003]

37 Machine Scheduling with “crashing” 2 machine scheduling problem Minimize the makespan  Budget B  Processing item j uses up some of the budget  Processing time of item j is p j (b), where $b are used. if b’ > b, then p j (b’)  p j (b). Note: there are an exponential number of possible decisions to make at stage j. We will modify B&D.

38 Machine Scheduling with “crashing” Processing Time on machine 1 States at stage j Processing Time on machine 2 Budget used up good bad good Decisions at stage j Place job j on machine 1 or 2 How much budget to allocate exp

39 Branching at stage j. b = budget allocated [0, B/2] x j = 1 x j = 0 Budget allocation as a binary decision process. state [B/2, B] [0, B/2] [B/2, B] [0, B/4] [B/4, B/2] [B/2, 3B/4] [3B/4, B] [0, B/4] [B/4, B/2] [B/2, 3B/4] [3B/4, B] Example: b  [0, B/2]

40 Domination Rule Stage j, state S, assign job j to machine 1, budget assigned is between L and U Suppose that the state F j (S, 1, L)  -dominates the state F j (S, 1, U) Then allocate a budget of L, and stop branching. [L, U] During the branching for budget allocation, stop branching at a node denoted as [L, U] if allocating the budget L gives a state that  -dominates the state obtained by allocating U. L

41 Lemma and Theorem Lemma. In the binary expansion of the budget, the number of nodes starting from any state is polynomial in the size of the problem and in 1/ . Theorem. Branch and dominate is an FPTAS whenever it satisfies conditions 1 to 6 from before and whenever the number of bad monotone states is at most 1.

42 Lot Sizing The previous analysis extends to dynamic lotsizing Branch and  -dominate gives an FPTAS, even in the multiple criteria case. Extends work by Dada and Orlin [1981] Safer and Orlin [1995] Van Hoesel and Wagelmans [2001] Safer, Orlin, and Dror [2003]

43 On FPTASs List Scheduling Problem polynomially bounded states monotone states, both good and bad constraints  -domination Extends and simplifies research by Woeginger