Download presentation
Presentation is loading. Please wait.
Published byColin Rogers Modified over 11 years ago
1
Alexander Kononov Sobolev Institute of Mathematics Siberian Branch of Russian Academy of Science Novosibirsk, Russia
2
R u s s i a Novosibirsk
3
How to design a PTAS adapted from the novel by P. Schuurman and G. Woeginger directed by Alexander Kononov
4
Garry Potter problem
5
Could you find a schedule for my new project with the minimal cost? We can do that! Real sorcerers can do everything! And we guess the cost of the project will be 1000000 $. Sounds great ! Wonderful ! Go ahead and determine this schedule! Tomorrow we start my new project! We can not do that by tomorrow Real sorcerers can do everything! But finding the schedule is going to take us 23,5 years!
6
Tomorrow… 2000000 $ But …I want … 1000000 $ Then…after 23,5 years The day after tomorrow 1500000 $ Three days from now 1333333 $ What if I call you up exactly X days from now 1000000(1+1/X)
7
NP-hard problems Almost all interesting combinatorial problems are NP- hard. Almost all interesting combinatorial problems are NP- hard. Nobody knows a polynomial time exact algorithm for any NP-hard problem. Nobody knows a polynomial time exact algorithm for any NP-hard problem. If there exists a polynomial time exact algorithm for some NP-hard problem then there exists a polynomial time exact algorithm for many NP-hard problems. If there exists a polynomial time exact algorithm for some NP-hard problem then there exists a polynomial time exact algorithm for many NP-hard problems. The most researchers guess the a polynomial time exact algorithm for NP-hard problems does not exist. The most researchers guess the a polynomial time exact algorithm for NP-hard problems does not exist. We have to solve NP-hard problems approximately. We have to solve NP-hard problems approximately.
8
Approximation algorithm An algorithm A is called ρ-approximation algorithm for problem Π, if for all instances I of Π it delivers a feasible solution with objective value A(I ) such that An algorithm A is called ρ-approximation algorithm for problem Π, if for all instances I of Π it delivers a feasible solution with objective value A(I ) such that A(I ) ρOPT(I ). A(I ) ρOPT(I ).
9
Polynomial time approximation scheme (PTAS) An approximation scheme for problem Π is a family of (1+ε) – approximation algorithms A ε for problem Π over all 0< ε <1. An approximation scheme for problem Π is a family of (1+ε) – approximation algorithms A ε for problem Π over all 0< ε <1. A polynomial time approximation scheme for problem Π is an approximation scheme whose time complexity is polynomial in the input size. A polynomial time approximation scheme for problem Π is an approximation scheme whose time complexity is polynomial in the input size.
10
A Fully polynomial time approximation scheme (FPTAS) A fully polynomial time approximation scheme for problem Π is an approximation scheme whose time complexity is polynomial in the input size and also polynomial in 1/ε. A fully polynomial time approximation scheme for problem Π is an approximation scheme whose time complexity is polynomial in the input size and also polynomial in 1/ε.
11
Remarks Running time Running time PTAS: | I | 2/ε, | I | 2/ε 10, (| I | 2/ε ) 1/ε. PTAS: | I | 2/ε, | I | 2/ε 10, (| I | 2/ε ) 1/ε. FPTAS: | I | 2 ε, | I |ε 2, | I | 7 ε 3. FPTAS: | I | 2 /ε, | I |/ε 2, | I | 7 /ε 3. With respect to worse case approximation an FPTAS is the strongest possible result that we can derive for an NP–hard problem. With respect to worse case approximation an FPTAS is the strongest possible result that we can derive for an NP–hard problem.
12
P2||C max J={1,..., n} – jobs. J={1,..., n} – jobs. {M 1, M 2 } – identical machines. {M 1, M 2 } – identical machines. j : p j > 0 (j =1,…, n). j : p j > 0 (j =1,…, n). Each job has to be executed by one of two machines. Each job has to be executed by one of two machines. All jobs are available at time 0 and preemption is not allowed. All jobs are available at time 0 and preemption is not allowed. Each machine executes at most one job at time. Each machine executes at most one job at time. The goal is to minimize the maximum job completion time. The goal is to minimize the maximum job completion time.
13
How to get a PTAS Simplification of instance I. Simplification of instance I. Partition of output space. Partition of output space. Adding structure to the execution of an algorithm A. Adding structure to the execution of an algorithm A. Instance I Algorithm AOutput A(I)
14
Simplification of instance I The first idea is to turn a difficult instance into a more primitive instance that is easier to tackle. Then we use the optimal solution for the primitive instance to get a near optimal solution of the original instance. The first idea is to turn a difficult instance into a more primitive instance that is easier to tackle. Then we use the optimal solution for the primitive instance to get a near optimal solution of the original instance. Simplification Solve OPT # Translate back OPT App I I #
15
Approaches of simplification Rounding Rounding Merging Merging Cutting Cutting Aligning Aligning
16
Rounding 0 322916 12 43
17
Rounding 0 322916 12 43
18
Merging 0 3229642
19
Merging 0 322964227
20
Cutting 0 32292
21
Cutting 0 32292
22
Aligning 0 3229642
23
Aligning 0 3229642 6+6+5+5 = 4 5.5 5+5+4+4 = 4 4.5 4+4+3+2+2 = 5 3
24
P2||C max J={1,..., n} – jobs. J={1,..., n} – jobs. {M 1, M 2 } – identical machines. {M 1, M 2 } – identical machines. j : p j > 0 (j =1,…, n). j : p j > 0 (j =1,…, n). Each job has to be executed by one of two machines. Each job has to be executed by one of two machines. All jobs are available at time 0 and preemption is not allowed. All jobs are available at time 0 and preemption is not allowed. Each machine executes at most one job at time. Each machine executes at most one job at time. The goal is to minimize the maximum job completion time. The goal is to minimize the maximum job completion time.
25
Lower bound
26
How to simplify an instance ( I I # ) Big = { j J | p j εL} Big = { j J | p j εL} New instance I # contains all the big jobs from I. New instance I # contains all the big jobs from I. Small = { j J | p j < εL} Small = { j J | p j < εL} Let X= Σ j Small p j. Let X= Σ j Small p j. New instance I # contains X/εL jobs of length εL. New instance I # contains X/εL jobs of length εL. The small jobs in I are first glued together to give a long job of length X, and then this long job is cut into lots of chunks of length εL. The small jobs in I are first glued together to give a long job of length X, and then this long job is cut into lots of chunks of length εL.
27
I and I # The optimal makespan of I # is fairly close to the optimal makespan of I: The optimal makespan of I # is fairly close to the optimal makespan of I: OPT(I # ) (1+ ε)OPT(I ). OPT(I # ) (1+ ε)OPT(I ).
28
Proof X i – the total size of all small jobs on machine M i in optimal schedule for I. X i – the total size of all small jobs on machine M i in optimal schedule for I. On M i, leave every big job where it is in optimal schedule. On M i, leave every big job where it is in optimal schedule. Replace the small jobs on M i by X i /εL chunks of length εL. Replace the small jobs on M i by X i /εL chunks of length εL. X 1 /εL + X 2 /εL X 1 /εL + X 2 /εL = X/εL X 1 /εL + X 2 /εL X 1 /εL + X 2 /εL = X/εL X i /εL εL – X i (X i /εL + 1) εL – X i εL X i /εL εL – X i (X i /εL + 1) εL – X i εL OPT(I # ) OPT + εL (1+ ε)OPT(I) OPT(I # ) OPT + εL (1+ ε)OPT(I)
29
How to solve the simplified instance How many jobs in instance I # ? How many jobs in instance I # ? p j εL for all jobs in I #. p j εL for all jobs in I #. The total length of all jobs in I # : p sum 2L. The total length of all jobs in I # : p sum 2L. The number of jobs in I # 2L/εL= 2/ε. The number of jobs in I # 2L/εL= 2/ε. The number of jobs in I # is independent of n. The number of jobs in I # is independent of n. We may simply try all possible schedules. We may simply try all possible schedules. The number of all possible schedules 2 2/ε ! The number of all possible schedules 2 2/ε ! Running time is O(2 2/ε n)! Running time is O(2 2/ε n)!
30
How to translate solution back Let σ # be an optimal schedule for instance I #. Let σ # be an optimal schedule for instance I #. Let L i # be the load of machine M i in σ #. Let L i # be the load of machine M i in σ #. Let B i # be the total length of the big jobs on M i in σ #. Let B i # be the total length of the big jobs on M i in σ #. Let X i be the total size of the small jobs on M i in σ #. Let X i be the total size of the small jobs on M i in σ #. L i # = B i # + X i #. L i # = B i # + X i #.
31
σ # (I # ) σ(I) Every big job is put onto the same machine as in schedule σ #. Every big job is put onto the same machine as in schedule σ #. Reserve an interval of length X 1 # + 2εL on machine M 1 and an interval of length X 2 # on machine M 2. Reserve an interval of length X 1 # + 2εL on machine M 1 and an interval of length X 2 # on machine M 2. Pack small jobs into the reserved interval on machine M 1 until meet some small job that does not fit in anymore. Pack small jobs into the reserved interval on machine M 1 until meet some small job that does not fit in anymore. Pack remaining unpacked jobs into the reserved interval on machine M 2. Pack remaining unpacked jobs into the reserved interval on machine M 2.
32
PTAS
33
Structuring the output The main idea is to cut output space (i.e. the set of feasible solutions) into lots of smaller regions over which the optimization problem is easy to approximate. Solve the problem separately for each smaller region and taking the best approximate solution over all region will then yield a globally good approximate solution. The main idea is to cut output space (i.e. the set of feasible solutions) into lots of smaller regions over which the optimization problem is easy to approximate. Solve the problem separately for each smaller region and taking the best approximate solution over all region will then yield a globally good approximate solution. 1. Partition. 2. Find representatives. 3. Take the best.
34
Partition * * - the global optimal solution
35
Find representatives * * * * * * * * * * * * * * * * * * * * * * * - the global optimal solution * - an optimal solution in his district * - a representative in his district
36
Take the best * * * * * * * * * * * * * * * * * * * * * * * - the global optimal solution * - an optimal solution in his district * - a representative in his district
37
P2||C max J={1,..., n} – jobs. J={1,..., n} – jobs. {M 1, M 2 } – identical machines. {M 1, M 2 } – identical machines. j : p j > 0 (j =1,…, n). j : p j > 0 (j =1,…, n). Each job has to be executed by one of two machines. Each job has to be executed by one of two machines. All jobs are available at time 0 and preemption is not allowed. All jobs are available at time 0 and preemption is not allowed. Each machine executes at most one job at time. Each machine executes at most one job at time. The goal is to minimize the maximum job completion time. The goal is to minimize the maximum job completion time.
38
How to define the districts Big = { j J| p j εL} Big = { j J| p j εL} Small = { j J| p j < εL} Small = { j J| p j < εL} Let Φ be the set of feasible solutions for I. Let Φ be the set of feasible solutions for I. Every feasible solution σ Φ specifies an assignment of the n jobs to the two machines. Every feasible solution σ Φ specifies an assignment of the n jobs to the two machines. Define the districts Φ (1), Φ (2),…according to the assignment of big jobs to the two machines: Two feasible solutions σ 1 и σ 2 lie in the same district if and only if σ 1 assigns every big job to the same machine as σ 2 does. Define the districts Φ (1), Φ (2),…according to the assignment of big jobs to the two machines: Two feasible solutions σ 1 и σ 2 lie in the same district if and only if σ 1 assigns every big job to the same machine as σ 2 does.
39
Number of districts The number of big jobs 2L/εL = 2/ε. The number of big jobs 2L/εL = 2/ε. The number of different ways for assigning these jobs to two machines 2 2/ε. The number of different ways for assigning these jobs to two machines 2 2/ε. The number of districts 2 2/ε ! The number of districts 2 2/ε ! The number of districts depends on ε and is independent of the input size! The number of districts depends on ε and is independent of the input size!
40
How to find good representatives The assignments of big jobs to their machines are fixed in Φ (l). The assignments of big jobs to their machines are fixed in Φ (l). Let OPT (l) be the makespan of the best schedule in Φ (l). Let OPT (l) be the makespan of the best schedule in Φ (l). Let B i (l) be the total length of big jobs assigned to machine M i. Let B i (l) be the total length of big jobs assigned to machine M i. T := max{B i (1), B i (2) } OPT (l) T := max{B i (1), B i (2) } OPT (l) The initial workload of machine M i is B i (l). The initial workload of machine M i is B i (l). We assign the small jobs one by one to the machines; every time a job is assigned to the machine with the currently smaller workload. We assign the small jobs one by one to the machines; every time a job is assigned to the machine with the currently smaller workload. The resulting schedule σ (l) with makespan A (l) is our representative for the district Φ (l). The resulting schedule σ (l) with makespan A (l) is our representative for the district Φ (l).
41
How close is A (l) to OPT (l) 1. If A (l) =T, then A (l) = OPT (l). 2. Let A (l) >T. Consider the machine with higher workload in the schedule σ (l). Consider the machine with higher workload in the schedule σ (l). Then the last job that was assigned to the machine is a small job and it has length at most εL. Then the last job that was assigned to the machine is a small job and it has length at most εL. At the moment when this small job was assigned to the machine the workload of this machine was at most p sum / 2. At the moment when this small job was assigned to the machine the workload of this machine was at most p sum / 2. A (l) (p sum / 2) + εL (1 + ε)OPT (1 + ε)OPT (l) A (l) (p sum / 2) + εL (1 + ε)OPT (1 + ε)OPT (l)
42
Structuring the execution of an algorithm The main idea is to take an exact but slow algorithm A, and to interact with it while it is working. The main idea is to take an exact but slow algorithm A, and to interact with it while it is working. If the algorithm accumulates a lot of auxiliary data during its execution, then we may remove part of this data and clean up the algorithms memory. If the algorithm accumulates a lot of auxiliary data during its execution, then we may remove part of this data and clean up the algorithms memory. As a result the algorithm becomes faster. As a result the algorithm becomes faster.
43
P2||C max J={1,..., n} – jobs. J={1,..., n} – jobs. {M 1, M 2 } – identical machines. {M 1, M 2 } – identical machines. j : p j > 0 (j =1,…, n). j : p j > 0 (j =1,…, n). Each job has to be executed by one of two machines. Each job has to be executed by one of two machines. All jobs are available at time 0 and preemption is not allowed. All jobs are available at time 0 and preemption is not allowed. Each machine executes at most one job at time. Each machine executes at most one job at time. The goal is to minimize the maximum job completion time. The goal is to minimize the maximum job completion time.
44
Code of feasible solution Let σ k be a feasible schedule of k first jobs {1,..., k}. Let σ k be a feasible schedule of k first jobs {1,..., k}. We encode a feasible schedule σ k with machine loads L 1 and L 2 by the two dimensional vector [L 1, L 2 ]. We encode a feasible schedule σ k with machine loads L 1 and L 2 by the two dimensional vector [L 1, L 2 ]. Let V k be the vector set corresponding to feasible schedules of k jobs {1,..., k}. Let V k be the vector set corresponding to feasible schedules of k jobs {1,..., k}.
45
Dynamic programming Input ( J={1,..., n}, p: J Z + ) 1) Set V 0 ={[0,0]}, i=0. 2) While i n do: for every vector [x,y] V i put [x + p i,y] and [x,y + p i ] in V i+1 ; for every vector [x,y] V i put [x + p i,y] and [x,y + p i ] in V i+1 ; i:= i +1; i:= i +1; 3) Find the vector [x*,y*] V n that minimize the value max [x,y] V n {x,y}. minimize the value max [x,y] V n {x,y}. Output ([x*,y*])
46
Running time The coordinates of all vectors are integer in the range from 0 to p sum. The coordinates of all vectors are integer in the range from 0 to p sum. The cardinality of every vector set V i is bounded from above by (p sum ) 2. The cardinality of every vector set V i is bounded from above by (p sum ) 2. The total number of vectors determined by the algorithm is at most n(p sum ) 2. The total number of vectors determined by the algorithm is at most n(p sum ) 2. The running time of the algorithm is O(n(p sum ) 2 ). The running time of the algorithm is O(n(p sum ) 2 ). The size |I| of the input I satisfies |I| log(p sum ) = const · ln(p sum ). The size |I| of the input I satisfies |I| log(p sum ) = const · ln(p sum ). The running time of the algorithm is not polynomial of the size of the input! The running time of the algorithm is not polynomial of the size of the input!
47
How to simplify the vector sets p sum 1 (p sum, p sum ) 1 0 Δ Δ Δ2Δ2 Δ2Δ2 Δ3Δ3 Δ3Δ3 ΔKΔK Δ = 1+ (ε/2n) K = log Δ (p sum ) = = ln(p sum )/ln Δ = ln(p sum )/ln Δ ((1+2n )/ε) ln(p sum ) ((1+2n )/ε) ln(p sum ) ΔKΔK
48
Trimmed vector set p sum 1 (p sum, p sum ) 1 0 Δ Δ Δ2Δ2 Δ2Δ2 Δ3Δ3 Δ3Δ3 ΔKΔK Δ = 1+ (ε/2n) K = log Δ (p sum ) = = ln(p sum )/ln Δ = ln(p sum )/ln Δ ((1+2n )/ε) ln(p sum ) ((1+2n )/ε) ln(p sum ) ΔKΔK
49
Algorithm FPTAS Input ( J={1,..., n}, p: J Z + ) 1. Set V 0 # ={[0,0]}, i=0. 2. While i n do: for every vector [x,y] V i # put [x + p i,y] and [x,y + p i ] in V i+1 ; for every vector [x,y] V i # put [x + p i,y] and [x,y + p i ] in V i+1 ; i:= i +1; i:= i +1; Transform V i into V i #. Transform V i into V i #. 3. Find the vector [x*,y*] V n #, that minimize the value max [x,y] V n # {x,y}. minimize the value max [x,y] V n # {x,y}. Output ([x*,y*])
50
Running time of FPTAS The trimmed vector set V i # contains at most one vector in each box. The trimmed vector set V i # contains at most one vector in each box. There are K 2 boxes. There are K 2 boxes. Running time of FPTAS O(nK 2 ). Running time of FPTAS O(nK 2 ). nK 2 = n ((1+2n )/ε) ln(p sum ) 2. nK 2 = n ((1+2n )/ε) ln(p sum ) 2. Algorithm FPTAS has a time complexity that is polynomial in the input size and in 1/ε. Algorithm FPTAS has a time complexity that is polynomial in the input size and in 1/ε.
51
V i and V i # V i and V i # For every vector [x,y] V i there exists a vector [x #,y # ] V i #, such that x # Δ i x and y # Δ i y. For every vector [x,y] V i there exists a vector [x #,y # ] V i #, such that x # Δ i x and y # Δ i y.
52
The worst case behavior FPTAS
53
Final remarks Do we consider all approaches? Do we consider all approaches? No we dont, of course! No we dont, of course! Approximation Algorithms for NP-hard problems, edited by D.Hochbaum, PWS Publishing Company, 1997. Approximation Algorithms for NP-hard problems, edited by D.Hochbaum, PWS Publishing Company, 1997. V. Vazirani Approximation Algorithms, Springer-Verlag, Berlin, 2001. V. Vazirani Approximation Algorithms, Springer-Verlag, Berlin, 2001. P. Schuurman, G. Woeginger, Approximation Schemes – A Tutorial, chapter of the book Lecture on Scheduling, to appear in 2008. P. Schuurman, G. Woeginger, Approximation Schemes – A Tutorial, chapter of the book Lecture on Scheduling, to appear in 2008.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.