Presentation is loading. Please wait.

Presentation is loading. Please wait.

M Tech Project – First Stage Improving Branch-And-Price Algorithms For Solving 1D Cutting Stock Problem Soumitra Pal [05305015]

Similar presentations


Presentation on theme: "M Tech Project – First Stage Improving Branch-And-Price Algorithms For Solving 1D Cutting Stock Problem Soumitra Pal [05305015]"— Presentation transcript:

1 M Tech Project – First Stage Improving Branch-And-Price Algorithms For Solving 1D Cutting Stock Problem Soumitra Pal [05305015]

2 Agenda  Cutting stock problem & formulation  Generic Branch-and-price algorithm  Implementation by Vance [1994], etc  Work in next stage  Conclusion

3 Cutting stock problem  Bigger rolls (raw) available  Orders of smaller rolls are to be cut (items)  Minimize no of raws used

4 IP formulation  λ j denotes the no of rolls cut in pattern j  a ij denote no of times item i is cut from pattern j  b i is order for item i

5 Solution using branch-and-price  Branch-and-bound technique of IP  The bound is calculated using LP relaxation solved by column generation

6 Generic branch-and-price algorithm Generate Initial heuristic. Set it as incumbent. Make root node of BB tree and enter into Q Unexamined node exists in Q? Undiscovered? Calculate LB (column generation) Set the node as discovered LB = some feasible sol X and X< incumbent? Update incumbent to X LB >= incumbent? Branch (i.e. make two nodes and make them undiscovered and unexamined and enter in Q) Fathom (do nothing) Set the node as examined Stop; incumbent is the solution Y N N N N Y Y Y

7 Column Generation in brief  Take few initial feasible columns  Solve the restricted master problem  Use the dual solution of the master problem as the profit for the knapsack problem to get new better column  Continue as far as better columns can be found

8 BAP implementation variations  Initial heuristic solution  Branching rule  Node selection rule  Bounds

9 Branching rules  Excludes fractional solution  Guarantees feasible solution after a finite number of steps  Must encode branching info in subproblem  Creates trees of ~ equal size  Keeps master & subproblem tractable

10 Conventional branching  Excludes fractional solution  Finite no of steps since finite no of variables  Encoding of branch info is explained next λ p = α λ p ≤ floor(α) λ p ≥ ceil(α)

11 Subproblem modification  On the right branch, it is equivalent to reducing the demand vector & solve the residual  No need to modify the subproblem  On the left branch, the column should not be regenerated  This is done by keeping forbidden list in subproblem

12 Problem in conventional branching  Solution space is not equally divided on both the branches  Here, solution space is equivalent to all possible columns  On the right branch, no of possible new columns is reduced  Left branch reduces only one column  Subproblem may become difficult

13 Branching for BCS (demand=1)  Branching rule by Vance [1994]

14 Branching in BCS (2)  Left branch should include both items l and m together  This can be solved by replacing the two items by one item of combined width  On the right branch, at most one of them should get included  This is solved by including edge constraints in the subproblem  The subspaces are of ~ equal size

15 Solving the subproblem  When no edge constraint, solve using Horowitz-Sahni algorithm  When no overlapping edge constraints, solve using modified HS using Jhonson-Padberg bounds  When overlapping edge constraints, use general IP solver

16 Horowitz-Sahni algorithm  Backtracking with bounds  Items ordered in decreasing profit density (profit/weight)  In forward move, try inserting one item  Bound  If U>current best, move forward  Otherwise backtrack (remove last inserted item) & repeat  Update current best when last item is considered  Stop when no more backtracking is possible

17 Jhonson-Padberg algorithm  Solves Knapsack Problem with SOS  SOS is a set of variables, at most one can be set to 1

18 Jhonson-Padberg algorithm (2)  Order SOSs according to max p/w  If w 1 ≥W, set corresponding x 1 =W/w 1 and all other x i =0  Otherwise, RECORD x i for the item with minimum weight  Remove from S 1, all items having profit less than w i ; update remaining items in S 1, p j - =p i, w j -=w i and W-=w i  Repeat  If multiple items in same SOSs is RECORDED, set variables as shown in the example

19 JP algorithm Example

20 Modified HS  Use SOS ordering  In forward move at most one from each SOS can be inserted  Bound is calculated using JP  In backtrack, other item in the same SOS is considered next

21 Branching for general CSP  Branch on a set of variables  Need to explore more

22 Bounding  Bounding can be used to avoid tailing-off effect  When the following condition is satisfied, column generation can be stopped

23 Initial heuristic solution  First Fit Decreasing (FFD)  Items in decreasing size is fit in existing roll  If can not be fit, use a new roll

24 Work in next stage  Complete literature survey Vance [1998] Vanderbeck [1999] Degraeve and Peeters [2003] Carvalho [1999]  Comparative study of them Different set of experimental instances Time quoted are on different machines Need to gather absolute numbers such as no of nodes, no of sub-problems etc

25 Knapsack with forbidden list  A dynamic programming algorithm  KPFL(I,W,S) := max{KPFL(I-1,W,S), KPFL(I-1,W-w I,S uinon I)}  O(n x W x 2 n x L)  Need to improve that  Core algorithms

26 Conclusion  We explored solution of 1D CSP using branch-and-price  Need to improve the solution to subproblem for overall improvement  Dynamic programming etc to solve subproblem instead of using IP solvers


Download ppt "M Tech Project – First Stage Improving Branch-And-Price Algorithms For Solving 1D Cutting Stock Problem Soumitra Pal [05305015]"

Similar presentations


Ads by Google