The Dual Problem: Minimization with problem constraints of the form ≥

Slides:



Advertisements
Similar presentations
EMGT 501 HW #1 Solutions Chapter 2 - SELF TEST 18
Advertisements

Operation Research Chapter 3 Simplex Method.
Chapter 6 Linear Programming: The Simplex Method
The Simplex Method The geometric method of solving linear programming problems presented before. The graphical method is useful only for problems involving.
Copyright (c) 2003 Brooks/Cole, a division of Thomson Learning, Inc
To accompany Quantitative Analysis for Management, 8e by Render/Stair/Hanna 9-1 © 2003 by Prentice Hall, Inc. Upper Saddle River, NJ Chapter 9 Linear.
Linear Inequalities and Linear Programming Chapter 5 Dr.Hayk Melikyan/ Department of Mathematics and CS/ Linear Programming in two dimensions:
9.4 Linear programming and m x n Games: Simplex Method and the Dual Problem In this section, the process of solving 2 x 2 matrix games will be generalized.
Chapter 6 Linear Programming: The Simplex Method Section 3 The Dual Problem: Minimization with Problem Constraints of the Form ≥
Chapter 6 Linear Programming: The Simplex Method
Linear Inequalities and Linear Programming Chapter 5
Chapter 7 LINEAR PROGRAMMING.
The Simplex Method: Standard Maximization Problems
Operation Research Chapter 3 Simplex Method.
Q 2-31 Min 3A + 4B s.t. 1A + 3B ≧ 6 B = - 1/3A + 2 1A + 1B ≧ 4
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 99 Chapter 4 The Simplex Method.
5.6 Maximization and Minimization with Mixed Problem Constraints
Chapter 4 The Simplex Method
6  Graphing Systems of Linear Inequalities in Two Variables  Linear Programming Problems  Graphical Solutions of Linear Programming Problems  The Simplex.
LINEAR PROGRAMMING SIMPLEX METHOD.
Linear Programming - Standard Form
Learning Objectives for Section 6.2
Chapter 3 Linear Programming Methods 高等作業研究 高等作業研究 ( 一 ) Chapter 3 Linear Programming Methods (II)
1. The Simplex Method.
Chapter 6 Linear Programming: The Simplex Method
Duality Theory 對偶理論.
Operations Research Assistant Professor Dr. Sana’a Wafa Al-Sayegh 2 nd Semester ITGD4207 University of Palestine.
Simplex Algorithm.Big M Method
Duality Theory LI Xiaolei.
Chapter 6 Linear Programming: The Simplex Method Section 2 The Simplex Method: Maximization with Problem Constraints of the Form ≤
Barnett/Ziegler/Byleen Finite Mathematics 11e1 Learning Objectives for Section 6.4 The student will be able to set up and solve linear programming problems.
Kerimcan OzcanMNGT 379 Operations Research1 Linear Programming: The Simplex Method Chapter 5.
1 1 © 2003 Thomson  /South-Western Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 1 © 2003 Thomson  /South-Western Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
The Simplex Method. Standard Linear Programming Problem Standard Maximization Problem 1. All variables are nonnegative. 2. All the constraints (the conditions)
Chapter 6 Linear Programming: The Simplex Method Section R Review.
Duality Theory  Every LP problem (called the ‘Primal’) has associated with another problem called the ‘Dual’.  The ‘Dual’ problem is an LP defined directly.
Business Mathematics MTH-367 Lecture 15. Chapter 11 The Simplex and Computer Solutions Methods continued.
Water Resources Development and Management Optimization (Linear Programming) CVEN 5393 Mar 4, 2011.
Chapter 6 Simplex-Based Sensitivity Analysis and Duality
Linear Programming Revised Simplex Method, Duality of LP problems and Sensitivity analysis D Nagesh Kumar, IISc Optimization Methods: M3L5.
4  The Simplex Method: Standard Maximization Problems  The Simplex Method: Standard Minimization Problems  The Simplex Method: Nonstandard Problems.
Chapter 6 Linear Programming: The Simplex Method Section 3 The Dual Problem: Minimization with Problem Constraints of the Form ≥
 Minimization Problem  First Approach  Introduce the basis variable  To solve minimization problem we simple reverse the rule that is we select the.
Chapter 4 Linear Programming: The Simplex Method
Chapter 6 Linear Programming: The Simplex Method Section 4 Maximization and Minimization with Problem Constraints.
1 1 Slide © 2005 Thomson/South-Western Simplex-Based Sensitivity Analysis and Duality n Sensitivity Analysis with the Simplex Tableau n Duality.
Barnett/Ziegler/Byleen Finite Mathematics 11e1 Learning Objectives for Section 6.3 The student will be able to formulate the dual problem. The student.
1 1 Slide © 2000 South-Western College Publishing/ITP Slides Prepared by JOHN LOUCKS.
Linear Inequalities and Linear Programming Chapter 5 Dr.Hayk Melikyan/ Department of Mathematics and CS/ 5.5 Dual problem: minimization.
OR Chapter 7. The Revised Simplex Method  Recall Theorem 3.1, same basis  same dictionary Entire dictionary can be constructed as long as we.
Simplex Method Simplex: a linear-programming algorithm that can solve problems having more than two decision variables. The simplex technique involves.
 LP graphical solution is always associated with a corner point of the solution space.  The transition from the geometric corner point solution to the.
Copyright © 2006 Brooks/Cole, a division of Thomson Learning, Inc. Linear Programming: An Algebraic Approach 4 The Simplex Method with Standard Maximization.
Decision Support Systems INF421 & IS Simplex: a linear-programming algorithm that can solve problems having more than two decision variables.
(iii) Simplex method - I D Nagesh Kumar, IISc Water Resources Planning and Management: M3L3 Linear Programming and Applications.
Chapter 4 The Simplex Algorithm and Goal Programming
5.5 Dual problem: minimization with problem constraints of the form Associated with each minimization problem with constraints is a maximization problem.
Solving Linear Program by Simplex Method The Concept
Simplex Algorithm.Big M Method
The Simplex Method The geometric method of solving linear programming problems presented before. The graphical method is useful only for problems involving.
Chap 9. General LP problems: Duality and Infeasibility
The Simplex Method: Standard Minimization Problems
Lial/Hungerford/Holcomb/Mullins: Mathematics with Applications 11e Finite Mathematics with Applications 11e Copyright ©2015 Pearson Education, Inc. All.
The Simplex Method The geometric method of solving linear programming problems presented before. The graphical method is useful only for problems involving.
Chapter 5. The Duality Theorem
Copyright © 2019 Pearson Education, Inc.
Problem Constraints of the Form 
Presentation transcript:

The Dual Problem: Minimization with problem constraints of the form ≥ Linear programming problems exist in pairs. That is in linear programming problem, every maximization problem is associated with a minimization problem. Conversely, associated with every minimization problem is a maximization problem. Once we have a problem with its objective function as maximization, we can write by using duality relationship of linear programming problems, its minimization version. The original linear programming problem is known as primal problem, and the derived problem is known as dual problem.

Thus, the dual problem uses exactly the same parameters as the primal problem, but in different locations. To highlight the comparison, now look at these same two problems in matrix notation. Primal Problem   Dual Problem Minimize Z=cx Maximize W=yb Subject to Ax≥b yAc and x≥0 And y≥0

Summary Primal Dual (a) Maximize. Minimize (b) Objective Function. Right hand side. (c) Right hand side. Objective function. (d) i th row of input-output coefficients. i th column of input output coefficients. (e) j th column of input-output coefficients. j the row of input-output coefficients.

EXAMPLE 1   The procedure for forming the dual problem is summarized in the box below: Formation of the Dual Problem Given a minimization problem with problem constraints, Step 1. Use the coefficients and constants in the problem constraints and the objective function to form a matrix A with the coefficients of the objec­tive function in the last row. Step 2. Interchange the rows and columns of matrix A to form the matrix AT, the transpose of A. Step 3. Use the rows of AT to form a maximization problem with  problem constraints. Forming the Dual Problem Minimize C = 40x1 + 12x2 + 40x3 subject to 2x1 + x2 + 5x3 ≥ 20 4x1 + x2 + x3 ≥ 30 x1, x2, x3 ≥ 0

EXAMPLE 2 Form the dual problem: Minimize C = 16 x1 + 9x2 + 21x3 subject to x1 + x2 + 3x3 ≥ 12 2x1 + x2 +x3 ≥ 16 x1, x2, x3 ≥ 0

Solution of Minimization Problems EXAMPLE 3 Solution of Minimization Problems Original Problem (1)   Minimize C = 16x1 + 45x2 Dual Problem (2) Maximize P = 50y1 + 27y2 subject to 2x1 + 5x2 ≥ 50 x1 + 3x2 ≥ 27 x1, x2 ≥ 0 subject to 2y1 + y2  16 5y1 + 3y2  45 y1,y2 ≥ 0

Corner Point   (x1, x2) C= 16x1+45 x2 (y1, y2) P=50y1+27 y2 (0,10) 450 (0,0) (15,4) 420 (0,15) 405 (27,0) 432 (3,10) (8,0) 400 Min C=420 at (15,4) Max P=420 at (3,10)

For reasons that will become clear later, we will use the variables x1 and x2 from the original problem as the slack variables in the dual problem:   2y1 + y2 + x1 =16 (initial system for the dual problem) 5y1 + 3y2 + x2 =45 -50y1- 27y2 + P =0

Since all indicators in the bottom row are nonnegative, the solution to the dual problem is   y1 = 3, y2 = 10, x1 = 0, x2 = 0, P = 420 which agrees with our earlier geometric solution. Furthermore, examining the bot­tom row of the final simplex tableau, we see the same optimal solution to the mini­mization problem that we obtained directly by the geometric method: Min C = 420 at x1 = 15, x2 = 4 This is not achieved with mistake. An optimal solution to a minimization problem always can be obtained from the bottom row of the final simplex tableau for the dual problem. Now we can see that using x1 and x2 as slack variables in the dual problem makes it easy to identify the solution of the original problem.

EXAMPLE 4 Solve the following minimization problem by maximizing the dual:   Minimize C = 40x1 + 12x2 + 40x3 subject to 2x1 +x2 + 5x3 ≥ 20 4x1 + x2 + x3 ≥ 30 Nonnegativity x1, x2, x3 ≥ 0

Maximization and minimization with mixed problem constraints In this section we present a generalized version of the simplex method that will solve both maximization and minimization problems with any combination of , ≥, and = problem constraints.   When the constraint is in the form of  we introduce slack variable (unused capacity), and when the constraint is in the form of ≥ we subtract the surplus (excess amount).

In order to use the simplex method on problems with mixed constraints, we turn to an ingenious device called an artificial variable. This variable has no physical meaning in the original problem (which explains the use of the word "artificial") and is introduced solely for the purpose of obtaining a basic feasible solution so that we can apply the simplex method. An artificial variable is a variable introduced into each equation that has a surplus variable. As before, to ensure that we consider only basic feasible solutions, an artificial variable is required to satisfy the nonnegative constraint.

To prevent an artificial variable from becoming part of an optimal solution to the original problem, a very large "penalty" is introduced into the objective function. This penalty is created by choosing a positive constant M so large that the artificial variable is forced to be 0 in any final optimal solution of the original problem.

Big M Method: Introducing Slack, Surplus, and Artificial Variables to Form the Modified Problem   Step 1. If any problem constraints have negative constants on the right side, multiply both sides by -1 to obtain a constraint with a nonnegative con­stant. (If the constraint is an inequality, this will reverse the direction of the inequality.) Step 2. Introduce a slack variable (S) in each  constraint. Step 3. Introduce a surplus variable (E) and an artificial variable (A) in each ≥ constraint. Step 4. Introduce an artificial variable (A) in each = constraint. Step 5. For each artificial variable Ai subtract- MAi from the objective function.

EXAMPLE 5 Find the modified problem for the following linear programming problem. (Do not attempt to solve the problem.)   Maximize P = 2x1 + 5x2 + 3x3 subject to x1 + 2x2 - x3  7 -x1 + x2 - 2x3 -5 x1 + 4x2 + 3x3 ≥ 1 2x1- x2 + 4x3 = 6 x1, x2, x3 ≥ 0

SOLUTION First, we multiply the second constraint by -1 to change -5 to 5:   (-1)(-x1 + x2 -2x3 ) ≥ (-1)(-5)  x1 - x2 + 2x3 ≥ 5 Next, we introduce the slack, surplus, and artificial variables according to the proce­dure stated in the box: x1 + 2x2 - x3 +S1 = 7 x1 - x2 + 2x3 -E1+A1 = 5 x1 + 4x2 + 3x3 -E2+A2 =1 2x1- x2 + 4x3 +A3 = 6

Finally, we subtract MA1, MA2, and MA3 from the objective function to penalize the artificial variables:   P = 2x1 + 5x2 + 3x3 - MA1 - MA2 - MA3 The modified problem is Maximize P = 2x1 + 5x2 + 3x3 - MA1 - MA2 - MA3 subject to x1 + 2x2 - x3 +S1 = 7 x1 - x2 + 2x3 -E1+A1 = 5 x1 + 4x2 + 3x3 –E2+A2 =1 2x1- x2 + 4x3 +A3 = 6 x1, x2, x3, S1, E1, E2, A1, A2, A3 ≥ 0

After introducing the slack, surplus and artificial variables we continue to solve the problem with following steps;   Step 1. Form the preliminary simplex tableau for the modified problem. Step 2. Use row operations to eliminate the M’s in the bottom row of the prelim­inary simplex tableau in the columns corresponding to the artificial vari­ables. The resulting tableau is the initial simplex tableau. Step 3. Solve the modified problem by applying the simplex method to the initial simplex tableau found in step 2.

EXAMPLE 6 Solve the following linear programming problem using the big M method: Maximize P = x1 - x2 + 3x3 subject to x1 + x2  20 x1 + x3 = 5 x2 + x3 ≥ 10 x1 , x2, x3 ≥ 0

Minimization by the big M method   In addition to solving any maximization problem, the big M method can be used to solve minimization problems. To minimize an objective function, we have only to maximize its negative. Further­more, if M is the minimum value of f, then —M is the maximum value of -f, and conversely. Thus, we can find the minimum value of a function f by finding the max­imum value of -f and then changing the sign of the maximum value.

EXAMPLE 7 J K L Cutting 1hr Polishing 2hr 1 hr Cost per stone $30 $10 Production Scheduling: Minimization Problem A small jewelry manufac­turing company employs a person who is a highly skilled gem cutter, and it wishes to use this person at least 6 hours per day for this purpose. On the other hand, the polishing facilities can be used in any amounts up to 10 hours per day. The company specializes in three kinds of semiprecious gemstones, J, K, and L. Relevant cutting, polishing, and cost requirements are listed in the table. How many gemstones of each type should be processed each day to minimize the cost of the finished stones? What is the minimum cost?   J K L Cutting 1hr Polishing 2hr 1 hr Cost per stone $30 $10