Chapter 10: Iterative Improvement

Slides:



Advertisements
Similar presentations
Lecture 3 Linear Programming: Tutorial Simplex Method
Advertisements

Chapter 6 Linear Programming: The Simplex Method
The Simplex Method The geometric method of solving linear programming problems presented before. The graphical method is useful only for problems involving.
Dr. Sana’a Wafa Al-Sayegh
Copyright (c) 2003 Brooks/Cole, a division of Thomson Learning, Inc
Computational Methods for Management and Economics Carla Gomes Module 6a Introduction to Simplex (Textbook – Hillier and Lieberman)
Linear Inequalities and Linear Programming Chapter 5
Chapter 7 LINEAR PROGRAMMING.
The Simplex Method: Standard Maximization Problems
5.4 Simplex method: maximization with problem constraints of the form
The Simplex Algorithm An Algorithm for solving Linear Programming Problems.
Operation Research Chapter 3 Simplex Method.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Lecture 9 Wednesday, 11/15/06 Linear Programming.
Design and Analysis of Algorithms
Chapter 10: Iterative Improvement
Simplex Method LP problem in standard form. Canonical (slack) form : basic variables : nonbasic variables.
Linear Programming (LP)
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
MIT and James Orlin © Chapter 3. The simplex algorithm Putting Linear Programs into standard form Introduction to Simplex Algorithm.
LINEAR PROGRAMMING SIMPLEX METHOD.
Chapter 10: Iterative Improvement Simplex Method The Design and Analysis of Algorithms.
Learning Objectives for Section 6.2
Linear Programming Piyush Kumar. Graphing 2-Dimensional LPs Example 1: x y Feasible Region x  0y  0 x + 2 y  2 y  4 x  3 Subject.
8. Linear Programming (Simplex Method) Objectives: 1.Simplex Method- Standard Maximum problem 2. (i) Greedy Rule (ii) Ratio Test (iii) Pivot Operation.
Chapter 6 Linear Programming: The Simplex Method Section 2 The Simplex Method: Maximization with Problem Constraints of the Form ≤
ECE 556 Linear Programming Ting-Yuan Wang Electrical and Computer Engineering University of Wisconsin-Madison March
Topic III The Simplex Method Setting up the Method Tabular Form Chapter(s): 4.
1 1 Slide © 2000 South-Western College Publishing/ITP Slides Prepared by JOHN LOUCKS.
Kerimcan OzcanMNGT 379 Operations Research1 Linear Programming: The Simplex Method Chapter 5.
1 1 © 2003 Thomson  /South-Western Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 1 © 2003 Thomson  /South-Western Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Public Policy Modeling Simplex Method Tuesday, October 13, 2015 Hun Myoung Park, Ph.D. Public Management & Policy Analysis Program Graduate School of International.
Solving Linear Programming Problems: The Simplex Method
Chapter 10 IterativeImprovement. Iterative Improvement Algorithm design technique for solving optimization problems b Start with a feasible solution b.
Linear Programming Revised Simplex Method, Duality of LP problems and Sensitivity analysis D Nagesh Kumar, IISc Optimization Methods: M3L5.
4  The Simplex Method: Standard Maximization Problems  The Simplex Method: Standard Minimization Problems  The Simplex Method: Nonstandard Problems.
1 1 Slide © 2005 Thomson/South-Western Linear Programming: The Simplex Method n An Overview of the Simplex Method n Standard Form n Tableau Form n Setting.
Chapter 4 Linear Programming: The Simplex Method
1 Simplex Method (created by George Dantzig in late 1940s) A systematic way of searching for an optimal LP solution BMGT 434, Spring 2002 Instructor: Chien-Yu.
MIT and James Orlin © Chapter 3. The simplex algorithm Putting Linear Programs into standard form Introduction to Simplex Algorithm file Simplex2_AMII_05a_gr.
An-Najah N. University Faculty of Engineering and Information Technology Department of Management Information systems Operations Research and Applications.
Simplex Method Simplex: a linear-programming algorithm that can solve problems having more than two decision variables. The simplex technique involves.
Part 3. Linear Programming 3.2 Algorithm. General Formulation Convex function Convex region.
Linear Programming: Formulations, Geometry and Simplex Method Yi Zhang January 21 th, 2010.
Copyright © 2006 Brooks/Cole, a division of Thomson Learning, Inc. Linear Programming: An Algebraic Approach 4 The Simplex Method with Standard Maximization.
1 Optimization Linear Programming and Simplex Method.
Decision Support Systems INF421 & IS Simplex: a linear-programming algorithm that can solve problems having more than two decision variables.
(iii) Simplex method - I D Nagesh Kumar, IISc Water Resources Planning and Management: M3L3 Linear Programming and Applications.
The Simplex Method. and Maximize Subject to From a geometric viewpoint : CPF solutions (Corner-Point Feasible) : Corner-point infeasible solutions 0.
Discrete Optimization
Linear Programming for Solving the DSS Problems
EMGT 6412/MATH 6665 Mathematical Programming Spring 2016
Solving Linear Program by Simplex Method The Concept
LINEAR PROGRAMMING.
Linear Programming Revised Simplex Method, Duality of LP problems and Sensitivity analysis D Nagesh Kumar, IISc Optimization Methods: M3L5.
The Simplex Method The geometric method of solving linear programming problems presented before. The graphical method is useful only for problems involving.
10CS661 OPERATION RESEARCH Engineered for Tomorrow.
Linear programming Simplex method.
10CS661 OPERATION RESEARCH Engineered for Tomorrow.
Chapter 4 Linear Programming: The Simplex Method
The Simplex Method.
Chapter 5. Sensitivity Analysis
Chapter 3 The Simplex Method and Sensitivity Analysis
Part 3. Linear Programming
Well, just how many basic
The Simplex Method The geometric method of solving linear programming problems presented before. The graphical method is useful only for problems involving.
Linear programming Simplex method.
Chapter 5. The Duality Theorem
Presentation transcript:

Chapter 10: Iterative Improvement The Design and Analysis of Algorithms Chapter 10: Iterative Improvement Simplex Method

Iterative Improvement Introduction Linear Programming The Simplex Method Standard Form of LP Problem Basic Feasible Solutions Outline of the Simplex Method Example Notes on the Simplex Method Improvements

Introduction Algorithm design technique for solving optimization problems Start with a feasible solution Repeat the following step until no improvement can be found: change the current feasible solution to a feasible solution with a better value of the objective function Return the last feasible solution as optimal

Introduction Note: Typically, a change in a current solution is “small” (local search) Major difficulty: Local optimum vs. global optimum

Important Examples Simplex method Ford-Fulkerson algorithm for maximum flow problem Maximum matching of graph vertices Gale-Shapley algorithm for the stable marriage problem

Linear Programming Linear programming (LP) problem is to optimize a linear function of several variables subject to linear constraints: maximize (or minimize) c1 x1 + ...+ cn xn subject to ai1x1+ ...+ ain xn ≤ (or ≥ or =) bi , i = 1,...,m , x1 ≥ 0, ... , xn ≥ 0 The function z = c1 x1 + ...+ cn xn is called the objective function; constraints x1 ≥ 0, ... , xn ≥ 0 are called non-negativity constraints

Example maximize 3x + 5y subject to x + y ≤ 4 x + 3y ≤ 6 x ≥ 0, y ≥ 0 Feasible region is the set of points defined by the constraints

Geometric solution maximize 3x + 5y subject to x + y ≤ 4 x + 3y ≤ 6 Extreme Point Theorem Any LP problem with a nonempty bounded feasible region has an optimal solution; moreover, an optimal solution can always be found at an extreme point of the problem's feasible region.

Possible outcomes in solving an LP problem has a finite optimal solution, which may not be unique unbounded: the objective function of maximization (minimization) LP problem is unbounded from above (below) on its feasible region infeasible: there are no points satisfying all the constraints, i.e. the constraints are contradictory

The Simplex Method Simplex method is the classic method for solving LP problems, one of the most important algorithms ever invented Invented by George Dantzig in 1947 (Stanford University) Based on the iterative improvement idea: Generates a sequence of adjacent points of the problem’s feasible region with improving values of the objective function until no further improvement is possible

Standard form of LP problem must be a maximization problem all constraints (except the non-negativity constraints) must be in the form of linear equations all the variables must be required to be nonnegative Thus, the general linear programming problem in standard form with m constraints and n unknowns (n ≥ m) is maximize c1 x1 + ...+ cn xn subject to ai1x1+ ...+ ain xn = bi , , i = 1,...,m, x1 ≥ 0, ... , xn ≥ 0 Every LP problem can be represented in such form

Example maximize 3x + 5y maximize 3x + 5y + 0u + 0v subject to subject to x + y ≤ 4 x + y + u = 4 x + 3y ≤ 6 x + 3y + v = 6 x≥0, y≥0 x≥0, y≥0, u≥0, v≥0 Variables u and v, transforming inequality constraints into equality constrains, are called slack variables

Basic feasible solutions A basic solution to a system of m linear equations in n unknowns (n ≥ m) is obtained by setting n – m variables to 0 and solving the resulting system to get the values of the other m variables. The variables set to 0 are called nonbasic; the variables obtained by solving the system are called basic. A basic solution is called feasible if all its (basic) variables are nonnegative. Example x + y + u = 4 x + 3y + v = 6 (0, 0, 4, 6) is basic feasible solution (x, y are nonbasic; u, v are basic)

Simplex Tableau maximize z = 3x + 5y + 0u + 0v subject to x + 3y + v = 6 x≥0, y≥0, u≥0, v≥0

Outline of the Simplex Method Step 0 [Initialization] Present a given LP problem in standard form and set up initial tableau. Step 1 [Optimality test] If all entries in the objective row are nonnegative — stop: the tableau represents an optimal solution. Step 2 [Find entering variable] Select (the most) negative entry in the objective row. Mark its column to indicate the entering variable and the pivot column.

Outline of the Simplex Method Step 3 [Find departing variable] For each positive entry in the pivot column, calculate the θ-ratio by dividing that row's entry in the rightmost column by its entry in the pivot column. (If there are no positive entries in the pivot column — stop: the problem is unbounded.) Find the row with the smallest θ-ratio, mark this row to indicate the departing variable and the pivot row. Step 4 [Form the next tableau] Divide all the entries in the pivot row by its entry in the pivot column. Subtract from each of the other rows, including the objective row, the new pivot row multiplied by the entry in the pivot column of the row in question. Replace the label of the pivot row by the variable's name of the pivot column and go back to Step 1.

Example of Simplex Method maximize z = 3x + 5y + 0u + 0v subject to x + y + u = 4 x + 3y + v = 6 x≥0, y≥0, u≥0, v≥0 basic feasible sol. (0, 0, 4, 6) z = 0 (0, 2, 2, 0) z = 10 (3, 1, 0, 0) z = 14

Notes on the Simplex Method Finding an initial basic feasible solution may pose a problem Theoretical possibility of cycling Typical number of iterations is between m and 3m, where m is the number of equality constraints in the standard form. Number of operations per iteration: O(nm) Worse-case efficiency is exponential

Improvements L. G. Khachian introduced an ellipsoid method (1979) that seemed to overcome some of the simplex method's limitations. O(n6). Disadvantage – runs with the same complexity on all problems Narendra K. Karmarkar of AT&T Bell Laboratories proposed in1984 a new very efficient interior-point algorithm - O(n 3.5). In empirical tests it performs competitively with the simplex method.