deterministic operations research

Slides:



Advertisements
Similar presentations
The Assignment Problem
Advertisements

Nonlinear Programming McCarl and Spreen Chapter 12.
Operations Research Assistant Professor Dr. Sana’a Wafa Al-Sayegh 2 nd Semester ITGD4207 University of Palestine.
Transportation Problem (TP) and Assignment Problem (AP)
Assignment Meeting 15 Course: D Deterministic Optimization Year: 2009.
Introducción a la Optimización de procesos químicos. Curso 2005/2006 BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained Important concepts.
Operations Research Assignment Problem.
Quantitative Techniques for Decision Making M.P. Gupta & R.B. Khanna © Prentice Hall India.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
Allocation problems - The Hungarian Algorithm The Hungarian algorithm Step 1Reduce the array by both row and column subtractions Step 2Cover the zero elements.
5.4 Simplex method: maximization with problem constraints of the form
Operation Research Chapter 3 Simplex Method.
The Hungarian algorithm for non-square arrays
MIT and James Orlin © Nonlinear Programming Theory.
Math443/543 Mathematical Modeling and Optimization
CES 514 – Data Mining Lecture 8 classification (contd…)
Lecture 10: Support Vector Machines
Linear Programming Applications
D Nagesh Kumar, IIScOptimization Methods: M2L3 1 Optimization using Calculus Optimization of Functions of Multiple Variables: Unconstrained Optimization.
Introduction to Optimization (Part 1)
Applied Economics for Business Management
ENCI 303 Lecture PS-19 Optimization 2
Assignment Model Lecture 21 By Dr Arshad Zaheer. RECAP  Transportation model (Maximization)  Illustration (Demand > Supply)  Optimal Solution  Modi.
Chapter 7 Transportation, Assignment & Transshipment Problems
Assignment Problem. Definition Assignment Problem is a balanced transportation problem in which all supplies and demand are equal to 1.
Nonlinear Programming Models
Chapter 6 Linear Programming: The Simplex Method Section 3 The Dual Problem: Minimization with Problem Constraints of the Form ≥
Nonlinear Programming I Li Xiaolei. Introductory concepts A general nonlinear programming problem (NLP) can be expressed as follows: objective function.
Optimization unconstrained and constrained Calculus part II.
Optimality Conditions for Unconstrained optimization One dimensional optimization –Necessary and sufficient conditions Multidimensional optimization –Classification.
Network Flow Problems – The Assignment Problem
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Nonlinear Programming In this handout Gradient Search for Multivariable Unconstrained Optimization KKT Conditions for Optimality of Constrained Optimization.
Inequality Constraints Lecture 7. Inequality Contraints (I) n A Review of Lagrange Multipliers –As we discussed last time, the first order necessary conditions.
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
QUANTITATIVE METHODS FOR MANAGERS ASSIGNMENT MODEL.
Lecture 4 Chapter 3 Improving Search
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
The Hungarian Algorithm – Maximisation problems The Hungarian algorithm for maximisation problems To solve a maximisation problem using the Hungarian algorithm,
Optimal Control.
The Hungarian algorithm for non-square arrays
Linear Programming Topics General optimization model
ENGM 535 Optimization Assignment Problems.
Chapter 11 Optimization with Equality Constraints
Calculus-Based Solutions Procedures MT 235.
Computational Optimization
Linear programming Simplex method.
Linear Programming Topics General optimization model
Lecture 8 – Nonlinear Programming Models
Dr. Arslan Ornek IMPROVING SEARCH
Chapter 4 Linear Programming: The Simplex Method
Linear Programming Topics General optimization model
Lecture 9 – Nonlinear Programming Models
1. Problem Formulation.
Linear Programming Topics General optimization model
Chapter 3 The Simplex Method and Sensitivity Analysis
Ying shen Sse, tongji university Sep. 2016
Chapter 7 Transportation, Assignment & Transshipment Problems
Operational Research (OR)
Linear programming Simplex method.
Classification of optimization problems
Transportation Problems
Outline Unconstrained Optimization Functions of One Variable
Optimization and Some Traditional Methods
EE 458 Introduction to Optimization
The Hungarian algorithm for non-square arrays
Decision Science Chapter 6 Assignment Models.
Calculus-Based Optimization AGEC 317
Multivariable optimization with no constraints
Presentation transcript:

deterministic operations research CHAPTER 14

14.6. Mathematical Programming Definition of OR: Set of quantitative techniques to solve decision problems, using math, statistics, and computers Definition of Math Programming A class of OR methods to mathematically model and solve constrained optimization problems, include: Linear programming Nonlinear programming Integer programming Dynamic programming Stochastic programming . . .

14.6. Mathematical Programming Math Programming Models Used to find optimum (best) solution of the following: Optimize f(X) Subject to gi(X) ≤ bi, i = 1, …, m, X = x1, …, xn where f(X) = objective function Optimize = minimize or maximize X = (x1, …, xn) = decision variables g(X) = constraints. Can also be (=) or (≥) Other constraints: X ≥ 0, X binary (0-1), or X general integer

14.7. Unconstrained Optimization No constraints. Can be solved by calculus. Set: f’(X) = 0 Examples: linear regression, EOQ Stationary Point X* is a stationary point of f(X) if: f’(X*) = 0 necessary, not sufficient condition Optimum Point X* is a min (max) point of f(X) if X* is a stationary point and: f’’(X*) > 0 min, sufficient condition f’’(X*) < 0 max, sufficient condition

14.7. Unconstrained Optimization If f’(X*) = 0 and f’’(X*) = 0? Higher-order derivatives must be taken Let n = order of the first non-zero derivative If n is odd, X* is a deflection point If n is even and f(n)(X*) > 0 X* is a local minimum. f(X) is convex f(n)(X*) < 0 X* is a local maximum. f(X) is concave

14.7. Unconstrained Optimization Example Find & classify all stationary points of f(x) = x4 + x3 f’(x) = 4x3 + 3x2 = 0 = x2 (4x + 3) = 0 Stationary points: 0, - ¾ f’’(x) = 12x2 + 6x f’’(0) = 0, f’’(-3/4) = 12(-3/4)2 + 6(-3/4) = 9/4, - ¾ is local minimum f’’’(x) = 24x + 6 f’’’(0) = 6, 0 is deflection point

14.7. Unconstrained Optimization for functions of Several Variables Given f(x1, ..,xn), Set the gradient = 0:  f(x1, ..,xn) = 0 Solve the system to determine stationary points X* For each stationary point, determine the Hessian Matrix H(X*) = 2 f(X*) If H(X*) is Positive Definite, X* is a minimum point If H(X*) is Negative Definite, X* is a maximum point If H(X*) is InDefinite, X* is a saddle point

14.7. Unconstrained Optimization for functions of Several Variables  

14.7. Unconstrained Optimization Example2 Find & classify all stationary points of f(x1, x2) = 4x12 – 5x1x2 + 3x22 – 6x1 + 2.6x2  f(x1,x2) = 8x1 – 5x2 – 6 = 0 = – 5x1 + 6x2 + 2.6 = 0 Solving gives the stationary point X* = (1, 0.4)

14.7. Unconstrained Optimization Example2 H(f) = 2 f(x1,x2) =  8 – 5   – 5 6  Leading Determinant (k = 1) = 8 Leading Determinant (k = 2) = 8*6 – (–5*–5) = 23 Since H is +DEF (x1,x2)* = (1, 0.4) is a minimum point

14.8.1. Assignment Technique Solution method: Phase I Given n×n square matrix of distances/transport costs Optimal solution by Hungarian Algorithm: 2 phases Phase I From each row, subtract its smallest number From each column, subtract its smallest number For each row with only 1 uncrossed zero, select the zero, cross zeros in same column For each column with only 1 uncrossed zero, select the zero, cross zeros in same row

Assignment Technique Hungarian Algorithm (Optimum Solution): Phase II If previous steps (Phase 1) produce n zeros, stop. If m < n zeros are covered, Phase II is needed. Phase II Cover all zeros with m horizontal or vertical line Subtract the minimum uncovered element from all uncovered elements Add the minimum uncovered element to all elements covered by 2 lines (at line intersections) Repeat Phase I

Assignment Technique Example A factory has 4 identical machines (A,B,C,D) and 4 storage areas (I,II,III,IV). Given the distance matrix below, determine the best machine-storage assignment. I II III IV A 2 6 3 5 B 1 C 4 D

Assignment Technique Example Subtract the smallest element from each row. I II III IV A 2 6 3 5 B 1 C 4 D Sum

Assignment Technique Example Subtract the smallest element from each column. I II III IV A 4 1 3 B 2 C D sum

Assignment Technique Example Subtract the smallest element from each column. I II III IV A 3 1 B 4 C 2 D

Assignment Technique Example II III IV A 3 1 B 4 C 2 D

Assignment Technique Example Cover all zeros with 3 lines, in rows/columns with max no. of zeros Min uncovered element = 1 I II III IV A 3 1 B 4 C 2 D

Assignment Technique Example Subtracting & adding “1” I II III IV A 2 1 B 5 C 3 D

Assignment Technique Example Zero-cost assignment. m = 4 (stop) I II III IV A 2 1 B 5 C 3 D

Assignment Example Solution Optimum machine-storage assignment. Total cost = 2 + 3 + 3 + 1 = 9 I II III IV A 2 6 3 5 B 1 C 4 D

End of Chapter 14 Questions?