System Optimization (1) Liang Yu Department of Biological Systems Engineering Washington State University 04. 16. 2013.

Slides:



Advertisements
Similar presentations
Solving LP Models Improving Search Special Form of Improving Search
Advertisements

Nonlinear Programming McCarl and Spreen Chapter 12.
Engineering Optimization
Optimization methods Review
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
Separating Hyperplanes
Inexact SQP Methods for Equality Constrained Optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
D Nagesh Kumar, IIScOptimization Methods: M1L1 1 Introduction and Basic Concepts (i) Historical Development and Model Building.
OPTIMAL CONTROL SYSTEMS
Nonlinear Optimization for Optimal Control
Optimization Methods One-Dimensional Unconstrained Optimization
1 Chapter 8: Linearization Methods for Constrained Problems Book Review Presented by Kartik Pandit July 23, 2010 ENGINEERING OPTIMIZATION Methods and Applications.
ENGR 351 Numerical Methods Instructor: Dr. L.R. Chevalier
A Comparative Study Of Deterministic And Stochastic Optimization Methods For Integrated Design Of Processes Mario Francisco a, Silvana Revollar b, Pastora.
Unconstrained Optimization Problem
1 Real-Time Optimization (RTO) In previous chapters we have emphasized control system performance for disturbance and set-point changes. Now we will be.
Advanced Topics in Optimization
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Nonlinear Algebraic Systems 1.Iterative solution methods 2.Fixed-point iteration 3.Newton-Raphson method 4.Secant method 5.Matlab tutorial 6.Matlab exercise.
ISM 206 Lecture 6 Nonlinear Unconstrained Optimization.
Why Function Optimization ?
An Introduction to Optimization Theory. Outline Introduction Unconstrained optimization problem Constrained optimization problem.
Tier I: Mathematical Methods of Optimization

UNCONSTRAINED MULTIVARIABLE
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 9. Optimization problems.
456/556 Introduction to Operations Research Optimization with the Excel 2007 Solver.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
1 Chapter 8 Nonlinear Programming with Constraints.
ENCI 303 Lecture PS-19 Optimization 2
84 b Unidimensional Search Methods Most algorithms for unconstrained and constrained optimisation use an efficient unidimensional optimisation technique.
Nonlinear programming Unconstrained optimization techniques.
Nonlinear Programming.  A nonlinear program (NLP) is similar to a linear program in that it is composed of an objective function, general constraints,
Fin500J: Mathematical Foundations in Finance
Optimization of Process Flowsheets S,S&L Chapter 24 T&S Chapter 12 Terry A. Ring CHEN 5253.
Introduction A GENERAL MODEL OF SYSTEM OPTIMIZATION.
1 Unconstrained Optimization Objective: Find minimum of F(X) where X is a vector of design variables We may know lower and upper bounds for optimum No.
Survey of gradient based constrained optimization algorithms Select algorithms based on their popularity. Additional details and additional algorithms.
1 Optimization Multi-Dimensional Unconstrained Optimization Part II: Gradient Methods.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
Mathematical Models & Optimization?
Nonlinear Programming Models
Solution of Nonlinear Functions
A comparison between PROC NLP and PROC OPTMODEL Optimization Algorithm Chin Hwa Tan December 3, 2008.
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Data Modeling Patrice Koehl Department of Biological Sciences National University of Singapore
Introduction to Optimization Methods
Inexact SQP methods for equality constrained optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
Metabolic Flux Analysis by MATLAB
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
1 Optimization Techniques Constrained Optimization by Linear Programming updated NTU SY-521-N SMU EMIS 5300/7300 Systems Analysis Methods Dr.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
Metabolic Flux Analysis by MATLAB Xueyang Feng Dept. of Energy, Environmental & Chemical Engineering Washington University in St. Louis.
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 - Chapter 7 Optimization.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Optimal Control.
1 Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 23, 2010 Piotr Mirowski Based on slides by Sumit.
Bounded Nonlinear Optimization to Fit a Model of Acoustic Foams
Computation of the solutions of nonlinear polynomial systems
Computational Optimization
Lecture 8 – Nonlinear Programming Models
Optimization of Process Flowsheets
Optimization and Some Traditional Methods
Part 4 - Chapter 13.
What are optimization methods?
Computer Animation Algorithms and Techniques
Presentation transcript:

System Optimization (1) Liang Yu Department of Biological Systems Engineering Washington State University

Outline Background of Engineering Optimization Application of Optimization Introduction Fundamentals of Optimization Optimization Toolbox in Matlab

Optimization in Process Plants

Engineering applications of optimization Some typical applications from different engineering disciplines Design of water resources systems for maximum benefit Design of pumps, turbines, and heat transfer equipment for maximum efficiency Optimum design of chemical processing equipment and plants Selection of a site for an industry Optimum design of control systems

Application: Metabolic Flux Analysis Flux Balance Analysis (FBA) in silico simulation Linear programming (LP) Genome-scale 13 C-assisted Metabolic Flux Analysis in vivo search Nonlinear programming (NLP) Simplified model maximize ∑c i ∙v i s.t. S∙v = 0 lb < v < ub minimize (MDV exp -MDV sim ) 2 s.t. S∙v = 0 IDV = f(v, IMM, IDV) MDV = M∙IDV lb < v < ub Metabolic Steady state Metabolic & isotopic Steady state IDV: isotopomer distribution vector MDV: mass distribution vector

Application for optimization of biorefinery configurations Pham, Viet, and Mahmoud El ‐ Halwagi. "Process synthesis and optimization of biorefinery configurations." AIChE Journal 58.4 (2012):

Part of the branching trees for the production of bio- alcohols from lignocellulosic bio-mass

Optimization Tree

Introduction Definition Optimization is the act of obtaining the best result under given circumstances. It can be defined as the process of finding the conditions that give the maximum or minimum value of a function. Goal either to minimize the effort required or to maximize the desired benefit. Optimization problem could be linear or non-linear. Non –linear optimization is accomplished by numerical ‘Search Methods’. Search methods are used iteratively before a solution is achieved. The search procedure is termed as algorithm.

Introduction Minimum of f (x) is same as maximum of −f (x) Optimum solution is found while satisfying its constraint (derivative must be zero at optimum).

Introduction Linear problem – solved by Simplex or Graphical methods. The solution of the linear problem lies on boundaries of the feasible region. Non-linear problem solution lies within and on the boundaries of the feasible region. Solution of linear problem Three dimensional solution of non-linear problem

Introduction Optimization Programming Languages GAMS - General Algebraic Modeling System LINDO - Widely used in business applications AMPL - A Mathematical Programming Language Others: MPL, ILOG Software with Optimization Capabilities Excel – Solver MATLAB MathCAD Mathematica Maple Others

Statement of an optimization problem An optimization or a mathematical programming problem can be stated x1 x2... xn which minimizes f (X) Find X = Subject to the constraints g j (X) ≤ 0, j = 1, 2,...,m l j (X) = 0, j = 1, 2,..., p

Fundamentals of Optimization Single Objective function f(x) Maximization Minimization Design Variables, xi, i=0,1,2,3….. Constraints Inequality Equality Optimal points Local minima/maxima points: A point or Solution x* is at local point if there is no other x in its Neighborhood less than x* Global minima/maxima points: A point or Solution x** is at global point if there is no other x in entire search space less than x** Example of design variables and constraints used in optimization. Maximize X X2 Subject to: X1 + X2 ≤ X X2 ≤ 50 X1 ≥ 50 X2 ≥ 25 X1 ≥0, X2 ≥0

Fundamentals of Optimization Global versus local optimization. Local point is equal to global point if the function is convex.

Fundamentals of Optimization Function f is convex if f(X a ) is less than value of the corresponding point joining f(X 1 ) and f(X 2 ). Convexity condition – Hessian 2nd order derivative) matrix of function f must be positive semi definite ( Eigen values +ve or zero). Convex and nonconvex setConvex function

Mathematical Background Slop or gradient of the objective function f – represent the direction in which the function will decrease/increase most rapidly Taylor series expansion Jacobian – matrix of gradient of f with respect to several variables

Mathematical Background Slope -First order Condition (FOC) – Provides function’s slope information Hessian – Second derivative of function of several variables, Sign indicates max.(+ve) or min.(-ve) Second order condition (SOC) Eigen values of H(X*) are all positive Determinants of all lower order of H(X*) are +ve

Optimization Algorithm Deterministic - specific rules to move from one iteration to next, gradient, Hessian Stochastic – probalistic rules are used for subsequent iteration Optimal Design – Engineering Design based on optimization algorithm Lagrangian method – sum of objective function and linear combination of the constraints.

Optimization Methods Deterministic Direct Search – Use Objective function values to locate minimum Gradient Based – first or second order of objective function. Minimization objective function f(x) is used with –ve sign – f(x) for maximization problem. Single Variable Newton – Raphson is Gradient based technique (FOC) Golden Search – step size reducing iterative method Multivariable Techniques ( Make use of Single variable Techniques specially Golden Section) Unconstrained Optimization Powell Method – Quadratic (degree 2) objective function polynomial is non-gradient based. Gradient Based – Steepest Descent (FOC) or Least Square minimum (LMS) Hessian Based -Conjugate Gradient (FOC) and BFGS (SOC)

Optimization Methods - Constrained Constrained Optimization Indirect approach – by transforming into unconstrained problem. Exterior Penalty Function (EPF) and Augmented Lagrange Multiplier Direct Method Sequential Linear Programming (SLP), Sequential Quadratic Programming (SQP) and steepest Generalized Reduced Gradient Method (GRG) Descent Gradient or LMS

Advanced Optimization Methods Global Optimization – Stochastic techniques Simulated Annealing (SA) method – minimum energy principle of cooling metal crystalline structure Genetic Algorithm (GA) – Survival of the fittest principle based upon evolutionary theory

Optimization Toolbox in Matlab Key features Interactive tools for defining and solving optimization problems and monitoring solution progress Solvers for nonlinear and multiobjective optimization Solvers for nonlinear least squares, data fitting, and nonlinear equations Methods for solving quadratic and linear programming problems Methods for solving binary integer programming problems Parallel computing support in selected constrained nonlinear solvers

How to use Optimization Toolbox Optimization Functions Function files can be directly provided by M File Syntax: [x,fval] = fminsearch(fun,x0) Optimization Tool graphical user interface (GUI) Define and modify problems quickly Use the correct syntax for optimization functions Import and export from the MATLAB workspace Generate code containing your configuration for a solver and options Change parameters of an optimization during the execution of certain Global Optimization Toolbox functions

Function Optimization Optimization concerns the minimization or maximization of functions Standard Optimization Problem: Equality Constraints Subject to: Inequality Constraints Side Constraints is the objective function, which measure and evaluate the performance of a system. In a standard problem, we are minimizing the function. For maximization, it is equivalent to minimization of the –ve of the objective function. Where: is a column vector of design variables, which can affect the performance of the system.

Function Optimization Constraints – Limitation to the design space. Can be linear or nonlinear, explicit or implicit functions Equality Constraints Inequality Constraints Side Constraints Most algorithm require less than

Optimization Toolbox Solvers Minimizers This group of solvers attempts to find a local minimum of the objective function near a starting point x 0. They address problems of unconstrained optimization, linear programming, quadratic programming, and general nonlinear programming. Multiobjective minimizers This group of solvers attempts to either minimize the maximum value of a set of functions (fminimax), or to find a location where a collection of functions is below some prespecified values (fgoalattain). Equation solvers This group of solvers attempts to find a solution to a scalar- or vector-valued nonlinear equation f(x) = 0 near a starting point x 0. Equation-solving can be considered a form of optimization because it is equivalent to finding the minimum norm of f(x) near x 0. Least-Squares (curve-fitting) solvers This group of solvers attempts to minimize a sum of squares. This type of problem frequently arises in fitting a model to data. The solvers address problems of finding nonnegative solutions, bounded or linearly constrained solutions, and fitting parameterized nonlinear models to data.

Objective Function Linear Quadratic Sum-of-squares (Least squares) Smooth nonlinear Nonsmooth

Constraint Type None (unconstrained) Bound Linear (including bound) General smooth Discrete (integer)

Select Solvers by Objective and Constraint

Minimization Algorithm

Minimization Algorithm (Cont.)

Equation Solving Algorithms

Least-Squares Algorithms

Implementing Optimization Toolbox Most of these optimization routines require the definition of an M-file containing the function, f, to be minimized. Maximization is achieved by supplying the routines with –f. Optimization options passed to the routines change optimization parameters. Default optimization parameters can be changed through an options structure.

Unconstrained Minimization Consider the problem of finding a set of values [x1 x2]T that solves Steps: Create an M-file that returns the function value (Objective Function). Call it objfun.m Then, invoke the unconstrained minimization routine. Use fminunc

Step 1 – Objective Function function f = objfun(x) f=exp(x(1))*(4*x(1)^2+2*x(2)^2+4*x(1)*x(2)+2*x(2)+1); Objective function

Step 2 – Invoke Routine x0 = [-1,1]; options = optimset(‘LargeScale’,’off’); [xmin,feval,exitflag,output]= fminunc(‘objfun’,x0,options); Output arguments Input arguments Starting with a guess Optimization parameters settings

Step 3 – Results xmin = feval = e-010 exitflag = 1 output = iterations: 7 funcCount: 40 stepsize: 1 firstorderopt: e-004 algorithm: 'medium-scale: Quasi-Newton line search' Minimum point of design variables Objective function value Exitflag tells if the algorithm is converged. If exitflag > 0, then local minimum is found Some other information

More on fminunc – Input [xmin,feval,exitflag,output,grad,hessian]= fminunc(fun,x0,options,P1,P2,…) fun: Return a function of objective function. x0: Starts with an initial guess. The guess must be a vector of size of number of design variables. Option: To set some of the optimization parameters. (More after few slides) P1,P2,…: To pass additional parameters.

More on fminunc – Output [xmin,feval,exitflag,output,grad,hessian]= fminunc(fun,x0,options,P1,P2,…) xmin: Vector of the minimum point (optimal point). The size is the number of design variables. feval: The objective function value of at the optimal point. exitflag: A value shows whether the optimization routine is terminated successfully. (converged if >0) Output: This structure gives more details about the optimization grad: The gradient value at the optimal point. hessian: The hessian value of at the optimal point

Next Class Please take your laptop and install Matlab