L10 Optimal Design L.Multipliers

Slides:



Advertisements
Similar presentations
Engineering Optimization
Advertisements

Page 1 Page 1 ENGINEERING OPTIMIZATION Methods and Applications A. Ravindran, K. M. Ragsdell, G. V. Reklaitis Book Review.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
L12 LaGrange Multiplier Method Homework Review Summary Test 1.
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions Today’s lecture is on optimality.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Optimization in Engineering Design 1 Lagrange Multipliers.
Engineering Optimization
Economics 214 Lecture 37 Constrained Optimization.
Constrained Optimization
Lecture 38 Constrained Optimization
Mathematics Operations Research Syllabus – Update 1.
Constrained Optimization Economics 214 Lecture 41.
CS Pattern Recognition Review of Prerequisites in Math and Statistics Prepared by Li Yang Based on Appendix chapters of Pattern Recognition, 4.
D Nagesh Kumar, IIScOptimization Methods: M2L5 1 Optimization using Calculus Kuhn-Tucker Conditions.
Optimality Conditions for Nonlinear Optimization Ashish Goel Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.
1 Optimization. 2 General Problem 3 One Independent Variable x y (Local) maximum Slope = 0.
Nonlinear Optimization Review of Derivatives Models with One Decision Variable Unconstrained Models with More Than One Decision Variable Models with Equality.
1 Least Cost System Operation: Economic Dispatch 2 Smith College, EGR 325 March 10, 2006.
KKT Practice and Second Order Conditions from Nash and Sofer
L13 Optimization using Excel See revised schedule read 8(1-4) + Excel “help” for Mar 12 Test Answers Review: Convex Prog. Prob. Worksheet modifications.
L20 LP part 6 Homework Review Postoptimality Analysis Summary 1.
LAGRANGE mULTIPLIERS By Rohit Venkat.
Slide 2a.1 Stiff Structures, Compliant Mechanisms, and MEMS: A short course offered at IISc, Bangalore, India. Aug.-Sep., G. K. Ananthasuresh Lecture.
EASTERN MEDITERRANEAN UNIVERSITY Department of Industrial Engineering Non linear Optimization Spring Instructor: Prof.Dr.Sahand Daneshvar Submited.
D Nagesh Kumar, IIScOptimization Methods: M2L4 1 Optimization using Calculus Optimization of Functions of Multiple Variables subject to Equality Constraints.
L8 Optimal Design concepts pt D
Part 4 Nonlinear Programming 4.1 Introduction. Standard Form.
L21 Numerical Methods part 1 Homework Review Search problem Line Search methods Summary 1 Test 4 Wed.
Fair Allocation and Network Ressources Pricing Fair Allocation and Network Ressources Pricing A simplified bi-level model Work sponsored by France Télécom.
Constraints Feasible region Bounded/ unbound Vertices
Performance Surfaces.
Nonlinear Programming In this handout Gradient Search for Multivariable Unconstrained Optimization KKT Conditions for Optimality of Constrained Optimization.
Review of PMP Derivation We want to find control u(t) which minimizes the function: x(t) = x*(t) +  x(t); u(t) = u*(t) +  u(t); (t) = *(t) +  (t);
CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Constrained optimization.
1 OR II GSLM Outline  equality constraint  tangent plane  regular point  FONC  SONC  SOSC.
Economics 2301 Lecture 37 Constrained Optimization.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
3-5: Linear Programming. Learning Target I can solve linear programing problem.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
OR II GSLM
2.5 The Fundamental Theorem of Game Theory For any 2-person zero-sum game there exists a pair (x*,y*) in S  T such that min {x*V. j : j=1,...,n} =
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
1 Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 23, 2010 Piotr Mirowski Based on slides by Sumit.
L6 Optimal Design concepts pt B
Part 4 Nonlinear Programming
Chapter 11 Optimization with Equality Constraints
L11 Optimal Design L.Multipliers
Dr. Arslan Ornek IMPROVING SEARCH
Chapter 5. Sensitivity Analysis
Chap 3. The simplex method
CS5321 Numerical Optimization
The Lagrange Multiplier Method
Operational Research (OR)
L5 Optimal Design concepts pt A
7.5 – Constrained Optimization: The Method of Lagrange Multipliers
Part 4 Nonlinear Programming
Outline Unconstrained Optimization Functions of One Variable
I.4 Polyhedral Theory (NW)
Back to Cone Motivation: From the proof of Affine Minkowski, we can see that if we know generators of a polyhedral cone, they can be used to describe.
I.4 Polyhedral Theory.
Performance Surfaces.
CS5321 Numerical Optimization
Optimal Control of Systems
Lecture 38 Constrained Optimization
L7 Optimal Design concepts pt C
Multivariable optimization with no constraints
Constraints.
L8 Optimal Design concepts pt D
Presentation transcript:

L10 Optimal Design L.Multipliers Homework Review Meaning & of Lagrange Multiplier Summary

Homework 4.44 Now is a “minimize” We have only used “necessary conditions” We cannot yet conclude that the pt is a MIN!

4.44 cont’d H(x) is negative definite, therefore the candidate pt is not a local min. (therefore Pt A is NOT a max of the original F(x)). Unbounded?

Prob 4.54

Gaussian Elimination Case 2 x R1 by -1 + to R2 x R3 by -13 + to R2

Prob 4.57

Gaussian Elmination 4.57 Case 1 u=0 +R1 to R2 x R2 by -1/2 + to R3 Check feasibility Backsub Using R2

Gaussian Elmination 4.57 Case 2 s=0 +R3 to R4 Check feasibility Backsub Using R3

Prob 4.57

Prob 4.59

Prob 4.59

Prob 4.59

Gaussian Elmination 4.59 Case 4 s1,2=0 +R3 to R4 Backsub Using R3 Check feasibility Both s1 and s2 =0

Prob 4.59 Where is: Case 1 Case 2 Case 3 Case 4

MV Optimization Inequality & Equality Constrained

KKT Necessary Conditions for Min Regularity check - gradients of active inequality constraints are linearly independent

Relax both constraints (Prob 4.59)

Constraint Variation Sensitivity Theorem The instantaneous rate of change in the objective function with respect to relaxing a constraint IS the LaGrange multiplier!

Practical Use of Multipliers in 4.59 The first-order approximation on f(x), of relaxing a constraint is obtained from a Taylor Series expansion: f(actual)=1 versus f(approx)=0

Summary Min =-Max, i.e. f(x)=-F(x) Necessary Conditions for Min KKT point is a CANDIDATE min! (need sufficient conditions for proof) Use switching conditions, Gaussian Elimination to find KKT pts LaGrange multipliers are the instantaneous rate of change in f(x) w.r.t. change in constraint relaxation.