Computational Optimization

Slides:



Advertisements
Similar presentations
1 OR II GSLM Outline  some terminology  differences between LP and NLP  basic questions in NLP  gradient and Hessian  quadratic form  contour,
Advertisements

Introducción a la Optimización de procesos químicos. Curso 2005/2006 BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained Important concepts.
Optimization 吳育德.
Engineering Optimization
C&O 355 Mathematical Programming Fall 2010 Lecture 15 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A.
1 OR II GSLM Outline  classical optimization – unconstrained optimization  dimensions of optimization  feasible direction.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
L12 LaGrange Multiplier Method Homework Review Summary Test 1.
D Nagesh Kumar, IIScOptimization Methods: M2L2 1 Optimization using Calculus Convexity and Concavity of Functions of One and Two Variables.
Review + Announcements 2/22/08. Presentation schedule Friday 4/25 (5 max)Tuesday 4/29 (5 max) 1. Miguel Jaller 8:031. Jayanth 8:03 2. Adrienne Peltz 8:202.
Thursday, April 25 Nonlinear Programming Theory Separable programming Handouts: Lecture Notes.
The Most Important Concept in Optimization (minimization)  A point is said to be an optimal solution of a unconstrained minimization if there exists no.
MIT and James Orlin © Nonlinear Programming Theory.
OPTIMAL CONTROL SYSTEMS
Tutorial 12 Linear programming Quadratic programming.
Unconstrained Optimization Problem
MAE 552 – Heuristic Optimization Lecture 1 January 23, 2002.
Tutorial 10 Iterative Methods and Matrix Norms. 2 In an iterative process, the k+1 step is defined via: Iterative processes Eigenvector decomposition.
Optimality Conditions for Nonlinear Optimization Ashish Goel Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.
CSE 245: Computer Aided Circuit Simulation and Verification
D Nagesh Kumar, IIScOptimization Methods: M2L3 1 Optimization using Calculus Optimization of Functions of Multiple Variables: Unconstrained Optimization.
Why Function Optimization ?
Math for CSLecture 51 Function Optimization. Math for CSLecture 52 There are three main reasons why most problems in robotics, vision, and arguably every.
Tier I: Mathematical Methods of Optimization
Computational Optimization
KKT Practice and Second Order Conditions from Nash and Sofer
Nonlinear Programming Models
7. Concavity and convexity
Multivariate Unconstrained Optimisation First we consider algorithms for functions for which derivatives are not available. Could try to extend direct.
CSE 245: Computer Aided Circuit Simulation and Verification Matrix Computations: Iterative Methods I Chung-Kuan Cheng.
Ecnomics D10-1: Lecture 11 Profit maximization and the profit function.
Optimality Conditions for Unconstrained optimization One dimensional optimization –Necessary and sufficient conditions Multidimensional optimization –Classification.
Signal & Weight Vector Spaces
Performance Surfaces.
Introduction to Optimization
1 OR II GSLM Outline  equality constraint  tangent plane  regular point  FONC  SONC  SOSC.
Linear & Nonlinear Programming -- Basic Properties of Solutions and Algorithms.
Linear Programming Chap 2. The Geometry of LP  In the text, polyhedron is defined as P = { x  R n : Ax  b }. So some of our earlier results should.
OR II GSLM
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Chapter 6 Differentiation.
L6 Optimal Design concepts pt B
deterministic operations research
Chapter 11 Optimization with Equality Constraints
Computational Optimization
L11 Optimal Design L.Multipliers
Polynomial Norms Amir Ali Ahmadi (Princeton University) Georgina Hall
Lecture 8 – Nonlinear Programming Models
CSE 245: Computer Aided Circuit Simulation and Verification
CS5321 Numerical Optimization
1. Problem Formulation.
Chap 3. The simplex method
CSE 245: Computer Aided Circuit Simulation and Verification
CS5321 Numerical Optimization
Optimization Part II G.Anuradha.
L5 Optimal Design concepts pt A
PRELIMINARY MATHEMATICS
Outline Unconstrained Optimization Functions of One Variable
I.4 Polyhedral Theory (NW)
Shivangi’s Questions z = x3+ 3x2y + 7y +9 What is
Derivatives in Action Chapter 7.
I.4 Polyhedral Theory.
CIS 700: “algorithms for Big Data”
Performance Surfaces.
Nonlinear programming
Performance Optimization
Outline Preface Fundamentals of Optimization
8/7/2019 Berhanu G (Dr) 1 Chapter 3 Convex Functions and Separation Theorems In this chapter we focus mainly on Convex functions and their properties in.
L7 Optimal Design concepts pt C
Multivariable optimization with no constraints
Presentation transcript:

Computational Optimization Convexity and Unconstrained Optimization 1/26-1/29

Convex Sets A set S is convex if the line segment joining any two points in the set is also in the set, i.e., for any x,yS, x+(1- )y S for all 0   1 }. convex not convex convex not convex not convex

Proving Convexity Prove {x|Ax<=b} is convex.

You Try Prove D= {x | ||x||≤1} is convex.

Convex Functions A function f is (strictly) convex on a convex set S, if and only if for any x,yS, f(x+(1- )y)(<)   f(x)+ (1- )f(y) for all 0   1. x y f(y) f(x) f(x+(1- )y) x+(1- )y

Proving Function Convex Linear functions

You Try

Convexity and Curvature Convex functions have positive curvature everywhere. Curvature can be measured by the second derivative or Hessian. Properties of the Hessian indicate if a function is positive definite or not.

Review of 1D Optimality Conditions First Order Necessary Condition If x* is a local min then f’(x*)=0. If f’(x*)=0 then ??????????

Theorem Let f be twice continuously differentiable. f(x) is convex on S if and only if for all xX, the Hessian at x is positive semi-definite.

Definition The matrix H is positive semi-definite (p.s.d.) if and only if for any vector y The matrix H is positive definite (p.d.) if and only if for any nonzero vector y Similarly for negative (semi-) definite.

Theorem Let f be twice continuously differentiable. If the Hessian at x is positive definite. Then f(x) is strictly convex.

Checking Matrix H is p.s.d/p.d. Manually

via eigenvalues The eigenvalues of

Using eigenvalues If all eigenvalues are positive, then matrix is positive definite, p.d. If all eigenvalues are nonnegative, then matrix is positive semi-definite, p.s.d If all eigenvalues are negative, then matrix is negative definite, n.d. If all eigenvalues are nonpositive, then matrix is negative semi-definite, n.s.d Otherwise the matrix is indefinite.

Try with Hessians

Line Search Assume f function maps the vector f to a scalar: Current point is Have interval Want to find:

Review of 1D Optimality Conditions First Order Necessary Condition If x* is a local min then f’(x*)=0. If f’(x*)=0 then ??????????

2nd Derivatives - 1D Case Sufficient conditions Necessary conditions If f’(x*)=0 and f’’(x*) >0, then x* is a strict local min. If f’(x*)=0 and f’’(x*) <0, then x* is a strict local max. Necessary conditions If x* is a local min, then f’(x*)=0 and f’’(x*) >=0. If x* is a local max, then f’(x*)=0 and f’’(x*) <=0.

Example Say we are minimizing Solution is [8 2]’ Say we are at [ 0, -1]’ and we want to do a linesearch in direction d=[1 0]’

Line Search We are at [ 0, -1]’ and we want to do a linesearch in direction d=[1 0]’ [8,2] [0,-1] d [29/4,-1]

Descent Directions If the directional derivative is negative then linesearch will lead to decrease in the function [8,2] d [0,-1]

Example continued The exact stepsize can be found

Example continued So new point is

Differentiability and Convexity For convex function, linear approximation underestimates function f(x) (x*,f(x*))

Theorem Assume f is continuously differentiable on a Set S. F is convex on S if and only if

Theorem Consider problem min f(x) unconstrained. If and f is convex, then is a global minimum. Proof:

Unconstrained Optimality Conditions Basic Problem: (1) Where S is an open set e.g. Rn

First Order Necessary Conditions Theorem: Let f be continuously differentiable. If x* is a local minimizer of (1), then

Stationary Points Note that this condition is not sufficient Also true for local max and saddle points

Proof Assume false, e.g.,

Second Order Sufficient Conditions Theorem: Let f be twice continuously differentiable. If and then x* is a strict local minimizer of (1).

Proof Any point x in neighborhood of x* can be written as x*+d for some vector d with norm 1 and <= *. Since f is twice continuously differentiable, we can choose * such that

Second Order Necessary Conditions Theorem: Let f be twice continuously differentiable. If x* is a local minimizer of (1) then

Proof by contradiction Assume false, namely there exists some d such that

Example Say we are minimizing [8,2]???

Solve FONC Solve FONC to find stationary point *

Check SOSC The Hessian at x*

Alternative Argument The Hessian at every value x is

You Try Use FONC and matlab to find solution of Are SOSC satisfied? Are SONC? Is f convex?