Presentation is loading. Please wait.

Presentation is loading. Please wait.

Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.

Similar presentations


Presentation on theme: "Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming."— Presentation transcript:

1 Constrained Optimization Rong Jin

2 Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming

3 Optimization Under Equality Constraints  Maximum Entropy Model  English ‘in’  French {dans (1), en (2), à (3), au cours de (4), pendant (5)}

4 Reducing variables  Representing variables using only p 1 and p 4  Objective function is changed  Solution: p 1 = 0.2, p 2 = 0.3, p 3 =0.1, p 4 = 0.2, p 5 = 0.2

5 Maximum Entropy Model for Classification  It is unlikely that we can use the previous simple approach to solve such a general  Solution: Lagrangian

6 Equality Constraints: Lagrangian  Introduce a Lagrange multiplier for the equality constraint  Construct the Lagrangian  Necessary condition A optimal solution for the original optimization problem has to be one of the stationary point of the Lagrangian

7 Example:  Introduce a Lagrange multiplier for constraint  Construct the Lagrangian  Stationary points

8 Lagrange Multipliers  Introducing a Lagrange multiplier for each constraint  Construct the Lagrangian for the original optimization problem

9 Lagrange Multiplier  We have more variables p 1, p 2, p 3, p 4, p 5 and, 1, 2, 3  Necessary condition (first order condition) A local/global optimum point for the original constrained optimization problem  a stationary point of the corresponding Lagrangian Original Entropy Function Constraints

10 Stationary Points for Lagrangian All probabilities p 1, p 2, p 3, p 4, p 5 are expressed as functions of Lagrange multipliers s

11 Dual Problem  p 1, p 2, p 3, p 4, p 5 are expressed as functions of s  We can even remove the variable 3  Put together necessary condition  Still difficult to solve

12 Dual Problem  p 1, p 2, p 3, p 4, p 5 are expressed as functions of s  We can even remove the variable 3  Put together necessary condition  Still difficult to solve

13 Dual Problem  Dual problem Substitute the expression for ps into the Lagrangian Find the s that MINIMIZE the substituted Lagrangian

14 Dual Problem Finding s such that the above objective function is minimized Original Lagrangian Substituted Lagrangian Expression for ps

15 Dual Problem  Using dual problem Constrained optimization  unconstrained optimization  Need to change maximization to minimization  Only valid when the original optimization problem is convex/concave (strong duality) Dual Problem Primal Problem x*= * When convex/concave

16 Maximum Entropy Model for Classification  Introduce a Lagrange multiplier for each linear constraint

17 Maximum Entropy Model for Classification  Construct the Lagrangian for the original optimization problem Original Entropy Function Consistency Constraint Normalization Constraint

18 Stationary Points Stationary points: first derivatives are zero Sum of conditional probabilities must be one Conditional Exponential Model !

19 Dual Problem

20

21

22 What is wrong?

23 Dual Problem

24

25

26

27 Minimizing L  maximizing the log-likelihood

28 Support Vector Machine  Having many inequality constraints  Solving the above problem directly could be difficult  Many variables: w, b,   Unable to use nonlinear kernel function

29 Inequality Constraints: Modified Lagrangian  Introduce a Lagrange multiplier for the inequality constraint  Construct the Lagrangian  Karush-Kuhn-Tucker (KKT) condition A optimal solution for the original optimization problem will satisfy the following conditions Non-negative Lagrange Multiplier Two cases: 1.g(x) = c, 2.g(x) > c  =0

30 Example:  Introduce a Lagrange multiplier for constraint  Construct the Lagrangian  KT conditions  Expressing objective function using  Solution is =3

31 Example:  Introduce a Lagrange multiplier for constraint  Construct the Lagrangian  KT conditions  Expressing objective function using  Solution is =3

32 Example:  Introduce a Lagrange multiplier for constraint  Construct the Lagrangian  KKT conditions  Expressing objective function using  Solution is =3

33 SVM Model  Lagrange multipliers for inequality constraints Min  Max +  

34 SVM Model  Lagrangian for SVM model  Karush-Kuhn-Tucker condition

35 SVM Model  Lagrangian for SVM model  Karush-Kuhn-Tucker condition

36 Dual Problem for SVM  Express w, b,  using  and 

37 Dual Problem for SVM  Express w, b,  using  and   Finding solution satisfying KKT conditions is difficult

38 Dual Problem for SVM  Rewrite the Lagrangian function using only  and   Simplify using KT conditions

39 Dual Problem for SVM  Final dual problem Maximize  Minimize

40 Quadratic Programming Find Subject to

41 Linear Programming Find Subject to Very very useful algorithm 1300+ papers 100+ books 10+ courses 100s of companies Main methods Simplex method Interior point method Most important: how to convert a general problem into the above standard form

42 Example  Need to change max to min Find Subject to

43 Example  Need change  to  Find Subject to

44 Example  Need to convert the inequality Find Subject to

45 Example  Need change |x 3 | Find Subject to


Download ppt "Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming."

Similar presentations


Ads by Google