Presentation is loading. Please wait.

Presentation is loading. Please wait.

Discrete Optimization Shi-Chung Chang. Discrete Optimization Lecture #1 Today: Reading Assignments 1.Chapter 1 and the Appendix of [Pas82] 2.Chapter 1.

Similar presentations


Presentation on theme: "Discrete Optimization Shi-Chung Chang. Discrete Optimization Lecture #1 Today: Reading Assignments 1.Chapter 1 and the Appendix of [Pas82] 2.Chapter 1."— Presentation transcript:

1 Discrete Optimization Shi-Chung Chang

2 Discrete Optimization Lecture #1 Today: Reading Assignments 1.Chapter 1 and the Appendix of [Pas82] 2.Chapter 1 of [GaJ79]

3 Outline: 1.Course Overview » A taxonomy of optimization problems » Course introduction » Requirements and schedule 2.Some Basics of Optimization » Local and global optima » Feasibility » Convexity » Convex Programming 3.Algorithms and Complexity » Problems, algorithms, and Complexity » Polynomial time algorithms » Intractability » NP-Complete Problems

4 §I.1 Course Overview Introduction to Optimization Problems Ingredients of an optimization problem A set of independents on the values of variables or parameters Condition or restrictions on the values of variables Criterion or objection  find the best solution

5 Example 1: 養馬問題 又要馬兒肥,又要馬兒不吃草  P(x) X Grass Horse Price Expense X Grass E(X)

6 A Standard Mathematical Form max F(x) x s Subject to (x)=0 i=1,…,m (x)≤0 j=1,…,r Classifications (1) By the time factor static dynamic

7 Subject to X( t)= a ≤ u(t) ≤ b 0 ≤ x(t) ≤ R (2 ) By the nature of variables S => nonlinear programming => discrete(combinatorial) optimization =>mixed integer programming C( t ) Grass Price Factor Example 2:  養馬問題 max P(x())- c(t)E(u(t))d x(t) t

8 (3) By the nature of problem functions Properties of F(x)Properties of g(x)&h(x) Function of a single variable Linear function No Constraints Sum of squares of linear functions Simple bounds Quadratic functionsLinear functions Sum of squares of nonlinear functions Sparse linear functions Smooth functionSmooth nonlinear functions Sparse nonlinear functionNon-smooth nonlinear functions Non-smooth function

9 In this course, we will consider problems with discrete variables. Example: The Traveling Salesman Problem (TSP) Cities:,,, Distance between cities d(, )  Find the shortest path that goes through every city once and only once and back to the starting city. Q: Why is this a discrete optimization problem?

10 Course Introduction Nonlinear programming Convex programming Linear programming Integer programming.

11 Linear Programming minimize c’ x x subject to A 1 x = b 1 A 2 X ≤ b 2 Fact : Optimal solution happens at a vertex  discrete (combinatorial) nature  LP serves as a bridge between the continuous and discrete optimization x1x1 x2x2 c

12 Optimization on Networks Example: The general minimum-cost flow problem minimize subject to - = bi i=1,2…,n [Flow balance] ≤ ≤ [Flow balance]  LP with special constraint structure  efficient solution techniques available by exploiting such a structure  Variations: shortest path problem minimum spanning tree maximum flow Minimum-cost flow 13 4 6 7 2 7 8 [, ]

13 Emphases: (1)Exploiting special problem structure (2)Applications in EE&CS (3)Distributed or parallel algorithms (4)Foundations for problems with nonlinear objectives or other discrete optimization problems

14 Integer Programming basic techniques for general problems  NP-completeness TSP Knapsack Problems min s. t. Scheduling Problems Simulated Annealing an approach to global optimization

15 §I.2. Basics of Optimization Definition of on Optimization Problem J= i.e. a mapping A: F -> An instance of an optimization: (F,A) Feasibility :  x is feasible e.g.

16 Neighborhood: Given an optimization problem with instances(F,A), a neighborhood is a mapping N: F -> example: (a) In LP with FC, we can define a neighborhood

17 Q: what to do with discrete cases? Example: 2-change of ={g: and g can be obtained from f by removing two edges from the tour and then replacing them with two edges}

18 Local and Global Optima Example: continuous case  A and C are local minima B is the global minima Q: How to define them for discrete optimization? F(x) 0ax A B C

19 Definition : Local Optima Given an instance (F,A) of an optimization problem, and a neighborhood N,  is locally optimal with respect to N if A(f) ≤A(g) example: A best TSP tour in may not be the solution tour. Definition : Global Optima If and is locally optimal w.r.t a neigborhood N, it is then also locally optimal to any other neighborhood  f is globally optimal N is exact. Q: How to check with respect to all N? example: In TSP, is not exact but is for an n-city problem.

20 D iscrete Optimization Lecture #2 Last time: Course overview Some basics of optimization Today: Reading Assignments: 1. Chapter 1 of [GAJ79] 2. Sections 2.1~2.3 of [PaS82] 3. Sections 2.4,2.5,3.1~3.8 of [Lue84].

21 Outline: 1. Some basics of optimization (cont.) convexity & Convex Programming 2. Algorithms and Complexity Problems, algorithms, and complexity Polynomial time algorithms Intractability NP-Complete Problems 3. Basic properties of Linear Programs From of LP Basic feasible solutions Geometry of LP 4. The simplex method Homework#1 Due:

22 I.3 Convexity and Convex Programming Why are we so interested in convex functions and convex sets? min (1)Globally speaking, If J, F are convex  minimum points are global minimum (2) Local convexity  local minimum Definition: Convex Set is convex iff and Q: Facts about convex sets (1) A hyperplane = is a convex set (2) A half space ≤ is a convex set (3) Intersection of convex sets is convex Union of convex set? (4) Contraction of a convex set is convex (5) An empty set is convex

23 Definition Convex Functions f: If Then f is convex Facts about convex functions (1) A quadratic function is convex if Q ≥0 (2) The linear extrapolation(approximation) at a point underestimates a convex function i.e. assume  x

24 (3) f is convex iff is positive semi-definite over proof: read by yourself (4) Linear combination ( positive coeff.) of convex functions is convex (5) The level set is convex for all c if f is convex  Min s.t. is a convex programming problem if f. g. and are convex.

25 Properties of Convex Minimization (a)s= arg min is a convex set Proof: are two optimal solutions   by def. of min.

26 (b) Every local minimum of a convex programming problem is also a global minimum Proof: are two local minimum and < Q.  (a)+(b) implies that when the solution is not unique, all are equivalent f(X) s


Download ppt "Discrete Optimization Shi-Chung Chang. Discrete Optimization Lecture #1 Today: Reading Assignments 1.Chapter 1 and the Appendix of [Pas82] 2.Chapter 1."

Similar presentations


Ads by Google