Download presentation
Presentation is loading. Please wait.
Published byIsabella Gibson Modified over 6 years ago
1
Chapter 1. Introduction Mathematical Programming (Optimization) Problem: min/max š(š„) subject to š š (š„)ā¤0, š=1,ā¦,š, ( ā š š„ =0, š=1,ā¦,š) (š„āšā š
š ) š, š š , ā š : š
š āš
If š, š š , ā š linear (affine) function ļ® linear programming problem If solution set (or some of the variables) restricted to be integer points ļ® integer programming problem If š, š š , ā š (or part of them) nonlinear function ļ® nonlinear programming problem If min š(š„), and š(š„), š š (š„) are convex functions, no equality constraints ā convex programming problem. Linear Programming 2016
2
(There may exist variables unrestricted in sign)
Linear programming: problem of optimizing (maximize or minimize) a linear (objective) function subject to linear inequality (and equality) constraints. General form: {max, min} š ā² š„ subject to š š ā² š„ā„ š š , šā š 1 š š ā² š„ā¤ š š , šā š 2 š š ā² š„= š š , šā š 3 š„ š ā„0, šā š 1 , š„ š ā¤0, šā š 2 š, š š ,š„ā š
š (There may exist variables unrestricted in sign) inner product of two column vectors š„,š¦ā š
š : š„ ā² š¦= š=1 š š„ š š¦ š If š„ ā² š¦=0, š„,š¦ā 0, then š„,š¦ are said to be orthogonal. In 3-D, the angle between the two vectors is 90 degrees. ( vectors are column vectors unless specified otherwise) Linear Programming 2016
3
š„ 1 , š„ 2 ,ā¦, š„ š : (decision) variables š š : right-hand-side
Big difference from systems of linear equations is the existence of objective function and linear inequalities (instead of equalities) Much deeper theoretical results and applicability than systems of linear equations. š„ 1 , š„ 2 ,ā¦, š„ š : (decision) variables š š : right-hand-side š š ā² š„ { ļ³, ļ£, ļ½ } š š : i th constraint š„ š { ļ³, ļ£ } 0 : nonnegativity (nonpositivity) constraint š ā² š„ : objective function Other terminology: feasible solution, feasible set (region), free (unrestricted) variable, optimal (feasible) solution, optimal cost, unbounded Linear Programming 2016
4
Important submatrix multiplications
Interpretation of constraints: see as submatrix multiplication. A: šĆš matrix , where š š is i-th unit vector denote constraints as š“š„ { ļ³, ļ£, ļ½ } š Linear Programming 2016
5
Any LP can be expressed as min š ā² š„, š“š„ā„š
max š ā² š„ ļ® min (ā š ā² š„) and take negative of the optimal cost š š ā²š„ā¤ š š ļ® āš š ā² š„ā„ā š š š š ā² š„= š š ļ® š š ā² š„ā„ š š , āš š ā² š„ā„ā š š nonnegativities (nonpositivities) are special cases of inequalities which will be handled separately in the algorithms and duality theory. Feasible solution set of LP can always be expressed as š“š„ā„š (or š“š„ā¤š) (called polyhedron, a set which can be described as a solution set of finitely many linear inequalities) We may sometimes use max š ā² š„, š“š„ā¤š form (especially, when we study polyhedron) Linear Programming 2016
6
Standard form problems
Standard form : min š ā² š„, š“š„=š, š„ā„0 Two view points: Find optimal weights (nonnegative) from possible nonnegative linear combinations of columns of A to obtain b vector Find optimal solution that satisfies linear equations and nonnegativity Reduction to standard form Free (unrestricted) variable š„ š ļ® š„ š + ā š„ š ā , š„ š + , š„ š ā ā„0 š š šš š„ šš ā¤ š š ļ® š š šš š„ šš + š š = š š , š š ā„0 (slack variable) š š šš š„ šš ā„ š š ļ® š š šš š„ šš ā š š = š š , š š ā„0 (surplus variable) Linear Programming 2016
7
Any (practical) algorithm can solve the LP problem in equality form only (except nonnegativity)
Modified form of the simplex method can solve the problem with free variables directly (without using difference of two variables). It gives more sensible interpretation of the behavior of the algorithm. Linear Programming 2016
8
1.2 Formulation examples See other examples in the text.
Minimum cost network flow problem Directed network šŗ=(š,š“), ( š =š ) arc capacity š¢ šš , (š,š)āš“, unit flow cost š šš , (š,š)āš“ š š : net supply at node i ( š š > 0: supply node, š š < 0: demand node), (We may assume šāš š š = 0) Find minimum cost transportation plan that satisfies supply, demand at each node and arc capacities. minimize (š,š)āš“ š šš š„ šš subject to {š:(š,š)āš“} š„ šš ā š: š,š āš“ š„ šš = š š , i = 1, ā¦, n (out flow - in flow = net flow at node i, flow conservation constraints) (some people use, in flow ā out flow = net flow) š„ šš ā¤ š¢ šš , (š,š)āš“ š„ šš ā„0, (š,š)āš“ Linear Programming 2016
9
Data can be sent using more than one path.
Choosing paths in a communication network ( (fractional) multicommodity flow problem) Multicommodity flow problem: Several commodities share the network. For each commodity, it is min cost network flow problem. But the commodities must share the capacities of the arcs. Generalization of min cost network flow problem. Many applications in communication, distribution / transportation systems Several commodities case Actually one commodity. But there are multiple origin and destination pairs of nodes (telecom, logistics, ..). Each origin-destination pair represent a commodity. Given telecommunication network (directed) with arc set A, arc capacity š¢ šš bits/sec, (š,š)āš“, unit flow cost š šš /bit , (š,š)āš“, demand š šš bits/sec for traffic from node k to node l. Data can be sent using more than one path. Find paths to direct demands with min cost. Linear Programming 2016
10
š„ šš šš : amount of data with origin k and destination l that
Decision variables: š„ šš šš : amount of data with origin k and destination l that traverses link (š,š)āš“ Let š š šš = š šš if š=š ā š šš if š=š otherwise Formulation (flow based formulation) minimize (š,š)āš“ š š š šš š„ šš šš subject to {š:(š,š)āš“} š„ šš šš ā š: š,š āš“ š„ šš šš = š š šš , š,š,š=1,ā¦,š (out flow - in flow = net flow at node i for commodity from node k to node l) š š š„ šš šš ā¤ š¢ šš , (š,š)āš“ (The sum of all commodities traversing link (š,š) should not exceed the capacity of link (i, j) ) š„ šš šš ā„0, š,š āš“, š,š=1,ā¦,š Linear Programming 2016
11
Alternative formulation (path based formulation)
Let K: set of origin-destination pairs (commodities) š š : demand of commodity šāš¾ P(k): set of all possible paths for sending commodity kļK P(k;e): set of paths in P(k) that traverses arc eļA E(p): set of links contained in path p Decision variables: š¦ š š : fraction of commodity k sent on path p minimize šāš¾ šāš(š) š¤ š š š¦ š š subject to šāš(š) š¦ š š =1, for all šāš¾ šāš¾ šāš(š;š) š š š¦ š š ā¤ š¢ š , for all šāš“ 0ā¤ š¦ š š ā¤1, for all šāš š , šāš¾, where š¤ š š = š š šāšø(š) š š If š¦ š š ā{0,1}, it is a single path routing problem (path selection problem, integer multicommodity flow problem). Linear Programming 2016
12
can be solved easily by column generation technique (later).
path based formulation has smaller number of constraints, but enormous number of variables. can be solved easily by column generation technique (later). Integer version is more difficult to solve. Extensions: Network design - also determine the number and type of facilities to be installed on the links (and/or nodes) together with routing of traffic. Variations: Integer flow. Bifurcation of traffic may not be allowed. Determine capacities and routing considering rerouting of traffic in case of network failure, Robust network design (data uncertainty), ... Linear Programming 2016
13
Pattern classification (Linear classifier)
Given m objects with feature vector š š ā š
š , š=1,ā¦,š. Objects belong to one of two classes. We know the class to which each sample object belongs. We want to design a criterion to determine the class of a new object using the feature vector. Want to find a vector (š„, š„ š+1 )ā š
š+1 with š„ā š
š such that, if šāš, then š š ā² š„ā„ š„ š+1 , and if šāš, then š š ā² š„< š„ š+1 . (if it is possible) Linear Programming 2016
14
Find a feasible solution (š„, š„ š+1 ) that satisfies
š š ā² š„ā„ š„ š+1 , šāš š š ā² š„< š„ š+1 , šāš for all sample objects i Is this a linear programming problem? ( no objective function, strict inequality in constraints) Linear Programming 2016
15
Is strict inequality allowed in LP?
consider min x, x > 0 ļ® no minimum point. only infimum of objective value exists If the system has a feasible solution (š„, š„ š+1 ), we can make the difference of the values in the right hand side and in the left hand side large by using solution š(š„, š„ š+1 ) for M > 0 and large. Hence there exists a solution that makes the difference at least 1 if the system has a solution. Remedy: Use š š ā² š„ā„ š„ š+1 , šāš š š ā² š„ā¤ š„ š+1 ā1, šāš Important problem in data mining with applications in target marketing, bankruptcy prediction, medical diagnosis, process monitoring, ā¦ Linear Programming 2016
16
Variations What if there are many choices of hyperplanes? any reasonable criteria? What if there is no hyperplane separating the two classes? Do we have to use only one hyperplane? Use of nonlinear function possible? How to solve them? SVM (support vector machine), convex optimization More than two classes? Linear Programming 2016
17
1.3 Piecewise linear convex objective functions
Some problems involving nonlinear functions can be modeled as LP. Def: Function š: š
š āš
is called a convex function if for all š„,š¦ā š
š and all ļ¬ļ[0, 1] š šš„+ 1āš š¦ ā¤šš š„ + 1āš š(š¦). ( the domain may be restricted) f called concave if āš is convex (picture: the line segment joining (š„,š š„ ) and (š¦,š š¦ ) in š
š+1 is not below the locus of š(š„) ) Linear Programming 2016
18
Then ļ¬1x + ļ¬2y is said to be a convex combination of x, y.
Def: š„,š¦ā š
š , ļ¬1, ļ¬2 ļ³ 0, ļ¬1+ ļ¬2 = 1 Then ļ¬1x + ļ¬2y is said to be a convex combination of x, y. Generally, š=1 š š š š„ š , where š=1 š š š =1 and š š ā„0, š=1,ā¦,š is a convex combination of the points š„ 1 ,ā¦, š„ š . Def: A set šā š
š is convex if for any š„,š¦āš, we have š 1 š„+ š 2 š¦āš for any š 1 , š 2 ā„0, š 1 + š 2 =1. Picture: š 1 š„+ š 2 š¦= š 1 š„+ 1ā š 1 š¦, 0ā¤ š 1 ā¤1 =š¦+ š 1 (š„āš¦), 0ā¤ š 1 ā¤1 (line segment joining š„,š¦ lies in š) x (ļ¬1 = 1) (š„āš¦) y (ļ¬1 = 0) (š„āš¦) Linear Programming 2016
19
Picture: š 1 š„+ š 2 š¦= š 1 š„+ 1ā š 1 š¦,
If we have š 1 š„+ š 2 š¦, š 1 + š 2 =1 (without š 1 , š 2 ā„0), it is called an affine combination of š„ and š¦. Picture: š 1 š„+ š 2 š¦= š 1 š„+ 1ā š 1 š¦, =š¦+ š 1 (š„āš¦), (ļ¬1 is arbitrary) (line passing through points š„,š¦) Linear Programming 2016
20
Picture of convex function
Linear Programming 2016
21
relation between convex function and convex set
Def: š: š
š āš
. Define epigraph of š as epi(š) = { š„,š ā š
š+1 :šā„š š„ }. Then previous definition of convex function is equivalent to epi(š) being a convex set. When dealing with convex functions, we frequently consider epi(š) to exploit the properties of convex sets. Consider operations on functions that preserve convexity and operations on sets that preserve convexity. Linear Programming 2016
22
Consider š š„ = ššš„ š=1,ā¦,š ( š š ā² š„+ š š ), š š ā š
š , š š āš
Example: Consider š š„ = ššš„ š=1,ā¦,š ( š š ā² š„+ š š ), š š ā š
š , š š āš
(maximum of affine functions, called a piecewise linear convex function.) š 1 ā² š„+ š 1 š 2 ā² š„+ š 2 š 3 ā² š„+ š 3 š„ Linear Programming 2016
23
Thm: Let š 1 ,ā¦, š š : š
š āš
be convex functions. Then
š š„ = ššš„ š=1,ā¦,š š š (š„) is also convex. pf) š šš„+ 1āš š¦ = ššš„ š=1,ā¦,š š š (šš„+ 1āš š¦) ļ£ ššš„ š=1,ā¦,š (š š š š„ + 1āš š š š¦ ) ļ£ ššš„ š=1,ā¦,š š š š š„ + ššš„ š=1,ā¦,š (1āš) š š (š¦) = šš š„ + 1āš š(š¦) ļæ Linear Programming 2016
24
Min of piecewise linear convex functions
minimize ššš„ š=1,ā¦,š ( š š ā² š„+ š š ) Subject to š“š„ā„š minimize š§ Subject to š§ā„ š š ā² š„+ š š , š=1,ā¦,š š“š„ā„š Linear Programming 2016
25
Minimum of a piecewise linear concave function?
Q: What can we do about finding maximum of a piecewise linear convex function? maximum of a piecewise linear concave function (can be obtained as minimum of affine functions)? Minimum of a piecewise linear concave function? Linear Programming 2016
26
Convex function has a nice property such that a local minimum point is a global minimum point. (when domain is š
š or convex set) (HW later) Hence finding the minimum of a convex function defined over a convex set is usually easy. But finding the maximum of a convex function is difficult to solve. Basically, we need to examine all local maximum points. Similarly, finding the maximum of a concave function is easy, but finding the minimum of a concave function is difficult. Linear Programming 2016
27
Q: What about constraints š(š„)ā„ā? Can it be modeled as LP?
Suppose we have š(š„)ā¤ā in constraints, where š(š„) is a piecewise linear convex function š š„ = ššš„ š=1,ā¦,š š š ā² š„+ š š . ļ š š ā² š„+ š š ā¤ā, š=1,ā¦,š Q: What about constraints š(š„)ā„ā? Can it be modeled as LP? Def: š: š
š āš
, is a convex function, š¼āš
The set š¶={š„:š(š„)ā¤š¼} is called the level set of š. level set of a convex function is a convex set. (HW) solution set of LP is convex (easy) ļ® non-convex solution set canāt be modeled as LP. Linear Programming 2016
28
Problems involving absolute values
minimize š=1 š š š | š„ š | subject to š“š„ā„š (assume š š >0) More direct formulations than piecewise linear convex function is possible. (1) min š=1 š š š š§ š subject to š“š„ā„š š„ š ā¤ š§ š , š=1,ā¦,š ā š„ š ā¤ š§ š , š=1,ā¦,š (2) min š=1 š š š ( š„ š + + š„ š ā ) subject to š“ š„ + āš“ š„ ā ā„š š„ + , š„ ā ā„0 (want š„ š + = š„ š if š„ š ā„0, š„ š ā =ā š„ š if š„ š <0 and š„ š + š„ š ā =0, i.e., at most one of š„ š + , š„ š ā is positive in an optimal solution. š š >0 guarantees that.) Linear Programming 2016
29
Data Fitting Regression analysis using absolute value function
Given m data points š š , š š , š=1,ā¦,š, š š ā š
š , š š āš
. Want to find š„ā š
š that predicts results š given š with function š= š ā² š„. Want š„ that minimizes the maximum prediction error | š š ā š š ā² š„| for all š. minimize š§ subject to š š ā š š ā² š„ā¤š§, š=1,ā¦,š ā š š + š š ā² š„ā¤š§, š=1,ā¦,š (want approximate solution š„ of š š ā² š„= š š , š=1,ā¦,š (š“š„=š) ļ ššš max š š š ā š š ā² š„ ) Linear Programming 2016
30
Alternative criterion minimize š=1 š | š š ā š š ā² š„|
subject to š š ā š š ā² š„ā¤ š§ š , š=1,ā¦,š ā š š + š š ā² š„ā¤ š§ š , š=1,ā¦,š šæ š norm of š„ā š
š ā” š=1 š š„ š š 1/š Quadratic error function can't be modeled as LP, but need calculus method (closed form solution) Linear Programming 2016
31
Special case of piecewise linear objective function : separable piecewise linear objective function.
function š: š
š āš
, is called separable if š š„ = š 1 š„ 1 + š 2 š„ 2 + ā¦+ š š ( š„ š ) š š ( š„ š ) š 1 < š 2 < š 3 < š 4 š 4 Approximation of nonlinear function. š 3 slope: š š š 2 š 1 š„ š š 1 š 2 š 3 š„ 1š š„ 2š š„ 3š š„ 4š Linear Programming 2016
32
0ā¤ š„ 1š ā¤ š 1 , 0ā¤ š„ 2š ā¤ š 2 āš 1 , 0ā¤ š„ 3š ā¤ š 3 āš 2 , 0ā¤ š„ 4š
Express variable š„ š in the constraints as š„ š ā” š„ 1š + š„ 2š + š„ 3š + š„ 4š , where 0ā¤ š„ 1š ā¤ š 1 , 0ā¤ š„ 2š ā¤ š 2 āš 1 , 0ā¤ š„ 3š ā¤ š 3 āš 2 , 0ā¤ š„ 4š In the objective function, use : min š 1 š„ 1š + š 2 š„ 2š + š 3 š„ 3š + š 4 š„ 4š Since we solve min problem, it is guaranteed that we get š„ šš >0 in an optimal solution implies š„ šš , š<š have values at their upper bounds. Linear Programming 2016
33
1.4 Graphical representation and solution
Let šā š
š , šāš
. Geometric intuition for the solution sets of {š„: š ā² š„=0} {š„: š ā² š„ā¤0} {š„: š ā² š„ā„0} {š„: š ā² š„=š} {š„: š ā² š„ā¤š} {š„: š ā² š„ā„š} Linear Programming 2016
34
Geometry in 2-D {š„: š ā² š„ā„0} š { š„ :šāš„ ļ£ 0 } { š„ :šāš„=0 }
Linear Programming 2016
35
Let š§ be a (any) point satisfying š ā² š„=š. Then
š„: š ā² š„=š = š„: š ā² š„= š ā² š§ ={š„: š ā² š„āš§ =0} Hence š„āš§=š¦, where š¦ is any solution to š ā² š¦=0, and š„=š¦+š§. Similarly, for {š„: š ā² š„ā¤š}, {š„: š ā² š„ā„š}. {š„: š ā² š„ā„š} š š§ {š„: š ā² š„ā¤š} {š„: š ā² š„=š} {š„: š ā² š„=0} Linear Programming 2016
36
gradient of obj. fn. is ( š 1 , š 2 ) š„2
min š 1 š„ 1 + š 2 š„ 2 s.t ā š„ 1 + š„ 2 ā¤1, š„ 1 ā„0, š„ 2 ā„0 gradient of obj. fn. is ( š 1 , š 2 ) š„2 š=(1, 0) š=(ā1, ā1) š=(1, 1) š=(0, 1) š„1 {š„: š„1 + š„2 = 0} {š„: š„1 + š„2 = š§} Linear Programming 2016
37
Representing complex solution set in 2-D
{ š variables, š equations (row vectors of the š“ matrix in š“š„=š are linearly independent), nonnegativity, and šāš=2 } š„3 (e.g. š„ 1 + š„ 2 + š„ 3 =1, š„ š ā„0 āš) š„2 š„1=0 š„2=0 š„3=0 š„1 See text sec. 1.5, 1.6 for more backgrounds Linear Programming 2016
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.