Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 4 Chapter 3 Improving Search

Similar presentations


Presentation on theme: "Lecture 4 Chapter 3 Improving Search"— Presentation transcript:

1 Lecture 4 Chapter 3 Improving Search
Local Improvement Hill Climbing Local Search Neighborhood Search Def 3.3 P 83 Improving searches are algorithms that begin at a feasible point and move along a search path of feasible points with improving objective value.

2 Hill Climbing Feasible Region 300 400 200 100 Maximum

3 Neighborhood The neighborhood of a point is all nearby points.
I haven’t defined nearby yet.

4 Local & Global Optimum Def 3.5 P 85 Local Optimum – point must be feasible and in a small neighborhood no feasible point is better Def 3.7 P 85 Global Optimum – point must be feasible and no other feasible point is better

5 Improving Search Def 3.12 P 88 Improving searches move from point 1 to point 2 by taking a step size of lambda in the direction delta x. X2 = X1 + (X) where  is the step size and X is the direction

6 Simplex Algorithm Obj Value Improves at Each Step 4 3 1 2

7 Interior Point Algorithm
Obj Value Improves at Each Step 4 3 2 1

8 Improving Direction Def 3.13 P 90 Improving Direction – objective function must improve for a sufficiently small step size C What are the improving directions at A, B, and C? B A

9 Improving Direction Def 3.13 P 90 Improving Direction – objective function must improve for a sufficiently small step size C What are the improving directions at A, B, and C? B A

10 Feasible Direction Feasible Direction – for sufficiently small , the new point is feasible. At the point A is the direction (0,1) feasible? Is (1,0) feasible? A (1,0) =

11 Algorithm 3A: Continuous Improving Search
0. Initialization. Let X0 be an feasible point and set t to 0. Check for local optimum. If no improving feasible direction exists, then stop with Xt as a local optimum. Move direction. Let d be an improving feasible direction. Step size. Let s be the largest possible step size that leads to a better feasible solution. Make your move. Set Xt+1 = Xt+(s)(d), set t to t+1 and go to 1.

12 Gradient Def 3.19 P 99 The gradient of a function evaluated at a point is the vector of partial derivatives evaluated at the point. Def 3.20 P 101 The gradient of a function evaluated at a point is the direction of maximum increase. Let f(X,Y) = X+2Y. The gradient of f at the point (4,0) is [1,2]. Why?

13 Linear Functions For linear functions the gradient is constant at all points. f(X,Y) = 50X+75Y What is the direction of max increase? Looks like (50,75) or (10,15) or (2,3) or (1,1.5).

14 Nonlinear Function For f(X,Y) = (x-2)2 + (Y-4)2
This is a circle with a center at (2,4). f increases the further you move away from the center. Gradient of f at (4,4) = [2(x-2), 2(y-4)] evaluated at (4,4) is [4,0].

15 Important Result Def 3.21 P 102 Let G(z) (1xn) be the gradient of f(x) evaluated at the point z. Let d (nx1) be a direction. G(z)d < 0 implies f decreases in the direction d = 0 implies f stays the same in the direction d > 0 implies f increases in the direction d G(z)d is (1xn)(nx1) = 1x1 (a constant) This allows us to determine if a direction is an improving direction.

16 Example f(x,y) = (x-2)2+(y-2)2
G(0,0) = [2(x-2), 2(y-2)] at (0,0) = [-4,-4] D G.d Change in f [1,1] -8 Decreases [1,0] -4 [1,-1] No Change [-1,-1] 8 Increases f(2,2) = 0 (2,2) f(0,0) = 8

17 Improving Directions Def 3.23 P Let G(z) (1xn) be the gradient of f(x) at the point z. If G(z) not 0, then d = G(z) (nx1) is an improving direction for the maximization problem. -G(z) is an improving direction for the minimization problem.

18 Example f(x,y) = (x-2)2+(y-2)2
z G(z)= [2(x-2),2(y-2)] [0,0] [-4,-4] [2,0] [0,-4] [0,2] [-4,0] [3,3] [2,2] [3,0] [2,-4] (0,2) (2,2) (2,0) (0,0)

19 Feasible Directions The question:
Is a given direction d feasible at the point z? To answer this we must examine the constraints: Three terms imply the same thing: active at z tight at z binding at z Equality constraints are always tight!

20 What are the feasible directions at z=[0,0]
Minimize f(x,y)=(x-2)2+(y-2)2 Subject to x-y > 0 x > 0, 0 < y < 4 Direction Y/N [0,1] [-1,-1] [1,0] [1,1]

21 What are the feasible directions at z=[0.0]
Minimize f(x,y)=(x-2)2+(y-2)2 Subject to x-y > 0 x > 0, y > 0 Direction Y/N [0,1] NO [-1,-1] [1,0] YES [1,1]

22 To Obtain Local Optimum
For improving search, you need improving feasible directions to get a local optimum. Tractable Models (The Best Models For Applications) Local Optimum = Global Optimum (see page 109) Def 3.26 Unimodal Objective Function x1 is feasible and x2 is feasible f(x2) is better than f(x1), then d = x2 – x1 is an improving direction

23 Unimodal – For Minimization
Not Unimodal

24 Linear Functions Are Unimodal
Def 3.28 P 112 If the objective function is unimodal, then every unconstrained local optimum is an unconstrained global optimum.

25 Convex Sets Def 3.29 If x and y are in the set, then all points on the line segment between x and y are also in the set. Convex Set Not Convex Not Convex Convex Set

26 Discrete Sets Def 3.30 Discrete Sets are not convex.

27 Linear Constraints Def 3.32 If all constraints are linear, then the feasible set of points is a convex set. Convex Set

28 Local Optimum = Global Optimum
Def Objective function is unimodal Constraint set is convex Local = Global Improving search works for this type of problem.


Download ppt "Lecture 4 Chapter 3 Improving Search"

Similar presentations


Ads by Google