Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Univariate optimization.

Similar presentations


Presentation on theme: "CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Univariate optimization."— Presentation transcript:

1 CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Univariate optimization

2 x f(x)

3 K EY I DEAS Critical points Direct methods Exhaustive search Golden section search Root finding algorithms Bisection [More next time] Local vs. global optimization Analyzing errors, convergence rates

4 x f(x) Local maxima Local minima Inflection point Figure 1

5 x f(x) ab Figure 2a

6 x f(x) ab Find critical points, apply 2 nd derivative test Figure 2b

7 x f(x) ab Figure 2b

8 x f(x) ab Global minimum must be one of these points Figure 2c

9 x f(x) ab Exhaustive grid search Figure 3

10 x f(x) ab Exhaustive grid search

11 x f(x) Two types of errors x* xtxt f(x t ) f(x * ) Geometric error Analytical error Figure 4

12 x f(x) a b Does exhaustive grid search achieve  /2 geometric error?  x*

13 x f(x) a b Does exhaustive grid search achieve  /2 geometric error? Not necessarily for multi-modal objective functions Error x*

14 L IPSCHITZ CONTINUITY Slope +K Slope -K |f(x)-f(y)|  K|x-y| Figure 5

15 x f(x) a b Exhaustive grid search achieves K  /2 analytical error in worst case  Figure 6

16 x f(x) a b Golden section search m Bracket [a,b] Intermediate point m with f(m) < f(a),f(b) Figure 7a

17 x f(x) a b Golden section search m Candidate bracket 1 [a,m] c Candidate bracket 2 [c,b] Figure 7b

18 x f(x) a b Golden section search m Figure 7b

19 x f(x) a b Golden section search m c Figure 7b

20 x f(x) ab Golden section search m Figure 7b

21 x f(x) a b Optimal choice: based on golden ratio m Choose c so that (c-a)/(m-c) = , where  is the golden ratio => Bracket reduced by a factor of  -1 at each step c

22 N OTES

23 x f(x) Root finding: find x-value where f’(x) crosses 0 f’(x) Figure 8

24 Bisection g(x) ab Bracket [a,b] Invariant: sign(f(a)) != sign(f(b)) Figure 9a

25 Bisection g(x) ab Bracket [a,b] Invariant: sign(f(a)) != sign(f(b)) m Figure 9

26 Bisection g(x) ab Bracket [a,b] Invariant: sign(f(a)) != sign(f(b)) Figure 9

27 Bisection g(x) ab Bracket [a,b] Invariant: sign(f(a)) != sign(f(b)) m Figure 9

28 Bisection g(x) ab Bracket [a,b] Invariant: sign(f(a)) != sign(f(b)) Figure 9

29 Bisection g(x) ab Bracket [a,b] Invariant: sign(f(a)) != sign(f(b)) m Figure 9

30 Bisection g(x) ab Bracket [a,b] Invariant: sign(f(a)) != sign(f(b)) Linear convergence: Bracket size is reduced by factor of 0.5 at each iteration Figure 9

31 N EXT TIME Root finding methods with superlinear convergence Practical issues


Download ppt "CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Univariate optimization."

Similar presentations


Ads by Google