Math for CSTutorial 5-61 Tutorial 5 Function Optimization. Line Search. Taylor Series for R n Steepest Descent.

Slides:



Advertisements
Similar presentations
Optimization : The min and max of a function
Advertisements

Optimization 吳育德.
Optimization methods Review
Math for CSTutorial 101 Fourier Transform. Math for CSTutorial 102 Fourier Series The series With a n and b n generated by Is called a Fourier series.
Function Optimization Newton’s Method. Conjugate Gradients
Tutorial 12 Unconstrained optimization Conjugate gradients.
MATH 140 January 18, The Graph of a Function When a function is defined by an equation in x and y, the graph of a function is the set of all points.
Tutorial 7 Constrained Optimization Lagrange Multipliers
Tutorial 5-6 Function Optimization. Line Search. Taylor Series for Rn
Unconstrained Optimization Problem
Function Optimization. Newton’s Method Conjugate Gradients Method
Advanced Topics in Optimization
Lesson 5.2 (PART 2) FUNCTION NOTATION.
ISM 206 Lecture 6 Nonlinear Unconstrained Optimization.
Why Function Optimization ?
Math for CSLecture 51 Function Optimization. Math for CSLecture 52 There are three main reasons why most problems in robotics, vision, and arguably every.
Implementation of Nonlinear Conjugate Gradient Method for MLP Matt Peterson ECE 539 December 10, 2001.
9 1 Performance Optimization. 9 2 Basic Optimization Algorithm p k - Search Direction  k - Learning Rate or.
CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Gradient descent.
ENCI 303 Lecture PS-19 Optimization 2
Application of Differential Applied Optimization Problems.
Multivariate Unconstrained Optimisation First we consider algorithms for functions for which derivatives are not available. Could try to extend direct.
D Nagesh Kumar, IIScOptimization Methods: M2L4 1 Optimization using Calculus Optimization of Functions of Multiple Variables subject to Equality Constraints.
1.4 Continuity  f is continuous at a if 1. is defined. 2. exists. 3.
Quasi-Newton Methods of Optimization Lecture 2. General Algorithm n A Baseline Scenario Algorithm U (Model algorithm for n- dimensional unconstrained.
Dan Simon Cleveland State University Jang, Sun, and Mizutani Neuro-Fuzzy and Soft Computing Chapter 6 Derivative-Based Optimization 1.
4.2 – The Mean Value Theorem
Chapter 10 Minimization or Maximization of Functions.
CHAPTER 10 Widrow-Hoff Learning Ming-Feng Yeh.
Intermediate Value Theorem Vince Varju. Definition The Intermediate Value Theorem states that if a function f is a continuous function on [a,b] then there.
1 Chapter 6 General Strategy for Gradient methods (1) Calculate a search direction (2) Select a step length in that direction to reduce f(x) Steepest Descent.
Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm
Steepest Descent Method Contours are shown below.
Gradient Methods In Optimization
Performance Surfaces.
Advanced Computer Graphics Optimization Part 2 Spring 2002 Professor Brogan.
Continuity and One- Sided Limits (1.4) September 26th, 2012.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
08/10/ High Performance Parallel Implementation of Adaptive Beamforming Using Sinusoidal Dithers High Performance Embedded Computing Workshop Peter.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Theorems Lisa Brady Mrs. Pellissier Calculus AP 28 November 2008.
Optimal Control.
MATH342: Numerical Analysis Sunjae Kim.
Function Optimization
3.2 Rolle’s Theorem and the
4.2 The Mean Value Theorem State Standard
Minimization of Circuits
3.7 The Real Zeros of a Polynomial Function
Plotting functions not in canonical form
Rolle’s Theorem.
3.7 The Real Zeros of a Polynomial Function
3.2 Rolle’s Theorem and the
CS5321 Numerical Optimization
Non-linear Least-Squares
Important Values for Continuous functions
A function f is increasing on an open interval I if, for any choice of x1 and x2 in I, with x1 < x2, we have f(x1) < f(x2). A function f is decreasing.
Expressing functions as infinite series
Optimization Part II G.Anuradha.
Introduction to Scientific Computing II
Introduction to Scientific Computing II
Warm Up Given y = –x² – x + 2 and the x-value, find the y-value in each… 1. x = –3, y = ____ 2. x = 0, y = ____ 3. x = 1, y = ____ –4 – −3 2 –
Rolle's Theorem Objectives:
Chapter 10 Minimization or Maximization of Functions
How do we find the best linear regression line?
Math 175: Numerical Analysis II
Rolle’s Theorem and the Mean Value Theorem
Performance Optimization
1. Evaluating Expressions and Functions
Presentation transcript:

Math for CSTutorial 5-61 Tutorial 5 Function Optimization. Line Search. Taylor Series for R n Steepest Descent

Math for CSTutorial 5-62 Line search runs as following. Let Be the scalar function of α representing the possible values of f(x) in the direction of p k. Let (a,b,c) be the three points of α, such, that the point of (constrained) minimum x’, is between a and c: a<x’<c. Then the following algorithm allows to apprach x’ arbitrarily close: If f(a) ≥ f(c), u = (a+b)/2; If f(u) < f(b) (a,b,c) = (a,u,b) Else (a,b,c) = (u,b,c) Line search a b c u If f(a) < f(c), u = (b+c)/2; If f(u) < f(b) (a,b,c) = (b,u,c) Else (a,b,c) = (a,b,u)

Math for CSTutorial 5-63 The Taylor series for f(x) is,where For the function of m variables, the expression is Taylor Series

Math for CSTutorial 5-64 Consider the elliptic function: f(x,y)=(x-1) 2 +(2y-2) 2 and find the first three terms of Taylor expansion. 2D Taylor Series: Example

Math for CSTutorial 5-65 Consider the elliptic function: f(x,y)=(x-1) 2 +(2y-2) 2 and find the first three terms of Taylor expansion. Find the first step of Steepest Descent. Steepest Descent 1 2 -f’(0)

Math for CSTutorial 5-66 Consider the elliptic function: f(x,y)=(x-1) 2 +(2y-2) 2 and find the first three terms of Taylor expansion. Find the first step of Steepest Descent. Now, find a step a, in the direction of gradient, minimizing the function: Steepest Descent

Math for CSTutorial 5-67 Consider the elliptic function: f(x,y)=(x-1) 2 +(2y-2) 2 and find the first three terms of Taylor expansion. Find the first step of Steepest Descent. Is it a minimum? Next step? Steepest Descent