Presentation is loading. Please wait.

Presentation is loading. Please wait.

Iterative Non-Linear Optimization Methods

Similar presentations


Presentation on theme: "Iterative Non-Linear Optimization Methods"— Presentation transcript:

1 Iterative Non-Linear Optimization Methods
Earth Red Sea Synthetics (Sanzong Zhang)

2 1. Overview Non-linear Problem
Outline 1. Overview Non-linear Problem 2. Taylor Series: gradient & Hessian 3. Newton Method

3 Non-Linear Optimization Problem
Objective Function Given: f(x) Find: x* such that min f(x) where x e R N x N=1 N=2

4 Non-Linear Optimization Problem
Objective Function Obs Syn. Given: f(x) Find: x* such that min f(x) Misfit Function ||d(x,t)-d(x,t)|| where x e R N x N=1 N>>>2 N=2 Time (s) 4

5 1. Overview Non-linear Problem
Outline 1. Overview Non-linear Problem 2. Taylor Series: gradient & Hessian 3. Newton Method

6 Multi-dimensional Taylor Series
f(x) is non-linear function of x, but we force quadratic Relationship by truncation Neglect > quadratic terms Goal is to find x* by linearizing eqn in Dx Dx x* x o

7 f(x) is minimum at x* if misfit gradient curvature x* x* f(x) f’(x)
Saddle In 2D x* x*

8 Take derivative of f(x) to Linearize Taylor Series
T

9 Take derivative of f(x) to Linearize Taylor Series
Solving for Dx gives

10 Take derivative of f(x) to Linearize Taylor Series
Is this optimal soln? Not necessarily, actual contours not an ellipse, so we start all over with new starting point and iterate Dx x* x o x o x*

11 Iterative Newton Method
Take derivative of f(x) to Linearize Taylor Series Is this optimal soln? Is this optimal soln? Dx x* Dx x* x o x o x o x*

12 Iterative Newton Method (Rosenbrock Example)

13 Newton Code

14 Gradient Interpretation
Directional Derivative x o Dx x* s o d da = d f(x) da Projection of gradient of f(x) on s

15 Gradient Interpretation
Directional Derivative x o Dx x* s s Curvature of f(x) along s

16 Eigenvalues of 2x2 H Determine
Shape of f(x) He = l e Assume 1 1 1 so He = l e 2 2 2 x o Dx x* s s

17 Eigenvalues of 2x2 H Determine
Shape of f(x) S.P.D S.N.D Broad minima Broad maxima Symmetric Indefinite Ill-conditioned

18 MATLAB Exercise Determine if SPD, SND, Indefinite, ill-conditioned.

19 Traveltime Tomography
First arrivals 0 km 6 km/s 6 km 3 km/s 0 km 20 km

20 Traveltime Tomography
0. Discretize Model t i 1. Forward Model d=Lm:

21 Traveltime Tomography
d=Lm: 1. Forward Model 2. Linearize: 3. Misfit:

22 Traveltime Tomography
d=Lm: 1. Forward Model 2. Linearize: 3. Misfit: 4. Variation e w/r to s = 0: 4. Gauss-Newton Method:

23 Altern. Traveltime Tomography

24 Conventional Least Squares Solution: L= & d =
1 2 1 L Given: Lm=d 2 In general, huge dimension matrix Find: m s.t. min||Lm-d|| 2 Solution: m = [L L] L d T -1 m = m – a L (Lm - d) My talk is organized in the following way: 1. The first part is motivation. I will talk about a least squares migration (LSM ) advantages and challenges. 2. The second part is theory for a deblurring filter, which is an alternative method to LSM. 3. In the third part, I will show a numerical result of a deblurring filter. 4. The fourth is the main part of my talk. Deblurred LSM (DLSM) is a fast LSM with a deblurring filter. I will explain how to use the filter in LSM algorithm. 5. Then I will show numerical results of the DLSM. 6. Then I will conclude my presentation. Each figure has a slide number is shown at the footer. (k+1) (k) (k) T (k) = m – a L (L m - d ) (k) [ ] T + L (L m - d ) T 1 1 1 2 2 2 Problem: Each prediction is a FD solve Solution: Blend+encode Data Problem: L is too big for IO bound hardware 28

25 r=Lm-d Steepest Descent Non-linear Newton a Summary t Given: f(m)
Find: m* such that min f(m) Misfit Function ||[d(x,t)-d(x,t)]W|| obs 2 i where m e R N Steepest Descent Non-linear Newton m a f(m) m = m - [L L] L d T -1 (k+1) (k) m m


Download ppt "Iterative Non-Linear Optimization Methods"

Similar presentations


Ads by Google