Iterative Non-Linear Optimization Methods

Slides:



Advertisements
Similar presentations
Curved Trajectories towards Local Minimum of a Function Al Jimenez Mathematics Department California Polytechnic State University San Luis Obispo, CA
Advertisements

1 OR II GSLM Outline  some terminology  differences between LP and NLP  basic questions in NLP  gradient and Hessian  quadratic form  contour,
Computational Challenges for Finding Big Oil by Seismic Inversion.
Multi-source Least Squares Migration and Waveform Inversion
Numerical Optimization
Function Optimization Newton’s Method. Conjugate Gradients
Nonlinear Optimization for Optimal Control
Design Optimization School of Engineering University of Bradford 1 Numerical optimization techniques Unconstrained multi-parameter optimization techniques.
Unconstrained Optimization Problem
Advanced Topics in Optimization
Why Function Optimization ?
9 1 Performance Optimization. 9 2 Basic Optimization Algorithm p k - Search Direction  k - Learning Rate or.
Computational Optimization
UNCONSTRAINED MULTIVARIABLE
ENCI 303 Lecture PS-19 Optimization 2
Making the Most from the Least (Squares Migration) G. Dutta, Y. Huang, W. Dai, X. Wang, and Gerard Schuster G. Dutta, Y. Huang, W. Dai, X. Wang, and Gerard.
Overview of Multisource Phase Encoded Seismic Inversion Wei Dai, Ge Zhan, and Gerard Schuster KAUST.
Application of Differential Applied Optimization Problems.
1 Unconstrained Optimization Objective: Find minimum of F(X) where X is a vector of design variables We may know lower and upper bounds for optimum No.
1 Optimization Multi-Dimensional Unconstrained Optimization Part II: Gradient Methods.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
Multivariate Unconstrained Optimisation First we consider algorithms for functions for which derivatives are not available. Could try to extend direct.
Theory of Multisource Crosstalk Reduction by Phase-Encoded Statics G. Schuster, X. Wang, Y. Huang, C. Boonyasiriwat King Abdullah University Science &
Quasi-Newton Methods of Optimization Lecture 2. General Algorithm n A Baseline Scenario Algorithm U (Model algorithm for n- dimensional unconstrained.
Dan Simon Cleveland State University Jang, Sun, and Mizutani Neuro-Fuzzy and Soft Computing Chapter 6 Derivative-Based Optimization 1.
CHAPTER 10 Widrow-Hoff Learning Ming-Feng Yeh.
Steepest Descent Method Contours are shown below.
Variations on Backpropagation.
Survey of unconstrained optimization gradient based algorithms
Multisource Least-squares Migration of Marine Data Xin Wang & Gerard Schuster Nov 7, 2012.
Signal & Weight Vector Spaces
Performance Surfaces.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
Fast Least Squares Migration with a Deblurring Filter Naoshi Aoki Feb. 5,
Fast Least Squares Migration with a Deblurring Filter 30 October 2008 Naoshi Aoki 1.
Fast 3D Least-squares Migration with a Deblurring Filter Wei Dai.
Optimal Control.
The Boom and Bust Cycles of Full Waveform Inversion: Is
CSCE 441: Computer Graphics Forward/Inverse kinematics
Zero-Offset Data d = L o ò r ) ( g = d dr r ) ( g = d
Non-linear Minimization
Digital Visual Effects, Spring 2007 Yung-Yu Chuang 2007/4/17
Fast Multisource Least Squares Migration of 3D Marine Data with
Steepest Descent Optimization
Fast Multisource Least Squares Migration of 3D Marine Data with
CHAPTER 3 RECURSIVE ESTIMATION FOR LINEAR MODELS
CS5321 Numerical Optimization
Skeletonized Wave-Equation Surface Wave Dispersion (WD) Inversion
Non-linear Least-Squares
Conjugate Gradient Problem: SD too slow to converge if NxN H matrix is ill-conditioned. SD: dx = - g (slow but no inverse to store or compute) CG: dx =
Collaborative Filtering Matrix Factorization Approach
CSCE 441: Computer Graphics Forward/Inverse kinematics
Digital Visual Effects Yung-Yu Chuang
Variations on Backpropagation.
CS5321 Numerical Optimization
Overview of Multisource Phase Encoded Seismic Inversion
Non-local Means (NLM) Filter for Trim Statics
Overview of Multisource and Multiscale Seismic Inversion
LSM with Sparsity Constraints
Overview of Multisource and Multiscale Seismic Inversion
PS, SSP, PSPI, FFD KM SSP PSPI FFD.
Non-local Means (NLM) Filter for Trim Statics
Variations on Backpropagation.
Performance Surfaces.
Performance Optimization
Outline Preface Fundamentals of Optimization
Steepest Descent Optimization
Conjugate Gradient Optimization
Wave Equation Dispersion Inversion of Guided P-Waves (WDG)
Presentation transcript:

Iterative Non-Linear Optimization Methods Earth Red Sea Synthetics (Sanzong Zhang)

1. Overview Non-linear Problem Outline 1. Overview Non-linear Problem 2. Taylor Series: gradient & Hessian 3. Newton Method

Non-Linear Optimization Problem Objective Function Given: f(x) Find: x* such that min f(x) where x e R N x N=1 N=2

Non-Linear Optimization Problem Objective Function Obs Syn. Given: f(x) Find: x* such that min f(x) Misfit Function ||d(x,t)-d(x,t)|| where x e R N x N=1 N>>>2 N=2 Time (s) 4

1. Overview Non-linear Problem Outline 1. Overview Non-linear Problem 2. Taylor Series: gradient & Hessian 3. Newton Method

Multi-dimensional Taylor Series f(x) is non-linear function of x, but we force quadratic Relationship by truncation Neglect > quadratic terms Goal is to find x* by linearizing eqn in Dx Dx x* x o

f(x) is minimum at x* if misfit gradient curvature x* x* f(x) f’(x) Saddle In 2D x* x*

Take derivative of f(x) to Linearize Taylor Series   T  

Take derivative of f(x) to Linearize Taylor Series Solving for Dx gives

Take derivative of f(x) to Linearize Taylor Series Is this optimal soln? Not necessarily, actual contours not an ellipse, so we start all over with new starting point and iterate Dx x* x o x o x*

Iterative Newton Method Take derivative of f(x) to Linearize Taylor Series Is this optimal soln? Is this optimal soln? Dx x* Dx x* x o x o x o x*

Iterative Newton Method (Rosenbrock Example)

Newton Code

Gradient Interpretation Directional Derivative x o Dx x* s o d da = d f(x) da Projection of gradient of f(x) on s

Gradient Interpretation Directional Derivative x o Dx x* s s Curvature of f(x) along s

Eigenvalues of 2x2 H Determine Shape of f(x) He = l e Assume 1 1 1 so He = l e 2 2 2 x o Dx x* s s

Eigenvalues of 2x2 H Determine Shape of f(x) S.P.D S.N.D Broad minima Broad maxima Symmetric Indefinite Ill-conditioned

MATLAB Exercise Determine if SPD, SND, Indefinite, ill-conditioned.

Traveltime Tomography First arrivals 0 km 6 km/s 6 km 3 km/s 0 km 20 km

Traveltime Tomography 0. Discretize Model t i 1. Forward Model d=Lm:

Traveltime Tomography d=Lm: 1. Forward Model 2. Linearize: 3. Misfit:

Traveltime Tomography d=Lm: 1. Forward Model 2. Linearize: 3. Misfit: 4. Variation e w/r to s = 0: 4. Gauss-Newton Method:

Altern. Traveltime Tomography

Conventional Least Squares Solution: L= & d = 1 2 1 L Given: Lm=d 2 In general, huge dimension matrix Find: m s.t. min||Lm-d|| 2 Solution: m = [L L] L d T -1 m = m – a L (Lm - d) My talk is organized in the following way: 1. The first part is motivation. I will talk about a least squares migration (LSM ) advantages and challenges. 2. The second part is theory for a deblurring filter, which is an alternative method to LSM. 3. In the third part, I will show a numerical result of a deblurring filter. 4. The fourth is the main part of my talk. Deblurred LSM (DLSM) is a fast LSM with a deblurring filter. I will explain how to use the filter in LSM algorithm. 5. Then I will show numerical results of the DLSM. 6. Then I will conclude my presentation. Each figure has a slide number is shown at the footer. (k+1) (k) (k) T (k) = m – a L (L m - d ) (k) [ ] T + L (L m - d ) T 1 1 1 2 2 2 Problem: Each prediction is a FD solve Solution: Blend+encode Data Problem: L is too big for IO bound hardware 28

r=Lm-d Steepest Descent Non-linear Newton a Summary t Given: f(m) Find: m* such that min f(m) Misfit Function ||[d(x,t)-d(x,t)]W|| obs 2 i where m e R N Steepest Descent Non-linear Newton m a f(m) m = m - [L L] L d T -1 (k+1) (k) m m