Surrogate-based constrained multi-objective optimization Aerospace design is synonymous with the use of long running and computationally intensive simulations,

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

Linear Regression.
Pattern Recognition and Machine Learning
Kriging.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Use of Kalman filters in time and frequency analysis John Davis 1st May 2011.
Experimental Design, Response Surface Analysis, and Optimization
Cost of surrogates In linear regression, the process of fitting involves solving a set of linear equations once. For moving least squares, we need to.
Including Uncertainty Models for Surrogate-based Global Optimization
Gizem ALAGÖZ. Simulation optimization has received considerable attention from both simulation researchers and practitioners. Both continuous and discrete.
Surrogate model based design optimization
Tea Break!.
Class 5: Thurs., Sep. 23 Example of using regression to make predictions and understand the likely errors in the predictions: salaries of teachers and.
ENGR 351 Numerical Methods Instructor: Dr. L.R. Chevalier
NORM BASED APPROACHES FOR AUTOMATIC TUNING OF MODEL BASED PREDICTIVE CONTROL Pastora Vega, Mario Francisco, Eladio Sanz University of Salamanca – Spain.
July 3, Department of Computer and Information Science (IDA) Linköpings universitet, Sweden Minimal sufficient statistic.
Principles of the Global Positioning System Lecture 10 Prof. Thomas Herring Room A;
Objectives of Multiple Regression
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 11 Some materials adapted from Prof. Keith E. Gubbins:
Machine Learning Week 4 Lecture 1. Hand In Data Is coming online later today. I keep test set with approx test images That will be your real test.
Theory testing Part of what differentiates science from non-science is the process of theory testing. When a theory has been articulated carefully, it.
1 Hybrid methods for solving large-scale parameter estimation problems Carlos A. Quintero 1 Miguel Argáez 1 Hector Klie 2 Leticia Velázquez 1 Mary Wheeler.
Applications of Bayesian sensitivity and uncertainty analysis to the statistical analysis of computer simulators for carbon dynamics Marc Kennedy Clive.
Example II: Linear truss structure
Part 5 Parameter Identification (Model Calibration/Updating)
Ken YoussefiMechanical Engineering Dept. 1 Design Optimization Optimization is a component of design process The design of systems can be formulated as.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Modern Navigation Thomas Herring
Linear Buckling Analysis
Geographic Information Science
EDGE DETECTION IN COMPUTER VISION SYSTEMS PRESENTATION BY : ATUL CHOPRA JUNE EE-6358 COMPUTER VISION UNIVERSITY OF TEXAS AT ARLINGTON.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
1 Using Multiple Surrogates for Metamodeling Raphael T. Haftka (and Felipe A. C. Viana University of Florida.
9.0 New Features Min. Life for a Titanium Turbine Blade Workshop 9 Robust Design – DesignXplorer.
INTRODUCTION TO Machine Learning 3rd Edition
Bias and Variance of the Estimator PRML 3.2 Ethem Chp. 4.
MANAGEMENT SCIENCE The Art of Modeling with Spreadsheets STEPHEN G. POWELL KENNETH R. BAKER Compatible with Analytic Solver Platform FOURTH EDITION OPTIMIZATION.
Over-fitting and Regularization Chapter 4 textbook Lectures 11 and 12 on amlbook.com.
Written by Changhyun, SON Chapter 5. Introduction to Design Optimization - 1 PART II Design Optimization.
1 Introduction to Statistics − Day 4 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Lecture 2 Brief catalogue of probability.
Bias and Variance of the Estimator PRML 3.2 Ethem Chp. 4.
Derivative-Enhanced Variable Fidelity Kriging Approach Dept. of Mechanical Engineering, University of Wyoming, USA Wataru YAMAZAKI 23 rd, September, 2010.
September 28, 2000 Improved Simultaneous Data Reconciliation, Bias Detection and Identification Using Mixed Integer Optimization Methods Presented by:
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
Learning Theory Reza Shadmehr Distribution of the ML estimates of model parameters Signal dependent noise models.
Structural & Multidisciplinary Optimization Group Deciding How Conservative A Designer Should Be: Simulating Future Tests and Redesign Nathaniel Price.
The University of SydneySlide 1 Simulation Driven Biomedical Optimisation Andrian Sue AMME4981/9981 Week 5 Semester 1, 2016 Lecture 5.
Kriging - Introduction Method invented in the 1950s by South African geologist Daniel Krige (1919-) for predicting distribution of minerals. Became very.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Structural Optimization
March 9th, 2015 Maxime Lapalme Nazim Ould-Brahim
12. Principles of Parameter Estimation
Goal We present a hybrid optimization approach for solving global optimization problems, in particular automated parameter estimation models. The hybrid.
Boosting and Additive Trees
Melissa Jablonski, John Geaney
Bias and Variance of the Estimator
CSCI 5822 Probabilistic Models of Human and Machine Learning
Collaborative Filtering Matrix Factorization Approach
Curve fit metrics When we fit a curve to data we ask:
The Bias Variance Tradeoff and Regularization
○ Hisashi Shimosaka (Doshisha University)
Overfitting and Underfitting
Curve fit metrics When we fit a curve to data we ask:
Parametric Methods Berlin Chen, 2005 References:
12. Principles of Parameter Estimation
Applications of CFD 2008 Neil W. Bressloff N. W. Bressloff
Probabilistic Surrogate Models
Optimization under Uncertainty
Presentation transcript:

Surrogate-based constrained multi-objective optimization Aerospace design is synonymous with the use of long running and computationally intensive simulations, which are employed in the search for optimal designs in the presence of multiple, competing objectives and constraints. The difficulty of this search is often exacerbated by numerical `noise' and inaccuracies in simulation data and the frailties of complex simulations, that is they often fail to return a result. Surrogate-based optimization methods can be employed to solve, mitigate, or circumvent problems associated with such searches. This presentation gives an overview of constrained multi-objective optimization using Gaussian process based surrogates, with an emphasis on dealing with real-world problems. Alex Forrester 3 rd July 2009

Coming up: Surrogate model based optimization – the basic idea Gaussian process based modelling Probability of improvement and expected improvement Missing data Noisy data Constraints Multiple objectives 2

Surrogate model based optimization Surrogate used to expedite search for global optimum Global accuracy of surrogate not a priority 3 SAMPLING PLAN OBSERVATIONS CONSTRUCT SURROGATE(S) design sensitivities available? multi-fidelity data? SEARCH INFILL CRITERION (optimization using the surrogate(s)) constraints present? noise in data? multiple design objectives? ADD NEW DESIGN(S) PRELIMINARY EXPERIMENTS

Gaussian process based modelling 4

Building Gaussian process models, e.g. Kriging 5 Sample the function to be predicted at a set of points

Correlate all points using a Gaussian type function 6

7 20 Gaussian “bumps” with appropriate widths (chosen to maximize likelihood of data) centred around sample points

Multiply by weightings (again chosen to maximize likelihood of data) 8

Add together to predict function 9 Kriging predictionTrue function

Optimization 10

Polynomial regression based search (as Devil’s advocate)

Gaussian process prediction based optimization 12

Gaussian process prediction based optimization (as Devil’s advocate) 13

But, we have error estimates with Gaussian processes 14

Error estimates used to construct improvement criteria 15 Probability of improvement Expected improvement

Probability of improvement 16 Useful global infill criterion Not a measure of improvement, just the chance there will be one

Expected improvement 17 Useful metric of actual amount of improvement to be expected Can be extended to constrained and multi- objective problems

18

Missing Data 19

What if design evaluations fail? No infill point augmented to the surrogate –model is unchanged –optimization stalls Need to add some information or perturb the model –add random point? –impute a value based on the prediction at the failed point, so EI goes to zero here? –use a penalized imputation (prediction + error estimate)? 20

Aerofoil design problem 2 shape functions (f 1,f 2 ) altered Potential flow solver (VGK) has ~35% failure rate 20 point optimal Latin hypercube max{E[I(x)]} updates until within one drag count of optimum 21

Results 22

A typical penalized imputation based optimization 23

Four variable problem f 1,f 2,f 3,f 4 varied 82% failure rate 24

A typical four variable penalized imputation based optimization Legend as for two variable Red crosses indicate imputed update points. Regions of infeasible geometries are shown as dark blue. Blank regions represent flow solver failure 25

‘Noisy’ Data 26

‘Noisy’ data Many data sets are corrupted by noise We are usually interested in deterministic ‘noise’ ‘Noise’ in aerofoil drag data due to discretization of Euler equations 27

Failure of interpolation based infill Surrogate becomes excessively snaky Error estimates increase Search becomes too global 28

Regression improves model Add regularization constant to correlation matrix Last plot of previous slide improved 29

Failure of regression based infill Regularization assumes error at sample locations (brought in through lambda in equations below) Leads to expectation of improvement here Ok for stochastic noise Search stalls for deterministic simulations 30

Use “re-interpolation” Error due to noise ignored using new variance formulation (equation below) Only modelling error Search proceeds as desired 31

Two variable aerofoil example Same parameterization as missing data problem Course mesh causes ‘noise’ 32

Interpolation – very global 33

Regression - stalls 34

Re-interpolation – searches local basins, but finds global optimum 35

Constrained EI 36

Probability of constraint satisfaction g(x) is the constraint function F=G(x)-g min is a measure of feasibility, where G(x) is a random variable 37

It’s just like the probability of improvement, but with a limit, not a minimum 38 Probability of satisfaction Prediction of constraint function Constraint function Constraint limit

Constrained probability of improvement Probability of improvement conditional upon constraint satisfaction Simply multiply the two probabilities: 39

Constrained expected improvement Expected improvement conditional upon constraint satisfaction Again, a simple multiplication: 40

A 1D example 41

After one infill point 42

A 2D example 43

44

Multi-objective EI 45

Pareto optimization We want to identify a set of non-dominated solutions These define the Pareto front We can formulate an expectation of improvement on the current non-dominated solutions 46

Multi-dimensional Gaussian process Consider a 2 objective problem The random variables Y 1 and Y 2 have a 2D probability density function: 47

Probability of improving on one point Need to integrate the 2D pdf: 48

Integrating under all non-dominated solutions: The EI is the first moment of this integral about the Pareto front (see book) 49

A 1D example 50

51

Matlab demo 52

Nowacki beam Fixed length steel cantilever beam under 5kN load Variables: –height –width Objectives: –minimize cross section area –minimize bending moment Constraints: –area ratio –Bending moment –buckling –deflection –Shear 53

Problem setup 10 point optimal Latin hypercube Kriging model of each objective and constraint –Parameters tuned with GA + SQP (using adjoint of likelihood) 20 points added at the maximum constrained multi- objective expected improvement 54

Sampling plan 55

Initial trade off 56

5 updates 57

10 updates 58

15 updates 59

20 updates 60

Final trade off 61

Summary Surrogate based optimization offers answers to, or ways to get round, many problems associated with real world optimization This seemingly blunt tool must, however, be used with precision as there are many traps to fall into In a multi-objective context, the use of surrogates is particularly promising There has not been time to cover new surrogate methods (e.g. blind Kriging), multi-fidelity modelling or enhancements to EI, in terms of its exploitation/exploration tradeoff properties 62

References A. I. J. Forrester, A. Sóbester, A. J. Keane, Engineering Design via Surrogate Modelling: A Practical Guide, John Wiley & Sons, Chichester, 240 pages, ISBN A. I. J. Forrester, A. J. Keane, Recent advances in surrogate-based optimization, Progress in Aerospace Sciences, 45, 50-79, (doi: /j.paerosci ) A. I. J. Forrester, A. Sóbester, A. J. Keane, Optimization with missing data, Proc. R. Soc. A, 462(2067), , (doi: /rspa ). A. I. J. Forrester, N. W. Bressloff, A. J. Keane, Design and analysis of ‘noisy’ computer experiments, AIAA journal, 44(10), , (doi: / ). All code at 63

Gratuitous publicity 64