STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION GROUP INCLUDING UNCERTAINTY MODELS FOR MORE EFFICIENT SURROGATE BASED DESIGN OPTIMIZATION The EGO algorithm Raphael T. Haftka haftka@ufl.edu Felipe A. C. Viana felipeacviana@gmail.com
BACKGROUND: SURROGATE MODELING Surrogates replace expensive simulations in design optimization. Kriging (KRG) Polynomial response surface (PRS) Support vector regression Radial basis neural networks Example: is an estimate of . Forrester AIJ and Keane AJ, Recent advances in surrogate-based optimization, Progress in Aerospace Sciences, Vol. 45, No. 1-3, pp. 50-79, 2009. 2 2
BACKGROUND: UNCERTAINTY Some surrogates also provide an uncertainty estimate: standard error, s(x). Example: kriging and polynomial response surface. These are used in EGO 3 3
Efficient Global Optimization (EGO) Outline Efficient Global Optimization (EGO) Importing uncertainty models in order to use EGO with surrogates without uncertainty models Simple multiple-surrogate EGO to benefit from parallel computations
KRIGING FIT AND PRESENT BEST SOLUTION First we sample the function. And we fit a kriging model. We note the present best sample (PBS) Jones DR, Schonlau M, Welch W, Efficient global optimization of expensive black-box functions, Journal of Global Optimization, 13(4), pp. 455-492, 1998. 5 5
THE EXPECTED IMPROVEMENT QUESTION Then we ask: Of all the points where we will improve, where are we most likely to improve significantly upon the present best sample? 6 6
WHAT IS EXPECTED IMPROVEMENT? Consider the point x=0.8, and the random variable Y, which is the possible values of the function there. Its mean is the kriging prediction, which is near zero. 7 7
EXPLORATION AND EXPLOITATION EGO maximizes E[I(x)] to find the next point to be sampled. The expected improvement balances exploration and exploitation because it can be high either because of high uncertainty or low surrogate prediction. When can we say that the next point is “exploration?” 8 8
THE BASIC EGO WORKS WITH KRIGING Considering the root mean square error, : (a) Kriging Why not run EGO with the most accurate surrogate? (b) Support vector regression 9 9
BUT SVR DOES NOT HAVE UNCERTAINTY Say we have two surrogates: Kriging (KRG): with uncertainty Support vector regression (SVR): NO uncertainty But we want SVR for EGO. Viana FAC and Haftka RT, Importing Uncertainty Estimates from One Surrogate to Another, in: 50th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference, Palm Springs, USA, May 4 - 7, 2009. AIAA-2009-2237. 10 10
IMPORTATION AT A GLANCE First, we generate the standard error of kriging. 11 11
IMPORTATION AT A GLANCE Then, we combine the prediction from support vector regression and the standard error from kriging. 12 12
BENEFITS OF IMPORTATION OF UNCERTAINTY Run EGO with non-kriging models!!! Hartman3 function (initially fitted with 20 points): After 20 iterations (i.e., total of 40 points), improvement (I) over initial best sample: 13
TWO OTHER DESIGNS OF EXPERIMENTS FIRST: SECOND: 14 14
SUMMARY OF THE HARTMAN3 EXERCISE Box plot of the difference between improvement offered by different surrogates (out of 100 DOEs) In 34 DOEs (out of 100) KRG outperforms RBNN (in those cases, the difference between the improvements has mean of only 0.8%). 15 15
EGO WITH MULTIPLE SURROGATES Traditional EGO uses kriging to generate one point at a time. We use multiple surrogates to get multiple points. 16
POTENTIAL OF EGO WITH MULTIPLE SURROGATES Hartman3 function (100 DOEs with 20 points) Overall, surrogates are comparable in performance. 17
POTENTIAL OF EGO WITH MULTIPLE SURROGATES “krg” runs EGO for 20 iterations adding one point at a time. “krg-svr” and “krg-rbnn” run 10 iterations adding two points. Multiple surrogates offer good results in half of the time!!! 18