Example II: Linear truss structure

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

ZEIT4700 – S1, 2014 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Topic Outline ? Black-Box Optimization Optimization Algorithm: only allowed to evaluate f (direct search) decision vector x objective vector f(x) objective.
Optimal Shape Design of Membrane Structures Chin Wei Lim, PhD student 1 Professor Vassili Toropov 1,2 1 School of Civil Engineering 2 School of Mechanical.
1 An Adaptive GA for Multi Objective Flexible Manufacturing Systems A. Younes, H. Ghenniwa, S. Areibi uoguelph.ca.
Multi-Objective Optimization NP-Hard Conflicting objectives – Flow shop with both minimum makespan and tardiness objective – TSP problem with minimum distance,
optiSLang - ANSYS Workbench Interface (optiPlug)
Optimization methods Review
Tutorial 2, Part 1: Optimization of a damped oscillator.
Differential Evolution Hossein Talebi Hassan nikoo 1.
2003 International Congress of Refrigeration, Washington, D.C., August 17-22, 2003 Application of Multi-objective Optimization in Food Refrigeration Processes.
Developments on Shape Optimization at CIMNE October Advanced modelling techniques for aerospace SMEs.
Non-Linear Problems General approach. Non-linear Optimization Many objective functions, tend to be non-linear. Design problems for which the objective.
Spring, 2013C.-S. Shieh, EC, KUAS, Taiwan1 Heuristic Optimization Methods Pareto Multiobjective Optimization Patrick N. Ngatchou, Anahita Zarei, Warren.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Multi-Objective Evolutionary Algorithms Matt D. Johnson April 19, 2007.
Genetic Algorithms in Materials Processing N. Chakraborti Department of Metallurgical & Materials Engineering Indian Institute of Technology Kharagpur.
Design Optimization School of Engineering University of Bradford 1 Formulation of a design improvement problem as a formal mathematical optimization problem.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Design of Curves and Surfaces by Multi Objective Optimization Rony Goldenthal Michel Bercovier School of Computer Science and Engineering The Hebrew University.
1 Genetic algorithm approach on multi-criteria minimum spanning tree problem Kuo-Hsien Chuang 2009/01/06.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
A New Algorithm for Solving Many-objective Optimization Problem Md. Shihabul Islam ( ) and Bashiul Alam Sabab ( ) Department of Computer Science.
Sensitivity Analysis, Multidisciplinary Optimization, Robustness Evaluation, and Robust Design Optimization with optiSLang 3.2.
Dr. Hala Moushir Ebied Faculty of Computers & Information Sciences
Osyczka Andrzej Krenich Stanislaw Habel Jacek Department of Mechanical Engineering, Cracow University of Technology, Krakow, Al. Jana Pawla II 37,
Portfolio-Optimization with Multi-Objective Evolutionary Algorithms in the case of Complex Constraints Benedikt Scheckenbach.
Prepared by Barış GÖKÇE 1.  Search Methods  Evolutionary Algorithms (EA)  Characteristics of EAs  Genetic Programming (GP)  Evolutionary Programming.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
dynamic software & engineering GmbH
Part 5 Parameter Identification (Model Calibration/Updating)
Ken YoussefiMechanical Engineering Dept. 1 Design Optimization Optimization is a component of design process The design of systems can be formulated as.
Ken YoussefiMechanical Engineering Dept. 1 Design Optimization Optimization is a component of design process The design of systems can be formulated as.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Topics in Artificial Intelligence By Danny Kovach.
1 A New Method for Composite System Annualized Reliability Indices Based on Genetic Algorithms Nader Samaan, Student,IEEE Dr. C. Singh, Fellow, IEEE Department.
Kanpur Genetic Algorithms Laboratory IIT Kanpur 25, July 2006 (11:00 AM) Multi-Objective Dynamic Optimization using Evolutionary Algorithms by Udaya Bhaskara.
Biologically inspired algorithms BY: Andy Garrett YE Ziyu.
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Robust Design Optimization (RDO) easy and flexible to use Introduction Dynardo Services.
Tutorial 3, Part 1: Optimization of a linear truss structure
Optimization of functions of one variable (Section 2)
Multi-objective Optimization
Multi-objective Evolutionary Algorithms (for NACST/Seq) summarized by Shin, Soo-Yong.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Multiobjective Optimization  Particularities of multiobjective optimization  Multiobjective.
1 Autonomic Computer Systems Evolutionary Computation Pascal Paysan.
Evolutionary Computing Chapter 12. / 26 Chapter 12: Multiobjective Evolutionary Algorithms Multiobjective optimisation problems (MOP) -Pareto optimality.
Application Development in Engineering Optimization with Matlab and External Solvers Aalto University School of Engineering.
Tutorial 2, Part 2: Calibration of a damped oscillator.
- Divided Range Multi-Objective Genetic Algorithms -
Inequality Constraints Lecture 7. Inequality Contraints (I) n A Review of Lagrange Multipliers –As we discussed last time, the first order necessary conditions.
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
Structural & Multidisciplinary Optimization Group Deciding How Conservative A Designer Should Be: Simulating Future Tests and Redesign Nathaniel Price.
The University of SydneySlide 1 Simulation Driven Biomedical Optimisation Andrian Sue AMME4981/9981 Week 5 Semester 1, 2016 Lecture 5.
Constrained Optimization by the  Constrained Differential Evolution with an Archive and Gradient-Based Mutation Tetsuyuki TAKAHAMA ( Hiroshima City University.
ZEIT4700 – S1, 2016 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Lagouge TARTIBU KWANDA Mechanical Engineering
Introduction to genetic algorithm
Power Magnetic Devices: A Multi-Objective Design Approach
Bounded Nonlinear Optimization to Fit a Model of Acoustic Foams
Aalto University School of Engineering
Particle Swarm Optimization (2)
Interface model for SIS18 and UNILAC
Digital Optimization Martynas Vaidelys.
Presented by: Dr Beatriz de la Iglesia
Heuristic Optimization Methods Pareto Multiobjective Optimization
Multi-Objective Optimization
○ Hisashi Shimosaka (Doshisha University)
RM-MEDA: A Regularity Model-Based Multiobjective Estimation of Distribution Algorithm BISCuit EDA Seminar
Presentation transcript:

Example II: Linear truss structure Optimization goal is to minimize the mass of the structure Cross section areas of trusses as design variables Maximum stress in each element as inequality constraints Maximum displacement in loading points as inequality constraints Gradient-based and ARSM optimization perform much better if constraint equations are formulated separately instead of using total max_stress and max_disp as constraints Parametric optimizer like oS are running a black box CAE-process, therefore the gradients are determined with single side or central differences. Part 4: Multidisciplinary Optimization

Example II: Sensitivity analysis MOP indicates only a1, a3, a8 as important variables for maximum stress and displacements, but all inputs are important for objective function Part 4: Multidisciplinary Optimization

Example II: Sensitivity analysis a1 a2 a3 a4 a5 a6 a7 a8 a9 a10 max_stress max_disp stress10 stress9 stress8 stress6 stress5 stress4 stress3 stress2 stress1 disp4 disp2 mass MOP filter For single stress values used in constraint equations, each input variable occurs at least twice as important parameter Reduction of number of inputs seems not possible Part 4: Multidisciplinary Optimization

Example II: Gradient-based optimization Best design with valid constraints: mass = 1595 (19% of initial mass) Areas of elements 2,5,6 and 10 are set to minimum Stresses in remaining elements reach maximum value 153 solver calls (+100 from DOE) Part 4: Multidisciplinary Optimization

Example II: Adaptive response surface Best design with valid constraints: mass = 1613 (19% of initial mass) Areas of elements 2,6 and are set to minimum, 5 and 10 are close to minimum 360 solver calls Part 4: Multidisciplinary Optimization

Example II: EA (global search) Best design with valid constraints: mass = 2087 (25% of initial mass) 392 solver calls Part 4: Multidisciplinary Optimization

Example II: EA (local search) Best design with valid constraints: mass = 2049 (24% of initial mass) 216 solver calls (+392 from global search) Part 4: Multidisciplinary Optimization

Example II: Overview optimization results Method Settings Mass Solver calls Constraints violated Initial - 8393 DOE LHS 3285 100 75% NLPQL diff. interval 0.01%, single sided 1595 153(+100) 42% ARSM defaults (local) 1613 360 80% EA global defaults 2087 392 56% EA local 2049 216(+392) 79% PSO global 2411 400 36% GA global 2538 381 25% SDI local 1899 70% Parametric optimizer like oS are running a black box CAE-process, therefore the gradients are determined with single side or central differences. NLPQL with small differentiation interval with best DOE as start design is most efficient Local ARSM gives similar parameter set EA/GA/PSO with default settings come close to global optimum GA with adaptive mutation has minimum constraint violation Part 4: Multidisciplinary Optimization

When to use which optimization algorithms Gradient-based algorithms Most efficient method if gradients are accurate enough Consider its restrictions like local optima, only continuous variables and noise Response surface method Attractive method for a small set of continuous variables (<15) Adaptive RSM with default settings is the method of choice Biologic Algorithms GA/EA/PSO copy mechanisms of nature to improve individuals Method of choice if gradient or ARSM fails Very robust against numerical noise, non-linearities, number of variables,… Limitations and recommendations using the different strategies are given above. Using gradient algorithms and response surface algorithms, optiSLang provides high end state of the art. ARSM and the genetic/evolutionary algorithms are worldwide one of the best commercially available algorithms. These algorithms have been continuously developed over the last 5 years and have proven there functionality and robustness several times. They are usually the main reason for the customers decision to use optiSLang. Start Part 4: Multidisciplinary Optimization

Sensitivity Analysis and Optimization 1) Start with a sensitivity study using the LHS Sampling 2) Identify the important parameters and responses understand the problem reduce the problem Scan the whole Design Space Understand the Problem using CoP/MoP optiSLang For identification tasks, the search of a parameter set for the numerical model regarding the best fit between test and simulation (some time also called model update) is one of the most difficult optimization problems. First, we recommend always performing a sensitivity analysis to ensure: That the test is inside the variation space of the simulation (defined by the lower and upper bounds of varying parameter=identification space) That a set of sensible input variables to the update criteria can be found After achieving a set of sensitive parameter to tune regarding sensitive update criteria, starting from the best design of the sensitivity analysis the optimization part is often simple. See also: J. Will: The Calibration of Measurement and Simulation as Optimization Problem, Proceeding NAFEMS Seminar Virtual Testing – Simulationsverfahren als integrierter Baustein einer effizienten Produktentwicklung“” April 2006, Wiesbaden, Germany, www.dynardo.de In the example above 7 different test conditions are identified at the same time. In the beginning 6 input parameter are varied between physically useful lower and upper bounds. First it could be proven, that the tests are lying within the variation band of identification space. Second the sensitivity of that parameter was checked against different response values (integrals and peak values of acceleration, displacement and pressure curves) and a subset of 3 input variables (gas temperature, bag permeability and efficiency of the airbag opening) to three response values (acceleration integral+peak, pressure integral) are used for the identification. Search for Optima 3) Run an ARSM, gradient based or biological based optimization algorithm 4) Goal: user-friendly procedure provides as much automatism as possible Part 4: Multidisciplinary Optimization

Optimization of a Large Ship Vessel Optimization of the total weight of two load cases with constrains (stresses) 30.000 discrete Variables Self regulating evolutionary strategy Population of 4, uniform crossover for reproduction Active search for dominant genes with different mutation rates Solver: ANSYS Design Evaluations: 3000 Design Improvement: > 10 % EVOLUTIONARY ALGORITHM The software architecture and the evolutionary algorithms are ready to search for design improvements in very large design spaces. This example is according to our knowledge the word largest published industrial optimization problem. After a moderate number of solver calls (3000), significant design improvement was shown. The example is also a very early one from 2000/2001. Probably this kind of problems still need some adjustments of algorithm, therefore using the programmer modus of oS or having consultancy from dynardo help to successfully solve that kind of problems. The objective function was the reduction of the total weight. Due to the size of the finite element model and the small amount (compared to the number of variables) of planned solver calls (3000), a special mix of GA and EA with a self-regulation algorithm was chosen. The mutation was performed on grouped variables (with respect to sensitivity criteria) and some optimizer parameters had been formulated by a self-regulating algorithm. After approximately 400 generations, the self-regulation had slowed down the optimizing process and the self-regulation process was varied and restarted at this point. On the best history plot of the generations (Fig. 11), the jump of design improvement after this intersection can be observed clearly. After 3000 solver calls, we stopped with a very successful reduction of the total weight. The whole analysis was done an SGI-workstation with 20 processors. The machine was permanently loaded with other network users and we had to cover successfully some system and solver crashes within the total analysis time of about three weeks. You find a short project description on a very early paper of oS J. Will, C. Bucher, J. Riedel. Multidisciplinary non-linear optimization with Optimizing Structural Language – OptiSLang, Proceedings 19. CAD-FEM Users Meeting 2001, Berlin, Germany Part 4: Multidisciplinary Optimization

Optimization of passive safety Adaptive Response Surface Methodology Optimization of passive safety performance US_NCAP & EURO_NCAP using Adaptive Response Surface Method 3 and 11 continuous variables weighted objective function Solver: MADYMO … continuation from previous slide: First: If the input parameters do scatter then we want to know how large is the scatter of the output parameters. In other words we want to quantify the uncertainty of the output parameters. Second: If the output parameters are uncertain then we want to know how large the probability is that an output parameter by chance does not meet some design criteria. I. e. we want to know how reliable the component is. Third: If the output parameters are uncertain then we want to know which parameters on the input side contribute the most to the uncertainty of the output parameter. Knowing this we will be able to tackle the “evil” drivers efficiently and improve the reliability of the component. Design Evaluations: 75 Design Improvement: 10 % Part 4: Multidisciplinary Optimization

Genetic Optimization of Spot Welds 134 binary variables, torsion loading, stress constrains Weak elitism to reach fast design improvement Fatigue related stress evaluation in all spot welds Solver: ANSYS (using automatic spot weld Meshing procedure) Design evaluations: 200 Design improvement: 47% Optimization of spot welds is a binary problem and genetic algorithms are the method of choice. See also: [02/1] Will, J., Bucher, C., Riedel. J, Akgün, T. „Stochastik und Optimierung: Anwendungen genetischer und stochastischer Verfahren zur multidisziplinären Optimierung in der Fahrzeugentwicklung“; VDI-Berichte Nr. 1701, 2002, S. 863-884 Part 4: Multidisciplinary Optimization

Optimization of an Oil Pan The intention is to optimize beads to increase the first eigenfrequency of an oil pan by more than 40%. Topology optimization indicate possibility > 40% improvement, but test failed. Sensitivity study and parametric optimization using parametric CAD design + ANSYS workbench+optiSLang could solve the task. Initial design beads design after topology optimization beads design after parameter optimization Design Parameter 50 Design Evaluations: 500 CAE: ANSYS workbench CAD: Pro/ENGINEER [Veiz. A; Will, J.: Parametric optimization of an oil pan; Proceedings Weimarer Optimierung- und Stochastiktage 5.0, 2008] Part 4: Multidisciplinary Optimization

Multi Criteria Optimization Strategies Several optimization criteria are formulated in terms of the input variables x Strategy A: Only the most important objective function is used as optimization goal Other objectives as constraints Strategy B: Weighting of single objectives Part 4: Multidisciplinary Optimization

Example: damped oscillator Parametric optimizer like oS are running a black box CAE-process, therefore the gradients are determined with single side or central differences. Objective 1: minimize maximum amplitude after 5s Objective 2: minimize eigen-frequency DOE scan with 100 LHS samples gives good problem overview Weighted objectives require about 1000 solver calls Part 4: Multidisciplinary Optimization

Multi Criteria Optimization Strategies Strategy C: Pareto Optimization If objectives are in conflict, a Pareto frontier exist. Only in that case, we recommend multicriteria (Pareto) optimization. If the objectives are not in conflict, the Pareto frontier will converge in a very small area, but will need much more solver runs than weighted objective functions. The multicriteria algorithm in oS is one of the three world wide best algorithms. (The other two are available in mF.) Part 4: Multidisciplinary Optimization

Multi Criteria Optimization Strategies Design space Objective space Only for conflicting objectives a Pareto frontier exists For positively correlated objective functions only one optimum exists If objectives are in conflict, a Pareto frontier exist. Only in that case, we recommend multicriteria (Pareto) optimization. If the objectives are not in conflict, the Pareto frontier will converge in a very small area, but will need much more solver runs than weighted objective functions. The multicriteria algorithm in oS is one of the three world wide best algorithms. (The other two are available in mF.) Part 4: Multidisciplinary Optimization

Multi Criteria Optimization Strategies Conflicting objectives If objectives are in conflict, a Pareto frontier exist. Only in that case, we recommend multicriteria (Pareto) optimization. If the objectives are not in conflict, the Pareto frontier will converge in a very small area, but will need much more solver runs than weighted objective functions. The multicriteria algorithm in oS is one of the three world wide best algorithms. (The other two are available in mF.) Correlated objectives Part 4: Multidisciplinary Optimization

Multi Criteria Optimization Strategies Pareto dominance Solution a dominates solution c since a is better in both objectives Solution a is indifferent to b since each solution is better than the respective other in one objective (a dominates c) (a is indifferent to b) If objectives are in conflict, a Pareto frontier exist. Only in that case, we recommend multicriteria (Pareto) optimization. If the objectives are not in conflict, the Pareto frontier will converge in a very small area, but will need much more solver runs than weighted objective functions. The multicriteria algorithm in oS is one of the three world wide best algorithms. (The other two are available in mF.) Part 4: Multidisciplinary Optimization

Multi Criteria Optimization Strategies Pareto optimality A solution is called Pareto-optimal if there is no decision vector that would improve one objective without causing a degradation in at least one other objective A solution a is called Pareto-optimal in relation to a set of solutions A, if it is not dominated by any other solution c Requirements for ideal multi-objective optimization Find a set of solutions close to the Pareto-optimal solutions (convergence) Find solutions which are diverse enough to represent the whole Pareto front (diversity) If objectives are in conflict, a Pareto frontier exist. Only in that case, we recommend multicriteria (Pareto) optimization. If the objectives are not in conflict, the Pareto frontier will converge in a very small area, but will need much more solver runs than weighted objective functions. The multicriteria algorithm in oS is one of the three world wide best algorithms. (The other two are available in mF.) Part 4: Multidisciplinary Optimization

Multi Criteria Optimization Strategies Pareto Optimization using Evolutionary Algorithms Only in case of conflicting objectives, a Pareto frontier exists and Pareto optimization is recommended (optiSLang post processing supports 2 or 3 conflicting objectives) Effort to resolute Pareto frontier is higher than to optimize one weighted optimization function the picture show the problem with multi criteria optimization Of course searching for one optimal point needs less effort than searching for a set of optimal points If the user can only effort a certain amount of solver calls, using 200 solver calls for single criteria optimization (GA) is resulting in a design far in front of the PARETO frontier using the same amount of solver calls the decision of using a multi-criteria algorithm should be based on the knowledge of having objectives in conflict. Part 4: Multidisciplinary Optimization

Example: damped oscillator Parametric optimizer like oS are running a black box CAE-process, therefore the gradients are determined with single side or central differences. Pareto optimization with EA gives good Pareto frontier with 123 solver calls Part 4: Multidisciplinary Optimization

Example II: linear truss structure Anthill plot from ARSM Pareto front 1. For more complex problems the performance of the Pareto optimization can be improved if a good start population is available This can be found in selected designs of a previous DOE or single objective optimization Part 4: Multidisciplinary Optimization

Optimization Algorithms Local adaptive RSM Biologic Algorithms Genetic algorithms, Evolutionary strategies & Particle Swarm Optimization Gradient-based algorithms Start Response surface method (RSM) Global adaptive RSM Pareto Optimization Limitations and recommendations using the different strategies are given above. Using gradient algorithms and response surface algorithms, optiSLang provides high end state of the art. ARSM and the genetic/evolutionary algorithms are worldwide one of the best commercially available algorithms. These algorithms have been continuously developed over the last 5 years and have proven there functionality and robustness several times. They are usually the main reason for the customers decision to use optiSLang. Part 4: Multidisciplinary Optimization