Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.

Slides:



Advertisements
Similar presentations
Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
Advertisements

Simulated Annealing Premchand Akella. Agenda Motivation The algorithm Its applications Examples Conclusion.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Acoustic design by simulated annealing algorithm
Optimization methods Review
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Institute of Intelligent Power Electronics – IPE Page1 Introduction to Basics of Genetic Algorithms Docent Xiao-Zhi Gao Department of Electrical Engineering.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
MAE 552 – Heuristic Optimization Lecture 8 February 8, 2002.
Engineering Optimization
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
1 Genetic Algorithms. CS The Traditional Approach Ask an expert Adapt existing designs Trial and error.
Iterative Improvement Algorithms
Design Optimization School of Engineering University of Bradford 1 A discrete problem Difficultiy in the solution of a discrete problem.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
7/2/2015Intelligent Systems and Soft Computing1 Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
Advanced Topics in Optimization
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Elements of the Heuristic Approach
Ranga Rodrigo April 6, 2014 Most of the sides are from the Matlab tutorial. 1.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
Evolutionary Intelligence
Efficient Model Selection for Support Vector Machines
Evolving a Sigma-Pi Network as a Network Simulator by Justin Basilico.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Integrating Neural Network and Genetic Algorithm to Solve Function Approximation Combined with Optimization Problem Term presentation for CSC7333 Machine.
A Comparison of Nature Inspired Intelligent Optimization Methods in Aerial Spray Deposition Management Lei Wu Master’s Thesis Artificial Intelligence Center.
Genetic algorithms Prof Kang Li
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
More on Heuristics Genetic Algorithms (GA) Terminology Chromosome –candidate solution - {x 1, x 2,...., x n } Gene –variable - x j Allele –numerical.
Evolving Virtual Creatures & Evolving 3D Morphology and Behavior by Competition Papers by Karl Sims Presented by Sarah Waziruddin.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
Doshisha Univ., Kyoto, Japan CEC2003 Adaptive Temperature Schedule Determined by Genetic Algorithm for Parallel Simulated Annealing Doshisha University,
© Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
Exact and heuristics algorithms
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
 Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms n Introduction, or can evolution be intelligent? n Simulation.
 Based on observed functioning of human brain.  (Artificial Neural Networks (ANN)  Our view of neural networks is very simplistic.  We view a neural.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Optimization Problems
Waqas Haider Bangyal 1. Evolutionary computing algorithms are very common and used by many researchers in their research to solve the optimization problems.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
1 Autonomic Computer Systems Evolutionary Computation Pascal Paysan.
Genetic Algorithms. Underlying Concept  Charles Darwin outlined the principle of natural selection.  Natural Selection is the process by which evolution.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm(GA)
March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 1 Let’s look at… Machine Evolution.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Genetic Algorithms.
Heuristic Optimization Methods
ME 521 Computer Aided Design 15-Optimization
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Maria Okuniewski Nuclear Engineering Dept.
CS621: Artificial Intelligence
Meta-Heuristic Algorithms 16B1NCI637
Introduction to Simulated Annealing
Boltzmann Machine (BM) (§6.4)
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
Md. Tanveer Anwar University of Arkansas
Simulated Annealing & Boltzmann Machines
Presentation transcript:

Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery Lecture 12

Optimization of thermal processes2007/2008 Overview of the lecture Overview of some modern methods of optimization Genetic algorithms (GA) Simulated annealing Neural-network-based optimization

Optimization of thermal processes2007/2008 Genetic algorithms (introduction) If a design problem is characterised by: mixed continuous and discrete variables discontinuous or nonconvex desing spaces (feasible regions) then standard techniques may be inefficient. It is also possible that only relative optimum closest to the starting point will be found. Feasible region Genetic algorithms (GA) in many cases can find global optimum with high probability.

Optimization of thermal processes2007/2008 Genetic algorithms (introduction) Charles Darwin ( ) Genetic algorithms are based on Darwin’s theory of survival of the fittest (natural selection). Population of solutions Reproduction Only „good” solutions may reproduce. Cross-over parents offspring „Good” solutions are reproduced in next generations. Mutation Random change in the solution. solution Better solution

Optimization of thermal processes2007/2008 Genetic algorithms (introduction) Characteristics of GA: A population of trial desing vectors is used for the starting procedure (less likely to get trapped in a local optimum) GA use only the value of the objective function (direct method) Design variables represented as strings of binary variables – naturally applicable for integer programming. In the case of continuous variables, they have to be approximated with discrete ones The objective function value plays the role of fitness In every new generation (iteration): −Parents are selected at random (from sufficiently good solutions) −Crossover occurs and new solution is obtained GA is not just a random search technique – solutions with better value of objective function (better fitness) are favoured

Decimal number Binary number Optimization of thermal processes2007/2008 Genetic algorithms – representation of design variables In GA the design variables are represented as strings of binary numbers, 0 and String of length 20 Continuous variable... q binary numbers Representation of a continuous variable (resolution depends on q ):

Optimization of thermal processes2007/2008 Genetic algorithms – representation of objective function and constraints GA finds solution of an unconstrained problem. To solve a constrained minimization problem, two transformations have to made: transformation into uncostrained problem, i.e. with the use of penalty function method transformation into maximization of the fitness function Minimize Penalty parameter Fitness function The largest value of in the population

Optimization of thermal processes2007/2008 Genetic algorithms – genetic operators Population of K solutions Reproduction Every solution has a value of the fitness function f1f1 f2f2... fKfK Strings are selected for reproduction with the probability: The larger the fitness function the larger probability of selection for reproduction. Note: Highly fit individuals live and reproduce Less fit individuals „die” Now, crossover occurs Selected parents

Offspring 1 Offspring 2 Exchange of substrings Optimization of thermal processes2007/2008 Genetic algorithms – genetic operators Crossover Parent 1 Parent 2 Crossover site – selected at random The new strings are placed in the new population. The process is continued.

Optimization of thermal processes2007/2008 Genetic algorithms – genetic operators Mutation Occasional random alteration of a binary digit Some design vector Random location Mutation New design vector Mutation introduces random change in the genetic material. It helps to find global optimum.

Optimization of thermal processes2007/2008 Simulated annealing (introduction) Simulated annealing belongs to random search methods. However, it is designed to move toward the global minimum of the objective function. To see the drawbacks of a „naive” random search method, let’s consider the following alogrithm: 1.Choose (at random) an initial starting point X 1 2.Make random moves along each coordinate direction – go to the point X* 3.If f(X*)>f(X 1 ) reject the point X* and find a new one. Otherwise, accept the point X* as a new starting point X 1 and go to step 2. 4.Repeat, until the objective function can’t be reduced further. The problem with such an algorithm is that it may stuck in a local optimum.

Optimization of thermal processes2007/2008 Simulated annealing – naive random search method First step. Increase of the objective function – point rejected. Second step. Point accepted. Third step. Point accepted. We can’t leave this point Local optimum Global optimum Thus, in this version of the random search method if we find the local optimum, there is no way to leave this point.

Optimization of thermal processes2007/2008 Simulated annealing – the main concept With the use of simulated annealing technique transitions out of a local minimum are possible. This move is accepted unconditionally, as the objective function is reduced. This move is accepted with a probability: where Metropolis criterion Increase of objective function Temperature As. So, the largest temperature, the less constrained are the movements.

Optimization of thermal processes2007/2008 Simulated annealing – the main concept The algorithm starts with a high temperature (large value of T ) and in the subsequent steps the temperature is reduced slowly. The global optimum is found with a high probability even for objective function with many local minima. The change of T is defined by so called cooling schedule. The name of the method is derived from simulation of thermal annealing of solids (metals). A slow cooling of a heated solid ensures proper solidification with highly ordered crystalline structure. Rapid cooling causes defects inside the material. Simulated annealing – slow cooling Naive random search – rapid cooling Lowest internal energy – global minimum High internal energy – local optimum

Optimization of thermal processes2007/2008 The quality of the final solution is not affected by the initial guess (however, computational time may increase with worse starting point). The objective function doesn’t have to be regular (continuous, differentiable). The feasible region doesn’t have to be convex (the convergence is not influenced by the convexity). The method can be used to solve mixed-integer, discrete or continuous problems. For the problems with constraints modified objective function may be formulated, just as in the case of genetic algorithms (i.e. penalty function approach). Simulated annealing – some of the features

Optimization of thermal processes2007/2008 Neural-network-based optimization A neural network is a parallel network of interconnected simpre processors (neurons). A neuron accepts a set of inputs from other neurons and computes an output. a Single neuron The weights w i are not specified but they are determined in the learning process.

Optimization of thermal processes2007/2008 Neural-network-based optimization Neurons may be connected to form multilayer networks. Output layer Hidden layer Input layer Such a network may be trained to „solve” specific problems.

Optimization of thermal processes2007/2008 Neural-network-based optimization The strength of the various interconnections (weights) may be considered as the representation of the knowledge contained in the network The network is trained to minimize the error between the actual output of the output layer and the target output for all the input patters The training is just selecting the weigths w i The learning schemes govern as to how the wieghts are to be varied to minimize the error Possible usage: Train the network for a specific set of input patterns (supply input parameters and solutions to the given problems) Supply input parameters different from that of the training set The network should return the solution of the problem (approximate, at least)

Optimization of thermal processes2007/2008 Thank you for your attention