Random numbers and optimization techniques Jorge Andre Swieca School Campos do Jordão, January,2003 second lecture.

Slides:



Advertisements
Similar presentations
Vegetation Science Lecture 4 Non-Linear Inversion Lewis, Disney & Saich UCL.
Advertisements

Decision Support Andry Pinto Hugo Alves Inês Domingues Luís Rocha Susana Cruz.
Local Search Algorithms
G5BAIM Artificial Intelligence Methods
Artificial Intelligence Presentation
Using Parallel Genetic Algorithm in a Predictive Job Scheduling
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Monte Carlo Methods and Statistical Physics
Simulated Annealing Student (PhD): Umut R. ERTÜRK Lecturer : Nazlı İkizler Cinbiş
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Optimization Methods One-Dimensional Unconstrained Optimization
Simulated Annealing 10/7/2005.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Optimization Methods One-Dimensional Unconstrained Optimization
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Simulated Annealing G.Anuradha. What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used.
By Rohit Ray ESE 251.  Most minimization (maximization) strategies work to find the nearest local minimum  Trapped at local minimums (maxima)  Standard.
Elements of the Heuristic Approach
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Efficient Model Selection for Support Vector Machines
MonteCarlo Optimization (Simulated Annealing) Mathematical Biology Lecture 6 James A. Glazier.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
Local Search Algorithms This lecture topic Chapter Next lecture topic Chapter 5 (Please read lecture topic material before and after each lecture.
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Genetic algorithms Charles Darwin "A man who dares to waste an hour of life has not discovered the value of life"
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Simulated Annealing.
Genetic Algorithms Siddhartha K. Shakya School of Computing. The Robert Gordon University Aberdeen, UK
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Local Search Pat Riddle 2012 Semester 2 Patricia J Riddle Adapted from slides by Stuart Russell,
Monte Carlo Process Risk Analysis for Water Resources Planning and Management Institute for Water Resources 2008.
559 Fish 559; Lecture 5 Non-linear Minimization. 559 Introduction Non-linear minimization (or optimization) is the numerical technique that is used by.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
Local Search and Optimization Presented by Collin Kanaley.
Optimization Problems
Announcement "A note taker is being recruited for this class. No extra time outside of class is required. If you take clear, well-organized notes, this.
Ramakrishna Lecture#2 CAD for VLSI Ramakrishna
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
An Introduction to Simulated Annealing Kevin Cannons November 24, 2005.
Advanced Computer Graphics Optimization Part 2 Spring 2002 Professor Brogan.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1 Statistical Data Analysis: Lecture 5 1Probability, Bayes’ theorem 2Random variables and.
Genetic Algorithm(GA)
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
Hirophysics.com The Genetic Algorithm vs. Simulated Annealing Charles Barnes PHY 327.
Constraints Satisfaction Edmondo Trentin, DIISM. Constraint Satisfaction Problems: Local Search In many optimization problems, the path to the goal is.
Local search algorithms In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution State space = set of "complete"
Optimization via Search
Heuristic Optimization Methods
Non-linear Minimization
Local Search Algorithms
By Rohit Ray ESE 251 Simulated Annealing.
Jun Liu Department of Statistics Stanford University
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Maria Okuniewski Nuclear Engineering Dept.
CSE 589 Applied Algorithms Spring 1999
5.2.3 Optimization, Search and
More on Search: A* and Optimization
Boltzmann Machine (BM) (§6.4)
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
Local Search Algorithms
Local Search Algorithms
Simulated Annealing & Boltzmann Machines
Stochastic Methods.
Presentation transcript:

Random numbers and optimization techniques Jorge Andre Swieca School Campos do Jordão, January,2003 second lecture

References The Nature of Mathematical Modeling, N. Gershenfelder, Cambridge, 1999; Numerical Recipes in C, Second Edition, W.H Press et al., Cambridge, 1992; Statistical Data Analysis, G. Cowan, Oxford, 1998 Computational Physics, Dean Karlen (online), 1998

Random O acaso não existe. car sticker

Random numbers Important task: generate random variables from known probability distributions. random numbers: produced by the computer in a strictly deterministic way – pseudorandom (→ Monte Carlo Method) random number generators: linear congruential generators: ex: c=3, a=5, m=16 0,2,3,13,4,7,6,1,8,11,10,5,12,15,14,9,0,2….

Generators Arguments from number theory: good values for a, c and m. Good generators: longest possible period individual elements within a period should follow each other “randomly” Ex. 1 RANDU a = 65539, m=2 31, c=0

Generators

“Random numbers fall mainly in the planes” Marsaglia, Proc. Acad. Sci. 61, 25, 1968 What is seen in RANDU is present in any multiplicative congruential generator. In 32 bit machines: maximum number of hyperplanes in the space of d-dimensions is: d=32953 d=4 566 d=6 120 d=10 41 RANDU has much less than the maximum!

Generators Ex. 2 Minimal standard generator (Num. Recp. ran0) a = 7 5 =16807, m= RAN1 and RAN2, given in the first edition, are “at best mediocre”

Generators Ex. 3 RANLUX cernlib (mathlib V115, F.James ) A portable high-quality random generator for lattice field theory simulation, M. Lüscher, Comp. Phys. Comm. 79, 100, 1994 period ≥ Recommendations: 1. Do some simple tests. 2. Check the results with other generator Lots of literature about random generators testing. See a set of programs (Die Hard) Functional definition: an algorithm that generates uniform numbers in acceptable if it is not rejected by a set of tests.

Inverse transform p(x) uniform probability distribution x: uniform deviate Generate y according to g(y). 0 1 x y uniform deviate in out analytically or numerically. G(y) g(y)

Inverse transform x F(x) 3 Ex. 4 discrete case f(x=0)=0.3 f(x=1)=0.2 f(x=2)=0.5 u 0.0 ≤ u ≤ 0.3x=0 0.3 ≤ u ≤ 0.5x=1 0.5 ≤ u ≤ 1.0x=2 for 2000 deviates: x= X= X=

Inverse transform Exponential amount of time until a specific event occurs; time between independent events (time until a part fails);

Acceptance-rejection method y max 0 f(x) x min x max if (x,y) under the curve, accept the point, else, discard.

Acceptance-rejection method

Importance sampling g(x): random numbers are easy to generate g(x) f(x) generate random x according to g(x) generate u uniformily between 0 and g(x); if u < f(x), accept x, if not, reject.

Optimization (minimization) Many strategies for minimization: local search, global search, with derivative evaluation or not, etc. Choose the best parameters: iteractive search starting from a initial (guess) value. Objective: find an acceptable solution. It is possible that many solutions are actually good. Other sources of uncertainty larger than differences among the solutions;

Downhill simplex method Nelder-Mead Most functionality for the least amount of code; Simplex (triangle in 2D, tetrahedron in 3D, etc); Random location, evaluate the function at D+1 vertices; Iterative procedure to improve the vertex with the highest value of the function at each step: reflect, reflect and grow, reflect and shrink, shrink, shrink towards the minimum; Minimum: stop when there is no more improvement; Num. Rec.: amoeba.c

Downhill simplex method

Powell’s method simplex in one dimension: a pair of points; find the minimum of a function in a given direction: line minimization: series of line minimization → multi-dimensional search; Powell’s method: updating the directions searched to find a set of directions that don’t interfere with each other;, D line minimizations, : ; good direction to keep for future minimization if advantageous; is added to the set of directions used for minimization (replacing the most similar to it) if gradiente of function available → conjugate gradiente algorithm

Powell’s method

Simulated annealing Growth of a cristal: difficult optimization problem; Liquid instantaneously frozen: atoms trapped in the configuration they had in the liquid state (energy barrier: from glassy to crystalline state ) If liquid slowly cooled: atoms would explore many local arrangements Thermodynamics: the relative probability of system in state with energy E trapped in the lowest energy configuration slower cooling rate, more likely to find the lowest global energy T=0 lowest state T>0 some chance to be in other state

Simulated annealing Metropolis (53) update a simulation Kirkpatrick (80’s) same ideia for other hard problems: Implementation issues: new state randomly selected (E new evaluated) if E new > E, accept the state, else if E new < E, accept the state with prob. Energy → cost function (simulated annealing) High energy: any move is accepted T →0 lowest minima found 1.Selection of trial moves: downhill simplex or conjugate gradiente, but Boltzman factor allows mistakes; 2.Cooling schedule; freeze the system in a bad solution X waste computer time

Simulated annealing

Genetic algorithms Evolution: a very hard optimization problem; Explore many options in parallel rather than concentrating on trying many changes around a single design. Sim. Annealing : one set of search parameters repeatedly updated X G.A: keep an ensemble of sets of parameters. G.A.: state is given by a population: each member a complete set of parameters for function being searched; Population updated in generations

Genetic algorithms Update criteria: Fitness: evaluation of the function for each member of the population; Reproduction; new population (size fixed) selected based on fitness; low fitness parameters may disappear; Crossover: members of the ensemble can share parameters; Mutation: changes in the parameters: random or taking advantage of what is already known to generate good moves;

Genetic algorithms