JM - 1 Introduction to Bioinformatics: Lecture XVI Global Optimization and Monte Carlo Jarek Meller Jarek Meller Division of Biomedical.

Slides:



Advertisements
Similar presentations
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Advertisements

Student : Mateja Saković 3015/2011.  Genetic algorithms are based on evolution and natural selection  Evolution is any change across successive generations.
Tuesday, May 14 Genetic Algorithms Handouts: Lecture Notes Question: when should there be an additional review session?
Institute of Intelligent Power Electronics – IPE Page1 Introduction to Basics of Genetic Algorithms Docent Xiao-Zhi Gao Department of Electrical Engineering.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Iterative Improvement Algorithms
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Genetic Algorithm.
Genetic Algorithms and Ant Colony Optimisation
Evolutionary Intelligence
© Negnevitsky, Pearson Education, CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University.
1 IE 607 Heuristic Optimization Simulated Annealing.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Genetic algorithms Prof Kang Li
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
JM - 1 Introduction to Bioinformatics: Lecture III Genome Assembly and String Matching Jarek Meller Jarek Meller Division of Biomedical.
Simulated Annealing.
1/27 Discrete and Genetic Algorithms in Bioinformatics 許聞廉 中央研究院資訊所.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
EE459 I ntroduction to Artificial I ntelligence Genetic Algorithms Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
Doshisha Univ., Kyoto, Japan CEC2003 Adaptive Temperature Schedule Determined by Genetic Algorithm for Parallel Simulated Annealing Doshisha University,
GENETIC ALGORITHMS.  Genetic algorithms are a form of local search that use methods based on evolution to make small changes to a popula- tion of chromosomes.
© Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Learning by Simulating Evolution Artificial Intelligence CSMC February 21, 2002.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
1 Genetic Algorithms and Ant Colony Optimisation.
Simulated Annealing G.Anuradha.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Chapter 12 FUSION OF FUZZY SYSTEM AND GENETIC ALGORITHMS Chi-Yuan Yeh.
Optimization Problems
Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
►Search and optimization method that mimics the natural selection ►Terms to define ٭ Chromosome – a set of numbers representing one possible solution ٭
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Genetic Algorithms. Solution Search in Problem Space.
Genetic Algorithms An Evolutionary Approach to Problem Solving.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
Constraints Satisfaction Edmondo Trentin, DIISM. Constraint Satisfaction Problems: Local Search In many optimization problems, the path to the goal is.
Local search algorithms In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution State space = set of "complete"
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Optimization Problems
Genetic Algorithms.
Genetic Algorithms.
CSCI 4310 Lecture 10: Local Search Algorithms
School of Computer Science & Engineering
Local Search Algorithms
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Maria Okuniewski Nuclear Engineering Dept.
Basics of Genetic Algorithms (MidTerm – only in RED material)
Optimization Problems
Basics of Genetic Algorithms
Boltzmann Machine (BM) (§6.4)
Chapter 5. Advanced Search
Beyond Classical Search
Simulated Annealing & Boltzmann Machines
Presentation transcript:

JM Introduction to Bioinformatics: Lecture XVI Global Optimization and Monte Carlo Jarek Meller Jarek Meller Division of Biomedical Informatics, Children’s Hospital Research Foundation & Department of Biomedical Engineering, UC

JM - Outline of the lecture Global optimization and local minima problem Physical map assembly, ab initio protein folding and likelihood maximization as examples of global optimization problems Biased random search heuristics Monte Carlo approach Biological motivations and genetic algorithms

JM - Optimization, steepest descent and local minima Optimization is a procedure in which an extremum of a function is sought. When the relevant extremum is the minimum of a function the optimization procedure is called minimization. f(x) Global minimum Local minimum

JM - Rugged landscapes and local minima (maxima) problem

JM - Algorithmic complexity of global optimization Polynomial vs. exponential complexity, e.g., n 2 vs. 2 n steps to obtain the optimal solution where n denotes the overall “size of the input” Global optimization term is used to refer to optimization problems for which no polynomial time algorithm that guarantees optimal solution is known In general global optimization implies that there might be multiple local minima and thus one is likely to find a local rather than the global optimum Let us revisit some of the global optimization problems that we stumbled on so far …

6 The problem of ordering clone libraries with STS markers in the presence of errors DNA clone 1 clone 2 clone 3 clone 4 STS: In the presence of experimental errors the problem leads to global optimization problem (see Pevzner, Chapter 3). STS Clone

JM - Heuristic solutions may still provide good probe ordering The number of “gaps” (blocks of zeros in rows) in the hybridization matrix may be used as a cost function, since hybridization errors typically split blocks of ones (false negatives) or split a gap into two gaps (false positive). The problem of finding a permutation that minimizes the number of gaps can be cast as a Traveling Salesman Problem (TSP), in which cities are the columns of the hybridization matrix (plus an additional column of zeros) and the distance between two cities is the number of positions in which the two columns differ (Hamming dist.) Thus, an efficient algorithm is unlikely in general case (unless P=NP) and heuristic solutions are being sought that provide good probe ordering, at least for most cases (e.g. Alizadeh et. al., 1995)

JM - Profile HMMs and likelihood optimization when states (optimal multiple alignments) are not known

JM - Random biased search: ideas and heuristics GA, MC, SA (MC with a smoothing) Fitness lanscapes Biological and physical systems solve these “unsolvable” problems: From optimization to biology and back to optimization

JM - Literature watch: 10 years of DNA computing Adleman LM, Molecular computation of solutions to combinatorial problems, Science 266: (1994) RS Braich, N Chelyapov, C Johnson, PWK Rothemund, and L Adleman, Solution of a 20-Variable 3-SAT Problem on a DNA Computer, Science 296: (2002)

JM - Monte Carlo random search A simulation technique for conformational sampling and optimization based on a random search for energetically favourable conformations. Finding global (or at least “good” local) minimum by biased random walk may take some luck …

JM - Monte Carlo algorithm The core of MC algorithm is a heuristic prescription for a plausible pattern of changes in the configurations assumed by the system. Such an elementary “move” depends on the type of the problem. In the realm of protein structure it may be for instance a rotation around a randomly chosen backbone bond. A long series of random moves is generated with only some of them considered as “good” moves. The advantage of MC method is its generality and a relatively weak dependence on the dimensionality of the system. However, finding a “move” which would ensure efficient sampling may be a highly non-trivial problem.

JM - Monte Carlo algorithm In the standard Metropolis MC a move is accepted unconditionally if the new configuration results in a better (lower) potential energy. Otherwise it is accepted with a probability given by the Boltzmann factor: denotes the change in the potential energy associated with a move

JM - Climbing mountains easier: simulated annealing Increasing the effective “temperature” means higher probability of accepting moves that increase the energy Thus, the likelihood of escaping from a local minimum may be tuned Heating and cooling cycles, in analogy to physical systems In the limit of infinitely slow cooling simulated annealing is guaranteed to provide the global minimum

JM - From biology to optimization: genetic algorithms Genetic algorithm (GA). A class of algorithms inspired by the mechanisms of genetics, which has been applied to global optimization (especially combinatorial optimization problems). It requires the specification of three operations (each is typically probabilistic) on objects, called "strings" (these could be real-valued vectors)global optimization combinatorial optimization 0. Initialize population 1. Select parents for reproduction and “evolutionary” operators (e.g. mutation and crossover) 2. Perform operations to generate intermediate population and evaluate their fitness (value of the objective function to be optimized) 3. Select a subpopulation for next generation (survival of the fittest) 4. Repeat 1-3 until some stopping rule is reached

JM - Genetic algorithm: operators and adaptation Reproduction - combining strings in the population to create a new string (offspring); Example: Taking 1st character from 1st parent + rest of string from 2nd parent: [001001] + [111111] ===> [011111] Mutation - spontaneous alteration of characters in a string; Example: [001001] ===> [101001] Crossover - combining strings to exchange values, creating new strings in their place. Example: With crossover location at 2: [001001] & [111111] ===> [001111], [111001]

JM - Genetic algorithms for global optimization The original GA was proposed by John Holland and used crossover and total population replacement. This means a population with 2N objects (called chromosomes) form N pairings of parents that produce 2N offsprings. The offsprings comprise the new generation, and they become the total population, replacing their parents. More generally, a population of size N produces an intermediate population of N+M, from which Ñis kept to form the new population. One way to choose which Ñsurvive is by those with the greatest fitness values –survival of the fittest.

JM - Random biased search: ideas and heuristics GA, MC, SA (MC with a smoothing) Fitness lanscapes