Heuristic Optimization and Dynamical System Safety Verification Todd W. Neller Knowledge Systems Laboratory Stanford University.

Slides:



Advertisements
Similar presentations
Local optimization technique G.Anuradha. Introduction The evaluation function defines a quality measure score landscape/response surface/fitness landscape.
Advertisements

Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Automatic Tuning1/33 Boosting Verification by Automatic Tuning of Decision Procedures Domagoj Babić joint work with Frank Hutter, Holger H. Hoos, Alan.
Optimal Design Laboratory | University of Michigan, Ann Arbor 2011 Design Preference Elicitation Using Efficient Global Optimization Yi Ren Panos Y. Papalambros.
Gizem ALAGÖZ. Simulation optimization has received considerable attention from both simulation researchers and practitioners. Both continuous and discrete.
1 Wendy Williams Metaheuristic Algorithms Genetic Algorithms: A Tutorial “Genetic Algorithms are good at taking large, potentially huge search spaces and.
Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 1 Ryan Kinworthy CSCE Advanced Constraint Processing.
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Information-Based Optimization Approaches to Dynamical System Safety Verification Todd W. Neller.
MAE 552 – Heuristic Optimization
Topologically Adaptive Stochastic Search I.E. Lagaris & C. Voglis Department of Computer Science University of Ioannina - GREECE IOANNINA ATHENS THESSALONIKI.
How to Stall a Motor: Information-Based Optimization for Safety Refutation of Hybrid Systems Todd W. Neller Knowledge Systems Laboratory Stanford University.
Resource Allocation Problem Reporter: Wang Ching Yu Date: 2005/04/07.
Optimization Methods One-Dimensional Unconstrained Optimization
Nonlinear Stochastic Programming by the Monte-Carlo method Lecture 4 Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius, Lithuania EURO.
Radial Basis Function Networks
By Rohit Ray ESE 251.  Most minimization (maximization) strategies work to find the nearest local minimum  Trapped at local minimums (maxima)  Standard.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Elements of the Heuristic Approach
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Genetic Algorithms: A Tutorial
Particle Swarm Optimization Algorithms
Island Based GA for Optimization University of Guelph School of Engineering Hooman Homayounfar March 2003.
Stochastic Approximation and Simulated Annealing Lecture 8 Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius, Lithuania EURO Working.
CSM6120 Introduction to Intelligent Systems Other evolutionary algorithms.
Efficient Model Selection for Support Vector Machines
1 Hybrid methods for solving large-scale parameter estimation problems Carlos A. Quintero 1 Miguel Argáez 1 Hector Klie 2 Leticia Velázquez 1 Mary Wheeler.
Large-scale Hybrid Parallel SAT Solving Nishant Totla, Aditya Devarakonda, Sanjit Seshia.
1 Lesson 8: Basic Monte Carlo integration We begin the 2 nd phase of our course: Study of general mathematics of MC We begin the 2 nd phase of our course:
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
PSO and its variants Swarm Intelligence Group Peking University.
(Particle Swarm Optimisation)
4 Fundamentals of Particle Swarm Optimization Techniques Yoshikazu Fukuyama.
Robin McDougall Scott Nokleby Mechatronic and Robotic Systems Laboratory 1.
1 “Genetic Algorithms are good at taking large, potentially huge search spaces and navigating them, looking for optimal combinations of things, solutions.
Heuristic Optimization Methods Greedy algorithms, Approximation algorithms, and GRASP.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
Local Search Pat Riddle 2012 Semester 2 Patricia J Riddle Adapted from slides by Stuart Russell,
CAS 721 Course Project Implementing Branch and Bound, and Tabu search for combinatorial computing problem By Ho Fai Ko ( )
Vaida Bartkutė, Leonidas Sakalauskas
Announcement "A note taker is being recruited for this class. No extra time outside of class is required. If you take clear, well-organized notes, this.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
A study of simulated annealing variants Ana Pereira Polytechnic Institute of Braganca, Portugal Edite Fernandes University of Minho,
August, 2005 Department of Computer Science, University of Ioannina Ioannina - GREECE 1 To STOP or not to STOP By I. E. Lagaris A question in Global Optimization.
Instructional Design Document Simplex Method - Optimization STAM Interactive Solutions.
Metaheuristics for the New Millennium Bruce L. Golden RH Smith School of Business University of Maryland by Presented at the University of Iowa, March.
Biointelligence Lab School of Computer Sci. & Eng. Seoul National University Artificial Intelligence Chapter 8 Uninformed Search.
Parallel Simulated Annealing using Genetic Crossover Tomoyuki Hiroyasu Mitsunori Miki Maki Ogura November 09, 2000 Doshisha University, Kyoto, Japan.
Genetic Algorithms. Solution Search in Problem Space.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Heuristic Optimization Methods
C.-S. Shieh, EC, KUAS, Taiwan
Local Search Algorithms
Sheqin Dong, Song Chen, Xianlong Hong EDA Lab., Tsinghua Univ. Beijing
Synthesis via Numerical optimization
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
“Hard” Optimization Problems
Constrained Molecular Dynamics as a Search and Optimization Tool
Market-based Dynamic Task Allocation in Mobile Surveillance Systems
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
Local Search Algorithms
Greg Knowles ECE Fall 2004 Professor Yu Hu Hen
Traveling Salesman Problem by Genetic Algorithm
The Coordination of Parallel Search with Common Components
Stephen Chen York University
Local Search Algorithms
Stochastic Methods.
Presentation transcript:

Heuristic Optimization and Dynamical System Safety Verification Todd W. Neller Knowledge Systems Laboratory Stanford University

1 Outline Motivating Problem Heuristic Optimization Approach Comparative Study of Global Optimization Techniques Information-Based Optimization Recent Research Results

2 Focus Global optimization techniques can be powerfully applied to a class of hybrid system verification problems. When each function evaluation of an optimization is costly, such information should be used intelligently in the course of optimization.

3 Stepper Motor

4 Stepper Motor Safety Verification Given: Bounds on stepper motor system parameters Bounds on initial conditions Verify: No stalls in all possible acceleration scenarios

5 Heuristic Search Landscape Make use of simple knowledge of problem domain to provide landscape helpful to search

6 Verification through Optimization Transform verification problem into an optimization problem with a heuristic measure of relative safety Apply efficient global optimization

7 Comparative Testing Methods: Simulated Annealing: AMEBSA, ASA, SALO Multi Level Single Linkage (MLSL) and variants Random Local Optimization (RANDLO) Test Functions: From optimization literature and method demos Used to gain rough idea of relative strengths

8 Comparative Study Results SALO and RANDLO generally best for functions with many and few minima respectively Local optimization “flattens” and simplifies these search spaces. Local optimization doesn’t always lead to nearest optimum. Minima rarely located at bounds of search space.

9 Global Optimization Results

10 Comparative Study Results (cont.) For test functions STEP1 and STEP2, RANDLO and LMLSL performed best for both constrained local optimization procedures. SALO: ASA did not search the locally optimized search spaces (f ´ ) efficiently. Recent experiments indicate that information-based global optimization performs even better.

11 Global Optimization Results (cont.) CONSTRYURETMIN

12 Information-Based Approach Information-Based Optimization - Previous function evaluations shape probability distribution over possible functions. Most methods waste costly information.

13 Information-Based Local Optimization Choose initial point and search radius Iterate: Evaluate point in sphere where minimum most likely according to information gained thus far If less than initial point, make new point center

14 Multi-Level Local Optimization Each layer of local optimization simplifies search space for the layer above. MLLO-RIQ : Perform random (Monte-Carlo) optimization of: f´´ : Information-based local optimization of:  f´ : Quasi-Newton local optimization of: –f : heuristic function

15 MLLO Example: Rastrigin Function

16 MLLO-RIQ Results For our first set of functions, MLLO-RIQ trial results are very encouraging Local optimization procedure not suited to discontinuous CMMR No startup cost as with MLSL or GA

17 Other Work in Progress Global Information-Based Optimization Information-Based Direction-Set Methods Dynamic Search Tuning Future work: Parallel Information-Based Methods Expert System for Global Optimization Main challenge: Approximating optimal decision procedures

18 Summary Heuristically use domain knowledge to transform an initial safety problem into a global optimization problem Information is costly  Use information well in the course of optimization with information-based approaches