Population Based Optimization for Variable Operating Points Alan L. Jennings & Ra úl Ordóñez, ajennings1ajennings1,

Slides:



Advertisements
Similar presentations
Application of the Root-Locus Method to the Design and Sensitivity Analysis of Closed-Loop Thermoacoustic Engines C Mark Johnson.
Advertisements

Reactive and Potential Field Planners
NEURAL NETWORKS Backpropagation Algorithm
Unsupervised Learning Clustering K-Means. Recall: Key Components of Intelligent Agents Representation Language: Graph, Bayes Nets, Linear functions Inference.
Particle Swarm Optimization
Constrained Near-Optimal Control Using a Numerical Kinetic Solver Alan L. Jennings & Ra úl Ordóñez, ajennings1,
Basis Expansion and Regularization Presenter: Hongliang Fei Brian Quanz Brian Quanz Date: July 03, 2008.
EARS1160 – Numerical Methods notes by G. Houseman
Image Segmentation and Active Contour
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
1Notes  Assignment 0 is due today!  To get better feel for splines, play with formulas in MATLAB!
Artificial Neural Networks ML Paul Scheible.
Radial Basis Functions
Spatial Analysis Longley et al., Ch 14,15. Transformations Buffering (Point, Line, Area) Point-in-polygon Polygon Overlay Spatial Interpolation –Theissen.
Engineering Optimization
Neural Networks Marco Loog.
Optimization Methods Unconstrained optimization of an objective function F Deterministic, gradient-based methods Running a PDE: will cover later in course.
November 2, 2010Neural Networks Lecture 14: Radial Basis Functions 1 Cascade Correlation Weights to each new hidden node are trained to maximize the covariance.
Efficient Simulation of Physical System Models Using Inlined Implicit Runge-Kutta Algorithms Vicha Treeaporn Department of Electrical & Computer Engineering.
Memory-Based Learning Instance-Based Learning K-Nearest Neighbor.
CS Instance Based Learning1 Instance Based Learning.
Part I: Classification and Bayesian Learning
Applications in GIS (Kriging Interpolation)
Introduction to Optimization (Part 1)
Javad Lavaei Department of Electrical Engineering Columbia University Joint work with Somayeh Sojoudi Convexification of Optimal Power Flow Problem by.
Autonomous Motion Learning for Near Optimal Control By Alan Jennings School of Engineering, University of Dayton Dayton, OH, August 2012 Dissertation defense.
More Machine Learning Linear Regression Squared Error L1 and L2 Regularization Gradient Descent.
Gaussian process modelling
Using ESRI ArcGIS 9.3 Spatial Analyst
Kumar Srijan ( ) Syed Ahsan( ). Problem Statement To create a Neural Networks based multiclass object classifier which can do rotation,
Integrating Neural Network and Genetic Algorithm to Solve Function Approximation Combined with Optimization Problem Term presentation for CSC7333 Machine.
Cristian Urs and Ben Riveira. Introduction The article we chose focuses on improving the performance of Genetic Algorithms by: Use of predictive models.
Optimal Nonlinear Neural Network Controllers for Aircraft Joint University Program Meeting October 10, 2001 Nilesh V. Kulkarni Advisors Prof. Minh Q. Phan.
Chapter 11 – Neural Networks COMP 540 4/17/2007 Derek Singer.
Introduction to Artificial Neural Network Models Angshuman Saha Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg.
Robotics Chapter 5 – Path and Trajectory Planning
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Scheduling Many-Body Short Range MD Simulations on a Cluster of Workstations and Custom VLSI Hardware Sumanth J.V, David R. Swanson and Hong Jiang University.
1 Optimization Multi-Dimensional Unconstrained Optimization Part II: Gradient Methods.
Brian Macpherson Ph.D, Professor of Statistics, University of Manitoba Tom Bingham Statistician, The Boeing Company.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 24 Nov 2, 2005 Nanjing University of Science & Technology.
Low Level Control. Control System Components The main components of a control system are The plant, or the process that is being controlled The controller,
Motor Control. Beyond babbling Three problems with motor babbling: –Random exploration is slow –Error-based learning algorithms are faster but error signals.
Control systems KON-C2004 Mechatronics Basics Tapio Lantela, Nov 5th, 2015.
Particle Swarm Optimization by Dr. Shubhajit Roy Chowdhury Centre for VLSI and Embedded Systems Technology, IIIT Hyderabad.
1  Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Chapter1: Introduction Chapter2: Overview of Supervised Learning
1 Chapter 6 General Strategy for Gradient methods (1) Calculate a search direction (2) Select a step length in that direction to reduce f(x) Steepest Descent.
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
Classification Course web page: vision.cis.udel.edu/~cv May 14, 2003  Lecture 34.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
Tutorial 2, Part 2: Calibration of a damped oscillator.
CAD and Finite Element Analysis Most ME CAD applications require a FEA in one or more areas: –Stress Analysis –Thermal Analysis –Structural Dynamics –Computational.
Optimization of Pallet Packaging Space and a Robotic SCARA Manipulator for Package Stacking Group-4 Puneet Jethani Erica Neuperger Siddharth Kodgi Zarvan.
CSC321: Neural Networks Lecture 9: Speeding up the Learning
Machine Learning Supervised Learning Classification and Regression
Deep Feedforward Networks
Neural Networks Winter-Spring 2014
Multiplicative updates for L1-regularized regression
Computer Science and Engineering, Seoul National University
Computational Optimization
CAD and Finite Element Analysis
Spatial Analysis Longley et al..
Neuro-Computing Lecture 4 Radial Basis Function Network
Overfitting and Underfitting
Support Vector Machines
SKTN 2393 Numerical Methods for Nuclear Engineers
Memory-Based Learning Instance-Based Learning K-Nearest Neighbor
Image recognition.
Presentation transcript:

Population Based Optimization for Variable Operating Points Alan L. Jennings & Ra úl Ordóñez, ajennings1ajennings1, Electrical and Computer Engineering University of Dayton MethodExamples ConclusionIntroduction Monday, June 6 th, 2011CEC2011-#2741

The Challenge As a desired parameter changes, Smoothly change other parameters in real-time, While maintaining local optimality. The Solution Global search/optimization of input space, To form inverse functions in the output space, Using particles and clusters. 2 MethodExamples ConclusionIntroduction Change Y One dimension Adjust X Many dimensions Smoothly and quickly Maintain Optimality Monday, June 6 th, 2011CEC2011-#274

Problem Statement Find continuous functions x*= h i (y d ) Such that, J=g(x*) is local minimum y d =f(x*) over an interval of y d. Assumptions Compact set in x g & f are C 1, deterministic and time invariant Change in x is easy to implement Adequately scaled Regions larger than a point where ∇ f=0 can result in open domain of h i 3 MethodExamples ConclusionIntroduction Monday, June 6 th, 2011CEC2011-#274

Example Problems Thermostat Combining generators Optimal control trajectory Linear, SISO system 4 MethodExamples ConclusionIntroduction Input: Set pointOutput: TemperatureCost: Energy Monday, June 6 th, 2011CEC2011-#274 Input: Servo positions nodes in time Output: Crawl distance, Jump height, …. Cost: Energy, Max Torque, Profile Height,… This method is different from Nominal operating point Pareto-Optimal Front

Method Overview Neural Networks Universal Approximators Converge in the gradient Simple to get gradient Swarm Optimization Agents cover n space Simple motions Allow for clusters Spline Interpolation Use known optimal points of clusters 5 MethodExamples ConclusionIntroduction Surrogate Function Creation Sample Function Train Network Validate Network Swarm Optimization Initialize Population Move Agents: lower g(x), keep f(x) Check for removal/ settling conditions Form clusters Execution Select cluster h i Get y d Evaluate h i Move x to x* Swarm Optimization Execution Monday, June 6 th, 2011CEC2011-#274

Particle Motion Output gradient Move in null space Cost gradient Move opposite (in null space) Saturation All gradients saturate If gradients are large -> fixed step length If a gradient is small -> step size diminishes Boundary constraint reduces step length Minimum step for settling Remove particles close to another Quickly reduces population size 6 MethodExamples ConclusionIntroduction Output Cost Step Monday, June 6 th, 2011CEC2011-#274

Cluster Formation Form cluster from settled particle Ascend/Descend Output Form new point Apply gradient descent End Cluster conditions Particle doesn’t settle Output decreases / increases Settles too far away / close 7 MethodExamples ConclusionIntroduction Monday, June 6 th, 2011CEC2011-#274

Simple Example Combination of generators Output: Total power out Cost: Quadratic function Expected result Each does half the load 8 MethodExamples ConclusionIntroduction Monday, June 6 th, 2011CEC2011-#274

Complex Examples Combination of functions Multiple extremum Saddle points 2-dim for verification Expected result Clusters between output extremum 9 MethodExamples ConclusionIntroduction Quadratic Cost Linear/Quadratic Cost Periodic Cost Quadratic Cost Monday, June 6 th, 2011CEC2011-#274

Cluster Evaluation Verify Output Accuracy Plot Actual vs Desired for test points Verify Optimality Generate neighbors Plot cost vs output Subtract expected cost 10 MethodExamples ConclusionIntroduction Cluster 1 Test Cluster 2 Test Monday, June 6 th, 2011CEC2011-#274

5 Dim, ill scaled example 5 different generators Order of magnitude difference of gradient Used exact NN to eliminate that source or error Resulted in single cluster that balanced the incremental cost of all generators 0.1% full range accuracy

Failure Methods `Kill distance’ may end other bifurcation branches Cluster ends prematurely Global optimization parameters insufficient Corners in clusters can impair cubic interpolation Piecewise cubic can make interpolant monotonic Difficult to verify in high dimensions Testing cluster is reasonably simple 12 MethodExamples ConclusionIntroduction Possible bifurcations, direction dependent Corners in cluster interfere with interpolation Gradient goes to zero, ends cluster Monday, June 6 th, 2011CEC2011-#274

Questions or Comments? Global search, using particles & clusters, to find optimal, continuous output-inverse functions. _____________________________________________________________ Tested to work on many difficult combinations. _____________________________________________________________ Future: apply to developmental/ resolution increasing control 13 MethodExamples ConclusionIntroduction Monday, June 6 th, 2011CEC2011-#274