Neural Networks for Optimization William J. Wolfe California State University Channel Islands.

Slides:



Advertisements
Similar presentations
Feedback Neural Networks
Advertisements

© Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems Introduction.
Energy-Efficient Distributed Algorithms for Ad hoc Wireless Networks Gopal Pandurangan Department of Computer Science Purdue University.
Computational Intelligence
CS6800 Advanced Theory of Computation
1 Optimization Algorithms on a Quantum Computer A New Paradigm for Technical Computing Richard H. Warren, PhD Optimization.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Data Clustering Methods
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Channel Assignment using Chaotic Simulated Annealing Enhanced Neural Network Channel Assignment using Chaotic Simulated Annealing Enhanced Hopfield Neural.
Reading population codes: a neural implementation of ideal observers Sophie Deneve, Peter Latham, and Alexandre Pouget.
Neural Networks for Optimization William J. Wolfe California State University Channel Islands.
Neural Networks Basic concepts ArchitectureOperation.
Neural Networks for Optimization Bill Wolfe California State University Channel Islands.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Nature’s Algorithms David C. Uhrig Tiffany Sharrard CS 477R – Fall 2007 Dr. George Bebis.
How does the mind process all the information it receives?
BA 471 Management Information Systems Workforce Topics; IT As An Investment.
Hypercubes and Neural Networks bill wolfe 10/23/2005.
Preparing for Careers in Business-IT: CIS Major and IT Minor CIS Presents Prof. Jennifer Xu November 6, 2007.
Ant Colony Optimization Algorithms for the Traveling Salesman Problem ACO Kristie Simpson EE536: Advanced Artificial Intelligence Montana State.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #31 4/17/02 Neural Networks.
Energy function: E(S 1,…,S N ) = - ½ Σ W ij S i S j + C (W ii = P/N) (Lyapunov function) Attractors= local minima of energy function. Inverse states Mixture.
Neural Networks based on Competition
The Next Step Darton College Career Development Center Dacia Stone Career Center Coordinator Darton College Career Development Center Dacia.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Money Magazine’s Top 10 Best Jobs. 1.Software Engineer ($80,500) 2.College Professor($81,500) 3.Financial Advisor($122,500) 4.Human Resources Manager($74,000)
Ant Colony Optimization: an introduction
Genetic Algorithms and Ant Colony Optimisation
Chapter 7 Other Important NN Models Continuous Hopfield mode (in detail) –For combinatorial optimization Simulated annealing (in detail) –Escape from local.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Wook-Sung Yoo, Ph.D. Software Engineering Program Fairfield University
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Artificial Neural Network Unsupervised Learning
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
Simultaneous Recurrent Neural Networks for Static Optimization Problems By: Amol Patwardhan Adviser: Dr. Gursel Serpen August, 1999 The University of.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.
Cliff Shaffer Computer Science Computational Complexity.
Spatial Lotka-Volterra Systems Joe Wildenberg Department of Physics, University of Wisconsin Madison, Wisconsin USA.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Hopfield Neural Networks for Optimization 虞台文 大同大學資工所 智慧型多媒體研究室.
CS623: Introduction to Computing with Neural Nets (lecture-12) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Neural Network to solve Traveling Salesman Problem Amit Goyal Koustubh Vachhani Ankur Jain 01D05007.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
EMIS 8373: Integer Programming Combinatorial Optimization Problems updated 27 January 2005.
Lecture 39 Hopfield Network
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Chapter 5 Unsupervised learning
Solving Traveling salesman Problem with Hopfield Net
Ch7: Hopfield Neural Model
One-layer neural networks Approximation problems
Latent Semantic Indexing
ECE/CS/ME 539 Neural Networks and Fuzzy Systems
Neural Networks Chapter 4
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
A Dynamic System Analysis of Simultaneous Recurrent Neural Network
Lecture 39 Hopfield Network
Hopfield Neural Networks for Optimization
Computational Intelligence
Computational Intelligence
A Neural Network for Car-Passenger matching in Ride Hailing Services.
Presentation transcript:

Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Top 10 best jobs MONEY Magazine and Salary.com researched hundreds of jobs, considering their growth, pay, stress-levels and other factors. These careers ranked highest: 1. Software Engineer6. Market research analyst 2. College professor7. Computer IT analyst 3. Financial adviser8. Real Estate Appraiser 4. Human Resources Manager9. Pharmacist 5. Physician assistant10. Psychologist By Tara Kalwarski, Daphne Mosher, Janet Paskin and Donna Rosato

Neural Models Simple processing units, and lots of them Highly interconnected Variety of connection architectures/strengths Exchange excitatory and inhibitory signals Learning: changes in connection strengths Knowledge: connection strengths/architecture No central processor: distributed processing

Simple Neural Model a i Activation e i External input w ij Connection Strength Assume: w ij = w ji (“symmetric” network)  W = (w ij ) is a symmetric matrix

Net Input

Dynamics Basic idea:

Energy

Lower Energy da/dt = net = -grad(E)  seeks lower energy

Problem: Divergence

A Fix: Saturation

Keeps the activation vector inside the hypercube boundaries Encourages convergence to corners

Summary: The Neural Model a i Activation e i External Input w ij Connection Strength W (w ij = w ji ) Symmetric

Example: Inhibitory Networks Completely inhibitory –wij = -1 for all i,j –k-winner Inhibitory Grid –neighborhood inhibition

Traveling Salesman Problem Classic combinatorial optimization problem Find the shortest “tour” through n cities n!/2n distinct tours

Neural Network Approach

Tours – Permutation Matrices tour: CDBA permutation matrices correspond to the “feasible” states.

Not Allowed

Only one city per time stop Only one time stop per city  Inhibitory rows and columns

Distance Connections: Inhibit the neighboring cities in proportion to their distances.

putting it all together:

R n 2 = F 0  E c  E r  D aix proj = aix + act avg - rowx avg - coli avg Feasible Solutions

E = -1/2 { ∑ i ∑ x ∑ j ∑ y a ix a jy w ixjy } = -1/2 { ∑ i ∑ x ∑ y (- d(x,y)) a ix ( a i+1 y + a i-1 y ) + ∑ i ∑ x ∑ j (-1/n) a ix a jx + ∑ i ∑ x ∑ y (-1/n) a ix a iy + ∑ i ∑ x ∑ j ∑ y (1/n 2 ) a ix a jy }

Research Questions Which architecture is best? Does the network produce: –feasible solutions? –high quality solutions? –optimal solutions? How do the initial activations affect network performance? Is the network similar to “nearest city” or any other traditional heuristic? How does the particular city configuration affect network performance? Is there any way to understand the nonlinear dynamics? References: –Neural Networks for Combinatorial Optimization: A Review of More Than a Decade of Research. Kate A. Smith, Informs Journal on Computing, Vol. 11, No. 1, Winter –An Analytical Framework for Optimization Problems. A. Gee, S. V. B. Aiyer, R. Prager, 1993, Neural Networks 6, –Neural Computation of Decisions in Optimization Problems. J. J. Hopfield, D. W. Tank, Biol. Cybern. 52, (1985).

typical state of the network before convergence

“Fuzzy Readout”

DEMO 1

Fuzzy Tour Lengths Tour Length Iteration

DEMO 2 Applet by Darrell Long

EXTRA SLIDES

Brain Approximately neurons Neurons are relatively simple Approximately 10 4 fan out No central processor Neurons communicate via excitatory and inhibitory signals Learning is associated with modifications of connection strengths between neurons

with external input e = 1/2

Perfect K-winner Performance: e = k-1/2