Hopfield Neural Networks for Optimization 虞台文 大同大學資工所 智慧型多媒體研究室.

Slides:



Advertisements
Similar presentations
Feedback Neural Networks
Advertisements

Introduction to Neural Networks Computing
Bayesian Networks Bucket Elimination Algorithm 主講人:虞台文 大同大學資工所 智慧型多媒體研究室.
Reducibility Class of problems A can be reduced to the class of problems B Take any instance of problem A Show how you can construct an instance of problem.
1 Appendix B: Solving TSP by Dynamic Programming Course: Algorithm Design and Analysis.
5-1 Chapter 5 Tree Searching Strategies. 5-2 Satisfiability problem Tree representation of 8 assignments. If there are n variables x 1, x 2, …,x n, then.
1 Optimization Algorithms on a Quantum Computer A New Paradigm for Technical Computing Richard H. Warren, PhD Optimization.
Hidden Markov Model 主講人:虞台文 大同大學資工所 智慧型多媒體研究室. Contents Introduction – Markov Chain – Hidden Markov Model (HMM) Formal Definition of HMM & Problems Estimate.
Decision Tree Learning 主講人:虞台文 大同大學資工所 智慧型多媒體研究室.
Optimization Problems 虞台文 大同大學資工所 智慧型多媒體研究室. Content Introduction Definitions Local and Global Optima Convex Sets and Functions Convex Programming Problems.
主講人:虞台文 大同大學資工所 智慧型多媒體研究室
Lecture 1: Introduction to the Course of Optimization 主講人 : 虞台文.
Computability and Complexity 23-1 Computability and Complexity Andrei Bulatov Search and Optimization.
Neural Networks for Optimization William J. Wolfe California State University Channel Islands.
Neural Networks for Optimization Bill Wolfe California State University Channel Islands.
Neural Networks for Optimization William J. Wolfe California State University Channel Islands.
Algorithms All pairs shortest path
Hypercubes and Neural Networks bill wolfe 10/23/2005.
Energy function: E(S 1,…,S N ) = - ½ Σ W ij S i S j + C (W ii = P/N) (Lyapunov function) Attractors= local minima of energy function. Inverse states Mixture.
1 Integrality constraints Integrality constraints are often crucial when modeling optimizayion problems as linear programs. We have seen that if our linear.
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Programming & Data Structures
Chapter 5 Dynamic Programming 2001 년 5 월 24 일 충북대학교 알고리즘연구실.
Complexity Classes (Ch. 34) The class P: class of problems that can be solved in time that is polynomial in the size of the input, n. if input size is.
CS440 Computer Science Seminar Introduction to Evolutionary Computing.
Hebbian Coincidence Learning
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.
The Traveling Salesman Problem Over Seventy Years of Research, and a Million in Cash Presented by Vladimir Coxall.
The Simplex Algorithm 虞台文 大同大學資工所 智慧型多媒體研究室. Content Basic Feasible Solutions The Geometry of Linear Programs Moving From Bfs to Bfs Organization of a.
Fundamental Data Structures and Algorithms (Spring ’05) Recitation Notes: Graphs Slides prepared by Uri Dekel, Based on recitation.
Lecture 1: A Formal Model of Computation 虞台文 大同大學資工所 智慧型多媒體研究室.
Reinforcement Learning Eligibility Traces 主講人:虞台文 大同大學資工所 智慧型多媒體研究室.
Reinforcement Learning 主講人:虞台文 大同大學資工所 智慧型多媒體研究室.
The Improvement of Local Minima of the Hopfield Network Mengkang Peng, Narendra k. Gupta AND Alistair F. Armitag NEURAL NETWORKS, VOL. 9, NO. 7, PP ,
___________________________________________________________________________ ___________________________________________________________________________.
The Simplex Algorithm 虞台文 大同大學資工所 智慧型多媒體研究室. Content Basic Feasible Solutions The Geometry of Linear Programs Moving From Bfs to Bfs Organization of a.
Lecture 3: Count Programs, While Programs and Recursively Defined Functions 虞台文 大同大學資工所 智慧型多媒體研究室.
Intelligent Space 國立台灣大學資訊工程研究所 智慧型空間實驗室 Brainstorming Principles Reporter Chun-Feng Liao Sep 12,2005 Source D.Bellin and S.S.Simone, ”Brainstorming: A.
A Neural-Network Approach for Visual Cryptography 虞台文 大同大學資工所.
___________________________________________________________________________ ___________________________________________________________________________.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
Lecture 2: Limiting Models of Instruction Obeying Machine 虞台文 大同大學資工所 智慧型多媒體研究室.
Neural Network to solve Traveling Salesman Problem Amit Goyal Koustubh Vachhani Ankur Jain 01D05007.
Linear Programming 虞台文.
Management Science 461 Lecture 7 – Routing (TSP) October 28, 2008.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
EM Algorithm 主講人:虞台文 大同大學資工所 智慧型多媒體研究室. Contents Introduction Example  Missing Data Example  Mixed Attributes Example  Mixture Main Body Mixture Model.
Lecture 6: Context-Free Languages
Lecture 20. Graphs and network models 1. Recap Binary search tree is a special binary tree which is designed to make the search of elements or keys in.
EMIS 8373: Integer Programming Combinatorial Optimization Problems updated 27 January 2005.
Lecture 39 Hopfield Network
EMIS 8373: Integer Programming
Solving Traveling salesman Problem with Hopfield Net
Hopfield net and Traveling Salesman problem
Redraw these graphs so that none of the line intersect except at the vertices B C D E F G H.
Ch7: Hopfield Neural Model
Neural Networks Chapter 4
Lecture 39 Hopfield Network
Hopfield Neural Networks for Optimization
Lecture 5: Turning Machine
Computational Intelligence
Longest Common Subsequence (LCS)
Computational Intelligence
A Neural-Network Approach for Visual Cryptography
Simulated Annealing & Boltzmann Machines
Edit Distance 張智星 (Roger Jang)
Lecture 6: Computational Complexity
Presentation transcript:

Hopfield Neural Networks for Optimization 虞台文 大同大學資工所 智慧型多媒體研究室

Content Introduction A Simple Example  Race Traffic Problem Example  A/D Converter Example  Traveling Salesperson Problem

Hopfield Neural Networks for Optimization Introduction 大同大學資工所 智慧型多媒體研究室

Energy Function of a Hopfield NN Interaction btw neurons Interaction to the external constant Running a Hopfield NN asynchronously, its energy is monotonically non-increasing.

Solving Optimization Problems Using Hopfield NNs Reformulating the cost of a problem in the form of energy function of a Hopfield NN. Build a Hopfield NN based on such an energy function. Running the NN asynchronously until the NN settles down. Read the answer reported by the NN.

Hopfield Neural Networks for Optimization A Simple Example Race Traffic Problem 大同大學資工所 智慧型多媒體研究室

A Simple Hopfield NN 11 11 22 22 I1I1 I2I2

The Race Traffic Problem +1 11 11 v1v1 v2v2

The Race Traffic Problem 11 11 22 22 00 11

11 11 22 22 00 11 11 11 1 Stable State

The Race Traffic Problem 11 11 22 22 00 11 11 11 1 Stable State

The Race Traffic Problem 11 11 22 22 00 11 11 11 How about if to run synchronously?

Hopfield Neural Networks for Optimization Example A/D Converter 大同大學資工所 智慧型多媒體研究室

Reference Tank, D.W., and Hopfield, J.J., “Simple "neural" optimization networks: An A/D converter, signal decision circuit and a linear programming circuit,” IEEE Transactions on Circuits and Systems, Vol. CAS-33 (1986)

Analog A/D Converter A/D v0v0 v1v1 v2v2 v3v I Using Unipolar Neurons

A/D Converter Using Unipolar Neurons

A/D Converter v0v0 v1v1 v2v2 v3v3 I0I0 I1I1 I2I2 I3I3

Hopfield Neural Networks for Optimization Example Traveling Salesperson Problem 大同大學資工所 智慧型多媒體研究室

Reference J. J. Hopfield and D. W. Tank, “Neural” computation of decisions in optimization problems, ” Biological Cybernetics, Vol. 52, pp , 1985.

Traveling Salesperson Problem

Given n cities with distances d ij, what is the shortest tour?

Traveling Salesperson Problem

Traveling Salesperson Problem Distance Matrix Find a minimum cost Hamiltonian Cycle.

Search Space Find a minimum cost Hamiltonian Cycle. Assume we are given a fully connection graph with n vertices and symmetric costs ( d ij =d ji ). The size of search space is

Problem Representation Using NNs Time City

Problem Representation Using NNs Time City The salesperson reaches city 5 at time 3.

Problem Representation Using NNs Time City Goal: Find a minimum cost Hamiltonian Cycle.

The Hamiltonian Constraint Time City Goal: Find a minimum cost Hamiltonian Cycle. Each row and column can have only one neuron “on”. For a n -city problem, n neurons will be on. Each row and column can have only one neuron “on”. For a n -city problem, n neurons will be on.

Cost Minimization Time City Goal: Find a minimum cost Hamiltonian Cycle. The total distance of the valid tour have to be very low. d 35 d 54 d 42 d 25 d 51 The summation of these d ij ’s is very low.

Indices of Neurons Time City v xi x i

Energy Function Hamiltonian-Cycle Satisfaction Cost Minimization

Energy Function Each row one or zero neuron ‘on’ Each column one or zero neuron ‘on’ n neurons ‘on’

Energy Function Total distance of the tour

Energy Function

Build NN for TSP Energy function of a 2-D neural network Mapping

Analog Hopfield NN for 10-City TSP

The shortest path

Analog Hopfield NN for 10-City TSP The shortest path

Analog Hopfield NN for 30-City TSP