Adaptive Hopfield Network Gürsel Serpen Dr. Gürsel Serpen Associate Professor Electrical Engineering and Computer Science Department University of Toledo.

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 7: Learning in recurrent networks Geoffrey Hinton.
CS 678 –Boltzmann Machines1 Boltzmann Machine Relaxation net with visible and hidden units Learning algorithm Avoids local minima (and speeds up learning)
Tuomas Sandholm Carnegie Mellon University Computer Science Department
Artificial Neural Networks
Gizem ALAGÖZ. Simulation optimization has received considerable attention from both simulation researchers and practitioners. Both continuous and discrete.
CS 678 –Relaxation and Hopfield Networks1 Relaxation and Hopfield Networks Totally connected recurrent relaxation networks Bidirectional weights (symmetric)
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
On Systems with Limited Communication PhD Thesis Defense Jian Zou May 6, 2004.
Channel Assignment using Chaotic Simulated Annealing Enhanced Neural Network Channel Assignment using Chaotic Simulated Annealing Enhanced Hopfield Neural.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Branch Prediction with Neural- Networks: Hidden Layers and Recurrent Connections Andrew Smith CSE Dept. June 10, 2004.
Radial Basis Functions
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2010 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Spring 2002 Shreekanth Mandayam Robi Polikar ECE Department.
Artificial Neural Networks
LOGO Classification III Lecturer: Dr. Bo Yuan
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
CS 4700: Foundations of Artificial Intelligence
Neural Networks. Plan Perceptron  Linear discriminant Associative memories  Hopfield networks  Chaotic networks Multilayer perceptron  Backpropagation.
Artificial Neural Networks
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks
Machine Learning Chapter 4. Artificial Neural Networks
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University EE459 Neural Networks The Structure.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
NEURAL NETWORKS FOR DATA MINING
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
CS 478 – Tools for Machine Learning and Data Mining Backpropagation.
Simultaneous Recurrent Neural Networks for Static Optimization Problems By: Amol Patwardhan Adviser: Dr. Gursel Serpen August, 1999 The University of.
Reinforcement Learning Control with Robust Stability Chuck Anderson, Matt Kretchmar, Department of Computer Science, Peter Young, Department of Electrical.
Akram Bitar and Larry Manevitz Department of Computer Science
Distinguished Talk Dr. Jun Wang, IEEE Fellow Professor Dept. of Mechanical and Automation Engineering, The Chinese University of Hong Kong For further.
Learning from Positive and Unlabeled Examples Investigator: Bing Liu, Computer Science Prime Grant Support: National Science Foundation Problem Statement.
Neural Networks - Berrin Yanıkoğlu1 Applications and Examples From Mitchell Chp. 4.
Chapter 18 Connectionist Models
Practical Message-passing Framework for Large-scale Combinatorial Optimization Inho Cho, Soya Park, Sejun Park, Dongsu Han, and Jinwoo Shin KAIST 2015.
EEE502 Pattern Recognition
Chapter 8: Adaptive Networks
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Designing High-Capacity Neural Networks for Storing, Retrieving and Forgetting Patterns in Real-Time Dmitry O. Gorodnichy IMMS, Cybernetics Center of Ukrainian.
CHEE825 Fall 2005J. McLellan1 Nonlinear Empirical Models.
Chapter 6 Neural Network.
A Hybrid Optimization Approach for Automated Parameter Estimation Problems Carlos A. Quintero 1 Miguel Argáez 1, Hector Klie 2, Leticia Velázquez 1 and.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Fall 2004 Backpropagation CS478 - Machine Learning.
CS 388: Natural Language Processing: LSTM Recurrent Neural Networks
Real Neurons Cell structures Cell body Dendrites Axon
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Machine Learning Today: Reading: Maria Florina Balcan
of the Artificial Neural Networks.
Tosiron Adegbija and Ann Gordon-Ross+
network of simple neuron-like computing elements
Artificial Neural Networks
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
Final Project presentation
Neural Networks Geoff Hulten.
Backpropagation Disclaimer: This PPT is modified based on
A Dynamic System Analysis of Simultaneous Recurrent Neural Network
Neural networks (1) Traditional multi-layer perceptrons
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
CSC 578 Neural Networks and Deep Learning
Artificial Neural Network learning
Akram Bitar and Larry Manevitz Department of Computer Science
Presentation transcript:

Adaptive Hopfield Network Gürsel Serpen Dr. Gürsel Serpen Associate Professor Electrical Engineering and Computer Science Department University of Toledo Toledo, Ohio, USA

Presentation Topics Motivation for research Classical Hopfield network (HN) Adaptation – Gradient Descent Adaptive Hopfield Network (AHN) Static Optimization with AHN Results and Conclusions Serpen et al., Upcoming Journal Article (Insallah!) FOR MORE INFO...

Motivation Classical Hopfield neural network (HN) has been shown to have the potential to address a very large spectrum of static optimization problems. Classical HN is NOT trainable: implies that it can NOT learn from prior search attempts. A hardware realization of the Hopfield network is very attractive for real-time, embedded computing environments. Is there a way (e.g., training or adaptation) to incorporate the experience (gained as a result of prior search attempts) into the network dynamics (weights) to help the network focus on promising regions of the overall search space?

Research Goals Propose gradient-descent based procedures to “adapt” weights and constraint weighting coefficients of HN. Develop an indirect procedure to define “pseudo” values for desired neuron outputs (much like the way desired output values for hidden layer neurons in an MLP). Develop an indirect procedure to define “pseudo” values for desired neuron outputs (much like the way desired output values for hidden layer neurons in an MLP). Develop space-efficient schemes to store the symmetric weight matrix (upper/lower triangular) for large-scale problem instances. Apply (through simulation) the adaptive HN algorithm to (large-scale) static optimization problems.

Classical Hopfield Net Dynamics Neuron Dynamics Sigmoid function Number of Neurons

Weights (interconnection) - Redefined Liapunov Function Generic Decomposed Weights Defined

Adaptive Hopfield Net Block Diagram

Adaptive Hopfield Net PseudoCode Initialization Initialize network constraint weighting coefficients. Initialize weights. Initialize Hopfield net neuron outputs (randomly). Adaptive Search Relaxation Relax Hopfield dynamics until convergence to a fixed point. Adaptation Relax Adjoint network until convergence to a fixed point. Update weights. Update constraint weighting coefficients. Termination Criteria if not satisfied, continue with Adaptive Search.

Hopfield Network Relaxation

Adaptation of Weights Adjoint Hopfield Network Adjoint Network

Adaptation of Weights Recurrent BackProp Weight Update – Recurrent BackProp

Adaptation Constraint Weighting Coefficients Gradient Descent Adaptation Rule Error Function – Problem Specific and Redefined

Adaptation Constraint Weighting Coefficients Partial Derivative – Readily Computable Final Form of Coefficient Update Rule

Mapping A Static Optimization Problem Generic Partial Problem-Specific Partial

Simulation Study Traveling Salesman Problem A preliminary work at this time Up to 100 cities performed Computing Resources – Ohio Supercomputing Center Preliminary findings suggest that the theoretical framework is sound and projections are valid Computational cost (weight matrix size)poses significant challenge for simulation purposes – on going research effort Currently in progress

Conclusions An adaptation mechanism, which modifies constraint weighting coefficient parameter values and weights of the classical Hopfield network, was proposed. A mathematical characterization of the adaptive Hopfield network was presented. Preliminary simulation results suggest the proposed adaptation mechanism to be effective in guiding the Hopfield network towards high-quality feasible solutions of large-scale static optimization problems. We are also exploring incorporating a computationally viable stochastic search mechanism to further improve quality of solutions computed by the adaptive Hopfield network while preserving parallel computation capability.

Thank You ! Questions ? We gratefully acknowledge the computing resources grant provided by the State of Ohio Supercomputing Center (in USA) in facilitating the simulation study. We appreciate the support provided by the Kohler Internationalization Awards Program at the University of Toledo to facilitate this conference presentation.