CS623: Introduction to Computing with Neural Nets (lecture-12) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.

Slides:



Advertisements
Similar presentations
Bioinspired Computing Lecture 16
Advertisements

Pattern Association.
Computational Intelligence
Online Scheduling with Known Arrival Times Nicholas G Hall (Ohio State University) Marc E Posner (Ohio State University) Chris N Potts (University of Southampton)
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
ANN-Based Operational Planning of Power Systems M. E. El-Hawary Dalhousie University Halifax, Nova Scotia, Canada 7th Annual IEEE Technical Exchange Meeting,
Correlation Matrix Memory CS/CMPE 333 – Neural Networks.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
Network Optimization Models: Maximum Flow Problems In this handout: The problem statement Solving by linear programming Augmenting path algorithm.
September 21, 2010Neural Networks Lecture 5: The Perceptron 1 Supervised Function Approximation In supervised learning, we train an ANN with a set of vector.
1 Set # 3 Dr. LEE Heung Wing Joseph Phone: Office : HJ639.
1 Branch and Bound Searching Strategies 2 Branch-and-bound strategy 2 mechanisms: A mechanism to generate branches A mechanism to generate a bound so.
Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden.
December 7, 2010Neural Networks Lecture 21: Hopfield Network Convergence 1 The Hopfield Network The nodes of a Hopfield network can be updated synchronously.
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
© Negnevitsky, Pearson Education, Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works Introduction, or.
Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of.
A Modified Meta-controlled Boltzmann Machine Tran Duc Minh, Le Hai Khoi (*), Junzo Watada (**), Teruyuki Watanabe (***) (*) Institute Of Information Technology-Viet.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
Instructor: Prof. Pushpak Bhattacharyya 13/08/2004 CS-621/CS-449 Lecture Notes CS621/CS449 Artificial Intelligence Lecture Notes Set 6: 22/09/2004, 24/09/2004,
CSC321: Introduction to Neural Networks and machine Learning Lecture 16: Hopfield nets and simulated annealing Geoffrey Hinton.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31: Feedforward N/W; sigmoid.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 30: Perceptron training convergence;
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
Quantifying Knowledge Fouad Chedid Department of Computer Science Notre Dame University Lebanon.
1 Job Scheduling for Grid Computing on Metacomputers Keqin Li Proceedings of the 19th IEEE International Parallel and Distributed Procession Symposium.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 29: Perceptron training and.
1/24 Introduction to Graphs. 2/24 Graph Definition Graph : consists of vertices and edges. Each edge must start and end at a vertex. Graph G = (V, E)
Instructor: Prof. Pushpak Bhattacharyya 13/08/2004 CS-621/CS-449 Lecture Notes CS621/CS449 Artificial Intelligence Lecture Notes Set 4: 24/08/2004, 25/08/2004,
Prof. Pushpak Bhattacharyya, IIT Bombay 1 CS 621 Artificial Intelligence Lecture /10/05 Prof. Pushpak Bhattacharyya Artificial Neural Networks:
CS621 : Artificial Intelligence
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 32: sigmoid neuron; Feedforward.
1. 2 You should know by now… u The security level of a strategy for a player is the minimum payoff regardless of what strategy his opponent uses. u A.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
CS623: Introduction to Computing with Neural Nets (lecture-17) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Lecture 9 Model of Hopfield
CS623: Introduction to Computing with Neural Nets (lecture-7) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Neural Network to solve Traveling Salesman Problem Amit Goyal Koustubh Vachhani Ankur Jain 01D05007.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Meta-controlled Boltzmann Machine toward Accelerating the Computation Tran Duc Minh (*), Junzo Watada (**) (*) Institute Of Information Technology-Viet.
CS621: Artificial Intelligence
CSC2535: Computation in Neural Networks Lecture 8: Hopfield nets Geoffrey Hinton.
CS623: Introduction to Computing with Neural Nets (lecture-9) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Assocative Neural Networks (Hopfield) Sule Yildirim 01/11/2004.
Lecture 39 Hopfield Network
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
CS623: Introduction to Computing with Neural Nets (lecture-18) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
CS621: Artificial Intelligence
Hidden Markov Models Part 2: Algorithms
CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak) Pushpak Bhattacharyya.
Neural Networks for Vertex Covering
ECE/CS/ME 539 Neural Networks and Fuzzy Systems
Corso su Sistemi complessi:
CS623: Introduction to Computing with Neural Nets (lecture-15)
CS623: Introduction to Computing with Neural Nets (lecture-4)
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
CS623: Introduction to Computing with Neural Nets (lecture-9)
Lecture 39 Hopfield Network
CS623: Introduction to Computing with Neural Nets (lecture-14)
CS623: Introduction to Computing with Neural Nets (lecture-3)
Computational Intelligence
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
Computational Intelligence
CS621: Artificial Intelligence Lecture 18: Feedforward network contd
CS623: Introduction to Computing with Neural Nets (lecture-11)
CSC 578 Neural Networks and Deep Learning
CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak) Pushpak Bhattacharyya.
Presentation transcript:

CS623: Introduction to Computing with Neural Nets (lecture-12) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay

Training of Hopfield Net Early Training Rule proposed by Hopfield Rule inspired by the concept of electron spin Hebb’s rule of learning –If two neurons i and j have activation x i and x j respectively, then the weight w ij between the two neurons is directly proportional to the product x i ·x j i.e.

Hopfield Rule To store a pattern make Storing pattern is equivalent to ‘Making that pattern the stable state’

Training of Hopfield Net Establish that is a stable state of the net To show the stability of impress at t=0

Training of Hopfield Net Consider neuron i at t=1

Establishing stability

Observations How much deviation can the net tolerate? What if more than one pattern is to be stored?

Storing k patterns Let the patterns be: P 1 : 1 P 2 : 2. P k : k Generalized Hopfield Rule is: P th pattern

Storing k patterns Study the stability of Impress the vector at t=0 and observer network dynamics Looking at neuron i at t=1, we have

Examining stability of the q th pattern

Small when k << n

Stability for k memory elements Condition for patterns to be stable on a Hopfield net with n neurons is: k << n The storage capacity of Hopfield net is very small Hence it is not a practical memory element

Hopfield Net: Computational Complexity Hopfield net is an O(n 2 ) algorithm, since It has to reach stability in O(n 2 ) steps

Hopfield Net Consider the energy expression E has [n(n-1)]/2 terms Nature of each term –w ij is a real number –x i and x j are each +1 or -1

No. of steps taken to reach stability E gap = E high - E low E high E low E gap

Analysis of the weights and consequent E high and E low W il is any weight with upper and lower bounds as W max and W min respectively. Suppose w min ≤ w ij ≤ w max w max > w min Case 1: w max > 0, w min > 0 E high = (1/2) * w max * n * (n-1) E low = -(1/2) * w max * n * (n-1)

Continuation of analysis of E high and E low Case 2: w min < 0, w max < 0 E high = (1/2) * w min * n * (n-1) E low = -(1/2) * w min * n * (n-1) Case 3: w max > 0, w min < 0 E high = (1/2) * max(|w max |,|w min |) * n*(n-1) E low = -(1/2) * max(|w max |,|w min |) * n*(n-1)

The energy gap In general, E high E low E gap = max(|w max |, |w min |) * n*(n-1)

To find ΔE min ΔE p = (x p initial - x p final ) * net p where ΔE p is the change in energy due to the p th neuron changing activation. | ΔE p |=| (x p initial - x p final ) * net p | =2 * | net p | where net p = Σ n j=1, j≠p w pj x j

To find ΔE min [Σ n j=1, j≠p w pj x j ] min is determined by the precision of the machine computing it. For example, assuming 7 places after the decimal point, the value cannot be lower than [it can be 0, but that is not of concern to us, since the net will continue in the same state] Thus ΔE min is a constant independent of n, determined by the precision of the machine.

Final observation: o(n 2 ) It is possible to reach the minimum independent of n. Hence in the worst case, the number of steps taken to cover the energy gap is less than or equal to [max(|w max |,|w min |) * n * (n-1)] / constant Thus stability has to be attained in O(n 2 ) steps

Hopfield Net for Optimization Optimization problem –Maximizes or minimizes a quantity Hopfield net used for optimization –Hopfield net and Traveling Salesman Problem –Hopfield net and Job Scheduling Problem

The essential idea of the correspondence In optimization problems, we have to minimize a quantity. Hopfield net minimizes the energy THIS IS THE CORRESPONDENCE