Artificial neural networks.

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Advertisements

Computational Neuroscience 03 Lecture 8
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Justin Besant BIONB 2220 Final Project
Spike Timing-Dependent Plasticity Presented by: Arash Ashari Slides mostly from: 1  Woodin MA, Ganguly K, and Poo MM. Coincident pre-
Spike timing-dependent plasticity: Rules and use of synaptic adaptation Rudy Guyonneau Rufin van Rullen and Simon J. Thorpe Rétroaction lors de l‘ Intégration.
Spike timing-dependent plasticity Guoqiang Bi Department of Neurobiology University of Pittsburgh School of Medicine.
Biological and Artificial Neurons Michael J. Watts
Artificial Neural Networks - Introduction -
Plasticity in the nervous system Edward Mann 17 th Jan 2014.
Artificial Neural Networks - Introduction -
Chapter 7 Supervised Hebbian Learning.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
Synapses are everywhere neurons synapses Synapse change continuously –From msec –To hours (memory) Lack HH type model for the synapse.
CSE 153Modeling Neurons Chapter 2: Neurons A “typical” neuron… How does such a thing support cognition???
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
September 16, 2010Neural Networks Lecture 4: Models of Neurons and Neural Networks 1 Capabilities of Threshold Neurons By choosing appropriate weights.
How does the mind process all the information it receives?
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
Supervised Hebbian Learning. Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing.
1 Activity-dependent Development (2) Hebb’s hypothesis Hebbian plasticity in visual system Cellular mechanism of Hebbian plasticity.
1 COMP305. Part I. Artificial neural networks.. 2 The McCulloch-Pitts Neuron (1943). McCulloch and Pitts demonstrated that “…because of the all-or-none.
PART 5 Supervised Hebbian Learning. Outline Linear Associator The Hebb Rule Pseudoinverse Rule Application.
COMP305. Part I. Artificial neural networks.. Topic 3. Learning Rules of the Artificial Neural Networks.
Critical periods A time period when environmental factors have especially strong influence in a particular behavior. –Language fluency –Birds- Are you.
Unsupervised learning
Introduction to Neural Networks CMSC475/675
Supervised Hebbian Learning
Learning Processes.
PSY105 Neural Networks 4/5 4. “Traces in time” Assignment note: you don't need to read the full book to answer the first half of the question. You should.
Artificial Neural Network Unsupervised Learning
Artificial Neural Network Yalong Li Some slides are from _24_2011_ann.pdf.
EC0054 NEURAL NETWORK AND FUZZY LOGIC. Neuron: A neuron nerve cell is an electricallyexcitable cell that processes and transmits information by electrical.
HEBB’S THEORY The implications of his theory, and their application to Artificial Life.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
7 1 Supervised Hebbian Learning. 7 2 Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Unsupervised learning
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
Bain on Neural Networks and Connectionism Stephanie Rosenthal September 9, 2015.
”When spikes do matter: speed and plasticity” Thomas Trappenberg 1.Generation of spikes 2.Hodgkin-Huxley equation 3.Beyond HH (Wilson model) 4.Compartmental.
Neural Networks 2nd Edition Simon Haykin
1 Financial Informatics –XVII: Unsupervised Learning 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2,
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Storage capacity: consider the neocortex ~20*10^9 cells, 20*10^13 synapses.
Nicolas Galoppo von Borries COMP Motion Planning Introduction to Artificial Neural Networks.
0 Chapter 4: Associators and synaptic plasticity Fundamentals of Computational Neuroscience Dec 09.
Synaptic Plasticity Synaptic efficacy (strength) is changing with time. Many of these changes are activity-dependent, i.e. the magnitude and direction.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
Introduction to Connectionism Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
CSE P573 Applications of Artificial Intelligence Neural Networks
Simple learning in connectionist networks
Financial Informatics –XVII: Unsupervised Learning
Hebb and Perceptron.
CSE 573 Introduction to Artificial Intelligence Neural Networks
Backpropagation.
Artificial neurons Nisheeth 10th January 2019.
Simple learning in connectionist networks
Introduction to Neural Network
Supervised Hebbian Learning
Presentation transcript:

Artificial neural networks. COMP305. Part I. Artificial neural networks.

of the Artificial Neural Networks. Topic 3. Learning Rules of the Artificial Neural Networks.

Hebb’s rule (1949). Hebb conjectured that a particular type of use-dependent modification of the connection strength of synapses might underlie learning in the nervous system.

Hebb’s rule (1949). Hebb introduced a neurophysiological postulate : “…When an axon of cell A is near enough to excite a cell B and repeatedly and persistently tales part in firing it, some growth process or metabolic change takes place in one or both cells, such that A’s efficiency as one of the cells firing B, is increased.”

Hebb’s rule (1949). The simplest formalisation of Hebb’s rule is to increase weight of connection at every next instant in the way: (1) where (2)

Hebb’s rule (1949). where (2) here (1) where (2) here wjik is the weight of connection at instant k, wjik+1 is the weight of connection at the following instant k+1, Dwjik is increment by which the weight of connection is enlarged, C is positive coefficient which determines learning rate, aik is input value from the presynaptic neuron at instant k, Xjk is output of the postsynaptic neuron at the same instant k.

Hebb’s rule (1949). (1) where (2) Thus, the weight of connection changes at the next instant only if both preceding input via this connection and the resulting output simultaneously are not equal to 0.

Hebb’s rule (1949). (1) where (2) Equation (2) emphasises the correlation nature of a Hebbian synapse. It is sometimes referred to as the activity product rule.

Hebb’s rule (1949). (1) where (2) Hebb’s original learning rule (2) referred exclusively to excitatory synapses, and has the unfortunate property that it can only increase synaptic weights, thus washing out the distinctive performance of different neurons in a network, as the connections drive into saturation..

Hebb’s rule (1949). (1) where (2) However, when the Hebbian rule is augmented by a normalisation rule, e.g. keeping constant the total strength of synapses upon a given neuron, it tends to “sharpen” a neuron’s predisposition “without a teacher”, causing its firing to become better and better correlated with a cluster of stimulus patterns.

Normalised Hebb’s rule. (1) where (2) normalisation: (3) Hebb’s rule plays an important role in studies of ANN algorithms much “younger” than the rule itself, such as unsupervised learning or self-organisation.

Normalised Hebb in practice. Input unit No 1 2 3 4 a 1 w 1 w a 2 2 q X a w 3 3 w 4 a 4

Normalised Hebb in practice. Input unit No 1 2 3 4 t=0 C=1 w01 w02 w03 w04 1 1 1 q =1 X 1 1

Normalised Hebb in practice. Input unit No 1 2 3 4 t=0 C=1 w01 w02 w03 w04 1 1 1 q =1 X 1 1

Normalised Hebb in practice. Input unit No 1 2 3 4 t=0 C=1 w01 w02 w03 w04 1 1 1 q =1 X 1 1

Normalised Hebb in practice. Input unit No 1 2 3 4 t=0 C=1 w01 w02 w03 w04 0.5 0.5 0.5 q =1 X 0.5 0.5

Normalised Hebb in practice. Input unit No 1 2 3 4 t=0 C=1 a01 a02 a03 a04 1 w01 w02 w03 w04 0.5 1 0.5 0.5 q =1 X 1 0.5 0.5

Normalised Hebb in practice. Input unit No 1 2 3 4 t=0 C=1 a01 a02 a03 a04 1 w01 w02 w03 w04 0.5 1 0.5 0.5 q =1 X 1 0.5 0.5

Normalised Hebb in practice. Input unit No 1 2 3 4 t=0 C=1 a01 a02 a03 a04 1 w01 w02 w03 w04 0.5 1 0.5 0.5 q =1 1 1 0.5 0.5

Normalised Hebb in practice. Input unit No 1 2 3 4 t=0 C=1 a01 a02 a03 a04 1 w01 w02 w03 w04 0.5 1 0.5 0.5 q =1 1 1 0.5 0.5

Normalised Hebb in practice. Input unit No 1 2 3 4 a01 a02 a03 a04 1 t=1 C=1 w11 w12 w13 w14 1.5 0.5 1.5 0.5 q =1 X 1.5 0.5

Normalised Hebb in practice. Input unit No 1 2 3 4 t=1 C=1 w11 w12 w13 w14 1.5 0.5 1.5 0.5 q =1 X 1.5 0.5

Normalised Hebb in practice. Input unit No 1 2 3 4 t=1 C=1 w11 w12 w13 w14 1.5 0.5 1.5 0.5 q =1 X 1.5 0.5

Normalised Hebb in practice. Input unit No 1 2 3 4 t=1 C=1 w11 w12 w13 w14 0.67 0.22 0.67 0.22 q =1 X 0.67 0.22

Normalised Hebb in practice. Input unit No 1 2 3 4 t=1 C=1 w11 w12 w13 w14 0.67 0.22 0.67 0.22 q =1 X 0.67 0.22

Normalised Hebb in practice. Input unit No 1 2 3 4 w01 w02 w03 w04 0.5 t=1 C=1 w11 w12 w13 w14 0.67 0.22 0.67 0.22 q =1 X 0.67 Continue… 0.22

Normalised Hebb in practice. Input unit No 1 2 3 4 t=1 C=1 a11 a12 a13 a14 1 w11 w12 w13 w14 0.67 0.22 1 0.67 0.22 q =1 X 1 0.67 0.22

Normalised Hebb in practice. Input unit No 1 2 3 4 t=1 C=1 a11 a12 a13 a14 1 w11 w12 w13 w14 0.67 0.22 1 0.67 0.22 q =1 1 1 0.67 0.22

Normalised Hebb in practice. Input unit No 1 2 3 4 t=1 C=1 a11 a12 a13 a14 1 w11 w12 w13 w14 0.67 0.22 1 0.67 0.22 q =1 1 1 0.67 0.22

Normalised Hebb in practice. Input unit No 1 2 3 4 t=2 C=1 w21 w22 w23 w24 1.67 0.22 1.67 0.22 q =1 X 1.67 0.22

Normalised Hebb in practice. Input unit No 1 2 3 4 t=2 C=1 w21 w22 w23 w24 1.67 0.22 1.67 0.22 q =1 X 1.67 0.22

Normalised Hebb in practice. Input unit No 1 2 3 4 t=2 C=1 w21 w22 w23 w24 1.67 0.22 1.67 0.22 q =1 X 1.67 0.22

Normalised Hebb in practice. Input unit No 1 2 3 4 t=2 C=1 w21 w22 w23 w24 0.70 0.09 0.70 0.09 q =1 X 0.70 0.09

Normalised Hebb in practice. Input unit No 1 2 3 4 t=2 C=1 w21 w22 w23 w24 0.70 0.09 0.70 0.09 q =1 X 0.70 0.09

Normalised Hebb in practice. Input unit No 1 2 3 4 w11 w12 w13 w14 0.67 0.22 t=2 C=1 w21 w22 w23 w24 0.70 0.09 0.70 0.09 q =1 X 0.70 0.09 Continue…

Normalised Hebb in practice. Input unit No 1 2 3 4 t=2 C=1 a21 a22 a23 a24 1 w21 w22 w23 w24 0.70 0.09 1 0.70 0.09 q =1 X 1 0.70 0.09

Normalised Hebb in practice. Input unit No 1 2 3 4 t=2 C=1 a21 a22 a23 a24 1 w21 w22 w23 w24 0.70 0.09 1 0.70 0.09 q =1 1 1 0.70 0.09

Normalised Hebb in practice. Input unit No 1 2 3 4 t=2 C=1 a21 a22 a23 a24 1 w21 w22 w23 w24 0.70 0.09 1 0.70 0.09 q =1 1 1 0.70 0.09

Normalised Hebb in practice. Input unit No 1 2 3 4 a21 a22 a23 a24 1 t=3 C=1 w31 w32 w33 w34 1.70 0.09 1.70 0.09 q =1 X 1.70 0.09

Normalised Hebb in practice. Input unit No 1 2 3 4 t=3 C=1 w31 w32 w33 w34 1.70 0.09 1.70 0.09 q =1 X 1.70 0.09

Normalised Hebb in practice. Input unit No 1 2 3 4 t=3 C=1 w31 w32 w33 w34 1.70 0.09 1.70 0.09 q =1 X 1.70 0.09

Normalised Hebb in practice. Input unit No 1 2 3 4 t=3 C=1 w31 w32 w33 w34 0.71 0.04 0.71 0.04 q =1 X 0.71 0.04

Normalised Hebb in practice. Input unit No 1 2 3 4 w21 w22 w23 w24 0.70 0.09 t=3 C=1 w31 w32 w33 w34 0.71 0.04 0.71 0.04 q =1 X 0.71 0.04 Continue…

Normalised Hebb in practice. Input unit No 1 2 3 4 t=3 C=1 a31 a32 a33 a34 1 w31 w32 w33 w34 0.71 0.04 1 0.71 0.04 q =1 X 1 0.71 0.04

Normalised Hebb in practice. Input unit No 1 2 3 4 t=3 C=1 a31 a32 a33 a34 1 w31 w32 w33 w34 0.71 0.04 1 0.71 0.04 q =1 1 1 0.71 0.04

Normalised Hebb in practice. Input unit No 1 2 3 4 t=3 C=1 a31 a32 a33 a34 1 w31 w32 w33 w34 0.71 0.04 1 0.71 0.04 q =1 1 1 0.71 0.04

Normalised Hebb in practice. Input unit No 1 2 3 4 a31 a32 a33 a34 1 t=3 C=1 w31 w32 w33 w34 1.71 0.04 1 1.71 0.04 q =1 1 1 1.71 0.04

Normalised Hebb in practice. Input unit No 1 2 3 4 t=4 C=1 w41 w42 w43 w44 1.71 0.04 1.71 0.04 q =1 X 1.71 0.04

Normalised Hebb in practice. Input unit No 1 2 3 4 t=4 C=1 w41 w42 w43 w44 1.71 0.04 1.71 0.04 q =1 X 1.71 0.04

Normalised Hebb in practice. Input unit No 1 2 3 4 t=4 C=1 w41 w42 w43 w44 0.71 0.02 0.71 0.02 q =1 X 0.71 0.02

Normalised Hebb in practice. Input unit No 1 2 3 4 w31 w32 w33 w34 0.71 0.04 t=4 C=1 w41 w42 w43 w44 0.71 0.02 0.71 0.02 q =1 X 0.71 0.02 Continue…

Normalised Hebb in practice. Input unit No 1 2 3 4 t=4 C=1 a41 a42 a43 a44 1 w41 w42 w43 w44 0.71 0.02 1 0.71 0.02 q =1 X 1 0.71 0.02

Normalised Hebb in practice. Input unit No 1 2 3 4 t=4 C=1 a41 a42 a43 a44 1 w41 w42 w43 w44 0.71 0.02 1 0.71 0.02 q =1 1 1 0.71 0.02

Normalised Hebb in practice. Input unit No 1 2 3 4 t=4 C=1 a41 a42 a43 a44 1 w41 w42 w43 w44 0.71 0.02 1 0.71 0.02 q =1 1 1 0.71 0.02

Normalised Hebb in practice. Input unit No 1 2 3 4 a41 a42 a43 a44 1 t=4 C=1 w51 w52 w53 w54 1.71 0.02 1 1.71 0.02 q =1 1 1 1.71 0.02

Normalised Hebb in practice. Input unit No 1 2 3 4 t=5 C=1 w51 w52 w53 w54 1.71 0.02 1.71 0.02 q =1 X 1.71 0.02

Normalised Hebb in practice. Input unit No 1 2 3 4 t=5 C=1 w51 w52 w53 w54 1.71 0.02 1.71 0.02 q =1 X 1.71 0.02

Normalised Hebb in practice. Input unit No 1 2 3 4 t=5 C=1 w51 w52 w53 w54 0.71 0.01 0.71 0.01 q =1 X 0.71 0.01

Normalised Hebb in practice. Input unit No 1 2 3 4 w41 w42 w43 w44 0.71 0.02 t=5 C=1 w51 w52 w53 w54 0.71 0.01 0.71 0.01 q =1 X 0.71 0.01 STOP!!!!

Normalised Hebb in practice. Input unit No 1 2 3 4 w 1.0 w 1,3 0.5 0.71 w Iteration N 2,4 0.01 0.0 1 2 3 4 5 q =1 0.3 X 0.71 0.2 0.1 0.01 Iteration N 0.0 1 2 3 4 5

Normalised Hebb in practice. Input unit No 1 2 3 4 Test a1 a2 a3 a4 1 w1 w2 w3 w4 0.71 0.01 1 0.71 0.01 1 q =1 X 0.71 I do not know you… 0.01