1 COMP305. Part I. Artificial neural networks.. 2 The McCulloch-Pitts Neuron (1943). McCulloch and Pitts demonstrated that “…because of the all-or-none.

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks 2. Overview  The McCulloch-Pitts neuron  Pattern space  Limitations  Learning.
Advertisements

Introduction to Artificial Neural Networks
Discrete Event Control
Artificial Neural Networks (1)
Artificial Neural Network
B.Macukow 1 Lecture 3 Neural Networks. B.Macukow 2 Principles to which the nervous system works.
G5BAIM Artificial Intelligence Methods Graham Kendall Neural Networks.
Artificial Neural Networks
Biological and Artificial Neurons Michael J. Watts
B.Macukow 1 Neural Networks Lecture 4. B.Macukow 2 McCulloch symbolism The symbolism introduced by McCulloch at the basis of simplified Venn diagrams.
Romain Brette Institut de la Vision, Paris Main developers of : Dan Goodman & Marcel Stimberg Neural.
Simple Neural Nets For Pattern Classification
Neural Networks.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Propositional Logic. Negation Given a proposition p, negation of p is the ‘not’ of p.
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
September 14, 2010Neural Networks Lecture 3: Models of Neurons and Neural Networks 1 Visual Illusions demonstrate how we perceive an “interpreted version”
How does the mind process all the information it receives?
Artificial Neurons, Neural Networks and Architectures
Artificial neural networks.
The McCulloch-Pitts Neuron. Characteristics The activation of a McCulloch Pitts neuron is binary. Neurons are connected by directed weighted paths. A.
Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden.
Neural Networks (NN) Ahmad Rawashdieh Sa’ad Haddad.
COMP305. Part I. Artificial neural networks.. Topic 3. Learning Rules of the Artificial Neural Networks.
Artificial Neural Network
In a not gate, if the input is on(1) the output is off (0) and vice versa.
Rohit Ray ESE 251. What are Artificial Neural Networks? ANN are inspired by models of the biological nervous systems such as the brain Novel structure.
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
1 Mehran University of Engineering and Technology, Jamshoro Department of Electronic, Telecommunication and Bio-Medical Engineering Neural Networks Mukhtiar.
Explorations in Neural Networks Tianhui Cai Period 3.
Last lecture summary Naïve Bayes Classifier. Bayes Rule Normalization Constant LikelihoodPrior Posterior Prior and likelihood must be learnt (i.e. estimated.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University EE459 Neural Networks The Structure.
1 Financial Informatics –XIV: Basic Principles 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2,
Presented by Scott Lichtor An Introduction to Neural Networks.
Logical Calculus of Ideas Immanent in Nervous Activity McCulloch and Pitts Deep Learning Fatima Al-Raisi.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
PSY105 Neural Networks 2/5 2. “A universe of numbers”
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
Introduction to Artificial Intelligence (G51IAI) Dr Rong Qu Neural Networks.
ADVANCED PERCEPTRON LEARNING David Kauchak CS 451 – Fall 2013.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 8: Neural Networks.
Linear Discrimination Reading: Chapter 2 of textbook.
Soft Computing Lecture 19 Part 2 Hybrid Intelligent Systems.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Neural Networks. Molecules Levels of Information Processing in the Nervous System 0.01  m Synapses 1m1m Neurons 100  m Local Networks 1mm Areas /
NETWORK SONGS !! created by Carina Curto & Katherine Morrison January 2016 Input: a simple directed graph G satisfying two rules: 1. G is an oriented.
Artificial Intelligence Methods Neural Networks Lecture 1 Rakesh K. Bissoondeeal Rakesh K.
Neural Networks (NN) Part 1 1.NN: Basic Ideas 2.Computational Principles 3.Examples of Neural Computation.
Start with student evals. What function does perceptron #4 represent?
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Where are we? What’s left? HW 7 due on Wednesday Finish learning this week. Exam #4 next Monday Final Exam is a take-home handed out next Friday in class.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
Business Analytics Several odds and ends Copyright © 2016 Curt Hill.
Introduction to Connectionism Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
Fundamental ARTIFICIAL NEURAL NETWORK Session 1st
Other Classification Models: Neural Network
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Binary Decision Diagrams
Perceptron as one Type of Linear Discriminants
G5AIAI Introduction to AI
The Naïve Bayes (NB) Classifier
McCulloch–Pitts Neuronal Model :
Lecture 5 Binary Operation Boolean Logic. Binary Operations Addition Subtraction Multiplication Division.
Lecture 02: Perceptron By: Nur Uddin, Ph.D.
EXPLICIT RULES: INPUT-OUTPUT FORMULAS
Introduction to Neural Network

Artificial Neural Network
Presentation transcript:

1 COMP305. Part I. Artificial neural networks.

2 The McCulloch-Pitts Neuron (1943). McCulloch and Pitts demonstrated that “…because of the all-or-none character of nervous activity, neural events and the relations among them can be treated by means of the propositional logic”.

3 The McCulloch-Pitts Neuron – Thus, the McCulloch-Pitts neuron operates on a discrete time scale, t = 0,1,2,3,… discrete time machine.

4 The McCulloch-Pitts Neuron – The input values a i t from the i-th presynaptic neuron at any instant t may be equal either to 0 or 1 only discrete time machine. binary unit.

5 The McCulloch-Pitts Neuron – The weights of connections w i are +1 for excitatory type connection and -1 for inhibitory type connection. discrete time machine.

6 The McCulloch-Pitts Neuron – There is an excitation threshold  associated with the neuron. discrete time machine.

7 The McCulloch-Pitts Neuron. Output X t+1 of the neuron at the following instant t+1 is defined according to the rule:

8 The McCulloch-Pitts Neuron. In the MP neuron, we shall call the instant total input S t – instant state of the neuron.

9 The McCulloch-Pitts Neuron. The statement “ “ means that activity of a single inhibitory input, i.e. input via a connection with negative weight w i = -1, would absolutely prevents excitation of the neuron at that instant.

10 Activation function. The output X t+1 is function of the state S t of the neuron, therefore it also may be written as function of discrete time where g(S t ) is the threshold activation function

11 MP-neuron example.

12 MP-neuron example. Input 1. 1) a 1 =0, a 2 =0, a 3 =1 2)All inhibitory connections are silent 3) S = 0×(-1) + 0×1 + 1×1 = 1 < θ 4) S X = 0

13 MP-neuron example. Input 2. 1) a 1 =0, a 2 =1, a 3 =1 2)All inhibitory connections are silent 3) S = 0×(-1) + 1×1 + 1×1 = 2 = θ 4) S = θ => X = 1

14 MP-neuron example. Input 3. 1) a 1 =1, a 2 =1, a 3 =1 2)There is an inhibitory connection activated 3)X = 0

15 MP-neuron as a binary unit. Simple logical functions can be implemented directly with a single McCulloch-Pitts unit. The output value 1 can be associated with the logical value true and 0 with the logical value false. Now, let us demonstrate how weights and thresholds can be set to yield neurons which realise the logical functions AND, OR and NOT.

16 MP-neuron logic. “AND” - the output fires if a 1 and a 2 both fire. a1a1 a2a2 “AND” = “AND”

17 MP-neuron logic. “OR” - the output fires if a 1 or a 2 or both fire. a1a1 a2a2 “OR” = “OR”

18 MP-neuron logic. “NOT”: the output fires if a 1 does NOT fire and vice versa. = “NOT” a1a1 “ NOT” 10 01