Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

Slides:



Advertisements
Similar presentations
Artificial Intelligence 12. Two Layer ANNs
Advertisements

Perceptron Lecture 4.
Introduction to Neural Networks Computing
G53MLE | Machine Learning | Dr Guoping Qiu
Perceptron Learning Rule
NEURAL NETWORKS Perceptron
Lecture 11 Neural Networks L. Manevitz Dept of Computer Science Tel: 2420.
Lecture 13 – Perceptrons Machine Learning March 16, 2010.
CS Perceptrons1. 2 Basic Neuron CS Perceptrons3 Expanded Neuron.
Classification Neural Networks 1
CSC321: Neural Networks Lecture 3: Perceptrons
Neural Networks (II) Simple Learning Rule
Overview over different methods – Supervised Learning
Simple Neural Nets For Pattern Classification
Linear Learning Machines  Simplest case: the decision function is a hyperplane in input space.  The Perceptron Algorithm: Rosenblatt, 1956  An on-line.
A Review: Architecture
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
NNs Adaline 1 Neural Networks - Adaline L. Manevitz.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
CSE115/ENGR160 Discrete Mathematics 03/03/11 Ming-Hsuan Yang UC Merced 1.
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
Linear Learning Machines  Simplest case: the decision function is a hyperplane in input space.  The Perceptron Algorithm: Rosenblatt, 1956  An on-line.
September 21, 2010Neural Networks Lecture 5: The Perceptron 1 Supervised Function Approximation In supervised learning, we train an ANN with a set of vector.
September 14, 2010Neural Networks Lecture 3: Models of Neurons and Neural Networks 1 Visual Illusions demonstrate how we perceive an “interpreted version”
Perceptron Learning Rule
Learning via Neural Networks L. Manevitz All rights reserved.
Perceptron Learning Rule Assuming the problem is linearly separable, there is a learning rule that converges in a finite time Motivation A new (unseen)
Support Vector Machines Classification
Artificial Neural Networks
September 23, 2010Neural Networks Lecture 6: Perceptron Learning 1 Refresher: Perceptron Training Algorithm Algorithm Perceptron; Start with a randomly.
Data Mining with Neural Networks (HK: Chapter 7.5)
CSC321: Introduction to Neural Networks and Machine Learning Lecture 20 Learning features one layer at a time Geoffrey Hinton.
The McCulloch-Pitts Neuron. Characteristics The activation of a McCulloch Pitts neuron is binary. Neurons are connected by directed weighted paths. A.
Neural Networks Lecture 8: Two simple learning algorithms
Dr. Hala Moushir Ebied Faculty of Computers & Information Sciences
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
1 Mehran University of Engineering and Technology, Jamshoro Department of Electronic, Telecommunication and Bio-Medical Engineering Neural Networks Mukhtiar.
Multi-Layer Perceptrons Michael J. Watts
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2012 Lecture 10.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 30: Perceptron training convergence;
ADVANCED PERCEPTRON LEARNING David Kauchak CS 451 – Fall 2013.
Linear Discrimination Reading: Chapter 2 of textbook.
School of Engineering and Computer Science Victoria University of Wellington Copyright: Peter Andreae, VUW Image Recognition COMP # 18.
CISC667, F05, Lec22, Liao1 CISC 667 Intro to Bioinformatics (Fall 2005) Support Vector Machines I.
Linear Classification with Perceptrons
CS-424 Gregory Dudek Today’s Lecture Neural networks –Training Backpropagation of error (backprop) –Example –Radial basis functions.
Chapter 2 Single Layer Feedforward Networks
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Lecture notes for Stat 231: Pattern Recognition and Machine Learning 1. Stat 231. A.L. Yuille. Fall Perceptron Rule and Convergence Proof Capacity.
November 21, 2013Computer Vision Lecture 14: Object Recognition II 1 Statistical Pattern Recognition The formal description consists of relevant numerical.
Perceptrons Michael J. Watts
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Neural NetworksNN 21 Architecture We consider the architecture: feed- forward NN with one layer It is sufficient to study single layer perceptrons with.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Giansalvo EXIN Cirrincione unit #4 Single-layer networks They directly compute linear discriminant functions using the TS without need of determining.
April 5, 2016Introduction to Artificial Intelligence Lecture 17: Neural Network Paradigms II 1 Capabilities of Threshold Neurons By choosing appropriate.
Introduction to the Basic Principles of Machine Learning
Today’s Lecture Neural networks Training
Chapter 2 Single Layer Feedforward Networks
Other Classification Models: Neural Network
Wed June 12 Goals of today’s lecture. Learning Mechanisms
Classification Neural Networks 1
Learning via Neural Networks
Neural Networks Chapter 5
Neuro-Computing Lecture 2 Single-Layer Perceptrons
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

Neural Network Computing Lecture no.1

All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is binary. McCullogh-Pitts Neurons are connected by directed, weighted paths. Each neuron has a fixed threshold.

All rights reserved L. Manevitz Lecture 13 Architecture

All rights reserved L. Manevitz Lecture 14 Theorem  We can model any function or phenomenon that can be represented as a logic function.  First step we’ll show that the neuron can perform a simple logic function as AND, OR and NOT.  At the second step we’ll use these simple neurons as building blocks. (Recall representability of logic functions by DNF form).

All rights reserved L. Manevitz Lecture 15 AND *1+0*1=0 0< *1+1*1=1 1< *1+0*1=1 1< *1+1*1=2 2>1.5

All rights reserved L. Manevitz Lecture 16 OR *1+0*1=0 0< *1+1*1=1 1> *1+0*1=1 1> *1+1*1=2 2>0.9

All rights reserved L. Manevitz Lecture 17 NOT 1*-1=-1 -1<-0.5 0*-1=0 0>

All rights reserved L. Manevitz Lecture 18 DNF DNF form :

All rights reserved L. Manevitz Lecture 19 Biases and Thresholds We can replace the threshold with a bias. A bias acts exactly as a weight on a connection from a unit whose activation is always 1.

All rights reserved L. Manevitz Lecture 110 Perceptron Loop : Take an example and apply to network. If correct answer – return to Loop. If incorrect – go to Fix. Fix : Adjust network weights by input example. Go to Loop.

All rights reserved L. Manevitz Lecture 111 Perceptron Algorithm Let be arbitrary Choose: choose Test: If and go to Choose If and go to Fix plus If and go to Choose If and go to Fix minus Fix plus: go to Choose Fix minus: go to Choose

All rights reserved L. Manevitz Lecture 112 Perceptron Algorithm Conditions to the algorithm existence : Condition no.1: Condition no.2: We choose F to be a group of unit vectors.

All rights reserved L. Manevitz Lecture 113 Geometric viewpoint

All rights reserved L. Manevitz Lecture 114 Perceptron Algorithm Based on these conditions the number of times we enter the Loop is finite. Proof: Positive examples Negative examples Examples world

All rights reserved L. Manevitz Lecture 115 Perceptron Algorithm-Proof We replace the threshold with a bias. We assume F is a group of unit vectors.

All rights reserved L. Manevitz Lecture 116 Perceptron Algorithm-Proof We reduce what we have to prove by eliminating all the negative examples and placing their negations in the positive examples.

All rights reserved L. Manevitz Lecture 117 Perceptron Algorithm-Proof The numerator : After n changes 

All rights reserved L. Manevitz Lecture 118 Perceptron Algorithm-Proof The denominator : After n changes 

All rights reserved L. Manevitz Lecture 119 Perceptron Algorithm-Proof From the numerator : From the denominator : n is final

All rights reserved L. Manevitz Lecture 120 Example - AND AND bias  wrong etc…

All rights reserved L. Manevitz Lecture 121 AND – Bi Polar solution  wrong continue success

All rights reserved L. Manevitz Lecture 122 Problem should be small enough so that contradiction !!!

All rights reserved L. Manevitz Lecture 123 Linear Separation Every perceptron determines a classification of vector inputs which is determined by a hyperline Two dimensional examples (add algebra) OR ANDXOR not possible

All rights reserved L. Manevitz Lecture 124 Linear Separation in Higher Dimensions In higher dimensions, still linear separation, but hard to tell Example: Connected; Convex - which can be handled by Perceptron with local sensors; which can not be. Note: Define local sensors.