Neural Network Computing Lecture no.1
All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is binary. McCullogh-Pitts Neurons are connected by directed, weighted paths. Each neuron has a fixed threshold.
All rights reserved L. Manevitz Lecture 13 Architecture
All rights reserved L. Manevitz Lecture 14 Theorem We can model any function or phenomenon that can be represented as a logic function. First step we’ll show that the neuron can perform a simple logic function as AND, OR and NOT. At the second step we’ll use these simple neurons as building blocks. (Recall representability of logic functions by DNF form).
All rights reserved L. Manevitz Lecture 15 AND *1+0*1=0 0< *1+1*1=1 1< *1+0*1=1 1< *1+1*1=2 2>1.5
All rights reserved L. Manevitz Lecture 16 OR *1+0*1=0 0< *1+1*1=1 1> *1+0*1=1 1> *1+1*1=2 2>0.9
All rights reserved L. Manevitz Lecture 17 NOT 1*-1=-1 -1<-0.5 0*-1=0 0>
All rights reserved L. Manevitz Lecture 18 DNF DNF form :
All rights reserved L. Manevitz Lecture 19 Biases and Thresholds We can replace the threshold with a bias. A bias acts exactly as a weight on a connection from a unit whose activation is always 1.
All rights reserved L. Manevitz Lecture 110 Perceptron Loop : Take an example and apply to network. If correct answer – return to Loop. If incorrect – go to Fix. Fix : Adjust network weights by input example. Go to Loop.
All rights reserved L. Manevitz Lecture 111 Perceptron Algorithm Let be arbitrary Choose: choose Test: If and go to Choose If and go to Fix plus If and go to Choose If and go to Fix minus Fix plus: go to Choose Fix minus: go to Choose
All rights reserved L. Manevitz Lecture 112 Perceptron Algorithm Conditions to the algorithm existence : Condition no.1: Condition no.2: We choose F to be a group of unit vectors.
All rights reserved L. Manevitz Lecture 113 Geometric viewpoint
All rights reserved L. Manevitz Lecture 114 Perceptron Algorithm Based on these conditions the number of times we enter the Loop is finite. Proof: Positive examples Negative examples Examples world
All rights reserved L. Manevitz Lecture 115 Perceptron Algorithm-Proof We replace the threshold with a bias. We assume F is a group of unit vectors.
All rights reserved L. Manevitz Lecture 116 Perceptron Algorithm-Proof We reduce what we have to prove by eliminating all the negative examples and placing their negations in the positive examples.
All rights reserved L. Manevitz Lecture 117 Perceptron Algorithm-Proof The numerator : After n changes
All rights reserved L. Manevitz Lecture 118 Perceptron Algorithm-Proof The denominator : After n changes
All rights reserved L. Manevitz Lecture 119 Perceptron Algorithm-Proof From the numerator : From the denominator : n is final
All rights reserved L. Manevitz Lecture 120 Example - AND AND bias wrong etc…
All rights reserved L. Manevitz Lecture 121 AND – Bi Polar solution wrong continue success
All rights reserved L. Manevitz Lecture 122 Problem should be small enough so that contradiction !!!
All rights reserved L. Manevitz Lecture 123 Linear Separation Every perceptron determines a classification of vector inputs which is determined by a hyperline Two dimensional examples (add algebra) OR ANDXOR not possible
All rights reserved L. Manevitz Lecture 124 Linear Separation in Higher Dimensions In higher dimensions, still linear separation, but hard to tell Example: Connected; Convex - which can be handled by Perceptron with local sensors; which can not be. Note: Define local sensors.