Download presentation
Presentation is loading. Please wait.
1
Neural Networks in Electrical Engineering Prof. Howard Silver School of Computer Sciences and Engineering Fairleigh Dickinson University Teaneck, New Jersey
2
Axon from Another Neuron Axon from Another Neuron Synaptic Gap Synaptic Gap Soma Axon Dendrite Dendrite of Another Neuron Dendrite of Another Neuron Biological Neuron
6
Steps in applying Perceptron: ! Initialize the weights and bias to 0. ! Set the learning rate alpha (0 < α <= 1) and threshold θ. ! For each input pattern, ! __Compute for each output ! y in = b + x 1 * w 1 + x 2 * w 2 + x 3 * w 3 +...... ! and set ! __y = ‑ 1 for y in < ‑ θ ! __y = 0 for -θ<= y in <= θ ! __y = 1 for y in > θ ! __If the jth output y j is not equal to t j, set ! ____w ij(new) = w ij(old) + α * x i * t j ! ____b j(new) = b j(old) + α * t j ! __(else no change in w ij and b j ) Example of Supervised Learning Algorithm - Perceptron
7
Perceptron Applied to Character Recognition Neural net inputs x 1 to x 25 Binary target output string (t 1 to t 26 )
13
Signal Classification with Perceptron
15
SN=5
19
>> nnet11i Enter frequency separation in pct. (del) 100 Number of samples per cycle (xtot) 64 Enter number of training epochs (m) 100 Final outputs after training: 100 010 001 Enter signal to noise ratio (SN) - zero to exit 1 Classification of signals embedded in noise 100 010 001 Classification of Three Sinusoids of Different Frequency SignalsNoisy Signals
20
>> nnet11i Enter frequency separation in pct. (del) 10 Number of samples per cycle (xtot) 64 Enter number of training epochs (m) 100 Final outputs after training: 100 010 001 Enter signal to noise ratio (SN) - zero to exit 10 Classification of signals embedded in noise 100 010 001 SignalsNoisy Signals
21
SignalsNoisy Signals Enter signal to noise ratio (SN) - zero to exit 10 Classification of signals embedded in noise 0?0 010 001 >> nnet11i Enter frequency separation in pct. (del) 5 Number of samples per cycle (xtot) 64 Enter number of training epochs (m) 500 Final outputs after training: 100 010 001
22
SignalsNoisy Signals >> nnet11i Enter frequency separation in pct. (del) 1 Number of samples per cycle (xtot) 1000 Enter number of training epochs (m) 10000 Final outputs after training: 100 010 001 Enter signal to noise ratio (SN) - zero to exit 100 Classification of signals embedded in noise 100 0?0 001
23
Unsupervised Learning
24
Initialize the weights (e.g. random values). Set the neighborhood radius (R) and a learning rate (α). Repeat the steps below until convergence or a maximum number of epochs is reached. For each input pattern X = [x 1 x 2 x 3......] Compute a "distance" D(j) = (w 1j ‑ x 1 ) 2 + (w 2j ‑ x 2 ) 2 + (w 3j ‑ x 3 ) 2 +...... for each cluster (i.e. all j), and find jmin, the value of j corresponding to the minimum D(j). If j is "in the neighborhood of" jmin, w ij (new) = w ij (old) + α [x i ‑ w ij (old)] for all i. Decrease α (linearly or geometrically) and reduce R (at a specified rate) if R > 0. Kohonen Learning Steps
25
NNET19.M
28
Two input neurons ‑ one each for the x and y coordinate numbers for the cities The Traveling Salesman Problem Distance function: D(j) = (w 1j ‑ x 1 ) 2 + (w 2j ‑ x 2 ) 2 Example - 4 cities on the vertices of a square
29
The coordinates of the cities: Weight matrices from three different runs: W = [ 1 1 ‑ 1 ‑ 1 1 ‑ 1 ‑ 1 1] W = [ 1 ‑ 1 ‑ 1 1 1 1 ‑ 1 ‑ 1] W = [ ‑ 1 1 1 ‑ 1 ‑ 1 ‑ 1 1 1]
30
100 “Randomly” Located Cities 0.1 < α < 0.25, 100 epochs, Initial R = 2
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.