Download presentation
Presentation is loading. Please wait.
Published byFrancine Marsh Modified over 9 years ago
2
What is a neural network? Collection of interconnected neurons that compute and generate impulses. Components of a neural network include neurons, synapses, and activation functions. Neural networks modeled by mathematical or computer models are referred to as artificial neural networks. Neural networks achieve functionality through learning/training functions.
4
x1 x2...... xn Σ Thresh old x0 w0 w1 w2 wn 1 if y >0 -1 otherwise
5
» The key to neural networks is weights. Weights determine the importance of signals and essentially determine the network output. » Training cycles adjust weights to improve the performance of a network. The performance of an ANN is critically dependent on training performance. » An untrained network is basically useless. » Different training algorithms lend themselves better to certain network topologies, but some also lend themselves to parallelization.
6
» Network topologies describe how neurons map to each other. » Many artificial neural networks (ANNs) utilize network topologies that are very parallelizable.
7
x 1 x 2. x 960 3 Hidden Units Outputs Inputs
8
» Training and evaluation of each node in large networks can take an incredibly long time » However, neural networks are "embarrassingly parallel“ ˃ Computations for each node are generally independent of all other nodes
9
Typical structure of an ANN: » For each training session ˃ For each training example in the session + For each layer (forward and backward) – For each neuron in the layer » For all weights of the neuron >For all bits of the weight value
10
● Multilayer feedforward networks Layered network where each layer of neurons only outputs to next layer. ● Feedback networks single layer of neurons with outputs that loops back to the inputs. ● Self-organizing maps (SOM) Forms mappings from a high dimensional space to a lower dimensional space(one layer of neurons) ● Sparse distributed memory (SDM) Special form of a two layer feedforward network that works as an associative memory.
11
S.O.M. is producing a low dimensional of the training samples. They use neighborhood function to preserve the topological properties of the input space. S.O.M. is useful for visualizing.
14
[1] Tomas Nordstrom. Highly parallel computers for artificial neural networks. 1995. [2] George Dahl, Alan McAvinney and Tia Newhall. Parallelizing neural network training for cluster systems,2008. [3] Hopfield, j.j. and D. Tank. Computing with neural circuits. Vol.,2008. :pp. 624-633,1986.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.