Download presentation
Presentation is loading. Please wait.
Published byMavis Grace Perry Modified over 9 years ago
1
Neural Network Tool Box Khaled A. Al-Utaibi
2
Outlines Neuron Model Transfer Functions Network Architecture Neural Network Models Feed-forward Network Training & Simulation Example 1: Majority Function Example 2: Handwritten Digits Recognition ICS583: Pattern Recoginition 2009-20102
3
Neuron Model ICS583: Pattern Recoginition 2009-20103
4
Neuron Model ICS583: Pattern Recoginition 2009-20104 InputsWeights Bias Weighted Sum Transfer Function Output
5
Transfer Functions Many transfer functions are included in the Neural Network Toolbox Three of the most commonly used functions are Hard-Limit Transfer Function Linear Transfer Function Log-Sigmoid Transfer Function ICS583: Pattern Recoginition 2009-20105
6
Transfer Functions Hard-Limit Transfer Function ICS583: Pattern Recoginition 2009-20106
7
Transfer Functions Linear Transfer Function ICS583: Pattern Recoginition 2009-20107
8
Transfer Functions Log-sigmoid Transfer Function ICS583: Pattern Recoginition 2009-20108
9
Network Architecture Single Layer of Neurons ICS583: Pattern Recoginition 2009-20109
10
Network Architecture Multiple Layers of Neurons ICS583: Pattern Recoginition 2009-201010
11
Neural Network Models MATLAB contains several models of neural networks (general / special purpose): Feed-forward back-propagation network Elman back-propagation network Cascade-forward back-propagation network Pattern recognition network Fitting network SOM network (Self-Organizing Map) ICS583: Pattern Recoginition 2009-201011
12
Feed-Forward Network Create feed-forward back-propagation network Many neural network models in MATLAB are special cases of this model (e.g. pattern recognition, fitting, and SOM) Syntax ICS583: Pattern Recoginition 2009-201012 network_name = newff(arguments)
13
Feed-Forward Network Arguments ICS583: Pattern Recoginition 2009-201013 Argument(s)Description Pinput vectors TTarget vectors [S 1 S 2 … S N-1 ]Size of ith layer {TF 1, TF 2, …, TF N }Transfer function of ith layer BTFBp network training function BLFBp weight/bias learning function IPFInput processing functions. OPFOutput processing functions DDFData division function
14
Feed-Forward Network Output layer size is determined from T Input and output processing functions transform the inputs and outputs into a better form for network use: Re-encode unknown input/output values into numerical values Remove redundant inputs and outputs vectors. Normalizes input/output values. ICS583: Pattern Recoginition 2009-201014
15
Feed-Forward Network MATLAB provides several data division functions that divide the input data into three sets (using different strategies and different percentage for each set: Training Set Validation Set Testing Set ICS583: Pattern Recoginition 2009-201015
16
Training the Network Syntax Arguments ICS583: Pattern Recoginition 2009-201016 [ret_vals] = train(arguments) Argument(s)Description netNeural network to be trained PNetwork inputs TNetwork targets PiInitial input delay conditions AiInitial layer delay conditions
17
Training the Network Returned Values ICS583: Pattern Recoginition 2009-201017 Argument(s)Description netTrained neural network PTraining record (iter & performance) YNetwork outputs ENetwork errors PfFinal input delay conditions AfFinal layer delay conditions
18
Simulating the Network Syntax Arguments ICS583: Pattern Recoginition 2009-201018 [ret_vals] = sim(arguments) Argument(s)Description netNeural network to be simulated PNetwork inputs PiInitial input delay conditions AiInitial layer delay conditions TNetwork targets
19
Simulating the Network Returned Values ICS583: Pattern Recoginition 2009-201019 Argument(s)Description YNetwork outputs PfFinal input delay conditions AfFinal layer delay conditions ENetwork errors PerfNetwork performance
20
Example 1: Majority Function P1P2P3T 0000 0010 0100 0111 1000 1011 1101 1111 ICS583: Pattern Recoginition 2009-201020 % initialize network inputs inputs = [ 0 0 0 0 1 1 1 1;... 0 0 1 1 0 0 1 1;... 0 1 0 1 0 1 0 1]; % initialize network targets targets = [0 0 0 1 0 1 1 1];
21
Example 1: Majority Function ICS583: Pattern Recoginition 2009-201021 % initialize network inputs inputs = [ 0 0 0 0 1 1 1 1;... 0 0 1 1 0 0 1 1;... 0 1 0 1 0 1 0 1]; % initialize network targets targets = [0 0 0 1 0 1 1 1]; % create a feed-froward network with a hidden % layer of 3 neurons net = newff(inputs, targets, 3,... {'logsig', 'purelin'}); % train the network net = train(net,inputs,targets); % simulate the network outputs = sim(net,inputs);
22
Example 2: Handwritten Digits Recognition ICS583: Pattern Recoginition 2009-201022 Given a set of 1000 samples of different handwritten digits (0,1, …, 9), Each digit is represented as a binary image of size 28x28 pixels
23
Example 2: Handwritten Digits Recognition ICS583: Pattern Recoginition 2009-201023 We would like to use MATLAB Neural Network Toolbox to design a neural network to recognize handwritten digits Pattern recognition network (newpr) is suitable for this purpose
24
Example 2: Handwritten Digits Recognition ICS583: Pattern Recoginition 2009-201024 % initialize network inputs inputs = [0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0; 0 0 0 0 0 0 1 1 1 0 0 1 0 1 0 0 1 1 1 0 0 0 0 0 0];
25
Example 2: Handwritten Digits Recognition ICS583: Pattern Recoginition 2009-201025 % initialize network targets inputs = [ 0 1;... 1 0;... 0 0;... 0 0];
26
Questions ICS583: Pattern Recoginition 2009-201026
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.