Download presentation
Presentation is loading. Please wait.
Published byBritney Floyd Modified over 8 years ago
1
c - IT Acumens. COMIT Acumens. COM
2
To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural network that uses the Backpropagation algorithm for the purpose of alphanumeric recognition and the use of Bi-directional Associative memories for the purpose of pattern recognition in the application of association of names with phone numbers. OBJECTIVE
3
REQUIREMENTS PLATFORM: Windows 9x/XP LANGUAGE USED: Microsoft VC++ DEVELOPMENT TOOL: Microsoft Visual Studio. GUI DESIGN : Microsoft Foundation Classes. OTHER DESIGN TOOLS: SmartDraw.
4
MODULES Character Recognition (alpha numeric) 7 Segment Display Look up Table Back Propagation Algorithm Pattern Recognition Bi-directional Associative Model
5
SYNOPSIS The character recognition in this project deals with the identification of alphanumeric characters that are created by user interaction. To highlight the importance of neural networks in this scenario two methods of automated character recognition was initially developed : Back propagation network Bidirectional associative memory
6
Character Recognition The basic function of this module is to implement neural network based alphanumeric recognition. There are three sub modules in this. The first one being using the pattern generated by the user through a GUI based 7 segment display. The next one is using a 5 x 7 grid generated in GUI to obtain the user interaction. This pattern is used as input and the particular character is recognized. The third sub module is the implementation of neural networks. It uses the backpropagation network for both alphabets and numeric recognition. The GUI created for this module allows the user to train the network with specific pattern of input.
7
7 Segment Display This method cannot display alphabets like Q, W, R, Y, K, Z, X, V, N, M.
8
Look Up Table Method This method is simpler and faster but the user has to create the pattern
9
Back Propagation Network For the recognition of alphabets a network with 26 output nodes in the output layer, 35 input nodes in the input layer and 50 hidden nodes in the hidden layer is used. For the recognition of numbers the same network with 10 output nodes is used. The output of a BackPropagation network as a classification decision.
10
BPN for Numeric Recognition
11
Training the network A back-propagation network also typically starts with a random set of weights. The network adjusts its weights each time it sees an input-output pair. Each pair requires two stages: a forward pass a backward pass The forward pass involves presenting a sample input to the network and letting activations flow until they reach the output layer.
12
In the backward pass, the network's actual output (from the forward pass) is compared with the target output and error estimates are computed for the output units. The weights connected to the output units can be adjusted in order reduce these errors. Then use the error estimates of the output units to derive error estimates for the hidden layers. Finally, errors are propagated back to the connections stemming from the input units. Training the network
13
Alphabet recognition -BPN
14
Numeric recognition BPN
15
Pattern Recognition The function of this module is to simulate a Bi- directional associative memory model for the application of associating the names with phone numbers. The module defines the names and phone numbers. The user when attempts to give the name with the wrong spelling the network learns the pattern given and finds the name that is most closely associated with the given pattern and gives the display. This module is to demonstrate the use of the BAM model in the field of pattern recognition.
16
The BAM network consists of two layers. Input layer being X layer and the output layer being Y. X layer represents the names and Y layer represents the phone numbers. X layer : 30 units with 6 bits per char in the name. Y layer: 42 units with 6 bits per char in the phone number. Implementing the BAM network.
17
Process of simulating the network Generating the network allocating the sufficient memory for the network layers X and Y. Initializing the application finding the Bipolar values for both the names and phone numbers. Calculating the weights weight =input * output of the particular unit.
18
Process of simulating the network Propagating signals between layers It involves the process of adjusting the output of Y layer such that the correct association between the element of the X Layer is found. Output: The output of the network is the association of the names with phone numbers.
19
SCREEN SHOT- Bi directional Associative memory
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.