Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

Beyond Linear Separability
Slides from: Doug Gray, David Poole
NEURAL NETWORKS Backpropagation Algorithm
EE 690 Design of Embodied Intelligence
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 7: Learning in recurrent networks Geoffrey Hinton.
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Machine Learning Lecture 4 Multilayer Perceptrons G53MLE | Machine Learning | Dr Guoping Qiu1.
Kostas Kontogiannis E&CE
Machine Learning Neural Networks
Overview over different methods – Supervised Learning
Artificial Neural Networks
Lecture 14 – Neural Networks
Simple Neural Nets For Pattern Classification
Characterization Presentation Neural Network Implementation On FPGA Supervisor: Chen Koren Maria Nemets Maxim Zavodchik
Handwritten Character Recognition Using Artificial Neural Networks Shimie Atkins & Daniel Marco Supervisor: Johanan Erez Technion - Israel Institute of.
The back-propagation training algorithm
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 15: Introduction to Artificial Neural Networks Martin Russell.
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Multi Layer Perceptrons (MLP) Course website: The back-propagation algorithm Following Hertz chapter 6.
October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 1 Example I: Predicting the Weather We decide (or experimentally determine) to use.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks I PROF. DR. YUSUF OYSAL.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Radial Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
BEE4333 Intelligent Control
Presented by: Kamakhaya Argulewar Guided by: Prof. Shweta V. Jain
Radial Basis Function Networks
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Cascade Correlation Architecture and Learning Algorithm for Neural Networks.
Artificial Neural Networks
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
CS 6825: Binary Image Processing – binary blob metrics
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
NEURAL NETWORKS FOR DATA MINING
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
CS 478 – Tools for Machine Learning and Data Mining Backpropagation.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
BACKPROPAGATION: An Example of Supervised Learning One useful network is feed-forward network (often trained using the backpropagation algorithm) called.
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Neural Networks Videos Brief Review The Next Generation Neural Networks - Geoff Hinton.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
EEE502 Pattern Recognition
Chapter 8: Adaptive Networks
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Neural Network Terminology
Each neuron has a threshold value Each neuron has weighted inputs from other neurons The input signals form a weighted sum If the activation level exceeds.
Perceptrons Michael J. Watts
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Neural networks.
Fall 2004 Backpropagation CS478 - Machine Learning.
Neural Network Architecture Session 2
Classification with Perceptrons Reading:
Neuro-Computing Lecture 4 Radial Basis Function Network
network of simple neuron-like computing elements
Neural Network - 2 Mayank Vatsa
Backpropagation.
Backpropagation.
Multilayer Perceptron: Learning : {(xi, f(xi)) | i = 1 ~ N} → W
Presentation transcript:

Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection  Loading routines for training input images and corresponding desired output characters in files.  Loading and saving routines for trained network (weight values)  Character to binary Unicode and vice versa conversion routines  Error, output and weight calculation routines

Multi-Layered Perceptron N/w

 This is perhaps the most popular network architecture in use today. The units each perform a biased weighted sum of their inputs and pass this activation level through an activation function to produce their output, and the units are arranged in a layered feedforward topology. The network thus has a simple interpretation as a form of input-output model, with the weights and thresholds (biases) the free parameters of the model. Such networks can model functions of almost arbitrary complexity, with the number of layers, and the number of units in each layer, determining the function complexity. Important issues in Multilayer Perceptrons (MLP) design include specification of the number of hidden layers and the number of units in each layer.

1. Network Formation The MLP Network implemented for the purpose of this project is composed of 3 layers, one input, one hidden and one output. 1. Network Formation

2. Symbol image detection  The process of image analysis to detect character symbols by examining pixels is the core part of input set preparation in both the training and testing phase.  The procedure assumes the input image is composed of only characters

i. Determining character lines  Enumeration of character lines in a character image is essential in delimiting the bounds within which the detection can proceed. Thus detecting the next character in an image does not necessarily involve scanning the whole image all over again.

 start at the first x and first y pixel of the image pixel(0,0), Set number of lines to 0  scan up to the width of the image on the same y-component of the image  if a black pixel is detected register y as top of the first line  if not continue to the next pixel  if no black pixel found up to the width increment y and reset x to scan the next horizontal line  start at the top of the line found and first x-component pixel(0,line_top)  scan up to the width of the image on the same y-component of the image  if no black pixel is detected register y-1 as bottom of the first line. Increment number of lines  if a black pixel is detected increment y and reset x to scan the next horizontal line  start below the bottom of the last line found and repeat steps 1-4 to detect subsequent lines  If bottom of image (image height) is reached stop.

 Detection of individual symbols involves scanning character lines for orthogonally separable images composed of black pixels. ii. Detecting Individual symbols

4.Line & Char boundary detection

 The detected character bound might not be the actual bound for the character in question. This is an issue that arises with the height and bottom alignment irregularity that exists with printed alphabetic symbols. Thus a line top does not necessarily mean top of all characters and a line bottom might not mean bottom of all characters as well.  Hence a confirmation of top and bottom for the character is also done. Line & Char boundary detection

An optional confirmation algorithm implemented in the project is:  start at the top of the current line and left of the character  scan up to the right of the character  if a black pixels is detected register y as the confirmed top  if not continue to the next pixel  if no black pixels are found increment y and reset x to scan the next horizontal line

5. Symbol Image Matrix Mapping The next step is to map the symbol image into a corresponding 2D binary matrix. An important issue to consider here will be deciding the size of the matrix. If all the pixels of the symbol are mapped into the matrix, one would definitely be able to acquire all the distinguishing pixel features of the symbol and minimize overlap with other symbols.

Mapping symbol images onto a binary matrix

4. Training The training routine implemented the following basic algorithm 1. Form network according to the specified topology parameters 2. Initialize weights with random values within the specified weight bias value 3. load trainer set files (both input image and desired output text) 4. analyze input image and map all detected symbols into linear arrays 5. read desired output text from file and convert each character to a binary Unicode value to store separately 6. for each character :  calculate the output of the feed forward network  compare with the desired output corresponding to the symbol and compute error  back propagate error across each link to adjust the weights 7. move to the next character and repeat step 6 until all characters are visited 8. compute the average error of all characters 9. repeat steps 6 and 8 until the specified number of epochs  Is error threshold reached? If so abort iteration  If not continue iteration

5. Testing  The testing phase of the implementation is simple and straightforward. Since the program is coded into modular parts the same routines that were used to load, analyze and compute network parameters of input vectors in the training phase can be reused in the testing phase as well.

Thank You