COMPARISON OF IMAGE ANALYSIS FOR THAI HANDWRITTEN CHARACTER RECOGNITION Olarik Surinta, chatklaw Jareanpon Department of Management Information System.

Slides:



Advertisements
Similar presentations
Patient information extraction in digitized X-ray imagery Hsien-Huang P. Wu Department of Electrical Engineering, National Yunlin University of Science.
Advertisements

Applications of one-class classification
1 Image Classification MSc Image Processing Assignment March 2003.
Car Model Recognition with Neural Network CS679 Term Project by Sungwon Jung Computer Science Department KAIST.
Machine Learning Lecture 4 Multilayer Perceptrons G53MLE | Machine Learning | Dr Guoping Qiu1.
Designing Facial Animation For Speaking Persian Language Hadi Rahimzadeh June 2005.
Image Indexing and Retrieval using Moment Invariants Imran Ahmad School of Computer Science University of Windsor – Canada.
Brian Merrick CS498 Seminar.  Introduction to Neural Networks  Types of Neural Networks  Neural Networks with Pattern Recognition  Applications.
Handwritten Character Recognition Using Artificial Neural Networks Shimie Atkins & Daniel Marco Supervisor: Johanan Erez Technion - Israel Institute of.
Aula 5 Alguns Exemplos PMR5406 Redes Neurais e Lógica Fuzzy.
Artificial Neural Networks (ANNs)
Text Classification: An Implementation Project Prerak Sanghvi Computer Science and Engineering Department State University of New York at Buffalo.
Classification of Music According to Genres Using Neural Networks, Genetic Algorithms and Fuzzy Systems.
October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 1 Example I: Predicting the Weather We decide (or experimentally determine) to use.
Handwritten Thai Character Recognition Using Fourier Descriptors and Robust C-Prototype Olarik Surinta Supot Nitsuwat.
October 7, 2010Neural Networks Lecture 10: Setting Backpropagation Parameters 1 Creating Data Representations On the other hand, sets of orthogonal vectors.
Face Processing System Presented by: Harvest Jang Group meeting Fall 2002.
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Radial-Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
West Virginia University
Presented by: Kamakhaya Argulewar Guided by: Prof. Shweta V. Jain
1 Template-Based Classification Method for Chinese Character Recognition Presenter: Tienwei Tsai Department of Informaiton Management, Chihlee Institute.
: Chapter 10: Image Recognition 1 Montri Karnjanadecha ac.th/~montri Image Processing.
Presented by Tienwei Tsai July, 2005
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Explorations in Neural Networks Tianhui Cai Period 3.
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
Appendix B: An Example of Back-propagation algorithm
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Damageless Information Hiding Technique using Neural Network Keio University Graduate School of Media and Governance Kensuke Naoe.
Under Supervision of Dr. Kamel A. Arram Eng. Lamiaa Said Wed
Artificial Intelligence Techniques Multilayer Perceptrons.
Yang, Luyu.  Postal service for sorting mails by the postal code written on the envelop  Bank system for processing checks by reading the amount of.
Handwritten Recognition with Neural Network Chatklaw Jareanpon, Olarik Surinta Mahasarakham University.
Multi-Layer Perceptron
Handwritten Hindi Numerals Recognition Kritika Singh Akarshan Sarkar Mentor- Prof. Amitabha Mukerjee.
EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Analysis of Classification Algorithms In Handwritten Digit Recognition Logan Helms Jon Daniele.
Face Image-Based Gender Recognition Using Complex-Valued Neural Network Instructor :Dr. Dong-Chul Kim Indrani Gorripati.
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
Neural Networks 2nd Edition Simon Haykin
Neural Networks Lecture 4 out of 4. Practical Considerations Input Architecture Output.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
1 A Statistical Matching Method in Wavelet Domain for Handwritten Character Recognition Presented by Te-Wei Chiang July, 2005.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
Leaves Recognition By Zakir Mohammed Indiana State University Computer Science.
Big data classification using neural network
Multiple-Layer Networks and Backpropagation Algorithms
Neural Network Architecture Session 2
Deep Learning Amin Sobhani.
Data Mining, Neural Network and Genetic Programming
Submitted by: Ala Berawi Sujod Makhlof Samah Hanani Supervisor:
CSE 473 Introduction to Artificial Intelligence Neural Networks
Final Year Project Presentation --- Magic Paint Face
Prof. Carolina Ruiz Department of Computer Science
Artificial Neural Networks for Pattern Recognition
Artificial Neural Network & Backpropagation Algorithm
of the Artificial Neural Networks.
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Zip Codes and Neural Networks: Machine Learning for
Automatic Handwriting Generation
Random Neural Network Texture Model
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

COMPARISON OF IMAGE ANALYSIS FOR THAI HANDWRITTEN CHARACTER RECOGNITION Olarik Surinta, chatklaw Jareanpon Department of Management Information System Faculty of Informatics Mahasarakham University

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 INTRODUCTION This article presents a comparison of image analysis for Thai handwritten character recognition. There are three steps of our methodology, –Data Pre-processing Method –Training Method –Pattern Recognition Method

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 INTRODUCTION Data Preprocessing Method –Data preprocessing method is based on feature extraction to find the edge of character-image (described by a group of Fourier Descriptors: FD).

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 INTRODUCTION Training Method –Training method is based on Back-Propagation Neural Network and Robust C- Prototype technique to processed Thai Handwritten character images to identify the centroid of each character, resulting in 44 prototypes.

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 INTRODUCTION There are 44 Thai characters: ก ข ฃ ค ฅ ฆ งจ ฉ ช ซ ฌ ญ ฎ ฏ ฐ ฑ ฒ ณด ต ถ ท ธ น บ ป ผ ฝ พ ฟภ ม ย ร ล ว ศ ษ ส ห ฬ อ ฮ

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 INTRODUCTION Pattern Recognition Method –Pattern recognition method is based on Back-Propagation Neural Network and Robust C- Prototype technique to identify the unknown image (Thai handwritten character).

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 DATA PREPROCESSING METHOD Character-images are images of Thai handwritten characters. The output will be stored in the term of digital data by scanning. One bitmap file with gray scale pattern and 256 levels specifics one character. Figure 1 A prototype character-image.

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 Binarization Binarization converts gray-level image to black-white image, and to extracting the object component from background, this scheme will check on every point of pixel. Figure 2 The example of binarization scheme.

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 Binarization (A) (B) (C) Figure 3 The diagram of extracting the object from the background component in the image.

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 Edge Detection Edge detection is one of an important image processing phases. This paper uses chain code technique to detect the image’s edge. The direction has been classified by 8 categories: Figure 4 Chain code with 8 directions.

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 Edge Detection Once the edge of image has discovered, shown in figure 3, the process needs to find the character line. The coordinate is represented by complex number as the formula: Figure 5 coordinate represented in character image.

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 Fourier Descriptors Fourier Features used to describe the edge of the object works by identify coordinate ; K = 0, 1, …, N-1 where N is any other area in the image. All point will be represents as complex number. Therefore, the DFT can be derived as below:

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 Fourier Descriptors From the above formula, coefficient vector will be automatically calculated. This vector fits as 1 dimension with the size of 1x10 or 1xn Figure 6 Fourier Descriptors of Image.

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 TRAINING METHOD Training method is based on –Back-Propagation Neural Network and –Robust C-Prototype technique

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 Robust C-Prototype RCP can be determined in grouping phase in order to estimate C-Prototypes spontaneously, utilizing loss function and square distance to reduce some noise. The definition can be expressed as:

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 Robust C-Prototype The diagram of solving the problem by RCP is shown in figure 7 Figure 7 RCP algorithm.

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 Back-Propagation Neural Network Back propagation neural network use a learning algorithm use in which an error moves from the output layer to the input one. Figure 8 The Neural Network Layer.

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 Back-Propagation Neural Network Back-Propagation neural networks consists of several neuron layers, each neuron of layer being connected to each neuron of layer. In the training phase, the correct class for each record is known (this is termed supervised training), and the output nodes can therefore be assigned "correct" values -- "1" for the node corresponding to the correct class, and "0" for the others.

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 Back-Propagation Neural Network These error terms are then used to adjust the weights in the hidden layers so that, hopefully, the next time around the output values will be closer to the "correct" values. The network processes the records in the training data one at a time, using the weights and functions in the hidden layers, and then compares the resulting outputs against the desired outputs.

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 Back-Propagation Neural Network Errors are then propagated back through the system, causing the system to adjust the weights for application to the next record to be processed. During the training of a network the same set of data is processed many times as the connection weights are continually refined.

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 EXPERIMENTAL RESULT The data in this experiment have 2 sets. –The first set is “the learning set” containing 4,400 characters. –The second set is the “test set” containing 440 characters. All data is Thai handwritten and generated by 100 persons.

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 EXPERIMENTAL RESULT Topic Robust C- Prototype Back- Propagation neural network Time of learning 1.5 Hour2.45 Hour Accuracy test on “Test set” 91.5%88% Comparison between Robust C-Prototype and Back- propagation neural network

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 CONCLUSION This paper looks at Thai handwritten character recognition. We compared Robust C-Prototype and Back- Propagation neural networks. The experimental results of both methods have accuracy more than 85%.

Home Introduction Data Pre- Processing Method Training Method Conclusion Experimental Result ICIIP 2006 Figure 9 The character images the adjustment scheme.