Combining Neural Networks and Context-Driven Search for On- Line, Printed Handwriting Recognition in the Newton Larry S. Yaeger, Brandn J. Web, and Richard.

Slides:



Advertisements
Similar presentations
Stream flow rate prediction x = weather data f(x) = flow rate.
Advertisements

1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Artificial Neural Networks (1)
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Perceptron Learning Rule
Handwritten Mathematical Symbol Recognition for Computer Algebra Applications Xiaofang Xie, Stephen M. Watt Dept. of Computer Science, University of Western.
NEURAL NETWORKS Perceptron
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Linear Discriminant Functions Wen-Hung Liao, 11/25/2008.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
March 15-17, 2002Work with student Jong Oh Davi Geiger, Courant Institute, NYU On-Line Handwriting Recognition Transducer device (digitizer) Input: sequence.
1 Automated Feature Abstraction of the fMRI Signal using Neural Network Clustering Techniques Stefan Niculescu and Tom Mitchell Siemens Medical Solutions,
Cliff Rhyne and Jerry Fu June 5, 2007 Parallel Image Segmenter CSE 262 Spring 2007 Project Final Presentation.
MACHINE LEARNING 12. Multilayer Perceptrons. Neural Networks Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
CS Instance Based Learning1 Instance Based Learning.
September 28, 2010Neural Networks Lecture 7: Perceptron Modifications 1 Adaline Schematic Adjust weights i1i1i1i1 i2i2i2i2 inininin …  w 0 + w 1 i 1 +
Hub Queue Size Analyzer Implementing Neural Networks in practice.
Radial-Basis Function Networks
Database Design IST 7-10 Presented by Miss Egan and Miss Richards.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
Data Mining Joyeeta Dutta-Moscato July 10, Wherever we have large amounts of data, we have the need for building systems capable of learning information.
Artificial Neural Network Theory and Application Ashish Venugopal Sriram Gollapalli Ulas Bardak.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Neural Networks Chapter 6 Joost N. Kok Universiteit Leiden.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
Rotation Invariant Neural-Network Based Face Detection
Mastering the Pipeline CSCI-GA.2590 Ralph Grishman NYU.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
COMPARISON OF IMAGE ANALYSIS FOR THAI HANDWRITTEN CHARACTER RECOGNITION Olarik Surinta, chatklaw Jareanpon Department of Management Information System.
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
School of Engineering and Computer Science Victoria University of Wellington Copyright: Peter Andreae, VUW Image Recognition COMP # 18.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 25 Nov 4, 2005 Nanjing University of Science & Technology.
 Based on observed functioning of human brain.  (Artificial Neural Networks (ANN)  Our view of neural networks is very simplistic.  We view a neural.
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
Ensemble Methods.  “No free lunch theorem” Wolpert and Macready 1995.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Logan Lebanoff Mentor: Haroon Idrees
Applications of Neural Networks Patrick Stalcup. Overview Background Vocabulary The Math Data Structure Design Black Boxing Example Applications.
Deep Belief Network Training Same greedy layer-wise approach First train lowest RBM (h 0 – h 1 ) using RBM update algorithm (note h 0 is x) Freeze weights.
Neural Networks References: “Artificial Intelligence for Games” "Artificial Intelligence: A new Synthesis"
Neural Networks Lecture 4 out of 4. Practical Considerations Input Architecture Output.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
1 Minimum Bayes-risk Methods in Automatic Speech Recognition Vaibhava Geol And William Byrne IBM ; Johns Hopkins University 2003 by CRC Press LLC 2005/4/26.
Mastering the Pipeline CSCI-GA.2590 Ralph Grishman NYU.
Deep Residual Learning for Image Recognition
Deep Feedforward Networks
Deep Learning Amin Sobhani.
ANN-based program for Tablet PC character recognition
CSE 473 Introduction to Artificial Intelligence Neural Networks
Chapter 12 Advanced Intelligent Systems
Face Recognition with Neural Networks
Backpropagation.
Basics of Deep Learning No Math Required
An Improved Neural Network Algorithm for Classifying the Transmission Line Faults Slavko Vasilic Dr Mladen Kezunovic Texas A&M University.
Creating Data Representations
Multilayer Perceptron & Backpropagation
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Mihir Patel and Nikhil Sardana
Process Synchronization
David Kauchak CS51A Spring 2019
Example of training and deployment of deep convolutional neural networks. Example of training and deployment of deep convolutional neural networks. During.
Presentation transcript:

Combining Neural Networks and Context-Driven Search for On- Line, Printed Handwriting Recognition in the Newton Larry S. Yaeger, Brandn J. Web, and Richard F. Lyon

Overview Tentative Segmentation Neural Net Classifier Context Search (x,y)& Pen-lifts Words Character Class Hypotheses Character Segmentation Hypotheses

Classifier Backpropogation Network Multiple Inputs –20 x 9 Stroke Feature –14 x 14 Image –5 x 1 Stroke Count –1 x 1 Aspect Ratio All run in parallel and combined at output layer.

Picking the Top Choices Ambiguity in character recognition Problem: if the full error is propagated back, the network gives one high answer, but no secondary answers. Solution: Normalize the error before propagating back.

Ambiguity Example

ASS

Ambiguity Example uh

Removing Ambiguity Use the context Look up possible words in a dictionary Use word spacing to help Could use sentence structure, but this doesn’t

LVQ Selection of Networks An LVQ Network will be used to select an appropriate BP network for a given context. EX: Driving in heavy traffic on the freeway vs. driving in town.

Reasons for Combining Networks Allows for a separation of tasks –A BP Network must be carefully trained to adapt to different situations –It would be very difficult to add a different subtask to a BP network. Hope that this technique will scale to more complex tasks better.

Training Manual Construction –Train BP networks for sub-tasks separately, and then train the LVQ network to select the correct network. Two-Stage Construction –Use the LVQ to classify all of the training set into different classes. Train a BP network for each of these classes.

Training (cont.) Simultaneous Construction –Instead of using the standard distance measures, use the output of the BP networks as the distance measure. For each case, the LVQ network will be adjusted so that the winner moves closer to the result. Each case that passes through a BP network will be used to adjust the weights of that network.

Project Goals Create the network described and implement all three training methods Compare the results to those obtained from a single large BP network.