EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.

Slides:



Advertisements
Similar presentations
Face Recognition: A Convolutional Neural Network Approach
Advertisements

Face Recognition Using Face Unit Radial Basis Function Networks Ben S. Feinstein Harvey Mudd College December 1999.
Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin.
1 Rotation Invariant Face Detection Using Neural Network Lecturers: Mehdi Dehghani - Mahdy Bashary Supervisor: Dr. Bagheri Shouraki Spring 2007.
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Simple Neural Nets For Pattern Classification
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
Neural Networks Basic concepts ArchitectureOperation.
Aula 5 Alguns Exemplos PMR5406 Redes Neurais e Lógica Fuzzy.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
1 Learning to Detect Objects in Images via a Sparse, Part-Based Representation S. Agarwal, A. Awan and D. Roth IEEE Transactions on Pattern Analysis and.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
November 9, 2010Neural Networks Lecture 16: Counterpropagation 1 Unsupervised Learning So far, we have only looked at supervised learning, in which an.
Neural Networks Chapter Feed-Forward Neural Networks.
October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 1 Example I: Predicting the Weather We decide (or experimentally determine) to use.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks I PROF. DR. YUSUF OYSAL.
October 7, 2010Neural Networks Lecture 10: Setting Backpropagation Parameters 1 Creating Data Representations On the other hand, sets of orthogonal vectors.
Aula 4 Radial Basis Function Networks
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Radial Basis Function (RBF) Networks
Radial Basis Function G.Anuradha.
Radial-Basis Function Networks
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Radial Basis Function Networks
Convolutional Neural Networks for Image Processing with Applications in Mobile Robotics By, Sruthi Moola.
Radial Basis Function Networks
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Multiple-Layer Networks and Backpropagation Algorithms
Intrusion Detection Using Hybrid Neural Networks Vishal Sevani ( )
Neural NetworksNN 11 Neural netwoks thanks to: Basics of neural network theory and practice for supervised and unsupervised.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
DIGITAL IMAGE PROCESSING Dr J. Shanbehzadeh M. Hosseinajad ( J.Shanbehzadeh M. Hosseinajad)
© Negnevitsky, Pearson Education, Will neural network work for my problem? Will neural network work for my problem? Character recognition neural.
Neural Networks Artificial neural network (ANN) is a machine learning approach inspired by the way in which the brain performs a particular learning task.
Explorations in Neural Networks Tianhui Cai Period 3.
Chapter 9 Neural Network.
Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University EE459 Neural Networks The Structure.
NEURAL NETWORKS FOR DATA MINING
Radial Basis Function Networks:
Artificial Intelligence Techniques Multilayer Perceptrons.
COMPARISON OF IMAGE ANALYSIS FOR THAI HANDWRITTEN CHARACTER RECOGNITION Olarik Surinta, chatklaw Jareanpon Department of Management Information System.
1 Pattern Classification X. 2 Content General Method K Nearest Neighbors Decision Trees Nerual Networks.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
So Far……  Clustering basics, necessity for clustering, Usage in various fields : engineering and industrial fields  Properties : hierarchical, flat,
Handwritten Recognition with Neural Network Chatklaw Jareanpon, Olarik Surinta Mahasarakham University.
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
Analysis of Classification Algorithms In Handwritten Digit Recognition Logan Helms Jon Daniele.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Learning Kernel Classifiers 1. Introduction Summarized by In-Hee Lee.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
1 Convolutional neural networks Abin - Roozgard. 2  Introduction  Drawbacks of previous neural networks  Convolutional neural networks  LeNet 5 
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Leaves Recognition By Zakir Mohammed Indiana State University Computer Science.
Big data classification using neural network
Handwritten Digit Recognition Using Stacked Autoencoders
Data Mining, Neural Network and Genetic Programming
Data Mining, Neural Network and Genetic Programming
Radial Basis Function G.Anuradha.
Classification with Perceptrons Reading:
Neuro-Computing Lecture 4 Radial Basis Function Network
network of simple neuron-like computing elements
Creating Data Representations
Department of Electrical Engineering
CSSE463: Image Recognition Day 17
Face Recognition: A Convolutional Neural Network Approach
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Presentation transcript:

EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

APPLICATIONS Two examples of real life applications of neural networks for pattern classification: –RBF networks for face recognition –FF networks for handwritten recognition

FACE RECOGNITION The problem: –Face recognition of persons of a known group in an indoor environment. The approach: –Learn face classes over a wide range of poses using an RBF network.

Dataset Sussex database (university of Sussex) –100 images of 10 people (8-bit grayscale, resolution 384 x 287) –for each individual, 10 images of head in different pose from face-on to profile –Designed to asses performance of face recognition techniques when pose variations occur

Datasets (Sussex) All ten images for classes 0-3 from the Sussex database, nose- centred and subsampled to 25x25 before preprocessing

Approach: Face unit RBF A face recognition unit RBF neural networks is trained to recognize a single person. Training uses examples of images of the person to be recognized as positive evidence, together with selected confusable images of other people as negative evidence.

Network Architecture Input layer contains 25*25 inputs which represent the pixel intensities (normalized) of an image. Hidden layer contains p+a neurons: –p hidden pro neurons (receptors for positive evidence) –a hidden anti neurons (receptors for negative evidence) Output layer contains two neurons: –One for the particular person. –One for all the others. The output is discarded if the absolute difference of the two output neurons is smaller than a parameter R.

RBF Architecture for one face recognition Output units Linear RBF units Non-linear Input units Supervised Unsupervised

Hidden Layer Hidden nodes can be: –Pro neurons:Evidence for that person. –Anti neurons:Negative evidence. The number of pro neurons is equal to the positive examples of the training set. For each pro neuron there is either one or two anti neurons. Hidden neuron model: Gaussian RBF function.

The Parameters Centers: –of a pro neuron: the corresponding positive example –of an anti neuron: the negative example which is most similar to the corresponding pro neuron, with respect to the Euclidean distance. Spread: average distance of the center vector from all other centers. If , h hidden nodes, H total number of hidden nodes then: Weights: determined using the pseudo-inverse method. A RBF network with 6 pro neurons, 12 anti neurons, and R equal to 0.3, discarded 23 pro cent of the images of the test set and classified correctly 96 pro cent of the non discarded images.

HANDWRITTEN DIGIT RECOGNITION

Dataset and NN architecture 7,291for training 9,298 examples used 2,007for testing Both training and test set contain ambiguous and unclassifiable examples. 256 input neurons.(16x16) 10 output neurons.(each represents a numeral) A feedforward NN with 3 hidden layers is used.

The Idea –Shared weights: all neurons in a feature share the same weights (but not the biases). In this way all neurons detect the same feature at different positions in the input image. The detected features are combined to reach shift-invariant feature detection. –This is combined with layer implementing subsampling to decrease the resolution and the sensitivity to distorsions.

NN Architecture

Convolutional NN –If a neuron in the feature map fires, this corresponds to a match with the template. –Neurons of the feature map react to the same pattern at different positions in the input image. –For neurons in the feature map that are one neuron apart (in the matrix representation of the feature map) their templates in the input image are two pixels apart. Thus the input image is undersampled, and some position information is eliminated. –A similar 2-to-1 undersampling occurs as one goes from H1 to H2. The rationale is that although high resolution may be needed to detect a feature, its exact position need not be determined at equally high precision.

A Feature Map

A Sub-sampling Map

Architecture Input layer: 256 neurons with input values in range [-1, 1]. Hidden layer H1: consists of 12 feature maps H1.1, …, H1.12. Feature map: –It consists of 8x8 neurons. –Each neuron in the feature map has the same incoming weights, but is connected to a square at a unique position in the input image. This square is called a template.

Architecture Hidden layer H2: consists of 12 sub-sampling maps H2.1, …, H2.12. Sub-sampling map: –Consists of 4x4 neurons. –Each neuron of the sub-sampling map is connected to a 5x5 square of H1.j, for each j in 8 of the 12 feature maps. –All neurons of the sub-sampling map share the same 25 weights.

Architecture Hidden layer H3: –Consists of 30 neurons. –H3 is completely connected to the sub-sampling layer (H2). Output layer: consists of 10 neurons, numbered 0, …, 9 and the neuron with the highest activation value is chosen. The digit recognized is equal to the cell number. A Dutch master thesis on Le Cun shared weights NN: master thesis of D. de Ridder: “shared weights NN’s in image anlyses”,

Le Cun NN Architecture

Training –Backpropagation was used to learn the weights. –The hyperbolic tangent was chosen as neuron model. –After training the number of misclassified patterns was 10 on the training set and 102 on the test set.