Aula 5 Alguns Exemplos PMR5406 Redes Neurais e Lógica Fuzzy.

Slides:



Advertisements
Similar presentations
Introduction to Machine Learning BITS C464/BITS F464
Advertisements

A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Face Recognition: A Convolutional Neural Network Approach
1 Image Classification MSc Image Processing Assignment March 2003.
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Face Recognition Using Face Unit Radial Basis Function Networks Ben S. Feinstein Harvey Mudd College December 1999.
Thesis title: “Studies in Pattern Classification – Biological Modeling, Uncertainty Reasoning, and Statistical Learning” 3 parts: (1)Handwritten Digit.
1 Rotation Invariant Face Detection Using Neural Network Lecturers: Mehdi Dehghani - Mahdy Bashary Supervisor: Dr. Bagheri Shouraki Spring 2007.
Learning Convolutional Feature Hierarchies for Visual Recognition
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Supervised and Unsupervised learning and application to Neuroscience Cours CA6b-4.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
1 Learning to Detect Objects in Images via a Sparse, Part-Based Representation S. Agarwal, A. Awan and D. Roth IEEE Transactions on Pattern Analysis and.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks I PROF. DR. YUSUF OYSAL.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
Aula 4 Radial Basis Function Networks
Radial Basis Function (RBF) Networks
Radial Basis Function G.Anuradha.
Radial-Basis Function Networks
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
CS-424 Gregory Dudek Today’s Lecture Neural networks –Backprop example Clustering & classification: case study –Sound classification: the tapper Recurrent.
Convolutional Neural Networks for Image Processing with Applications in Mobile Robotics By, Sruthi Moola.
West Virginia University
Radial Basis Function Networks
Multiple-Layer Networks and Backpropagation Algorithms
Neural NetworksNN 11 Neural netwoks thanks to: Basics of neural network theory and practice for supervised and unsupervised.
Hurieh Khalajzadeh Mohammad Mansouri Mohammad Teshnehlab
Neural Networks Artificial neural network (ANN) is a machine learning approach inspired by the way in which the brain performs a particular learning task.
Explorations in Neural Networks Tianhui Cai Period 3.
Chapter 9 Neural Network.
Radial Basis Function Networks:
COMPARISON OF IMAGE ANALYSIS FOR THAI HANDWRITTEN CHARACTER RECOGNITION Olarik Surinta, chatklaw Jareanpon Department of Management Information System.
1 Pattern Classification X. 2 Content General Method K Nearest Neighbors Decision Trees Nerual Networks.
Building high-level features using large-scale unsupervised learning Anh Nguyen, Bay-yuan Hsu CS290D – Data Mining (Spring 2014) University of California,
Handwritten Recognition with Neural Network Chatklaw Jareanpon, Olarik Surinta Mahasarakham University.
EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
Handwritten digit recognition
Face Image-Based Gender Recognition Using Complex-Valued Neural Network Instructor :Dr. Dong-Chul Kim Indrani Gorripati.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
CSC321 Lecture 5 Applying backpropagation to shape recognition Geoffrey Hinton.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 6: Applying backpropagation to shape recognition Geoffrey Hinton.
Neural Networks 2nd Edition Simon Haykin
1 Convolutional neural networks Abin - Roozgard. 2  Introduction  Drawbacks of previous neural networks  Convolutional neural networks  LeNet 5 
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Today’s Lecture Neural networks Training
Big data classification using neural network
Convolutional Neural Network
Data Mining, Neural Network and Genetic Programming
Data Mining, Neural Network and Genetic Programming
Radial Basis Function G.Anuradha.
Classification with Perceptrons Reading:
Final Year Project Presentation --- Magic Paint Face
Non-linear classifiers Neural networks
4.2 Data Input-Output Representation
Non-linear hypotheses
Neuro-Computing Lecture 4 Radial Basis Function Network
network of simple neuron-like computing elements
Convolutional neural networks Abin - Roozgard.
CSC 578 Neural Networks and Deep Learning
Creating Data Representations
CSSE463: Image Recognition Day 17
Department of Electrical Engineering
Analysis of Trained CNN (Receptive Field & Weights of Network)
Introduction to Radial Basis Function Networks
CSSE463: Image Recognition Day 17
Face Recognition: A Convolutional Neural Network Approach
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
CSC 578 Neural Networks and Deep Learning
Presentation transcript:

Aula 5 Alguns Exemplos PMR5406 Redes Neurais e Lógica Fuzzy

PMR5406Alguns Exemplos2 APPLICATIONS Two examples of real life applications of neural networks for pattern classification: –RBF networks for face recognition –FF networks for handwritten recognition

PMR5406Alguns Exemplos3 FACE RECOGNITION The problem: –Face recognition of persons of a known group in an indoor environment. The approach: –Learn face classes over a wide range of poses using an RBF network. PhD thesis by Jonathan Howelland, Sussex University

PMR5406Alguns Exemplos4 Dataset Sussex database (university of Sussex) –100 images of 10 people (8-bit grayscale, resolution 384 x 287) –for each individual, 10 images of head in different pose from face-on to profile –Designed to asses performance of face recognition techniques when pose variations occur

PMR5406Alguns Exemplos5 Robustness to shift- invariance, scale-variance

PMR5406Alguns Exemplos6 Datasets (Sussex) All ten images for classes 0-3 from the Sussex database, nose- centred and subsampled to 25x25 before preprocessing

PMR5406Alguns Exemplos7 Pre-Processing Raw data can be used, but with pre- processing. Possible approaches: –Difference of Gaussians (DoG) Pre- Processing. –Gabor Pre-Processing.

PMR5406Alguns Exemplos8 Some justification

PMR5406Alguns Exemplos9 DoG

PMR5406Alguns Exemplos10 DoG masks

PMR5406Alguns Exemplos11 Some examples:

PMR5406Alguns Exemplos12 Binarisation (1)

PMR5406Alguns Exemplos13 Binarisation (2)

PMR5406Alguns Exemplos14 Gabor Pre-Processing

PMR5406Alguns Exemplos15

PMR5406Alguns Exemplos16 Gabor Masks

PMR5406Alguns Exemplos17 Approach: Face unit RBF A face recognition unit RBF neural networks is trained to recognize a single person. Training uses examples of images of the person to be recognized as positive evidence, together with selected confusable images of other people as negative evidence.

PMR5406Alguns Exemplos18 Network Architecture Input layer contains 25*25 inputs which represent the pixel intensities (normalized) of an image. Hidden layer contains p+a neurons: –p hidden pro neurons (receptors for positive evidence) –a hidden anti neurons (receptors for negative evidence) Output layer contains two neurons: –One for the particular person. –One for all the others. The output is discarded if the absolute difference of the two output neurons is smaller than a parameter R.

PMR5406Alguns Exemplos19 RBF Architecture for one face recognition Output units Linear RBF units Non-linear Input units Supervised Unsupervised

PMR5406Alguns Exemplos20

PMR5406Alguns Exemplos21 Hidden Layer Hidden nodes can be: –Pro neurons:Evidence for that person. –Anti neurons:Negative evidence. The number of pro neurons is equal to the positive examples of the training set. For each pro neuron there is either one or two anti neurons. Hidden neuron model: Gaussian RBF function.

PMR5406Alguns Exemplos22 The Parameters Centers: –of a pro neuron: the corresponding positive example –of an anti neuron: the negative example which is most similar to the corresponding pro neuron, with respect to the Euclidean distance. Spread: average distance of the center vector from all other centers. If , h hidden nodes, H total number of hidden nodes then: Weights: determined using the pseudo-inverse method. A RBF network with 6 pro neurons, 12 anti neurons, and R equal to 0.3, discarded 23 pro cent of the images of the test set and classified correctly 96 pro cent of the non discarded images.

PMR5406Alguns Exemplos23 Handwritten Digit Recognition Using Convolutional Networks Developed by Yann Lecun while working at the AT&T

PMR5406Alguns Exemplos24 HANDWRITTEN DIGIT RECOGNITION

PMR5406Alguns Exemplos25 Convolutional Network Convolutional network is a multilayer perceptron designed specifically to recognize two-dimensional shapes with a high degree of invariance to translation, scaling, skewing and other forms of distortions.

PMR5406Alguns Exemplos26 Characteristics (1) 1.Feature extraction: each neuron takes its synaptic input from a local receptive field from the previous layer. It extracts local features

PMR5406Alguns Exemplos27 Characteristics (2) 2.Feature mapping: each computational layer is composed of multiple feature maps. In each feature, map neurons must share weights. This promote: Shift invariance, Reduction in the number of the free parameters.

PMR5406Alguns Exemplos28 Chracteristics (3) 3.Subsampling: each convolutional layer is followed by a computational layer that peforms local averaging and subsampling. This has the effect of reducing the sensitivity to shift´s and other forms of distortions.

PMR5406Alguns Exemplos29 Architecture (0)

PMR5406Alguns Exemplos30 Architecture (1) Input layer: 28x28 sensory nodes First hidden layer: convolution, neurons feature map. Each neuron is assigned a receptive field of 5x5. Second hidden layer: subsampling and local averaging. neurons feature map. Each neuron: 2x2 receptive field, weight, bias and sigmoid activation function

PMR5406Alguns Exemplos31 Architecture (2) Third hidden layer: convolution. neurons feature map. Each neuron may have synaptic connections from several feature maps in the previous hidden layer. Fourth hidden layer: subsampling and averaging. neurons feature map.

PMR5406Alguns Exemplos32 Architecture (3) Output layer: Convolution, one for each character. Each neurons is connected to a receptive field of 4x4.

PMR5406Alguns Exemplos33 Architecture (4) Convolution -> subsampling -> convolution -> subsampling ->... At each convolutional or subsampling layer, the number of feature maps is increased while the spatial resolution is reduced synaptic conections but only free parameters.