Face Recognition Using Face Unit Radial Basis Function Networks Ben S. Feinstein Harvey Mudd College December 1999.

Slides:



Advertisements
Similar presentations
Face Recognition: A Convolutional Neural Network Approach
Advertisements

黃文中 Preview 2 3 The Saliency Map is a topographically arranged map that represents visual saliency of a corresponding visual scene. 4.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Neural Networks II CMPUT 466/551 Nilanjan Ray. Outline Radial basis function network Bayesian neural network.
6/10/ Visual Recognition1 Radial Basis Function Networks Computer Science, KAIST.
Aula 5 Alguns Exemplos PMR5406 Redes Neurais e Lógica Fuzzy.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Radial Basis-Function Networks. Back-Propagation Stochastic Back-Propagation Algorithm Step by Step Example Radial Basis-Function Networks Gaussian response.
Radial Basis Functions
CONTENT BASED FACE RECOGNITION Ankur Jain 01D05007 Pranshu Sharma Prashant Baronia 01D05005 Swapnil Zarekar 01D05001 Under the guidance of Prof.
Radial Basis Function Networks 표현아 Computer Science, KAIST.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
November 2, 2010Neural Networks Lecture 14: Radial Basis Functions 1 Cascade Correlation Weights to each new hidden node are trained to maximize the covariance.
Artificial Neural Networks (ANNs)
Smart Traveller with Visual Translator for OCR and Face Recognition LYU0203 FYP.
October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 1 Example I: Predicting the Weather We decide (or experimentally determine) to use.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks I PROF. DR. YUSUF OYSAL.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
CS Instance Based Learning1 Instance Based Learning.
Aula 4 Radial Basis Function Networks
September 28, 2010Neural Networks Lecture 7: Perceptron Modifications 1 Adaline Schematic Adjust weights i1i1i1i1 i2i2i2i2 inininin …  w 0 + w 1 i 1 +
Radial Basis Function (RBF) Networks
Radial Basis Function G.Anuradha.
Radial Basis Networks: An Implementation of Adaptive Centers Nivas Durairaj ECE539 Final Project.
Last lecture summary.
Radial-Basis Function Networks
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Multiclass object recognition
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
Radial Basis Function Networks
CSSE463: Image Recognition Day 21 Upcoming schedule: Upcoming schedule: Exam covers material through SVMs Exam covers material through SVMs.
Presented by Tienwei Tsai July, 2005
Artificial Neural Networks Shreekanth Mandayam Robi Polikar …… …... … net k   
© Copyright 2004 ECE, UM-Rolla. All rights reserved A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C.
Rotation Invariant Neural-Network Based Face Detection
Radial Basis Function Networks:
Yang, Luyu.  Postal service for sorting mails by the postal code written on the envelop  Bank system for processing checks by reading the amount of.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 24 Nov 2, 2005 Nanjing University of Science & Technology.
Handwritten Hindi Numerals Recognition Kritika Singh Akarshan Sarkar Mentor- Prof. Amitabha Mukerjee.
EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
1 Terrorists Face recognition of suspicious and (in most cases) evil homo-sapiens.
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
Fast Learning in Networks of Locally-Tuned Processing Units John Moody and Christian J. Darken Yale Computer Science Neural Computation 1, (1989)
Face Image-Based Gender Recognition Using Complex-Valued Neural Network Instructor :Dr. Dong-Chul Kim Indrani Gorripati.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
Leaves Recognition By Zakir Mohammed Indiana State University Computer Science.
April 5, 2016Introduction to Artificial Intelligence Lecture 17: Neural Network Paradigms II 1 Capabilities of Threshold Neurons By choosing appropriate.
Today’s Lecture Neural networks Training
Recognition of biological cells – development
RADIAL BASIS FUNCTION NEURAL NETWORK DESIGN
Neural Networks Winter-Spring 2014
Radial Basis Function G.Anuradha.
Classification with Perceptrons Reading:
CSSE463: Image Recognition Day 17
Neuro-Computing Lecture 4 Radial Basis Function Network
network of simple neuron-like computing elements
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 17
Introduction to Radial Basis Function Networks
Image Compression Using An Adaptation of the ART Algorithm
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
Face Recognition: A Convolutional Neural Network Approach
PYTHON Deep Learning Prof. Muhammad Saeed.
Computational Intelligence
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

Face Recognition Using Face Unit Radial Basis Function Networks Ben S. Feinstein Harvey Mudd College December 1999

Original Project Proposal Try to reproduce published results for RBF neural nets performing face-recognition.

Recap of RBF Networks Neuron responses are “locally-tuned” or “selective” for some range of input space. Biologically plausible: Cochlear stereocilia cells in human ear exhibit locally-tuned response to frequency. Contains 1 hidden layer of radial neurons, usually gaussian functions. Hidden layer output fed to output layer of linear neurons.

Recap of RBF Networks (2)

Face Unit Network Architecture First proposed in June 1995 by Dr. A. J. Howell, School of Cognitive and Computing Sciences, Univ. of Sussex, UK. A face unit is structured to recognize only one person, using hybrid RBF architecture. Network has two linear outputs, one indicating a positive ID of the person, the other a negative ID.

Face Unit Architecture (2) An p+a face unit network has p radial neurons linked to the + output, and a neurons linked to the - output. Challenges –Bitmap faces are big dimensionally –How to reduce dimensionality of problem, extracting only the relevant information?

Gabor Wavelet Analysis Answer: Use 2D Gabor wavelets, class of orientation and position selective functions. In this case, reduces dim from |10,000| (100x100 pixel sample) to |126|. Biologically plausible: Cells in visual cortex respond selectively to stimulation that is both local in retinal position and local in angle of orientation.

Approach to Problem Sample data –10 people x 10 poses of each person ranging from 0° (head-on) to 90° (side profile) = 100 sample images –All images 384x287 pixel grayscale Sun rasterfiles, courtesy of Univ. of Sussex face database. –5 men and 5 women in sample set, mostly Caucasian.

Approach to Problem (2) Example of images for 1 person...

Approach to Problem (3) Preprocessing –Used a 100x100 pixel window around pixel at tip of the nose. Wrote NosePicker Java app to display images and save manually clicked nose coordinates. –Used Gabor orientations (0°, 60°, 120°) with sine and cosine masks = 6 functions. –Calculated the 6 Gabor masks on 99x99, 4 51x51, and 16 25x25 pixel subsamples = |126|.

Approach to Problem (4) Preprocessing –Sampling windows and orientations...

Approach to Problem (5) Network Setup/Training –All input vectors were unit normalized, and the unit normalized gaussian function was used. –For each p+a face unit network, fixed set of p poses were used to center the + neurons. –For each + neuron, the nearest p/a unique negative input vectors are used to center p/a - neurons.

Approach to Problem (6) Network Setup/Training, Cont. –Setting appropriate widths for + and - neurons remains a problem. –Linear output weights are computed by finding the pseudoinverse of the matrix of hidden neuron outputs for each input, A. Since we want Aw = d => w = A -1 d Used singular value decomposition method to approximate A -1 since A is singular.

Approach to Problem (7) Network Setup/Training, Cont. –Advantages are instantaneous “training”, since training is no longer iterative process, unlike gradient descent. –Only need to find pseudoinverse and perform matrix vector multiplication to calculate linear output weight vector.

Results Currently have tested 3+6 and 6+12 networks. Selection of neuron widths remains a problem, with manual tweaking necessary for good results. 3+6 performs about like a random classifier.

Results (2) 6+12 network performed better (see below) –Min correctMin proMin anti –37.8%037.2% –Max correctMax proMax anti –95.1%100%98.7% –Avg correctAvg. proAvg. ant –72.6%55.0%73.5%

Results (3) Compare with Dr. Howell (see below) –Avg correctMin proMin anti –89%50%83 – Max proMax anti – 100%100% Better, however Dr. Howell used a more complex preprocessing scheme, yielding input vectors of |510|.

Future Work Devise algorithm to choose appropriate neuron widths for + and - neurons or experiment with other radial basis functions that don’t need widths, such as the thin spline. Implement a network of face units, whose output will indicate a face’s identity instead of just an affirmative or negative response.

Future Work (2) Implement a confidence threshold to automatically discard low-confidence results. Expand Gabor preprocessing scheme to yield more coefficients.

What Code Was Written? Wrote C++ RBFNet class and rbf app to implement RBF net with n dimensional input and 1 linear output neuron. –Uses k-means clustering, global first nearest neighbor heuristic, and gradient descent. Wrote C++ FaceUnit class and face_net app to implement a scalable face unit network.

What Code Was Written? (2) Wrote Java app to display images and save manually clicked nose coordinates. Wrote C++ program to perform image sampling and Gabor wavelet preprocessing. Wrote perl scripts to generate input files. Hope to soon have perl script to automatically run input files and compile performance results.

Acknowledgments Dr. A. J. Howell, School of Cognitive and Computing Sciences, Univ. of Sussex, UK. –Provided Gabor data and sample face images. Dr. Robert Oostenveld, Dept. of Medical Physics and Clinical Neurophysiology, University Nijmegen, The Netherlands. –Provided C routine for SVD pseudoinverse calculation.

Acknowledgments (2) Numerical Recipies Software, Numerical Recipies in C: The Art of Scientific Computing. –Used their published singular value decomposition routine in C. And last, but not least… Prof. Keller –Invaluable guidance and advice regarding this project.