Traffic Sign Recognition Using Artificial Neural Network Radi Bekker 101100.

Slides:



Advertisements
Similar presentations
Artificial Intelligence 12. Two Layer ANNs
Advertisements

NEURAL NETWORKS Biological analogy
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Artificial Neural Networks (1)
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Artificial Intelligence (CS 461D)
Artificial Neural Networks ECE 398BD Instructor: Shobha Vasudevan.
Handwritten Character Recognition Using Artificial Neural Networks Shimie Atkins & Daniel Marco Supervisor: Johanan Erez Technion - Israel Institute of.
The back-propagation training algorithm
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Artificial neural networks:
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Machine Learning. Learning agent Any other agent.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Artificial Neural Networks
Artificial Neural Network Theory and Application Ashish Venugopal Sriram Gollapalli Ulas Bardak.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
The Von Neumann Model The approach to solving problems is to process a set of instructions and data stored in locations. The instructions are processed.
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
1 Chapter 6: Artificial Neural Networks Part 2 of 3 (Sections 6.4 – 6.6) Asst. Prof. Dr. Sukanya Pongsuparb Dr. Srisupa Palakvangsa Na Ayudhya Dr. Benjarath.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
NEURAL NETWORKS FOR DATA MINING
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Presented by Scott Lichtor An Introduction to Neural Networks.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
PARALLELIZATION OF ARTIFICIAL NEURAL NETWORKS Joe Bradish CS5802 Fall 2015.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
Neural Networks 2nd Edition Simon Haykin
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Joe Bradish Parallel Neural Networks. Background  Deep Neural Networks (DNNs) have become one of the leading technologies in artificial intelligence.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
CSC321: Neural Networks Lecture 1: What are neural networks? Geoffrey Hinton
Intelligent Control Methods Lecture 14: Neuronal Nets (Part 2) Slovak University of Technology Faculty of Material Science and Technology in Trnava.
Where are we? What’s left? HW 7 due on Wednesday Finish learning this week. Exam #4 next Monday Final Exam is a take-home handed out next Friday in class.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Artificial Neural Networks This is lecture 15 of the module `Biologically Inspired Computing’ An introduction to Artificial Neural Networks.
Fundamental ARTIFICIAL NEURAL NETWORK Session 1st
Neural Networks.
Neural Network Implementations on Parallel Architectures
Artificial Intelligence (CS 370D)
Artificial neural networks:
CSE 473 Introduction to Artificial Intelligence Neural Networks
Neural Networks Dr. Peter Phillips.
What is an ANN ? The inventor of the first neuro computer, Dr. Robert defines a neural network as,A human brain like system consisting of a large number.
Dr. Unnikrishnan P.C. Professor, EEE
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Prof. Carolina Ruiz Department of Computer Science
OVERVIEW OF BIOLOGICAL NEURONS
network of simple neuron-like computing elements
Artificial Intelligence 12. Two Layer ANNs
Artificial Neural Networks
ARTIFICIAL NEURAL networks.
Introduction to Neural Network
EE 193/Comp 150 Computing with Biological Parts
Artificial Neural Networks / Spring 2002
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Traffic Sign Recognition Using Artificial Neural Network Radi Bekker

Motivation for ANN  von Neumann machines are based on the processing – one processing unit, many operations in one second.  Neural networks are based on the parallel architecture of animal brains-slow,parallel and complicated-good for pattern matching.  Pattern matching can solve many problems to which algorithms are not exist or very complicated.

The human brain  Consists from 1011 neurons  Neurons are connected by around 1015 connections.  Neurons send impulses to each other through the connections and these impulses make the brain work.  Dendrites- responsible for input.  Axon- responsible for output.

Artificial neural network (ANN)  Network is constructed from artificial neuron layers.  There is input and output layers and any number of hidden (internal) layers.  Each neuron in one layer is connected to every neuron in the next layer.

Artificial Neuron  Many inputs like dendrites.  One output like axon.  Each neuron receives a signal from the neurons in the previous layer.  The weighted inputs are summed, and passed through a limiting function which scales the output to a fixed range of values.  The output of the limiter is then broadcast to all of the neurons in the next layer.

Training- Back Propagation-1  The most common learning algorithm is called Back Propagation (BP).  A BP network learns by example, that is, we must provide a learning set that consists of some input examples and the known-correct output for each case.  This method adjusts the weights between the neurons to solve a particular problem.  The BP learning process works in small iterative steps: one of the example cases is applied to the network, and the network produces some output based on the current state of it's synaptic weights.  This output is compared to the known-good output, and a mean-squared error signal is calculated.

Training- Back Propagation-2  The error value is then propagated backwards through the network, and small changes are made to the weights in each layer.  The whole process is repeated for each of the example cases, then back to the first case again, and so on.  The cycle is repeated until the overall error value drops below some pre-determined threshold.  At this point we say that the network has learned the problem "well enough".

My Network  Input layer-10,000 neurons.  Hidden layers-3 hidden layers with 10 neurons each.  Output layer-16 neurons for 16 traffic signs.  Training- network trained for 2000 cycles.

Image Filtering  Resizing the image to size 100x100.  Turning the image to black and white.  Rescaling the matrix image to numbers between 0 and 1.  Constructing a 10,000 sized vector from the columns of the image matrix.

Results  Good results for trained images  Bad results for real picture images.  When the network was constructed to identify 5 images- better results was achieved.  Contrast and brightness adjustments in some cases contributed to sign correct recognition.

Conclusions  ANN is good for small problems and networks.  ANN is bad for big networks.  Bigger network –more training time needed.  Hard to find out good network configurations.  ANN is a good method for solving hard computational problems.  More research on human brain could be helpful in constructing better ANN.