Ranga Rodrigo February 8, 2014

Slides:



Advertisements
Similar presentations
Slides from: Doug Gray, David Poole
Advertisements

1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
T HE A DALINE N EURON Ranga Rodrigo February 8,
NEURAL NETWORKS Perceptron
Navneet Goyal, BITS-Pilani Perceptrons. Labeled data is called Linearly Separable Data (LSD) if there is a linear decision boundary separating the classes.
Classification Neural Networks 1
Simple Neural Nets For Pattern Classification
Supervised learning 1.Early learning algorithms 2.First order gradient methods 3.Second order gradient methods.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Radial Basis Functions
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
September 23, 2010Neural Networks Lecture 6: Perceptron Learning 1 Refresher: Perceptron Training Algorithm Algorithm Perceptron; Start with a randomly.
CS 484 – Artificial Intelligence
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
1 Mehran University of Engineering and Technology, Jamshoro Department of Electronic, Telecommunication and Bio-Medical Engineering Neural Networks Mukhtiar.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
1 Artificial Neural Networks Sanun Srisuk EECP0720 Expert Systems – Artificial Neural Networks.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Multi-Layer Perceptron
Linear Discrimination Reading: Chapter 2 of textbook.
Non-Bayes classifiers. Linear discriminants, neural networks.
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
ADALINE (ADAptive LInear NEuron) Network and
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Multiple-Layer Networks and Backpropagation Algorithms
Neural Networks.
Supervised Learning in ANNs
Learning with Perceptrons and Neural Networks
第 3 章 神经网络.
DEPARTMENT: COMPUTER SC. & ENGG. SEMESTER : VII
A Simple Artificial Neuron
CSE 473 Introduction to Artificial Intelligence Neural Networks
Classification with Perceptrons Reading:
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
CSE P573 Applications of Artificial Intelligence Neural Networks
Hebb and Perceptron.
Machine Learning Today: Reading: Maria Florina Balcan
Biological and Artificial Neuron
Classification Neural Networks 1
Artificial Neural Network & Backpropagation Algorithm
Artificial Intelligence Chapter 3 Neural Networks
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Neural Networks Chapter 5
Neural Network - 2 Mayank Vatsa
Multi-Layer Perceptron
Capabilities of Threshold Neurons
Lecture Notes for Chapter 4 Artificial Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Neuro-Computing Lecture 2 Single-Layer Perceptrons
Artificial Intelligence Chapter 3 Neural Networks
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
CS621: Artificial Intelligence Lecture 18: Feedforward network contd
Artificial Intelligence Chapter 3 Neural Networks
Artificial Neural Networks / Spring 2002
Presentation transcript:

Ranga Rodrigo February 8, 2014 The Artificial Neuron Ranga Rodrigo February 8, 2014

Introduction The basic building block of a artificial neural network is the artificial neuron. The neuron sums the weighted inputs. If this sum exceeds a threshold value, the neuron fires and a signal is transmitted via the axon to other neurons. In this lecture, we learn about the artificial neuron.

Artificial Neuron x1 x2 x3 xD f a y w1 w2 w3 wD  w0 x0 = 1 Activation function

Activation Functions f

Perceptron Perception is a single-layer NN with a step activation function. The perceptron, due to its activation function, takes only two different output values, so it may classify signals applied at its input in the form of vectors to one of two classes.

Question Sketch the perceptron for a two-dimensional (2-D) data of the form . How many weight parameters are there to be learned in this case?

Learning Learning means adjusting the weights. We adjust the weights by presenting a set of input vectors with known desired (target) values. If the desired value and the out output of the NN are different, there is an error. We present these vectors one at a time. We may adjust the weights, if the output of the NN differs from the desired. We repeat the process until the sum of errors becomes smaller than a threshold.

Objective Function (Error) Here we consider sum of squared errors as the objective function and an identity activation func. Given a training set comprising a set of input vectors where together with a corresponding set of target vectors , we minimize the error function True value for the nth input vector Output of the NN for nth input vector

dth component of the nth input vector identity activation func

Output of the NN for nth input vector

Gradient Descent Rule Given a single training pattern, weights are updated using Widrow-Hoff Learning Rule

Homework Plot the activation functions as shown in slide 4. Slide 12 shows the perceptron algorithm. What are the expression that fill the blanks in this flow chart? Write the perceptron algorithm as shown in slide 12.