Artificial Neural Networks

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Perceptron Lecture 4.
Beyond Linear Separability
Introduction to Neural Networks Computing
G53MLE | Machine Learning | Dr Guoping Qiu
Artificial Neural Networks (1)
Perceptron Learning Rule
NEURAL NETWORKS Perceptron
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Classification Neural Networks 1
Artificial Neural Networks
Lecture 14 – Neural Networks
Simple Neural Nets For Pattern Classification
The Perceptron CS/CMPE 333 – Neural Networks. CS/CMPE Neural Networks (Sp 2002/2003) - Asim LUMS2 The Perceptron – Basics Simplest and one.
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 15: Introduction to Artificial Neural Networks Martin Russell.
Perceptron Learning Rule
An Illustrative Example
Artificial Neural Networks
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Artificial Neural Networks
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Neural Networks and Machine Learning Applications CSC 563 Prof. Mohamed Batouche Computer Science Department CCIS – King Saud University Riyadh, Saudi.
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
CS621 : Artificial Intelligence
Chapter 2 Single Layer Feedforward Networks
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Start with student evals. What function does perceptron #4 represent?
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
1 Neural networks 2. 2 Introduction: Neural networks The nervous system contains 10^12 interconnected neurons.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Fundamental ARTIFICIAL NEURAL NETWORK Session 1st
Neural networks.
Multiple-Layer Networks and Backpropagation Algorithms
Fall 2004 Backpropagation CS478 - Machine Learning.
Learning with Perceptrons and Neural Networks
Chapter 2 Single Layer Feedforward Networks
Other Classification Models: Neural Network
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
CS621: Artificial Intelligence
Machine Learning Today: Reading: Maria Florina Balcan
Classification Neural Networks 1
Chapter 3. Artificial Neural Networks - Introduction -
Biological Neuron Cell body Dendrites Axon Impulses Synapses
Artificial Neural Network & Backpropagation Algorithm
Perceptron as one Type of Linear Discriminants
Neural Networks Chapter 5
Neural Network - 2 Mayank Vatsa
Multilayer Perceptron & Backpropagation
Capabilities of Threshold Neurons
Lecture Notes for Chapter 4 Artificial Neural Networks
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
McCulloch–Pitts Neuronal Model :
CSSE463: Image Recognition Day 17
Perceptron Learning Rule
Prof. Pushpak Bhattacharyya, IIT Bombay
Word representations David Kauchak CS158 – Fall 2016.
Perceptron Learning Rule
Perceptron Learning Rule
CS621: Artificial Intelligence Lecture 18: Feedforward network contd
Outline Announcement Neural networks Perceptrons - continued
CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak) Pushpak Bhattacharyya.
Presentation transcript:

Artificial Neural Networks Part 2/3 – Perceptron Slides modified from Neural Network Design by Hagan, Demuth and Beale Berrin Yanikoglu

Perceptron A single artificial neuron that computes its weighted input and uses a threshold activation function. It effectively separates the input space into two categories by the hyperplane: wTx + b = 0

- Two-Input Case + a = hardlim(n) = [1 2]p + -2 Decision Boundary: all points p for which wTp + b =0 Taking as p=[2 0]T, we see that b=-2 from [1 2]p + b = 0

Decision Boundary p w -b wT.p = ||w||||p||Cosq proj. of p onto w = ||p||Cosq = wT.p/||w|| q proj. of p onto w -b • All points on the decision boundary have the same inner product (= -b) with the weight vector • Therefore they have the same projection onto the weight vector; so they must lie on a line orthogonal to the weight vector ADVANCED

Decision Boundary The weight vector is orthogonal to the decision boundary The weight vector points in the direction of the vector which should produce an output of 1 so that the vectors with the positive output are on the right side of the decision boundary if w pointed in the opposite direction, the dot products of all input vectors would have the opposite sign would result in same classification but with opposite labels The bias determines the position of the boundary solve for wTp+b = 0 using one point on the decision boundary to find b.

An Illustrative Example

Boolean OR Given the above input-output pairs (p,t), can you find (manually) the weights of a perceptron to do the job?

Boolean OR Solution 1) Pick an admissable decision boundary 2) Weight vector should be orthogonal to the decision boundary. 3) Pick a point on the decision boundary to find the bias.

Multiple-Neuron Perceptron weights of one neuron in one row of W. W w 1 , 2 ¼ R S = i 2x1 3x2 W w T 1 2 S = w i 1 , 2 R =

Multiple-Neuron Perceptron Each neuron will have its own decision boundary. A single neuron can classify input vectors into two categories. An S-neuron perceptron can potentially classify input vectors into 2S categories. e.g. two neurons with input from R2 can divide the input space into 4

Perceptron Limitations

Perceptron Limitations A single layer perceptron can only learn linearly separable problems. Boolean AND function is linearly separable, whereas Boolean XOR function is not. Boolean AND Boolean XOR

Perceptron Limitations Linear Decision Boundary Linearly Inseparable Problems

Perceptron Limitations For a linearly not-separable problem: Would it help if we use more layers of neurons? What could be the learning rule for each neuron? Yes! Solution: Multilayer networks and the backpropagation learning algorithm

AND, OR, NAND, NOR, but not XOR Perceptrons ( in this context of limitations, the word refers to single layer perceptron) can learn many Boolean functions: AND, OR, NAND, NOR, but not XOR Multi-layer perceptron can solve the XOR problem AND: x1 x2 X0=1 W0 = -0.8 W1=0.5 W2=0.5 Σ What is machine learning?

More than one layer of perceptrons (with a hardlimiting activation function) can learn any Boolean function. However, a learning algorithm for multi-layer perceptrons has not been developed until much later backpropagation algorithm replacing the hardlimiter in the perceptron with a sigmoid activation function

Outline So far we have seen how a single neuron with a threshold activation function separates the input space into two. We also talked about how more than one nodes may indicate convex (open or closed) regions The next slides = Backpropagation algorithm to learn the weights automatically