G53MLE | Machine Learning | Dr Guoping Qiu

Slides:



Advertisements
Similar presentations
Perceptron Lecture 4.
Advertisements

Introduction to Neural Networks Computing
Perceptron Learning Rule
NEURAL NETWORKS Perceptron
also known as the “Perceptron”
Lecture 13 – Perceptrons Machine Learning March 16, 2010.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
CS Perceptrons1. 2 Basic Neuron CS Perceptrons3 Expanded Neuron.
Classification Neural Networks 1
Intro. ANN & Fuzzy Systems Lecture 8. Learning (V): Perceptron Learning.
Overview over different methods – Supervised Learning
Simple Neural Nets For Pattern Classification
Linear Learning Machines  Simplest case: the decision function is a hyperplane in input space.  The Perceptron Algorithm: Rosenblatt, 1956  An on-line.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
The Perceptron CS/CMPE 333 – Neural Networks. CS/CMPE Neural Networks (Sp 2002/2003) - Asim LUMS2 The Perceptron – Basics Simplest and one.
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
Linear Learning Machines  Simplest case: the decision function is a hyperplane in input space.  The Perceptron Algorithm: Rosenblatt, 1956  An on-line.
An Illustrative Example
Artificial Neural Networks
Data Mining with Neural Networks (HK: Chapter 7.5)
LOGO Classification III Lecturer: Dr. Bo Yuan
CSCI 347 / CS 4206: Data Mining Module 04: Algorithms Topic 06: Regression.
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Artificial Neural Networks
Computer Science and Engineering
Artificial Neural Networks
1 Artificial Neural Networks Sanun Srisuk EECP0720 Expert Systems – Artificial Neural Networks.
CS464 Introduction to Machine Learning1 Artificial N eural N etworks Artificial neural networks (ANNs) provide a general, practical method for learning.
Machine Learning Lecture 11 Summary G53MLE | Machine Learning | Dr Guoping Qiu1.
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.6: Linear Models Rodney Nielsen Many of.
Neural Networks and Machine Learning Applications CSC 563 Prof. Mohamed Batouche Computer Science Department CCIS – King Saud University Riyadh, Saudi.
Linear Discrimination Reading: Chapter 2 of textbook.
Non-Bayes classifiers. Linear discriminants, neural networks.
Linear Classification with Perceptrons
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
Chapter 2 Single Layer Feedforward Networks
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 7: Linear and Generalized Discriminant Functions.
CS621 : Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 21: Perceptron training and convergence.
Lecture notes for Stat 231: Pattern Recognition and Machine Learning 1. Stat 231. A.L. Yuille. Fall Perceptron Rule and Convergence Proof Capacity.
1 Perceptron as one Type of Linear Discriminants IntroductionIntroduction Design of Primitive UnitsDesign of Primitive Units PerceptronsPerceptrons.
Perceptrons Michael J. Watts
Chapter 6 Neural Network.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Neural NetworksNN 21 Architecture We consider the architecture: feed- forward NN with one layer It is sufficient to study single layer perceptrons with.
Start with student evals. What function does perceptron #4 represent?
Linear Models & Clustering Presented by Kwak, Nam-ju 1.
Announcements 1. Textbook will be on reserve at library 2. Topic schedule change; modified reading assignment: This week: Linear discrimination, evaluating.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Fall 2004 Backpropagation CS478 - Machine Learning.
CS 388: Natural Language Processing: Neural Networks
Artificial Neural Networks
Chapter 2 Single Layer Feedforward Networks
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Classification with Perceptrons Reading:
CSC 578 Neural Networks and Deep Learning
Perceptrons for Dummies
Data Mining with Neural Networks (HK: Chapter 7.5)
Pattern Recognition: Statistical and Neural
Classification Neural Networks 1
Perceptron as one Type of Linear Discriminants
Neural Networks Chapter 5
Artificial Intelligence Lecture No. 28
Lecture Notes for Chapter 4 Artificial Neural Networks
Neuro-Computing Lecture 2 Single-Layer Perceptrons
Artificial Intelligence 9. Perceptron
Chapter - 3 Single Layer Percetron
Lecture 8. Learning (V): Perceptron Learning
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

G53MLE | Machine Learning | Dr Guoping Qiu Lecture 2 Perceptron G53MLE | Machine Learning | Dr Guoping Qiu

G53MLE | Machine Learning | Dr Guoping Qiu Perceptron - Basic Perceptron is a type of artificial neural network (ANN) G53MLE | Machine Learning | Dr Guoping Qiu

Perceptron - Operation It takes a vector of real-valued inputs, calculates a linear combination of these inputs, then output 1 if the result is greater than some threshold and -1 otherwise G53MLE | Machine Learning | Dr Guoping Qiu

Perceptron – Decision Surface Perceptron can be regarded as representing a hyperplane decision surface in the n-dimensional feature space of instances. The perceptron outputs a 1 for instances lying on one side of the hyperplane and a -1 for instances lying on the other side. This hyperplane is called the Decision Surface G53MLE | Machine Learning | Dr Guoping Qiu

Perceptron – Decision Surface In 2-dimensional space w0 w1 w2 x1 x2 Decision Surface (Line) o=-1 o=+1 G53MLE | Machine Learning | Dr Guoping Qiu

Perceptron – Representation Power The Decision Surface is linear Perceptron can only solve Linearly Separable Problems + + + + + + + + + + - - - - - - - - - G53MLE | Machine Learning | Dr Guoping Qiu

Perceptron – Representation Power Can represent many boolean functions: Assume boolean values of 1 (true) and -1 (false) AND W0=-0.8 W1=0.5 W2=0.5 x1 x2 x1 x2 D -1 -1 -1 -1 +1 -1 +1 -1 -1 +1 +1 +1 Decision Surface (-1,+1) (+1,+1) (-1,-1) +1,-1) G53MLE | Machine Learning | Dr Guoping Qiu

Perceptron – Representation Power Can represent many boolean functions: Assume boolean values of 1 (true) and -1 (false) OR W0=? W1=? W2=? x1 x2 x1 x2 D -1 -1 -1 -1 +1 +1 +1 -1 +1 +1 +1 +1 Decision Surface (-1,+1) (+1,+1) (-1,-1) +1,-1) G53MLE | Machine Learning | Dr Guoping Qiu

Perceptron – Representation Power Separate the objects from the rest x1 x2 1 2 3 4 6 5 7 9 10 12 11 13 14 8 15 16 Elliptical blobs (objects) G53MLE | Machine Learning | Dr Guoping Qiu

Perceptron – Representation Power Some problems are linearly non-separable XOR Decision Surface: It doesn’t matter where you place the line (decision surface), it is impossible to separate the space such that on one side we have D = 1 and on the other we have D = -1 x1 x2 D -1 -1 -1 -1 +1 +1 +1 -1 +1 +1 +1 -1 (-1,+1) (+1,+1) W0=? W1=? W2=? x1 x2 (-1,-1) (+1,-1) Perceptron Cannot Solve such Problem! G53MLE | Machine Learning | Dr Guoping Qiu

Perceptron – Training Algorithm Separate the objects from the rest W0=? W1=? W2=? x1 x2 x1 x2 We are given the training sample (experience ) pairs (X, D), how can we determine the weights that will produce the correct +1 and -1 outputs for the given training samples? G53MLE | Machine Learning | Dr Guoping Qiu

Perceptron – Training Algorithm Training sample pairs (X, d), where X is the input vector, d is the input vector’s classification (+1 or -1) is iteratively presented to the network for training, one at a time, until the process converges G53MLE | Machine Learning | Dr Guoping Qiu

Perceptron – Training Algorithm The Procedure is as follows Set the weights to small random values, e.g., in the range (-1, 1) Present X, and calculate Update the weights Repeat by going to step 2 x0=1 (constant) G53MLE | Machine Learning | Dr Guoping Qiu

Perceptron – Training Algorithm Example W0=0.5 W1=0.5 W2=0.5 x1 x2 x1 x2 D -1 -1 -1 -1 +1 +1 +1 -1 +1 +1 +1 +1 G53MLE | Machine Learning | Dr Guoping Qiu

Perceptron – Training Algorithm Convergence Theorem The perceptron training rule will converge (finding a weight vector correctly classifies all training samples) within a finite number of iterations, provided the training examples are linearly separable and provided a sufficiently small h is used. G53MLE | Machine Learning | Dr Guoping Qiu

G53MLE | Machine Learning | Dr Guoping Qiu Further Reading T. M. Mitchell, Machine Learning, McGraw-Hill International Edition, 1997 Chapter 4 G53MLE | Machine Learning | Dr Guoping Qiu

Tutorial/Exercise Questions What is the weight values of a perceptron having the following decision surfaces Design two-input perceptrons for implementing the following boolean functions AND, OR, NAND, NOR A single layer perceptron is incapable of learning simple functions such as XOR (exclusive OR). Explain why this is the case (hint: use the decision boundary) x2 x2 1 1 x1 x1 -1.5 2 (a) (b) G53MLE | Machine Learning | Dr Guoping Qiu

Tutorial/Exercise Questions A single layer Perceptron is as follows Write down and plot the equation of the decision boundary of this device Change the values of w1 and w2 so that the Perceptron can separate following two-class patterns Class 1 Patterns: (1, 2), (1.5. 2.5), (1, 3) Class 2 Patterns: (2, 1.5), (2, 1) G53MLE | Machine Learning | Dr Guoping Qiu