Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, 11-12 a Machine Learning.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Perceptron Lecture 4.
Slides from: Doug Gray, David Poole
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Introduction to Neural Networks Computing
G53MLE | Machine Learning | Dr Guoping Qiu
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
also known as the “Perceptron”
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Introduction to Artificial Intelligence (G51IAI)
CS Perceptrons1. 2 Basic Neuron CS Perceptrons3 Expanded Neuron.
Classification Neural Networks 1
Overview over different methods – Supervised Learning
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Simple Neural Nets For Pattern Classification
Linear Learning Machines  Simplest case: the decision function is a hyperplane in input space.  The Perceptron Algorithm: Rosenblatt, 1956  An on-line.
The back-propagation training algorithm
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
Linear Learning Machines  Simplest case: the decision function is a hyperplane in input space.  The Perceptron Algorithm: Rosenblatt, 1956  An on-line.
An Illustrative Example
Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.
Before we start ADALINE
September 23, 2010Neural Networks Lecture 6: Perceptron Learning 1 Refresher: Perceptron Training Algorithm Algorithm Perceptron; Start with a randomly.
Data Mining with Neural Networks (HK: Chapter 7.5)
CS 4700: Foundations of Artificial Intelligence
Linear Discriminators Chapter 20 From Data to Knowledge.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Fundamentals of Python: From First Programs Through Data Structures
Neural Networks Lecture 8: Two simple learning algorithms
Dr. Hala Moushir Ebied Faculty of Computers & Information Sciences
Fundamentals of Python: First Programs
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
1 CS 343: Artificial Intelligence Neural Networks Raymond J. Mooney University of Texas at Austin.
Artificial Neural Networks
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
1 Chapter 20 Section Slide Set 2 Perceptron examples Additional sources used in preparing the slides: Nils J. Nilsson’s book: Artificial Intelligence:
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Pattern Classification X. 2 Content General Method K Nearest Neighbors Decision Trees Nerual Networks.
CSC321: Neural Networks Lecture 2: Learning with linear neurons Geoffrey Hinton.
Linear Discrimination Reading: Chapter 2 of textbook.
Non-Bayes classifiers. Linear discriminants, neural networks.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Perceptrons Gary Cottrell. Cognitive Science Summer School 2 Perceptrons: A bit of history Frank Rosenblatt studied a simple version of a neural net called.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
For Friday No reading Take home exam due Exam 2. For Monday Read chapter 22, sections 1-3 FOIL exercise due.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Neural networks.
CS 388: Natural Language Processing: Neural Networks
Real Neurons Cell structures Cell body Dendrites Axon
Ranga Rodrigo February 8, 2014
A Simple Artificial Neuron
CSE 473 Introduction to Artificial Intelligence Neural Networks
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Classification with Perceptrons Reading:
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Linear Discriminators
Data Mining with Neural Networks (HK: Chapter 7.5)
Classification Neural Networks 1
Perceptron as one Type of Linear Discriminants
Neural Networks Chapter 5
Neuro-Computing Lecture 2 Single-Layer Perceptrons
David Kauchak CS158 – Spring 2019
Presentation transcript:

Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning

HW 1 is available due to Wednesday 02-18

Machine Learning A neural network isn’t magic. It’s not going to be able to guess anything correctly unless we teach it how to!

Machine Learning

The process is as follows: 1.Provide the perceptron with inputs for which there is a known answer. 2.Ask the perceptron to guess an answer. 3.Compute the error. 4.Adjust all the weights according to the error. 5.Return to Step 1 and repeat!

Perceptron Learning Rule Update weights by: where η is the “learning rate” r i is the teacher specified output for unit i. Machine Learning

Equivalent to rules: –If output is correct do nothing. –If output is high, lower weights on active inputs –If output is low, increase weights on active inputs Machine Learning

Since perceptron uses linear threshold function, it is searching for a linear separator that discriminates the classes. o3o3 o2o2 ?? Machine Learning

Perceptron convergence theorem: If : 1. The data is linearly separable 2. A set of weights exist that are consistent with the data, then The Perceptron algorithm will eventually converge to a consistent set of weights. Machine Learning

Perceptron cycling theorem: If The data is not linearly separable, Then the Perceptron algorithm will eventually repeat a set of weights and threshold at the end of some epoch and therefore enter an infinite loop. Machine Learning

where ρ is the parameter controlling convergence, and x (t) denotes the point considered in the t iteration. The class labels y(t) are equal to −1 and +1 for the two classes ω2 and ω1, respectively. A pseudo code for this scheme is given as:

Machine Learning

Example Generate four 2-dimensional data sets Xi, i = 1,...,4, each containing data vectors from two classes. In all Xi’s The first class (denoted −1) contains 100 vectors uniformly distributed in the square [0, 2]×[0, 2]. The second class (denoted +1) contains another 100 vectors uniformly distributed in the squares [3, 5]×[3, 5], [2, 4]×[2, 4], [0, 2]×[2, 4], and [1, 3]×[1, 3] for X1, X2, X3, and X4, respectively. Each data vector is augmented with a third coordinate that equals 1.

Machine Learning 1. Plot the four data sets and notice that as we move from X1 to X3 the classes approach each other but remain linearly separable. In X4 the two classes overlap.

Machine Learning 2. Run the perceptron algorithm for each Xi, i = 1,...,4, with learning rate parameters 0.01 and 0.05 and initial estimate for the parameter vector [1, 1, −0.5]T. 3. Run the perceptron algorithm for X3 with learning rate 0.05 using as initial estimates for w [1, 1, −0.5]T and [1, 1, 0.5]T. w = iter = mis_clas = 25

Feeding data through the net: (1  0.25) + (0.5  (-1.5)) = (-0.75) = Machine Learning

Price of meal = 850 portions of fish portions of chips portions of beer Linear neuron Machine Learning

NEW WEIGHT = WEIGHT + ΔWEIGHT ΔWEIGHT = ERROR * INPUT NEW WEIGHT = WEIGHT + ERROR * INPUT NEW WEIGHT = WEIGHT + ERROR * INPUT * LEARNING CONSTANT

Machine Learning ERROR = DESIRED OUTPUT - GUESS OUTPUT DesiredGuessError

x+y-2 1 * -2 y * +1 x * +1 if sum<0 : 0 else : 1 inputs weights sum output x & yyx output inputs Truth Table for Logical AND Machine Learning

Learning Boolean AND

x+y-1 1 * -1 y * +1 x * +1 if sum<0 : 0 else : 1 inputs weights sum output x | yyx output inputs Truth Table for Logical OR Machine Learning

Thank you!