Download presentation
Presentation is loading. Please wait.
Published byἈντίπας Πυλαρινός Modified over 6 years ago
1
CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay
2
Machine Learning Basics
Learning from examples: e1,e2,e3… are +ve examples f1, f2, f3… are –ve examples e1 e2 e3 … en Concept C f1 f2 f3 fn
3
Classification of Learning Paradigms
Statistical Knowledge Based Learning From Analogy Learning From Examples Learning From Examples Neural Networks Decision Trees
4
Example: Loan Reliability Detection (Feature Vector)
Features for deciding if a person is reliable for granting loan Age – Numerical Gender – categorical Education – categorical Salary – numerical Family background – categorical Loan history – categorical
5
Kolmogorov Theorem: 1965 (Informal Statement) A function (Yes/No) can be computed by a 3-layer n/w of simple Yes/No computing elements
6
3 Layer NN for XOR XY’+X’Y Θ = 0.5 1 1 X’Y XY’ Θ = 0.5 Θ = 0.5 -1 -1 1
7
A popular universal data for testing learning algorithms: IRIS Data
Sepal Length Sepal Width Petal Length Petal Width Classes 5.1 3.5 1.4 0.2 setosa 4.9 3.0 6.3 2.9 5.6 1.8 virginica 6.9 3.1 1.5 versicolor 5.5 2.3 4.0 1.3 5.7 2.8 4.1 3.3 6.0 2.5
8
Machine Learning Basics cont..
Training: arrive at hypothesis h based on the data seen. Testing: present new data to h test performance. hypothesis h concept c
9
Feedforward Network
10
Limitations of perceptron
Non-linear separability is all pervading Single perceptron does not have enough computing power Eg: XOR cannot be computed by perceptron
11
Solutions Tolerate error (Ex: pocket algorithm used by connectionist expert systems). Try to get the best possible hyperplane using only perceptrons Use higher dimension surfaces Ex: Degree - 2 surfaces like parabola Use layered network
12
Pocket Algorithm Algorithm evolved in 1985 – essentially uses PTA
Basic Idea: Always preserve the best weight obtained so far in the “pocket” Change weights, if found better (i.e. changed weights result in reduced error).
13
XOR using 2 layers Non-LS function expressed as a linearly separable
function of individual linearly separable functions.
14
Example - XOR = 0.5 w1=1 w2=1 Calculation of XOR x1x2 x1x2 x1 x2
1 Calculation of x1x2 = 1 w1=-1 w2=1.5 x1 x2
15
Example - XOR = 0.5 w1=1 w2=1 x1x2 1 1 x1x2 1.5 -1 -1 1.5 x1 x2
16
Some Terminology A multilayer feedforward neural network has
Input layer Output layer Hidden layer (assists computation) Output units and hidden units are called computation units.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.