Presentation is loading. Please wait.

Presentation is loading. Please wait.

© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 INFO331 Machine learning. Neural networks. Supervised.

Similar presentations


Presentation on theme: "© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 INFO331 Machine learning. Neural networks. Supervised."— Presentation transcript:

1 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 INFO331 Machine learning. Neural networks. Supervised learning in neural networks.MLP and BP (Text book: section 2.11, pp.146-155; section 3.7.3., pp.218-221); section 4.2, pp.267- 282;catch-up reading: pp.251-266)

2 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Machine learning n Issues in machine learning n Learning from static versus learning from dynamic data n Incremental learning n On-line learning, adaptive learning n Life-long learning n Cognitive learning processes in humans

3 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Inductive learning n learning from examples n Inductive decision trees and the ID3 algorithm n Information gain evaluation

4 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Other methods of machine learning n Learning by doing n Learning from advice n Learning by analogy n Case-based learning and reasoning n Template-based learning (Kasabov and Clarke) - Iris example

5 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Learning fuzzy rules from data n Cluster-based methods n Fuzzy template -based method (Kasabov, 96), pp.218-219 n Wang’s method (pp.220-221) n Advantages and disadvantages

6 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Supervised learning in neural networks n Supervised learning in neural networks n Perceptrons n Multilayer perceptrons (MLP) and the backpropagation algorithm n MLP as universal approximators n Problems and features of the MPL

7 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Supervised learning in neural networks n The learning principle is to provide the input values and the desired output values for each of the training examples. n The neural network changes its connection weights during training. n Calculate the error: training error - how well a NN has learned the data test error - how well a trained NN generalises over new input data.

8 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Perceptrons n fig.4.8

9 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Perceptrons n fig.4.9

10 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Perceptrons n fig.4.10

11 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 MLP and the backpropagation algorithm n fig.4.11

12 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 MLP and the backpropagation algorithm n fig.4.12

13 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 MLP and the backpropagation algorithm n fig.4.13

14 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 MLPs as statistical tools n A MLP with one hidden layer can approximate any continuous function to any desired accuracy (Hornik et al, 1989) n MLP are multivariate non-linear regression models n MLP can learn conditional probabilities

15 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Problems and features of the MPL n How to chose the number of the hidden nodes n Catastrophic forgetting n Introducing hints in neural networks n Overfitting (overlearning)

16 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Problems and features of the MPL n Catastrophic forgetting n fig. 4.14

17 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Problems and features of the MPL n Introducing hints n fig.4.15

18 © N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Problems and features of the MPL n Overfitting n fig. 4.16


Download ppt "© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 INFO331 Machine learning. Neural networks. Supervised."

Similar presentations


Ads by Google