© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 INFO331 Machine learning. Neural networks. Supervised learning in neural networks.MLP and BP (Text book: section 2.11, pp ; section , pp ); section 4.2, pp ;catch-up reading: pp )
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Machine learning n Issues in machine learning n Learning from static versus learning from dynamic data n Incremental learning n On-line learning, adaptive learning n Life-long learning n Cognitive learning processes in humans
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Inductive learning n learning from examples n Inductive decision trees and the ID3 algorithm n Information gain evaluation
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Other methods of machine learning n Learning by doing n Learning from advice n Learning by analogy n Case-based learning and reasoning n Template-based learning (Kasabov and Clarke) - Iris example
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Learning fuzzy rules from data n Cluster-based methods n Fuzzy template -based method (Kasabov, 96), pp n Wang’s method (pp ) n Advantages and disadvantages
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Supervised learning in neural networks n Supervised learning in neural networks n Perceptrons n Multilayer perceptrons (MLP) and the backpropagation algorithm n MLP as universal approximators n Problems and features of the MPL
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Supervised learning in neural networks n The learning principle is to provide the input values and the desired output values for each of the training examples. n The neural network changes its connection weights during training. n Calculate the error: training error - how well a NN has learned the data test error - how well a trained NN generalises over new input data.
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Perceptrons n fig.4.8
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Perceptrons n fig.4.9
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Perceptrons n fig.4.10
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 MLP and the backpropagation algorithm n fig.4.11
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 MLP and the backpropagation algorithm n fig.4.12
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 MLP and the backpropagation algorithm n fig.4.13
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 MLPs as statistical tools n A MLP with one hidden layer can approximate any continuous function to any desired accuracy (Hornik et al, 1989) n MLP are multivariate non-linear regression models n MLP can learn conditional probabilities
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Problems and features of the MPL n How to chose the number of the hidden nodes n Catastrophic forgetting n Introducing hints in neural networks n Overfitting (overlearning)
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Problems and features of the MPL n Catastrophic forgetting n fig. 4.14
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Problems and features of the MPL n Introducing hints n fig.4.15
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 Problems and features of the MPL n Overfitting n fig. 4.16