Download presentation
Presentation is loading. Please wait.
Published byByron Mervin Patrick Modified over 9 years ago
1
AI – CS289 Machine Learning - Labs Machine Learning – Lab 4 02 nd November 2006 Dr Bogdan L. Vrusias b.vrusias@surrey.ac.uk
2
AI – CS289 Machine Learning - Labs 02 nd November 2006Bogdan L. Vrusias © 20062 Instuctions The following slides demonstrate the capabilities of Machine Learning. The examples are taken from: –Negnevitsky, M., "Artificial Intelligence: A Guide to Intelligent Systems", 2nd edn. Addison Wesley, Harlow, England, 2005. Download and unzip the following file, that contains all examples: http://www.booksites.net/download/negnevitsky2/student_files/matlab/0321204662_matlab.zip or alternately: http://www.cs.surrey.ac.uk/teaching/cs289/lecturenotes/0321204662_matlab.zip To run the examples, use Matlab (left hand side window called current directory) to navigate to the directory where you have downloaded and unzipped the files, and then simply type (case sensitive) the name of the file without the “.m” extention. –E.g. to run the Perceptron_AND.m you type Perceptron_AND on Matlab’s command window.
3
AI – CS289 Machine Learning - Labs 02 nd November 2006Bogdan L. Vrusias © 20063 The perceptron: learning linearly separable functions Filename: Perceptron_AND.m Matlab command: Perceptron_AND Problem: Two-input perceptron is required to perform logical operation AND Run the file, follow the instructions, READ THE COMMENTS on each step, and observe the following: –The graph should show you how well it separates the two categories (category 0 and category 1) –The input pairs ([0,0], [0,1], [1,0], [1,1]) are classified correctly after the training.
4
AI – CS289 Machine Learning - Labs 02 nd November 2006Bogdan L. Vrusias © 20064 The perceptron: learning linearly separable functions Filename: Perceptron_OR.m Matlab command: Perceptron_OR Problem: Two-input perceptron is required to perform logical operation OR Run the file, follow the instructions, READ THE COMMENTS on each step, and observe the following: –The graph should show you how well it separates the two categories (category 0 and category 1) –Observe how the categorisation line changes for every step of the training. –The input pairs ([0,0], [0,1], [1,0], [1,1]) are classified correctly after the training.
5
AI – CS289 Machine Learning - Labs 02 nd November 2006Bogdan L. Vrusias © 20065 The perceptron: an attempt to learn linearly non- separable functions Filename: Perceptron_XOR.m Matlab command: Perceptron_XOR Problem: Two-input perceptron is required to perform logical operation XOR Run the file, follow the instructions, READ THE COMMENTS on each step, and observe the following: –The graph should show you how well it separates the two categories (category 0 and category 1) –Observe how the categorisation line changes for every step of the training. Does the system learn? –The input pairs ([0,0], [0,1], [1,0], [1,1]) are NOT classified correctly after the training. –Can you tell why?
6
AI – CS289 Machine Learning - Labs 02 nd November 2006Bogdan L. Vrusias © 20066 Back-propagation algorithm Filename: XOR_bp.m Matlab command: XOR_bp Problem: The three-layer back-propagation network is required to perform logical operation Exclusive-OR. Run the file, follow the instructions, READ THE COMMENTS on each step, and observe the following: –The graph should show you how the training is performed. The blue line represents the error that drops through time (iterations). –Observe how there are now two line to separate the categories (from the two hidden neurons) –The input pairs ([0,0], [0,1], [1,0], [1,1]) are classified correctly after the training.
7
AI – CS289 Machine Learning - Labs 02 nd November 2006Bogdan L. Vrusias © 20067 Competitive learning Filename: Competitive.m Matlab command: Competitive Problem: A single-layer competitive network is required to classify a set of two-element input vectors into four natural classes. Run the file, follow the instructions, READ THE COMMENTS on each step, and observe the following: –The graph should show you the coordinates of the input data vectors (red) together with the weight vectors (blue) –Observe how there are labels given to each cluster. –The test vectors (green) are then labelled based on how close they fall to a formed labelled cluster.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.