Presentation is loading. Please wait.

Presentation is loading. Please wait.

Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Rival-Model Penalized Self-Organizing Map Yiu-ming Cheung.

Similar presentations


Presentation on theme: "Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Rival-Model Penalized Self-Organizing Map Yiu-ming Cheung."— Presentation transcript:

1 Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Rival-Model Penalized Self-Organizing Map Yiu-ming Cheung and Lap-tak Law, IEEE Transaction on Neural Networks, Vol. 18, No. 1, 2007, pp. 289-295. Presenter : Wei-Shen Tai Advisor : Professor Chung-Chian Hsu 2007/3/1

2 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Outline Introduction Overview the SOM RPSOM Experimental results Conclusion Comments

3 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Motivation Learning rate problem  A small initial value of learning rate It is prone to make the models stabilized at some locations of input space in an early training stage.  A relatively large value If it is reduced very slowly  The map can learn the topology of inputs well with the small quantization error, but the map convergence needs a large number of iterations and becomes quite time-consuming. If it is reduced too quickly  The map will be likely trapped into a local suboptimal solution and finally led to the large quantization error.

4 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Objective Rival-model penalized self organizing map (RPSOM)  It does not need to specify the decreasing function of learning rate.  It utilizes a constant learning rate to circumvent the awkward selection of a monotonically decreased function for the learning rate, but still achieves map convergence.

5 N.Y.U.S.T. I. M. Intelligent Database Systems Lab RPSOM Concept  It adaptively chooses several rivals of the BMU and penalizes their associated models a little far away from the input.  Subsequently, the rivals have more chance to become the BMU of the other inputs. Rival  The map neuron that belongs to the first k-nearest map neurons of an input, but not the one-neighborhood of the BMU, where k is the number of one-neighborhood neurons in the adopted neighborhood topology.

6 N.Y.U.S.T. I. M. Intelligent Database Systems Lab RPSOM learning algorithm (1/2) Step 1)  For a set M of mn weight vectors in a m*n map, we initialize all the weight vectors w i  M, i = 1, 2,...,mn, and let the winning frequency of each neuron, written as n i,be 1. Step 2)  For each input vector x(t), we find out its k-nearest neurons, where k is the number of one-neighborhood neuron in the adopted neighborhood topology.  We can find the neuron c (i.e., the BMU) by using (1) and, similarly, we can also find out the second nearest to kth nearest neurons, whose subscript indices are denoted as c 2, c 3,..., c k, respectively. We let B ={ c 2, c 3,..., c k }. Step 3)  Increment the winning frequency of neuron c (i.e., BMU) by n c = n c + 1.

7 N.Y.U.S.T. I. M. Intelligent Database Systems Lab RPSOM learning algorithm (2/2) Step 4)  Identify R that consists of the rival neurons belonging to the first k- nearest neurons, but not the one-neighborhood neurons of BMU. Step 5)  Update the weight vectors of BMU and its neighbors except the rivals by  The neighborhood function h ci (t) of the BMU neuron c defined as Step 6)  Penalize the rivals by

8 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Two measurements to evaluate the training performance of the SOM Quantization Error Q:  It measures the average distance from each input data to its BMU. Neuron Utilization U:  It measures the percentage of map neurons that are the BMU of one or more training data in the map.

9 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Experimental results Q U Linear initializationRandom initialization

10 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Conclusion A novel RPSOM learning algorithm  It does not need to specify the decreasing function of the learning rate.  It can lead a much lower quantization and a higher neuron utilization in comparison with the conventional SOM and the two-phase training SOM algorithms.

11 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Comments Advantage  A novel idea for circumventing the awkward selection of a monotonically decreased function for the learning rate in SOM.  It also increases the chance for rival neurons to be the BMU of other inputs. Drawback  Authors did not compare the difference between different k (the number of one-neighborhood neuron). Application  SOM related applications.


Download ppt "Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Rival-Model Penalized Self-Organizing Map Yiu-ming Cheung."

Similar presentations


Ads by Google