Presentation is loading. Please wait.

Presentation is loading. Please wait.

Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Looking inside self-organizing map ensembles with resampling.

Similar presentations


Presentation on theme: "Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Looking inside self-organizing map ensembles with resampling."— Presentation transcript:

1 Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Looking inside self-organizing map ensembles with resampling and negative correlation learning Alexandra Scherbart, Tim W. Nattkemper NN, Vol.24 2011, pp. 130–141 Presenter : Wei-Shen Tai 2010/12/22

2 N.Y.U.S.T. I. M. Intelligent Database Systems Lab 2 Outline Introduction Methods Results Discussion Conclusions Comments

3 N.Y.U.S.T. I. M. Intelligent Database Systems Lab 3 Motivation Dilemma of ensemble learning methods  Balance the diversity against single learner accuracy Maximize the diversity may worsen the prediction performance of every single learner. Minimize the prediction error of ensemble member leads to very similar nets. f1f1 f2f2 f3f3 x y

4 N.Y.U.S.T. I. M. Intelligent Database Systems Lab 4 Objective Negative correlation learning (NCL) in EL  Allows a balance between single network accuracy and diversity controlled by the co-operation of neural networks. f1f1 f2f2 f3f3 x y ensemble error of EL error of individual network

5 N.Y.U.S.T. I. M. Intelligent Database Systems Lab 5 Negative correlation learning (NCL) Additional penalty term  Balance between accuracy of individual networks and the quantified ambiguity (diversity). γ is a parameter controlling the penalty for a high correlation of the individual networks errors (low diversity).

6 N.Y.U.S.T. I. M. Intelligent Database Systems Lab 6 Penalty functions  Two penalty functions derived from NCL  Minimize those penalty functions

7 N.Y.U.S.T. I. M. Intelligent Database Systems Lab 7 Proposed architecture 1)Random starting initialization of each node Applies two resampling methods, bagging (bootstrap aggregating) and the random subspace mMethod (RSM) to enforce differences between the ensemble members  RSM obtains the subset of features randomly 2)A SOM is regarded as a EL The intra-member (i.e., intra-SOM) diversity is an intrinsic property of the SOM, as a single network can be seen as an ensemble itself. 3)In the case of NCL, the ensemble members interact and are forced to follow different trajectories in hypothesis space. f1f1 f2f2 f3f3 x y

8 N.Y.U.S.T. I. M. Intelligent Database Systems Lab 8 Results

9 N.Y.U.S.T. I. M. Intelligent Database Systems Lab 9 Discussion  A low number of nodes in a network Increasing the size of the networks leads to a loss in generalization (ensemble) performance, since every net gets too specialized to the presented subtask.  Small neighbor size σ Individual nodes are forced to specialize locally. Single ensemble accuracy is improved by forcing the diversity along each individual predictor.  A small k Predictors are no longer capable of extracting the relevant information for modeling the particular subtask at hand.

10 N.Y.U.S.T. I. M. Intelligent Database Systems Lab 10 Conclusions A novel SOM-based EL  Introduce the concepts of negative correlation learning (NCL) into the field of SOM ensemble learning. Diversity in intra-SOM and inter-SOM  Explicit diversity-forcing impact is caused by NCL on the inter-SOM level.  By the combination of the two resampling methods, bagging and RSM, the diversity between SOMs is enforced implicitly.  The implicit diversity inside the SOMs, which is controlled by the width σ of the Gaussian neighborhood function.

11 N.Y.U.S.T. I. M. Intelligent Database Systems Lab 11 Comments Advantage  This proposed model overcomes the conflict between the accuracy of single ensemble member and diversity in ensemble learning methods. Drawback  The intra-SOM diversity is increased by decreasing the neighbor size σ. The authors did not mention about how to increase the inter-SOM diversity during the training.  Only the error between ensemble output and the prediction of a single member is considered in two penalty functions. Nevertheless, the above- mentioned inter-SOM diversity is ignored in both of them.  Less map size and neighbor size will cause low accuracy of each SOM. Application  SOM-based ensemble learning issue.


Download ppt "Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Looking inside self-organizing map ensembles with resampling."

Similar presentations


Ads by Google