Download presentation
Presentation is loading. Please wait.
Published byMeghan April Richards Modified over 8 years ago
1
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Growing Hierarchical Tree SOM: An unsupervised neural network with dynamic topology Advisor : Dr. Hsu Presenter : Zih-Hui Lin Author :Alberto Forti, Gian Luca Foresti Neural Networks 2006
2
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 2 Motivation Objective GHTSOM Conclusions Outline
3
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 3 Motivation The SOM has two main limitations: ─ (1) the need to define a priori the size and the topology of the map ─ (2) the problem of cluster boundary detection
4
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 4 Objective In this paper we introduce a GHTSOM to solve this problem.
5
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 5 GHTSOM Training process a tree of identical SOMs is constructed, clustering process the clustering process considers each level of the tree and uses self-organization to group neurons in classes.
6
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 6 GHTSOM - Training process growth phase → each neuron of the last hierarchical layer grows (if it can ) creating a triangle SOM ─ we assume that a neuron grows if it is activated by at least three patterns. Training phase → the new SOMs are trained each with its own local training set ─ Standard SOM training ─ Input presentation to leafs ─ Fine-tuning training each triangle SOM is trained adapting only the winner The neighbourhood function is not used ─ Deletion of nodes zero activity low activity if two neurons have zero activity we delete the entire subnetwork blocking the father neuron growth.
7
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 7 GHTSOM - Clustering process Input vector-1 winner loser neuron Input vector-2 loser neuron winner
8
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 8 Experimental results
9
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 9 Experimental results
10
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 10 Conclusions This paper has presented a new model of a hierarchical growing neural network, the Growing Hierarchical Tree SOM As a result, the model does not need user- defined thresholds.
11
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 11 My opinion Advantage Users don’t input the parameters. (map size, threshold..) The problem of cluster boundary detection is solved. Drawback …. Application …..
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.