Download presentation
Presentation is loading. Please wait.
Published byEarl Baldwin Modified over 9 years ago
1
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 On Rival Penalization Controlled Competitive Learning for Clustering with Automatic Cluster Number Selection Advisor : Dr. Hsu Presenter : Ai-Chen Liao Authors : Yiu-ming Cheung 2005. TKDE. Page(s) : 1583 - 1588
2
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 2 Outline Motivation Objective Method RPCL RPCCL Experimental Results Conclusion Personal Opinions
3
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 3 Motivation K-means algorithm has at least two major drawbacks: ─ It suffers from the dead-unit problem. ─ If the number of clusters is misspecified, i.e., k is not equal to the true cluster number k*, the performance of k-means algorithm deteriorates rapidly. The performance of RPCL is sensitive to the value of the delearning rate.
4
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 4 Objective We will concentrate on studying the RPCL algorithm and propose a novel technique to circumvent the selection of the delearning rate. We further investigate the RPCL and present a mechanism to control the strength of rival penalization dynamically.
5
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 5 Method ─ RPCL Advantage : RPCL can automatically select the correct cluster number by gradually driving redundant seed points far away from the input dense regions. Drawback : RPCL is sensitive to the delearning rate. Idea : ex. In a election campaign…..(more intense)….. candidates : A 40% B 35% C 5%
6
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 6 Method ─ RPCL cluster center each input Winner (move closer) Rival (move away) unchanged
7
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 7 Method ─ RPCL
8
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 8 Method ─ RPCCL
9
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 9 Method ─ RPCCL This penalization control mechanism by with compare
10
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 10 Experimental Results RPCL : learning rate α C at 0.001, and αr at 0.0001 the number of seed points : 30 audience image : 128*128 pixels epoch :50 original Audience Image RPCL RPCCL
11
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 11 Conclusion RPCCL has novelly circumvented the difficult selection of the deleaning rate.
12
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 12 Personal Opinions Advantage RPCCL can automatically select the correct cluster number. The novel technique can circumvent the selection of the delearning rate. Drawback limitation : k >= k* Application clustering…
13
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 13 K-means example 1. Given : {2,4,10,12,3,20,30,11,25} k=2 2. Randomly assign means : m1=3 ; m2=4 k1={2,3}, k2={4,10,12,20,30,11,25},m1=2.5, m2=16 k1={2,3,4}, k2={10,12,20,30,11,25}, m1=3, m2=18 k1={2,3,4,10}, k2={12,20,30,11,25}, m1=4.75, m2=19.6 k1={2,3,4,10,11,12}, k2={20,30,25}, m1=7, m2=25 …..
14
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 14 Dead-unit problem 1. Given : {2,4,10,12,3,20,30,11,25}, k=3 2. Randomly assign means : m1=30 ; m2=25 ; m3=10 Dead-unit Heuristic Frequency Sensitive Competitive Learning (FSCL) algorithm
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.