Presentation is loading. Please wait.

Presentation is loading. Please wait.

Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Advisor : Dr. Hsu Graduate : Sheng-Hsuan Wang Authors :

Similar presentations


Presentation on theme: "Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Advisor : Dr. Hsu Graduate : Sheng-Hsuan Wang Authors :"— Presentation transcript:

1 Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Advisor : Dr. Hsu Graduate : Sheng-Hsuan Wang Authors : Giuseppe Patane Marco Russo Department of Information Management Fully Automatic Clustering System IEEE Transactions on Neural Networks, vol. 13, no. 6, November 2002

2 Intelligent Database Systems Lab N.Y.U.S.T. I. M. Outline Motivation Objective Introduction VQ Previous Works: ELBG FACS Results Conclusion Personal Opinion Review

3 Intelligent Database Systems Lab N.Y.U.S.T. I. M. Motivation Fully automatic clustering? The number of computations per iteration.

4 Intelligent Database Systems Lab N.Y.U.S.T. I. M. Objective In this paper, the fully automatic clustering system (FACS) is presented. The objective is the automatic calculation of the codebook of the right dimension, the desired error being fixed. In order to save on the number of computations per iteration, greedy techniques are adopted.

5 Intelligent Database Systems Lab N.Y.U.S.T. I. M. Introduction Cluster Analysis(CA, or clustering). Vector Quantization (VQ). Groups (or cells). Each cell is represented by a vector (called codeword). The set of the codewords is called the codebook. The different of CA and VQ. Grouping data into a certain number of groups so that a loss (or error) function is minimized.

6 Intelligent Database Systems Lab N.Y.U.S.T. I. M. Clustering and VQ

7 Intelligent Database Systems Lab N.Y.U.S.T. I. M. VQ- Definition The objective of VQ is the representation of a set of feature vectors by a set,, of reference vector in.

8 Intelligent Database Systems Lab N.Y.U.S.T. I. M. VQ- Quantization Error(QE) Square error(SE) Weighted square error(WSE)

9 Intelligent Database Systems Lab N.Y.U.S.T. I. M. VQ- Nearest neighbor condition (NNC) Nearest neighbor condition (NNC): Given a fixed codebook Y, the NNC consists in assigning to each input vector the nearest codeword.

10 Intelligent Database Systems Lab N.Y.U.S.T. I. M. VQ- Centroid condition (CC) Centroid condition (CC): Given a fixed partition S, the CC concerns the procedure for finding the optimal codebook.

11 Intelligent Database Systems Lab N.Y.U.S.T. I. M. Previous Works: ELBG The starting point of the research reported in this paper was our previous work: the ELBG [39]. Initialization. Partition calculation. According to the NNC (6). Termination condition check. ELBG-block execution. New codebook calculation. According to the CC (9). Return to Step 2.

12 Intelligent Database Systems Lab N.Y.U.S.T. I. M. A. ELBG-Block The basic idea of the ELBG-block. Joining a low-distortion cell with a cell adjacent to it. A high-distortion cell is split into two smaller ones. If we define the mean distortion per cell as

13 Intelligent Database Systems Lab N.Y.U.S.T. I. M. A. ELBG-Block

14 Intelligent Database Systems Lab N.Y.U.S.T. I. M. A. ELBG-Block

15 Intelligent Database Systems Lab N.Y.U.S.T. I. M. A. ELBG-Block 1) SoCAs (shift of codeword attempt): is looked for in a stochastic way.

16 Intelligent Database Systems Lab N.Y.U.S.T. I. M. A. ELBG-Block Splitting: We place both and on the principal diagonal of ; in this sense, we can say that the two codewords are near each other. Executing some local rearrangements. Union:

17 Intelligent Database Systems Lab N.Y.U.S.T. I. M. A. ELBG-Block 2) Mean Quantization Error Estimation and Eventual SoC: After the shift, we have a new codebook (Y’) and a new partition (S’). Therefore, we can calculate the new MQE. If it is lower than the value we had before the SoCA, this is confirmed. Otherwise, it is rejected.

18 Intelligent Database Systems Lab N.Y.U.S.T. I. M. B. Conderations Regarding the ELBG Insertions are effected in the regions where the error is higher ; Deletions where the error is lower. operations are executed locally. Several insertions or deletions can be effected during the same iteration always working locally.

19 Intelligent Database Systems Lab N.Y.U.S.T. I. M. FACS Introduction. The CA/VQ technique whose objective is to automatically find the codebook of the right dimension. FACS - increase or decrease happens smartly. To insert new codewords where the QE is higher. To eliminate them where the error is lower.

20 Intelligent Database Systems Lab N.Y.U.S.T. I. M. FACS iteration

21 Intelligent Database Systems Lab N.Y.U.S.T. I. M. Smart growing phase.

22 Intelligent Database Systems Lab N.Y.U.S.T. I. M. p versus the number of iteration

23 Intelligent Database Systems Lab N.Y.U.S.T. I. M. Smart reduction phase.

24 Intelligent Database Systems Lab N.Y.U.S.T. I. M. FACS The cell to eliminate is chosen with a probability that is a decreasing function of its distortion.

25 Intelligent Database Systems Lab N.Y.U.S.T. I. M. Behavior of FACS Versus the Number of Iterations and Termination Condition

26 Intelligent Database Systems Lab N.Y.U.S.T. I. M. Discussion about outliers

27 Intelligent Database Systems Lab N.Y.U.S.T. I. M. Result Introduction. Comparison With ELBG. Comparison With GNG and GNG-U. Comparison With FOSART. Comparison With the Competitive Agglomeration Algorithm. Classification.

28 Intelligent Database Systems Lab N.Y.U.S.T. I. M. B. Comparison with ELBG

29 Intelligent Database Systems Lab N.Y.U.S.T. I. M. C. Comparison With GNG and GNG-U. GNG, GNG-U. Insert codewords until The prefixed number. The “performance measure” is fulfilled. Our case,

30 Intelligent Database Systems Lab N.Y.U.S.T. I. M. D. Comparison With FOSART. The family of the ART algorithms called FOSART. They use it also for tasks of VQ.

31 Intelligent Database Systems Lab N.Y.U.S.T. I. M. E. Comparison With the Competitive Agglomeration.

32 Intelligent Database Systems Lab N.Y.U.S.T. I. M. F. Classification Comparison between FACS and the GCS algorithm for a problem, the two spirals, of supervised classification. Mode 1: The input is constituted by 194 2-D vectors representing the two spirals. The output is the related membership class (0 or 1). We employed the WSE. Mode 2: The clustering phase occurs using only the part of the patterns related to the input, and using SE.

33 Intelligent Database Systems Lab N.Y.U.S.T. I. M. F. Classification(cont.)

34 Intelligent Database Systems Lab N.Y.U.S.T. I. M. Conclusion FACS, a new algorithm for CA/VQ that is able to autonomously find the number of codewords once the desired quantization error is specified. In comparison to previous similar works a significative improvement in the running time has been obtained. Further studies will be made regarding the use of different distortion measures.

35 Intelligent Database Systems Lab N.Y.U.S.T. I. M. Personal Opinion The starting point of the research reported in this paper was author’s previous work:the ELBG. The QE is a key index.

36 Intelligent Database Systems Lab N.Y.U.S.T. I. M. Review Clustering V.S VQ. Previous works: ELBG. FACS Smart Growing Smart Reduction


Download ppt "Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Advisor : Dr. Hsu Graduate : Sheng-Hsuan Wang Authors :"

Similar presentations


Ads by Google