Download presentation
Presentation is loading. Please wait.
Published byAnissa Johns Modified over 9 years ago
1
Intelligent Database Systems Lab N.Y.U.S.T. I. M. A fast nearest neighbor classifier based on self-organizing incremental neural network (SOINN) Neuron Networks (NN, 2008) Presenter : Lin, Shu-Han Authors : Shen Furao,, Osamu Hasegawa
2
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 2 Outline Introduction Motivation Objective Methodology Experiments Conclusion Comments
3
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 3 Introduction - self-organizing incremental neural network (SOINN) Distance: Too far Node = prototype
4
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 4 Introduction - self-organizing incremental neural network (SOINN) Link age
5
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 5 Introduction - self-organizing incremental neural network (SOINN) Age: Too old
6
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 6 Introduction - self-organizing incremental neural network (SOINN) Run two times Insert node if error is large Cancel Insertion if insert is no use
7
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 7 Introduction - self-organizing incremental neural network (SOINN) Run two times Delete outlier: Nodes without neighbor (low-density assumption)
8
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Motivation SOINN classifier (their first research in 2005) Use 6 user determined parameters Do not mentioned about noise Too many prototypes Unsupervised learning Their second research (in 2007) talk about these weakness 8
9
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Objectives Propose a Improved version of SOINN, ASC (Adjust SOINN Classifier) FASTER: delete/less prototype Training phase Classification phase CLASSIFIER: 1-NN (prototype) rule INCREMENTAL LEARNING ONE LAYER: easy to understand the setting, less parameters~ MORE STABLE: help of k-means 9
10
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Methodology – Adjusted SOINN 10 Distance: Too far A node is a cluster
11
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Methodology – Adjusted SOINN 11 Link age
12
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Methodology – Adjusted SOINN 12 Winner Neighbor
13
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Methodology – Adjusted SOINN 13 Age: Too old > a d
14
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Methodology – Adjusted SOINN 14 Delete outlier: Nodes without neighbor (low-density assumption)
15
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Methodology – Adjusted SOINN 15 Lambda = iterations
16
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Methodology – k-means 16 Help of k-means clustering, k = # of neurons Adjust the result prototypes: assume that each node nearby the centroid of class
17
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Methodology – noise-reduction 17 Help of k-Edit Neighbors Classifier (ENC), k=? Delete the node which label are differs from the majority voting of its k- neighbors: assume that are generated by noise
18
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Methodology – center-cleaning 18 Delete neurons: if it has never been the nearest neuron to other class: assume that are lies in the central part of class
19
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Experiments: Artificial dataset 19 dataset Adjusted SOINN ASC Error: same Speed: faster
20
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Experiments: Artificial dataset 20 dataset Adjusted SOINN ASC Error: same Speed: faster
21
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Experiments: Artificial dataset 21 dataset Adjusted SOINN ASC Error: better Speed: faster
22
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Experiments: Artificial dataset 22 dataset Adjusted SOINN ASC Error: better Speed: faster
23
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Experiments: Real dataset 23 Compression ratio (%) Speed up ratio (%)
24
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Experiments: Compare with other prototype-based classification method 24 Nearest Subclass Classifier (NSC) k-Means Classifier (KMC) k-NN Classifier (NNC) Learning Vector Quantization (LVQ)
25
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Experiments: Compare with other prototype-based classification method 25
26
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Conclusions ASC Learns the number of nodes needed to determine the decision boundary Incremental neural network Robust to noisy training data Fast classification Fewer parameters: 3 parameters 26
27
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Comments Advantage Improve many things A previous paper to demonstrate the thing they want to modify Drawback NO Suggestion of parameters Application A work from unsupervised learning to supervised learning 27
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.