A Hybrid Self-Organizing Neural Gas Network James Graham and Janusz Starzyk School of EECS, Ohio University Stocker Center, Athens, OH 45701 USA IEEE World.

Slides:



Advertisements
Similar presentations
Distributed Algorithm for a Mobile Wireless Sensor Network for Optimal Coverage of Non-stationary Signals Andrea Kulakov University Sts Cyril and Methodius.
Advertisements

Fast Algorithms For Hierarchical Range Histogram Constructions
Template design only ©copyright 2008 Ohio UniversityMedia Production Spring Quarter  A hierarchical neural network structure for text learning.
TEMPLATE DESIGN © Self Organized Neural Networks Applied to Animal Communication Abstract Background Objective The main.
DATA-MINING Artificial Neural Networks Alexey Minin, Jass 2006.
Computational Intelligence: Methods and Applications Lecture 10 SOM and Growing Cell Structures Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
X0 xn w0 wn o Threshold units SOM.
UNIVERSITY OF JYVÄSKYLÄ Yevgeniy Ivanchenko Yevgeniy Ivanchenko University of Jyväskylä
Optimizing number of hidden neurons in neural networks
1 Learning to Detect Objects in Images via a Sparse, Part-Based Representation S. Agarwal, A. Awan and D. Roth IEEE Transactions on Pattern Analysis and.
Self-Organizing Hierarchical Neural Network
CONTENT BASED FACE RECOGNITION Ankur Jain 01D05007 Pranshu Sharma Prashant Baronia 01D05005 Swapnil Zarekar 01D05001 Under the guidance of Prof.
Shape Modeling International 2007 – University of Utah, School of Computing Robust Smooth Feature Extraction from Point Clouds Joel Daniels ¹ Linh Ha ¹.
Associative Learning in Hierarchical Self Organizing Learning Arrays Janusz A. Starzyk, Zhen Zhu, and Yue Li School of Electrical Engineering and Computer.
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
Neural Networks Lecture 17: Self-Organizing Maps
Lecture 09 Clustering-based Learning
FLANN Fast Library for Approximate Nearest Neighbors
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
 C. C. Hung, H. Ijaz, E. Jung, and B.-C. Kuo # School of Computing and Software Engineering Southern Polytechnic State University, Marietta, Georgia USA.
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Unsupervised Learning and Clustering k-means clustering Sum-of-Squared Errors Competitive Learning SOM Pre-processing and Post-processing techniques.
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Self-organizing Maps Kevin Pang. Goal Research SOMs Research SOMs Create an introductory tutorial on the algorithm Create an introductory tutorial on.
Self-organizing map Speech and Image Processing Unit Department of Computer Science University of Joensuu, FINLAND Pasi Fränti Clustering Methods: Part.
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
Hierarchical Distributed Genetic Algorithm for Image Segmentation Hanchuan Peng, Fuhui Long*, Zheru Chi, and Wanshi Siu {fhlong, phc,
Community Architectures for Network Information Systems
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Adaptive nonlinear manifolds and their applications to pattern.
A Scalable Self-organizing Map Algorithm for Textual Classification: A Neural Network Approach to Thesaurus Generation Dmitri G. Roussinov Department of.
NEURAL NETWORKS FOR DATA MINING
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Hierarchical Clustering of Gene Expression Data Author : Feng Luo, Kun Tang Latifur Khan Graduate : Chien-Ming Hsiao.
Clustering.
TreeSOM :Cluster analysis in the self- organizing map Neural Networks 19 (2006) Special Issue Reporter 張欽隆 D
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Externally growing self-organizing maps and its application to database visualization and exploration.
381 Self Organization Map Learning without Examples.
CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 A self-organizing map for adaptive processing of structured.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Adaptive FIR Neural Model for Centroid Learning in Self-Organizing.
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Soft Computing Lecture 15 Constructive learning algorithms. Network of Hamming.
Non-parametric Methods for Clustering Continuous and Categorical Data Steven X. Wang Dept. of Math. and Stat. York University May 13, 2010.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
A Self-organizing Semantic Map for Information Retrieval Xia Lin, Dagobert Soergel, Gary Marchionini presented by Yi-Ting.
Computational Intelligence: Methods and Applications Lecture 9 Self-Organized Mappings Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
Soft Competitive Learning without Fixed Network Dimensionality Jacob Chakareski and Sergey Makarov Rice University, Worcester Polytechnic Institute.
Cmpe 588- Modeling of Internet Emergence of Scale-Free Network with Chaotic Units Pulin Gong, Cees van Leeuwen by Oya Ünlü Instructor: Haluk Bingöl.
Machine Learning 12. Local Models.
Machine Learning Supervised Learning Classification and Regression
Self-Organizing Network Model (SOM) Session 11
Neural Networks Winter-Spring 2014
Data Mining, Neural Network and Genetic Programming
Self Organizing Maps: Parametrization of Parton Distribution Functions
MLP Based Feedback System for Gas Valve Control in a Madison Symmetric Torus Andrew Seltzman Dec 14, 2010.
CSE P573 Applications of Artificial Intelligence Neural Networks
Lecture 22 Clustering (3).
CSE572, CBS598: Data Mining by H. Liu
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
CSE572, CBS572: Data Mining by H. Liu
Exact Nearest Neighbor Algorithms
CSE572: Data Mining by H. Liu
Unsupervised Networks Closely related to clustering
Presentation transcript:

A Hybrid Self-Organizing Neural Gas Network James Graham and Janusz Starzyk School of EECS, Ohio University Stocker Center, Athens, OH USA IEEE World Conference on Computational Intelligence (WCCI’08) June 1-6, 2008 Hong Kong

Introduction Self Organizing Networks – Useful for representation building in unsupervised learning – Useful for clustering, visualization and feature maps – Numerous applications in surveillance, traffic monitoring, flight control, rescue mission, reinforcement learning, etc. Some Types of Self Organizing Networks – Traditional Self-Organizing Map – Parameterless SOM – Neural Gas Network – Growing Neural Gas – Self-Organizing Neural Gas (SONG)

Description of the approach - Fritzke’s GNG Network Algorithm Highlights 1) GNG starts with a set A of two units a and b at random positions w a and w b in R n 2) In the set A find two nearest neighbors s 1 and s 2 to the input signal x. 3) Connect s 1 and s 2, with an edge and set the edge age to zero. 4) Adjust the positions of s 1 and its neighborhood by a constant times (x-s 1 ). (  b for s 1 and  n for the neighborhood) 5) Remove edges in the neighborhood that are older than a max. 6) Place a new node every λ cycles between the node with greatest error and its nearest neighbor. 7) Reduce error of the node with the maximum error and its nearest neighbor by  %, and add the removed error to the new node. 8) Reduce error of all nodes by a constant (  ) times their current error.

Example Example of Fritzke’s network results for 40,000 iterations with the following constants:  b =0.05,  n =.0006, a max =88, =200,  =.5,  =

Description of the approach - Proposed Hybrid SONG Network Algorithm Highlights 1) SONG starts with a random pre-generated network of a fixed size. 2) Connections get “stiffer” with age, making their weight harder to change. 3) Error is calculated after the node position updates rather than before. 4) Weight adjustment and error distribution are functions of a distance rather than arbitrary, hard to set constants. 5) Edge connections are removed only under the following conditions: – When a connection is added and the node has a long connection 2x greater than its average connection length - the long edge is removed. – When a node is moved and has at least 2 connections (after attaching to its destination node) - its longest connection is removed.

Description of the approach - Modification of new data neighborhood “Force” calculations |w s -  | |w N -  | |w 2 -  | |w 1 -  | s  nearest neighbor new data remove connection to a distant neighbor >2 mean node removed if orphaned Weight adjustment Error increase Age increase by 1

Description of the approach - Node replacement Select a node with the minimum error E sk Spread E sk to its s k neighborhood sqsq maximum error node minimum error node moved sksk

Description of the approach - Node replacement Select a node with the minimum error E sk Spread E sk to its s k neighborhood sksk sqsq maximum error node longest connection removed Insert s k to the neighborhood of s q using weights Remove the longest connection Spread half of s q neighborhood error to s k

Results Initial network structure with 1 random connection per node (for 200 nodes)

Results (cont.) Structure resulting form 1 initial random connection.

Results (cont.) Connection equilibrium reached for 1 initial connection.

Results (cont.) Structure resulting from 16 initial random connections.

Connection equilibrium for 16 initial connections. Results (cont.)

Video of Network Progression Hybrid SONG NetworkFritzke GNG Network

Results (cont.) 2-D comparison, with SOM network Salient features of the SOM algorithm: – The SOM network starts as a predefined grid and is adjusted over many iterations. – Connections are fixed and nodes are not inserted, moved, or relocated out of their preexisting grid. – Weight adjustments occur over the entire grid and are controlled by weighted distance to the data point.

Growing SONG Network Number of nodes in SONG can be automatically obtained The SONG network starts with a few randomly placed nodes and build itself up until an equilibrium is reached between the network size and the error. A node is added every λ cycles if MaxError > AveError + Constant Equilibrium appears to be ~200 nodes.

Growing SONG Network (cont.) Error handling in growing SONG network was modified. The error is “reset” and recomputed after the equilibrium was reached Network continues to learn reaching new equilibrium Approximation accuracy vary from run to run

Growing SONG Network (cont.) The results of growing SONG network run (on the right) compared to the simpler static approach (on the left).

Other Applications - Sparsely connected hierarchical sensory network The major features of the SONG algorithm such as the weight adjustment, error calculation, and neighborhood selection are utilized in building self- organizing sparsely connected hierarchical networks. The sparse hierarchical network is locally connected based on neurons’ firing correlation Feedback and time based correlation are used for invariant object recognition.

Other Applications - Sparsely connected hierarchical sensory network (cont.) Correlation/PDF Example Wiring area

Other Applications - Sparsely connected hierarchical network (cont.) Correlation based wiring Sparse hierarchical representations Declining neurons’ activations

Conclusions The SONG algorithm is more biologically plausible than Fritzke’s GNG algorithm. Specifically: – Weight and error adjustment are not parameter based. – Connections become stiffer with age rather than being removed at a maximum age as in Fritzke’s method. – Network has all neurons from the beginning SONG approximates data distribution faster than the other methods tested. Connectivity between neurons is automatically obtained and depends on the parameter that controls edge removal and the network size. The number of neurons can be automatically obtained in growing SONG to achieve the desired accuracy.

Future Work Adapt the SONG algorithm to large input spaces (high dimensionality, i.e. images) Adapt the SONG algorithm to a hierarchical network. – Possible applications in feature extraction, representation building, and shape recognition. Insert new nodes as needed to reduce error. Optimize the network design.

Questions ?