Growing Curvilinear Component Analysis (GCCA)

Slides:



Advertisements
Similar presentations
Memristor in Learning Neural Networks
Advertisements

2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Introduction to Training and Learning in Neural Networks n CS/PY 399 Lab Presentation # 4 n February 1, 2001 n Mount Union College.
Vehicles Detection From Aerial Sequences 4 th European Micro-UAV Meeting University of Picardie Jules Verne 1 VEHICLES DETECTION FROM AERIAL SEQUENCES.
Self Organization: Competitive Learning
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Competitive learning College voor cursus Connectionistische modellen M Meeter.
Non-linear Dimensionality Reduction CMPUT 466/551 Nilanjan Ray Prepared on materials from the book Non-linear dimensionality reduction By Lee and Verleysen,
Machine Learning Neural Networks
A New Block Based Motion Estimation with True Region Motion Field Jozef Huska & Peter Kulla EUROCON 2007 The International Conference on “Computer as a.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
1 Abstract This paper presents a novel modification to the classical Competitive Learning (CL) by adding a dynamic branching mechanism to neural networks.
November 9, 2010Neural Networks Lecture 16: Counterpropagation 1 Unsupervised Learning So far, we have only looked at supervised learning, in which an.
MSU CSE 803 Stockman Linear Operations Using Masks Masks are patterns used to define the weights used in averaging the neighbors of a pixel to compute.
Manifold Learning: ISOMAP Alan O'Connor April 29, 2008.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Implementing a reliable neuro-classifier
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
MSU CSE 803 Linear Operations Using Masks Masks are patterns used to define the weights used in averaging the neighbors of a pixel to compute some result.
NonLinear Dimensionality Reduction or Unfolding Manifolds Tennenbaum|Silva|Langford [Isomap] Roweis|Saul [Locally Linear Embedding] Presented by Vikas.
A Hybrid Self-Organizing Neural Gas Network James Graham and Janusz Starzyk School of EECS, Ohio University Stocker Center, Athens, OH USA IEEE World.
Lecture 09 Clustering-based Learning
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.
EE513 Audio Signals and Systems Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Fault Prediction in Electrical Valves Using Temporal Kohonen Maps Luiz F. Gonçalves, Eduardo L. Schneider, Jefferson L. Bosa, Renato Ventura B. Henriques,
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
1 UFRGS Design of an Embedded System for the Proactive Maintenance of Electrical Valves Luiz F. Gonçalves, Jefferson L. Bosa, Renato V. B. Henriques, Marcelo.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Adaptive nonlinear manifolds and their applications to pattern.
IEEE TRANSSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
UNSUPERVISED LEARNING NETWORKS
ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.
CSC2535: Computation in Neural Networks Lecture 12: Non-linear dimensionality reduction Geoffrey Hinton.
Nonlinear Dimensionality Reduction Approach (ISOMAP)
Unsupervised Learning Motivation: Given a set of training examples with no teacher or critic, why do we learn? Feature extraction Data compression Signal.
Contents PCA GHA APEX Kernel PCA CS 476: Networks of Neural Computation, CSD, UOC, 2009 Conclusions WK9 – Principle Component Analysis CS 476: Networks.
381 Self Organization Map Learning without Examples.
Privacy-Preserving Self- Organizing Map Shuguo Han and Wee Keong Ng Center for Advanced Information Systems, School of Computer Engineering,Nanyang Technological.
CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.
Jump to first page. The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Instructor: Mircea Nicolescu Lecture 5 CS 485 / 685 Computer Vision.
Soft Computing Lecture 15 Constructive learning algorithms. Network of Hamming.
Soft Competitive Learning without Fixed Network Dimensionality Jacob Chakareski and Sergey Makarov Rice University, Worcester Polytechnic Institute.
Machine Learning 12. Local Models.
Nonlinear Dimensionality Reduction
Building Adaptive Basis Function with Continuous Self-Organizing Map
INTRODUCTION TO Machine Learning 3rd Edition
Unsupervised Riemannian Clustering of Probability Density Functions
CSE 473 Introduction to Artificial Intelligence Neural Networks
Structure learning with deep autoencoders
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Blind Signal Separation using Principal Components Analysis
Unsupervised learning
Neuro-Computing Lecture 4 Radial Basis Function Network
Neural Networks and Their Application in the Fields of Coporate Finance By Eric Séverin Hanna Viinikainen.
CSE 573 Introduction to Artificial Intelligence Neural Networks
EE513 Audio Signals and Systems
Robust Full Bayesian Learning for Neural Networks
NonLinear Dimensionality Reduction or Unfolding Manifolds
Topological Signatures For Fast Mobility Analysis
Artificial Neural Networks
Presentation transcript:

Growing Curvilinear Component Analysis (GCCA) G. Cirrincione University of Picardie Jules Verne Lab. LTI, Amiens, France University of Nottingham Dept. Electrical Eng., Nottingham, UK exin@u-picardie.fr V. Randazzo Politecnico di Torino DET, Torino, Italy vincenzorandazzo@msn.com

Introduction CCA GCCA algorithm Analysis of bridges Simulations Application Conclusions

continuous learning online data stream introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

connected graph online ISOMAP (2004) GNLG-NG (2006) TRNMap (2008) RBF-NDR (2011) linear PCA (Oja, 1992) neural PCA : GHA (1989) APEX (1996) nonlinear VQ + P fusion VQP (1993) OVI-NG (2006) incremental SOM : GNG-U (1997) DASH (2005) DSOM (2010) autoencoders (2006) stationary input data stream nonstationary input linear nonlinear PCA (Oja, 1992) DSOM (2010) real time PR fault diagnosis novelty detection intrusion detection speech recognition text recognition computer vision scene analysis face recognition forgetting network introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

What is the goal ?

What is the goal ?

What is the goal ? memoryless SOM with low constant learningrate DSOM with high constant learning rate SOM with high constant learning rate memoryless

What is the goal ? bridge memory introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

What is the goal ? bridge memory t = t nonstationary (jump) x bridge nonstationary (jump) x bridge neuron local threshold edge Competitive Hebbian Learning T first winner edge n second winner t < t introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Curvilinear Component Analysis projection local vector quantization dynamic introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Network initialization with two neurons (initial seed) start Network initialization with two neurons (initial seed) data space seed latent space GCCA algorithm introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

x t = t neuron threshold T nonstationary (jump) w edge n t < t start Create a new neuron in the X space whose weight vector is x0 Network initialization with two neurons (initial seed) New data x0 Compute distances from existing neurons (candidate neurons) Sort distances from min to max, say d1,..,dn Tw1 > d1? t = t w n T W edge t < t nonstationary (jump) no x neuron threshold 1 introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

x bridge t = t T w edge n t < t start Create a new neuron in the X space whose weight vector is x0 Network initialization with two neurons (initial seed) Link x0 to w1 with a bridge New data x0 Compute distances from existing neurons (candidate neurons) t = t Sort distances from min to max, say d1,..,dn no Tw1 > d1? x bridge w n T W edge t < t 1 introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

x bridge t = t T w edge n t < t start Create a new neuron in the X space whose weight vector is x0 Network initialization with two neurons (initial seed) Link x0 to w1 with a bridge New data x0 ComputeTx0 =x0-w1  Compute distances from existing neurons (candidate neurons) t = t Sort distances from min to max, say d1,..,dn no Tw1 > d1? x bridge w n T W edge t < t 1 introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

projection with extrapolation (latent space) start Create a new neuron in the X space whose weight vector is x0 Network initialization with two neurons (initial seed) Link x0 to w1 with a bridge New data x0 ComputeTx0 =x0-w1  Compute distances from existing neurons (candidate neurons) Triangulate initial y0 t = t Sort distances from min to max, say d1,..,dn no Tw1 > d1? x bridge w n T W edge t < t 1 introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

projection with extrapolation (latent space) Create a new neuron in the X space whose weight vector is x0 Link x0 to w1 with a bridge ComputeTx0 =x0-w1  Triangulate initial y0 farthest from the third winner y d 1 d 2 t >= t w 1 w 2 w 3 t < t introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

projection with extrapolation (latent space) Create a new neuron in the X space whose weight vector is x0 Link x0 to w1 with a bridge ComputeTx0 =x0-w1  Triangulate initial y0 Fix the first and the second winner and adjust y0 according to CCA gradient flow for one iteration extrapolation y d 1 d 2 t >= t FIXED w 1 w 2 w 3 FIXED t < t introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

x bridge t = t T w edge n t < t start Create a new neuron in the X space whose weight vector is x0 Network initialization with two neurons (initial seed) Link x0 to w1 with a bridge New data x0 ComputeTx0 =x0-w1  Compute distances from existing neurons (candidate neurons) Triangulate initial y0 t = t Sort distances from min to max, say d1,..,dn Fix the first and the second winner and adjust y0 according to CCA gradient flow for one iteration no Tw1 > d1? x bridge w n T W edge t < t 1 introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Competitive Hebbian Learning (input space) start CHL: first and second winner linking (if not already linked) Network initialization with two neurons (initial seed) New data x0 Compute distances from existing neurons (candidate neurons) Sort distances from min to max, say d1,..,dn Tw1 > d1? yes Is the connection w1 w2 a bridge? no Competitive Hebbian Learning (input space) introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Competitive Hebbian Learning (input space) start CHL: first and second winner linking (if not already linked) Network initialization with two neurons (initial seed) second winner New data x0 Compute distances from existing neurons (candidate neurons) x winner Sort distances from min to max, say d1,..,dn Tw1 > d1? yes Is the connection w1 w2 a bridge? Competitive Hebbian Learning (input space) introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Competitive Hebbian Learning (input space) start CHL: first and second winner linking (if not already linked) Network initialization with two neurons (initial seed) Increment first winner neighbor edge ages by one (neuron pruning) second winner New data x0 Set the edge age between first and second winner to 0 Compute distances from existing neurons (candidate neurons) x winner +1 Sort distances from min to max, say d1,..,dn Tw1 > d1? yes Is the connection w1 w2 a bridge? Competitive Hebbian Learning (input space) introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Soft Competitive Learning (input space) start CHL: first and second winner linking (if not already linked) Network initialization with two neurons (initial seed) Increment first winner neighbor edge ages by one (neuron pruning) New data x0 Set the edge age between first and second winner to 0 Compute distances from existing neurons (candidate neurons) x winner SCL on w1 and its linked neighbors Sort distances from min to max, say d1,..,dn Tw1 > d1? yes Is the connection w1 w2 a bridge? no Soft Competitive Learning (input space) introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Soft Competitive Learning (input space) start CHL: first and second winner linking (if not already linked) Network initialization with two neurons (initial seed) Increment first winner neighbor edge ages by one (neuron pruning) New data x0 Set the edge age between first and second winner to 0 Compute distances from existing neurons (candidate neurons) x SCL on w1 and its linked neighbors Sort distances from min to max, say d1,..,dn Tw1 > d1? yes Is the connection w1 w2 a bridge? no Soft Competitive Learning (input space) introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

neuron thresholding (input space) start CHL: first and second winner linking (if not already linked) Network initialization with two neurons (initial seed) Tw2 Increment first winner neighbor edge ages by one (neuron pruning) New data x0 Set the edge age between first and second winner to 0 Compute distances from existing neurons (candidate neurons) Tw1 SCL on w1 and its linked neighbors Sort distances from min to max, say d1,..,dn Compute Tw1 and Tw2 Tw1 > d1? yes Is the connection w1 w2 a bridge? no neuron thresholding (input space) introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

projection with interpolation (latent space) start CHL: first and second winner linking (if not already linked) Network initialization with two neurons (initial seed) Increment first winner neighbor edge ages by one (neuron pruning) New data x0 Set the edge age between first and second winner to 0 Compute distances from existing neurons (candidate neurons) SCL on w1 and its linked neighbors Sort distances from min to max, say d1,..,dn Compute Tw1 and Tw2 Detect neurons within the sphere in the Y space centered in the first winner and with radius λ (λ-neurons) Tw1 > d1? yes Is the connection w1 w2 a bridge? λ no projection with interpolation (latent space) introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

projection with interpolation (latent space) start CHL: first and second winner linking (if not already linked) Network initialization with two neurons (initial seed) Increment first winner neighbor edge ages by one (neuron pruning) New data x0 Set the edge age between first and second winner to 0 Compute distances from existing neurons (candidate neurons) SCL on w1 and its linked neighbors Sort distances from min to max, say d1,..,dn Compute Tw1 and Tw2 Detect neurons within the sphere in the Y space centered in the first winner and with radius λ (λ-neurons) Tw1 > d1? yes Is the connection w1 w2 a bridge? no Fix w1 projection and adjust the λ-neurons according to CCA gradient flow for one iteration FIXED λ projection with interpolation (latent space) introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

start CHL: first and second winner linking (if not already linked) Network initialization with two neurons (initial seed) Increment first winner neighbor edge ages by one (neuron pruning) New data x0 Set the edge age between first and second winner to 0 Compute distances from existing neurons (candidate neurons) SCL on w1 and its linked neighbors Sort distances from min to max, say d1,..,dn Compute Tw1 and Tw2 Detect neurons within the sphere in the Y space centered in the first winner and with radius λ (λ-neurons) Tw1 > d1? yes Is the connection w1 w2 a bridge? no Fix w1 projection and adjust the λ-neurons according to CCA gradient flow for one iteration FIXED λ yes Is w1 the bridge tail? yes Turn the bridge into a link introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

start CHL: first and second winner linking (if not already linked) Network initialization with two neurons (initial seed) Increment first winner neighbor edge ages by one (neuron pruning) New data x0 Compute distances from existing neurons (candidate neurons) Set the edge age between first and second winner to 0 SCL on w1 and its linked neighbors Sort distances from min to max, say d1,..,dn Compute Tw1 and Tw2 no Tw1 > d1? Detect neurons within the sphere in the Y space centered in the first winner and with radius λ (λ-neurons) yes Is the connection w1 w2 a bridge? no w1 doubling (w1new ) using HCL Fix w1 projection and adjust the λ-neurons according to CCA gradient flow for one iteration yes Is w1 the bridge tail? no yes Turn the bridge into a link introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

neuron doubling seed T edge bridge x t >= t w edge n t < t 1new neuron doubling seed T W1 W 1 edge x 1 bridge t >= t w1 doubling (w1new ) using HCL Link w1 and w1new Compute Tw1= Tw1new = w1-w1new  w edge n t < t introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

projection with extrapolation (latent space) 1 W 1new edge x 1 bridge t >= t w1 doubling (w1new ) using HCL Link w1 and w1new Compute Tw1= Tw1new = w1-w1new  w Triangulate yw1new edge n t < t introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

projection with extrapolation (latent space) farthest from the third winner FIXED W 1 W 1new edge bridge t >= t w1 doubling (w1new ) using HCL Link w1 and w1new Compute Tw1= Tw1new = w1-w1new  FIXED w Triangulate yw1new edge Fix yw1 and yw2 adjust yw1new according to CCA gradient flow for one iteration n t < t introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Compute distances from existing neurons (candidate neurons) Sort distances from min to max, say d1,..,dn start Tw1 > d1? yes no Is the connection w1 w2 a bridge? Turn the bridge into a link CHL: first and second winner linking (if not already linked) Increment first winner neighbor edge ages by one (neuron pruning) Set the edge age between first and second winner to 0 SCL on w1 and its linked neighbors Compute Tw1 and Tw2 Detect neurons within the sphere in the Y space centered in the first winner and with radius λ (λ-neurons) Fix w1 projection and adjust the λ-neurons according to CCA gradient flow for one iteration Is w1 the bridge tail? Network initialization with two neurons (initial seed) New data x0 start Tw1 > d1? yes no Create a new neuron in the X space whose weight vector is x0 Link x0 to w1 with a bridge ComputeTx0 =x0-w1  Triangulate initial y0 Fix the first and the second winner and adjust y0 according to CCA gradient flow for one iteration New data x0 w1 doubling (w1new ) using HCL Link w1 and w1new Compute Tw1= Tw1new = w1-w1new  Triangulate yw1new Fix yw1 and yw2 adjust yw1new according to CCA gradient flow for one iteration t < t introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Analysis of bridges nonstationary: jump t >= t t >= t The jump is detected by the bridges introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Analysis of bridges nonstationary: jump introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Analysis of bridges nonstationary: jump introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Analysis of bridges nonstationary: smooth displacement possible bridges in the upper area no bridges in the overlap area t 2 t 1 The density of bridges is proportional to the displacement velocity of the distribution introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Analysis of the bridge: length If the length of the bridge is great, there are two interpretations : bridge link old neuron new neuron double neuron bridge no link old neuron new neuron Novelty detection Outlier (bridge pruning) introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Simulations data space latent space introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Simulations simplified onCCA no seeds no bridges no double neurons introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Simulations simplified onCCA good initial conditions bad initial conditions introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Simulations simplified onCCA noisy data introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Application NASA data repository : experiments on bearings' accelerated life tests provided by FEMTO-ST Institute, Besancon, France CCA 2155 5-dimensional vectors whose components correspond to statistical features extracted by measurements drawn from four vibration transducers installed in a kinematic chain of an electrical motor nonstationary framework evolving from the healthy state to a double fault occurring in the inner-race of a bearing and in the ball of another bearing Intrinsic dim = 3 introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Application simplified onCCA introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Application simplified onCCA novelty detection by means of the outlier frequency fault onset introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Application GCCA introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Application latent space introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Application latent space introduction CCA onCCA algorithm analysis of bridges simulations application conclusions

euclidean geodesic equivalence Conclusions  coordination euclidean geodesic equivalence local seed bridge topology preserving distance preserving colonization nonstationary outlier introduction CCA GCCA algorithm analysis of bridges simulations application conclusions

Aaron Nimzowitsch (1886-1935)