Computer Science Department Brigham Young University

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Data Mining Classification: Alternative Techniques
Sparse Coding in Sparse Winner networks Janusz A. Starzyk 1, Yinyin Liu 1, David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University,
Lecture 13 – Perceptrons Machine Learning March 16, 2010.
Artificial Neural Networks - Introduction -
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Machine Learning Neural Networks
Simple Neural Nets For Pattern Classification
Radial Basis Functions
1 Abstract This paper presents a novel modification to the classical Competitive Learning (CL) by adding a dynamic branching mechanism to neural networks.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Voice Recognition by a Realistic Model of Biological Neural Networks by Efrat Barak Supervised by Karina Odinaev Igal Raichelgauz.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
A Hybrid Self-Organizing Neural Gas Network James Graham and Janusz Starzyk School of EECS, Ohio University Stocker Center, Athens, OH USA IEEE World.
Lecture 09 Clustering-based Learning
Hub Queue Size Analyzer Implementing Neural Networks in practice.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
Radial-Basis Function Networks
Dr. Hala Moushir Ebied Faculty of Computers & Information Sciences
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
PPT 206 Instrumentation, Measurement and Control SEM 2 (2012/2013) Dr. Hayder Kh. Q. Ali 1.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
5.5 Learning algorithms. Neural Network inherits their flexibility and computational power from their natural ability to adjust the changing environments.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
Neural Network Hopfield model Kim, Il Joong. Contents  Neural network: Introduction  Definition & Application  Network architectures  Learning processes.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
Applications of Neural Networks in Time-Series Analysis Adam Maus Computer Science Department Mentor: Doctor Sprott Physics Department.
Neural Networks II By Jinhwa Kim. 2 Neural Computing is a problem solving methodology that attempts to mimic how human brain function Artificial Neural.
Pattern Recognition with N-Tuple Systems Simon Lucas Computer Science Dept Essex University.
ECE 8443 – Pattern Recognition LECTURE 08: DIMENSIONALITY, PRINCIPAL COMPONENTS ANALYSIS Objectives: Data Considerations Computational Complexity Overfitting.
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
AUDIT SOFTWARE Chapter 16. Generalized Audit Software Off-the-shelf software that provides a means to gain access to and manipulate data maintained on.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 12: Advanced Discriminant Analysis Objectives:
Neural Networks 2nd Edition Simon Haykin
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Ghent University Compact hardware for real-time speech recognition using a Liquid State Machine Benjamin Schrauwen – Michiel D’Haene David Verstraeten.
1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately neurons, each connected to, on average, 10 4.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
BIOINFORMATION A new taxonomy-based protein fold recognition approach based on autocross-covariance transformation - - 王红刚 14S
Neural Network Architecture Session 2
Semi-Supervised Clustering
Deep Learning Amin Sobhani.
LECTURE 11: Advanced Discriminant Analysis
Data Mining, Neural Network and Genetic Programming
LECTURE 09: BAYESIAN ESTIMATION (Cont.)
Neural Networks A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
A “Holy Grail” of Machine Learing
Neuro-Computing Lecture 4 Radial Basis Function Network
network of simple neuron-like computing elements
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
Brendan K. Murphy, Kenneth D. Miller  Neuron 
Model Combination.
Introduction to Radial Basis Function Networks
COSC 4335: Part2: Other Classification Techniques
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
An Illustrative Example.
Design of Experiments CHM 585 Chapter 15.
PYTHON Deep Learning Prof. Muhammad Saeed.
Pattern Recognition: Statistical and Neural
Presentation transcript:

Computer Science Department Brigham Young University Improving the Separability of a Reservoir Facilitates Learning Transfer David Norton Dan Ventura Computer Science Department Brigham Young University

Liquid State Machine A type of reservoir computing utilizing a recurrent spiking neural network (SNN) as the reservoir (liquid).

Liquid State Machine Input is transformed into a series of spikes. These spikes are introduced into the liquid. Snap-shots of the liquid’s state are taken (state vector). State vectors are introduced as input into the readout function.

Properties of LSMs Exploit power of recurrent spiking neural networks Random reservoir creation; typically no training of the reservoir Requires the creation of at least hundreds of liquids for acceptable results When an acceptable reservoir is found, it often can transfer to different problems well Acceptable reservoirs work by “effectively separating” different classes of input

Classification Problems Frequency Recognition 4 input neurons 5 classes Identify different combinations of slow and fast input neurons Pattern Recognition 8 input neurons 4, 8, and 12 classes Identify different spiking patterns based on templates

Learning Transfer with Traditional LSMs Mean Accuracy Maximum Accuracy

Training the Reservoir Randomly create a reservoir in the traditional sense Adjust the architecture of the reservoir until it can “effectively separate” Training is driven by separation property rather than error

Learning Transfer after Reservoir Training Mean Accuracy Maximum Accuracy

Separation Inter-class distance: Intra-class variance:

Separation Center of mass for class m: Average variance within class m:

Separation

Separation Correlation coefficient is 0.6876

Separation Driven Synaptic Modification (SDSM) Problems with liquids Too little distance between centers of mass Too much variance within classes Solutions Strengthen weak synapses, weaken strong synapses Strengthen strong synapses, weaken weak synapses Chaos Increase chaos in liquid Decrease chaos in liquid

SDSM Weight update: Effect of separation: Relative synaptic strength:

SDSM Differentiating classes of input (increase chaos): Activity of Neuron i: Decreasing variance within classes (decrease chaos): Compare

Learning Transfer with SDSM Mean Accuracy Maximum Accuracy

Learning Transfer with Traditional LSMs Mean Accuracy Maximum Accuracy

Conclusions The reservoir in a LSM can be used effectively on different problems SDSM successfully trains a reservoir without compromising this ability of the reservoir SDSM exhibits learning transfer Differing numbers of classes Naïve translation from one input representation into another

Future Work Compare a greater number problems with greater diversity Experiment with alternative “reservoir-quality-assessing” metrics such as statistical complexity Provide a large spectrum of training data (from a variety of problems) to train a single reservoir

Questions