A Neural Net For Terrain Classification

Slides:



Advertisements
Similar presentations
Memristor in Learning Neural Networks
Advertisements

2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Neural Networks Dr. Peter Phillips. The Human Brain (Recap of week 1)
Organizing a spectral image database by using Self-Organizing Maps Research Seminar Oili Kohonen.
TEMPLATE DESIGN © Self Organized Neural Networks Applied to Animal Communication Abstract Background Objective The main.
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
Self Organization: Competitive Learning
Kohonen Self Organising Maps Michael J. Watts
Unsupervised Learning with Artificial Neural Networks The ANN is given a set of patterns, P, from space, S, but little/no information about their classification,
Unsupervised Networks Closely related to clustering Do not require target outputs for each input vector in the training data Inputs are connected to a.
Self-Organizing Map (SOM). Unsupervised neural networks, equivalent to clustering. Two layers – input and output – The input layer represents the input.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
X0 xn w0 wn o Threshold units SOM.
Self Organizing Maps. This presentation is based on: SOM’s are invented by Teuvo Kohonen. They represent multidimensional.
PMR5406 Redes Neurais e Lógica Fuzzy
November 2, 2010Neural Networks Lecture 14: Radial Basis Functions 1 Cascade Correlation Weights to each new hidden node are trained to maximize the covariance.
Matlab NN Toolbox Implementation 1. Loading data source. 2. Selecting attributes required. 3. Decide training, validation,
Aula 4 Radial Basis Function Networks
WK6 – Self-Organising Networks:
Content Based Image Organization CS491 Spring 2006 Prof. Chengyu Sun Kelly Breed.
Introduction to undirected Data Mining: Clustering
 C. C. Hung, H. Ijaz, E. Jung, and B.-C. Kuo # School of Computing and Software Engineering Southern Polytechnic State University, Marietta, Georgia USA.
Lecture 12 Self-organizing maps of Kohonen RBF-networks
Self Organized Map (SOM)
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Self-organizing Maps Kevin Pang. Goal Research SOMs Research SOMs Create an introductory tutorial on the algorithm Create an introductory tutorial on.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
-Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系 李麗華 教授.
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
Community Architectures for Network Information Systems
A Scalable Self-organizing Map Algorithm for Textual Classification: A Neural Network Approach to Thesaurus Generation Dmitri G. Roussinov Department of.
Self-Organizing Maps Corby Ziesman March 21, 2007.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Exploiting Data Topology in Visualization and Clustering.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Exploiting Data Topology in Visualization and Clustering.
TreeSOM :Cluster analysis in the self- organizing map Neural Networks 19 (2006) Special Issue Reporter 張欽隆 D
381 Self Organization Map Learning without Examples.
Privacy-Preserving Self- Organizing Map Shuguo Han and Wee Keong Ng Center for Advanced Information Systems, School of Computer Engineering,Nanyang Technological.
CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 A self-organizing map for adaptive processing of structured.
EM Mixture models: datapoint has nonzero probs to belong to multiple k distributions So Hidden Var in each datapoint: e.g for k=2 hypothesis h about parameters.
Joe Bradish Parallel Neural Networks. Background  Deep Neural Networks (DNNs) have become one of the leading technologies in artificial intelligence.
Soft Computing Lecture 15 Constructive learning algorithms. Network of Hamming.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
Clustering (1) Clustering Similarity measure Hierarchical clustering
Self-Organizing Network Model (SOM) Session 11
Radoslav Forgáč, Igor Mokriš
Ananya Das Christman CS311 Fall 2016
Building Adaptive Basis Function with Continuous Self-Organizing Map
Data Mining, Neural Network and Genetic Programming
Other Applications of Energy Minimzation
Creating fuzzy rules from numerical data using a neural network
Convolutional Neural Networks
Lecture 22 Clustering (3).
Self-Organizing Maps Corby Ziesman March 21, 2007.
OVERVIEW OF BIOLOGICAL NEURONS
Neural Networks and Their Application in the Fields of Coporate Finance By Eric Séverin Hanna Viinikainen.
Self-organizing map numeric vectors and sequence motifs
Hubs and Authorities & Learning: Perceptrons
CS 621 Artificial Intelligence Lecture 27 – 21/10/05
Introduction to Cluster Analysis
Self Organizing Maps A major principle of organization is the topographic map, i.e. groups of adjacent neurons process information from neighboring parts.
Feature mapping: Self-organizing Maps
Artificial Neural Networks
Unsupervised Networks Closely related to clustering
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Memory-Based Learning Instance-Based Learning K-Nearest Neighbor
Presentation transcript:

A Neural Net For Terrain Classification Jackie Soenneker

Overview of SOM Self-Organizing Map Neural Net Has a grid of neurons Each neuron has a weight vector For each input vector there is a “winning” neuron The winning neuron and its neighbors are adjusted to better match the input 1 2 3 7 8 9

Dimensions & Distance Functions Dimension: how many neurons to use Default is 4x6 (24 neurons) I’m using twice as many neurons as terrain classes Distance Function: how far apart are 2 neurons? Link Distance (default) – number of links between the neurons Euclidean Distance – straight-line distance between the neurons Manhattan Distance – “follow the grid” distance between the neurons’ vectors

Topologies Topology: how are the neurons connected? Topology doesn’t seem to effect learning very much Hextop is the default and the one I’m using Hextop Gridtop Randtop

Learning Phases SOM learning has two phases Ordering Phase (first phase) large learning rate quickly fits the neurons to the general distribution of the input space There are 2 Ordering Phase parameters Learning rate – 0.9 (default) Number of steps – 1,000 (default); 2,000 works better The number of OP steps should probably grow proportionally to the number of neurons

Learning Phases con. Tuning Phase (second phase) small learning rate fine-tunes the neurons to fit the input space more precisely There is one Tuning Phase parameter Learning rate – 0.02 (default)

Summary Dimension: twice as many neurons as terrain classes Distance Function: Link Distance (default) Topology: Hextop (default) OP Learning Rate: 0.9 (default) OP Steps: 2,000 (probably increase w/ more nodes) TP Learning Rate: 0.02 (default)