TEMPLATE DESIGN © 2008 www.PosterPresentations.com Self Organized Neural Networks Applied to Animal Communication Abstract Background Objective The main.

Slides:



Advertisements
Similar presentations
Memristor in Learning Neural Networks
Advertisements

Self Organization of a Massive Document Collection
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
Self Organization: Competitive Learning
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Kohonen Self Organising Maps Michael J. Watts
Self-Organizing Map (SOM). Unsupervised neural networks, equivalent to clustering. Two layers – input and output – The input layer represents the input.
Non-linear Dimensionality Reduction CMPUT 466/551 Nilanjan Ray Prepared on materials from the book Non-linear dimensionality reduction By Lee and Verleysen,
Self Organizing Maps. This presentation is based on: SOM’s are invented by Teuvo Kohonen. They represent multidimensional.
Un Supervised Learning & Self Organizing Maps Learning From Examples
November 9, 2010Neural Networks Lecture 16: Counterpropagation 1 Unsupervised Learning So far, we have only looked at supervised learning, in which an.
Ameriranikistan Muhammad Ahmad Kyle Huston Farhad Majdeteimouri Dan Mackin.
Motion Map: Image-based Retrieval and Segmentation of Motion Data EG SCA ’ 04 學生 : 林家如
1 Study of Topographic and Equiprobable Mapping with Clustering for Fault Classification Ashish Babbar EE645 Final Project.
Neural Network Homework Report: Clustering of the Self-Organizing Map Professor : Hahn-Ming Lee Student : Hsin-Chung Chen M IEEE TRANSACTIONS ON.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
Neural Networks Lecture 17: Self-Organizing Maps
A Hybrid Self-Organizing Neural Gas Network James Graham and Janusz Starzyk School of EECS, Ohio University Stocker Center, Athens, OH USA IEEE World.
Lecture 09 Clustering-based Learning
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
AdvisorStudent Dr. Jia Li Shaojun Liu Dept. of Computer Science and Engineering, Oakland University 3D Shape Classification Using Conformal Mapping In.
 C. C. Hung, H. Ijaz, E. Jung, and B.-C. Kuo # School of Computing and Software Engineering Southern Polytechnic State University, Marietta, Georgia USA.
Project reminder Deadline: Monday :00 Prepare 10 minutes long pesentation (in Czech/Slovak), which you’ll present on Wednesday during.
JingTao Yao Growing Hierarchical Self-Organizing Maps for Web Mining Joseph P. Herbert and JingTao Yao Department of Computer Science, University or Regina.
CS623: Introduction to Computing with Neural Nets (lecture-20) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Lecture 12 Self-organizing maps of Kohonen RBF-networks
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Advisor : Dr. Hsu Student : Sheng-Hsuan Wang Department.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Self Organizing Maps (SOM) Unsupervised Learning.
Self Organized Map (SOM)
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Self-organizing Maps Kevin Pang. Goal Research SOMs Research SOMs Create an introductory tutorial on the algorithm Create an introductory tutorial on.
Self-organizing map Speech and Image Processing Unit Department of Computer Science University of Joensuu, FINLAND Pasi Fränti Clustering Methods: Part.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Artificial Neural Network Unsupervised Learning
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
EMIS 8381 – Spring Netflix and Your Next Movie Night Nonlinear Programming Ron Andrews EMIS 8381.
A two-stage approach for multi- objective decision making with applications to system reliability optimization Zhaojun Li, Haitao Liao, David W. Coit Reliability.
A Scalable Self-organizing Map Algorithm for Textual Classification: A Neural Network Approach to Thesaurus Generation Dmitri G. Roussinov Department of.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 A Comparison of SOM Based Document Categorization Systems.
Self-Organizing Maps Corby Ziesman March 21, 2007.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Extracting meaningful labels for WEBSOM text archives Advisor.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Multidimensional Scaling Vuokko Vuori Based on: Data Exploration Using Self-Organizing Maps, Samuel Kaski, Ph.D. Thesis, 1997 Multivariate Statistical.
TreeSOM :Cluster analysis in the self- organizing map Neural Networks 19 (2006) Special Issue Reporter 張欽隆 D
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Rival-Model Penalized Self-Organizing Map Yiu-ming Cheung.
381 Self Organization Map Learning without Examples.
CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 A self-organizing map for adaptive processing of structured.
Citation-Based Retrieval for Scholarly Publications 指導教授:郭建明 學生:蘇文正 M
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Robodog Frontal Facial Recognition AUTHORS GROUP 5: Jing Hu EE ’05 Jessica Pannequin EE ‘05 Chanatip Kitwiwattanachai EE’ 05 DEMO TIMES: Thursday, April.
A Self-organizing Semantic Map for Information Retrieval Xia Lin, Dagobert Soergel, Gary Marchionini presented by Yi-Ting.
Data Mining, Neural Network and Genetic Programming
Unsupervised Learning Networks
Other Applications of Energy Minimzation
Self organizing networks
Dr. Unnikrishnan P.C. Professor, EEE
Face Recognition and Detection Using Eigenfaces
Lecture 22 Clustering (3).
Introduction to Cluster Analysis
Self Organizing Maps A major principle of organization is the topographic map, i.e. groups of adjacent neurons process information from neighboring parts.
Artificial Neural Networks
A Neural Net For Terrain Classification
Presentation transcript:

TEMPLATE DESIGN © Self Organized Neural Networks Applied to Animal Communication Abstract Background Objective The main objective of this project is to extract features from several animal sounds that were communicated under situations of mating, danger, and foraging using MATLAB. An application was developed in Java to train several SOFMs using those data. This allows us to discern the animal sounds and the situation under which they’re used. We have made the following contributions in this project: On Self-Organizing Feature Map Input Presentation Order Winning neurons history Taking account of context in which the sounds are made Proposing Correlation to do finer level clustering Sammon’s projection introduced to provide better initialization to the SOFM On Feature Selection Ordering Proposing Matrix Features in the case of joint time-frequency domain Using normalization of input data for better efficiency Pre-processing the input to remove noise On Animal Sounds Humpback Whale Bottle nose dolphin Coyote Feature SelectionPerformance It was found that the addition of a history to the SOM greatly improved performance. With a history the SOM was able to reach a stable state within 50 iterations and the map preserved the topological differences of the input space. Without history, the SOM took much longer than 50 iterations(100+) to reach stable state and the overall quality was lower. Displayed below are maps created to display the quality and topology of the map by computing the average differences a neuron has with its neighbors. Conclusion Acknowledgement s In conclusion, self-organizing feature maps were successful in categorizing and identifying danger, hunger, and mating calls of Coyotes, Humpback Whales, and Bottlenose dolphins. The history function greatly improves the performance during the training phase. Correlation increases the success rate of identifying unknown sounds by 15-20% Blended distance not only has a significant effect in improvement of computational time in our self-organizing feature maps but it can also be used in a number of applications. We are currently writing three papers to be submitted for publication and conferences. The main paper will generalize our findings and be submitted to IEEE, while the other papers will specialize in either SOFM contributions or animal communication and be submitted to conferences. A Self-Organizing Feature Map(SOFM) is a type of unsupervised neural network designed to map high dimensional spaces onto low dimensional spaces that can be easily understood and visualized. The map consists of input nodes to which feature vectors of high dimension are presented. SOFMs are trained on exemplary patterns using two stages: competitive and weight update stage. Matthew Bradley 1 Kay Jantharasorn 2 Keith Jones 1 Advisor: Dr. Mohamed Zohdy 3 1 Oakland University 2 University of Michigan- Flint 3 Department of Electrical and Computer Engineering at Oakland University We’d like to thank National Science Foundation and Oakland University for giving us this great opportunity to explore and participate in research and learn more about research careers. We’d also like to thank our advisor, Dr. Mohamed Zohdy for his guidance throughout this research, Doug Hunter for his scientific journals, articles related to Humpback whales, and Humpback whales data. With History Without History Metrics between Euclidean and Manhattan are useful to obtain best matching units when the data has a rotational bias. A blended Euclidean-Manhattan distance metric (lambda) is proposed to approximate the traditional Lp metric (for p = 1 to 2) while costing considerably less computational time: Distance Selection There are several different ways to compute distance(Shown to the right): 1-Norm(Manhattan) 2-Norm(Euclidean) p-norm Infinity References Kohonen, T., Self-Organizating Maps, New York : Springer- Verlag, 1997 Payne, Roger S. McVay, Scott. “Songs of Humpback Whales.” Science 173 (1971): Germano, Tom. 23 March, June, Hunter, Doug. Professor Emeritus. Biology. Oakland University Nsour, Ahmad. Zohdy, Mohamed. “Self Organized Learning Applied to Global Positioning System (GPS) Data”. Oakland University FindSounds Comparisonics. 2 June, When testing the accuracy of the maps it was found that the SOFM that used features from the frequency domain performed better overall at approximately 74.6%, followed by the time- frequency domain at 63.5%, and lastly the time- domain at 55.5% accuracy. W ij-- Neuron Weights α(t)- Learning Rate β(t)- Neighborhood Function X -- Input Vector In this project, Self Organizing Feature Maps are trained to categorize animal communication sounds into danger, hunger, and mating calls for Humpback Whales, Bottlenose Dolphins and Coyotes. Features are extracted in time domain, frequency domain, and joint time-frequency domain from audio files of animal communication. Unknown calls are then fed into the map to identify the sound. Several contributions have been made to the maps to improve the quality, speed, and accuracy. Correlation has been implemented as the activation function to improve the accuracy of identifying sounds. A history function to update previous winning nodes has been implemented to decrease the time necessary to complete the training phase. Sammon mapping has also been proposed to provide better initialization of vectors. In addition, several types of distance metrics are tested to find best matching units including a blended Manhattan-Euclidean distance which has been found useful. Frequency Domain Frequency Domain Time Domain Time/Frequency Filtered W e extracted data from frequency domain, time domain and joint time-frequency domain of each sound from each animal. We filtered frequency domain to minimize noise. We selected several features from time and frequency domains as vectors and joint time – frequency domain is selected as a matrix.