Software Simulation of a Self-organizing Learning Array System Janusz Starzyk & Zhen Zhu School of EECS Ohio University.

Slides:



Advertisements
Similar presentations
Data Mining Classification: Alternative Techniques
Advertisements

Data Mining Classification: Alternative Techniques
Introduction to Training and Learning in Neural Networks n CS/PY 399 Lab Presentation # 4 n February 1, 2001 n Mount Union College.
1 Machine Learning: Lecture 10 Unsupervised Learning (Based on Chapter 9 of Nilsson, N., Introduction to Machine Learning, 1996)
Indian Statistical Institute Kolkata
Analog Circuits for Self-organizing Neural Networks Based on Mutual Information Janusz Starzyk and Jing Liang School of Electrical Engineering and Computer.
Machine Learning Neural Networks
Support Vector Machines
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Hybrid Pipeline Structure for Self-Organizing Learning Array Yinyin Liu 1, Ding Mingwei 2, Janusz A. Starzyk 1, 1 School of Electrical Engineering & Computer.
Evolutionary Feature Extraction for SAR Air to Ground Moving Target Recognition – a Statistical Approach Evolving Hardware Dr. Janusz Starzyk Ohio University.
1 OPTIMIZED INTERCONNECTIONS IN PROBABILISTIC SELF-ORGANIZING LEARNING Janusz Starzyk, Mingwei Ding, Haibo He School of EECS Ohio University, Athens, OH.
Fourth International Symposium on Neural Networks (ISNN) June 3-7, 2007, Nanjing, China A Hierarchical Self-organizing Associative Memory for Machine Learning.
Future Hardware Realization of Self-Organizing Learning Array and Its Software Simulation Adviser: Dr. Janusz Starzyk Student: Tsun-Ho Liu Ohio University.
November 2, 2010Neural Networks Lecture 14: Radial Basis Functions 1 Cascade Correlation Weights to each new hidden node are trained to maximize the covariance.
Associative Learning in Hierarchical Self Organizing Learning Arrays Janusz A. Starzyk, Zhen Zhu, and Yue Li School of Electrical Engineering and Computer.
What is Learning All about ?  Get knowledge of by study, experience, or being taught  Become aware by information or from observation  Commit to memory.
MACHINE LEARNING 12. Multilayer Perceptrons. Neural Networks Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
An Introduction To The Backpropagation Algorithm Who gets the credit?
ICCINC' Janusz Starzyk, Yongtao Guo and Zhineng Zhu Ohio University, Athens, OH 45701, U.S.A. 6 th International Conference on Computational.
DESIGN OF A SELF- ORGANIZING LEARNING ARRAY SYSTEM Dr. Janusz Starzyk Tsun-Ho Liu Ohio University School of Electrical Engineering and Computer Science.
Information Fusion Yu Cai. Research Article “Comparative Analysis of Some Neural Network Architectures for Data Fusion”, Authors: Juan Cires, PA Romo,
Self-organizing Learning Array based Value System — SOLAR-V Yinyin Liu EE690 Ohio University Spring 2005.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Chapter 5 Data mining : A Closer Look.
Radial-Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Ensembles of Classifiers Evgueni Smirnov
Presented by: Kamakhaya Argulewar Guided by: Prof. Shweta V. Jain
Efficient Model Selection for Support Vector Machines
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Applying Neural Networks Michael J. Watts
So Far……  Clustering basics, necessity for clustering, Usage in various fields : engineering and industrial fields  Properties : hierarchical, flat,
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 24 Nov 2, 2005 Nanjing University of Science & Technology.
Artificial Neural Network Building Using WEKA Software
Computational Intelligence: Methods and Applications Lecture 23 Logistic discrimination and support vectors Włodzisław Duch Dept. of Informatics, UMK Google:
Prediction of Influencers from Word Use Chan Shing Hei.
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
Linear Discrimination Reading: Chapter 2 of textbook.
1Ellen L. Walker Category Recognition Associating information extracted from images with categories (classes) of objects Requires prior knowledge about.
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Speech Lab, ECE, State University of New York at Binghamton  Classification accuracies of neural network (left) and MXL (right) classifiers with various.
20031 Janusz Starzyk, Yongtao Guo and Zhineng Zhu Ohio University, Athens, OH 45701, U.S.A. April 27 th, 2003.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Perceptrons Michael J. Watts
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
1 Neural networks 2. 2 Introduction: Neural networks The nervous system contains 10^12 interconnected neurons.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 17: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
IEEE AI - BASED POWER SYSTEM TRANSIENT SECURITY ASSESSMENT Dr. Hossam Talaat Dept. of Electrical Power & Machines Faculty of Engineering - Ain Shams.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
An Introduction To The Backpropagation Algorithm.
Chapter 13 Artificial Intelligence. Artificial Intelligence – Figure 13.1 The Turing Test.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Deep Feedforward Networks
Artificial Neural Networks
Chap. 7 Regularization for Deep Learning (7.8~7.12 )
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
The Naïve Bayes (NB) Classifier
Prepared by: Mahmoud Rafeek Al-Farra
Multivariate Methods Berlin Chen
Multivariate Methods Berlin Chen, 2005 References:
Facultad de Ingeniería, Centro de Cálculo
Linear Discrimination
Evolutionary Ensembles with Negative Correlation Learning
Presentation transcript:

Software Simulation of a Self-organizing Learning Array System Janusz Starzyk & Zhen Zhu School of EECS Ohio University

Theme SOLAR = Self-organizing Learning Array Introduction to SOLAR Software simulation Performance of SOLAR

Introduction to SOLAR SOLAR : Artificial neural networks (ANN) Self-organizing structure Re-configurable hardware

Introduction to SOLAR Basic frame of SOLAR: A fixed lattice of processing units (neurons) Self-organization: Interconnections among the units refined during learning

Software Simulation - SOLAR Simulation tasks: Pre-processing of input data to SOLAR Behavior of a single neuron Network structure Classification Assembly of various networks

Software Simulation - SOLAR Inputs & outputs of SOLAR:

Software Simulation - SOLAR Real world input data features X: Incomplete set – data missing Symbolic – unacceptable to neural computation Unbalance weighted – needs to be equalized Pre-processing: Calculate default substitutes for missing data Set continuous values to all symbols Rescaling

Software Simulation - SOLAR Missing data problem: Find defaults for missing items in each individual inputs to minimize Mahalanobis distance. Separate known items X k, and missing items X m X=[X k, X m ]. Compute covariance matrix and its inversed matrix. Partition matrix. Compute default X m

Software Simulation - SOLAR Inputs & outputs of a single SOLAR neuron:

Software Simulation - SOLAR Behavior of a single SOLAR neuron: Output behaves a selected functions of input. Unary input operations: O=Y(I 1 ) or O=Y(I 2 ). Binary input operations: O=Y(I 1, I 2 ). All the operations are redesigned arithmetic operations. -Linear/ non-linear -Input/output range is set as

Software Simulation - SOLAR Unary input operations: Identical function : Y=IDENT(x)= Half function: Y=HALF(x)= Logarithm function: Y=NLOG2(x)= Exponential function:Y= NEXP2(x)= Binary input operations: Addition function: Y=NADD(x1,x2)= Subtraction function: Y=NSUB(x1,x2)=

Software Simulation - SOLAR Example: Y=NLOG2(x)=

Software Simulation - SOLAR HOW does a neuron learn from training data and process on testing data? Each neuron chooses an operation and a threshold. The whole input space will be cut into 2 parts (subspaces). Ex:

Software Simulation - SOLAR Neuron learning Neurons learn from each other and generates more complicated cuttings.

Software Simulation - SOLAR Neuron learning In order to effectively separate different classes, a neuron may choose from different configure options. processing unit Input clock function and 1 threshold are selected

Software Simulation - SOLAR Classification On each individual testing input data point, some of or all the neurons are active in classification. Neurons are activated with input clocks. Each neuron saves classification probabilities based on subspace division. Ex: subspace 1subspace 2 class 1 60% 10% class 2 10% 80% class 3 30% 10%

Software Simulation - SOLAR Classification On each testing input data point, some neurons have sufficient knowledge from learning and become eligible. They vote on the classification of this point.

Software Simulation - SOLAR Classification Several independent SOLAR networks form an ensemble to vote on the same problem.

Performance Evaluation - SOLAR An Australian credit card data set [1] is used to evaluate SOLAR performance. 14 input features, 690 individuals, 2 classes This data set is a typical classification problem and has been used to test other classic classification algorithms [2].

Performance Evaluation - SOLAR Divide the data set into 10 groups randomly. Run the simulation 10 times. Each time use 1 group for testing the the remaining for training. Average the resultant classification rate. Experimented on single SOLAR and SOLAR ensemble.

Performance Evaluation - SOLAR MethodMiss RateMethodMiss Rate CAL50.131Naivebay0.151 DIPOL CASTLE0.148 Logdisc0.141ALLOC SMART0.158CART0.145 C NewID0.181 IndCART0.152CN Bprop0.154LVQ0.197 Discrim0.141Kohenen - RBF0.145Quadisc0.207 Baytree0.171Default0.440 ITule0.137 AC20.181SOLAR0.183 k-NN0.181ensemble0.135

Performance Evaluation - SOLAR Conclusion: Although SOLAR was not designed with any particular purposes, it works well with several classification problems. SOLAR behaviors are observed in this simulation.

References [1] Y. Liu, X. Yao and T. Higuchi, “Evolutionary Ensembles with Negative Correlation Learning”, IEEE Trans. on Evolutionary Computation, Vol. 4, No. 4, Nov [2] D. Michie, D. J. Spiegelhalter, and C. C. Taylor, “Machine Learning, Neural and Statistical Classification” London, U. K. Ellis Horwood Ltd. 1994