1 OPTIMIZED INTERCONNECTIONS IN PROBABILISTIC SELF-ORGANIZING LEARNING Janusz Starzyk, Mingwei Ding, Haibo He School of EECS Ohio University, Athens, OH.

Slides:



Advertisements
Similar presentations
EE 690 Design of Embodied Intelligence
Advertisements

20031 Janusz Starzyk and Yongtao Guo School of Electrical Engineering and Computer Science Ohio University, Athens, OH 45701, U.S.A. September,
Decision Tree Approach in Data Mining
Entropy and Dynamism Criteria for Voice Quality Classification Applications Authors: Peter D. Kukharchik, Igor E. Kheidorov, Hanna M. Lukashevich, Denis.
Sparse Coding in Sparse Winner networks Janusz A. Starzyk 1, Yinyin Liu 1, David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University,
Template design only ©copyright 2008 Ohio UniversityMedia Production Spring Quarter  A hierarchical neural network structure for text learning.
Analog Circuits for Self-organizing Neural Networks Based on Mutual Information Janusz Starzyk and Jing Liang School of Electrical Engineering and Computer.
Machine Learning Neural Networks
Brian Merrick CS498 Seminar.  Introduction to Neural Networks  Types of Neural Networks  Neural Networks with Pattern Recognition  Applications.
Simple Neural Nets For Pattern Classification
A Review: Architecture
1 Part I Artificial Neural Networks Sofia Nikitaki.
Optimizing number of hidden neurons in neural networks
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
1 Learning to Detect Objects in Images via a Sparse, Part-Based Representation S. Agarwal, A. Awan and D. Roth IEEE Transactions on Pattern Analysis and.
Hybrid Pipeline Structure for Self-Organizing Learning Array Yinyin Liu 1, Ding Mingwei 2, Janusz A. Starzyk 1, 1 School of Electrical Engineering & Computer.
Evolutionary Feature Extraction for SAR Air to Ground Moving Target Recognition – a Statistical Approach Evolving Hardware Dr. Janusz Starzyk Ohio University.
Using Analytic QP and Sparseness to Speed Training of Support Vector Machines John C. Platt Presented by: Travis Desell.
Fourth International Symposium on Neural Networks (ISNN) June 3-7, 2007, Nanjing, China A Hierarchical Self-organizing Associative Memory for Machine Learning.
Future Hardware Realization of Self-Organizing Learning Array and Its Software Simulation Adviser: Dr. Janusz Starzyk Student: Tsun-Ho Liu Ohio University.
1 FPGA Lab School of Electrical Engineering and Computer Science Ohio University, Athens, OH 45701, U.S.A. An Entropy-based Learning Hardware Organization.
Software Simulation of a Self-organizing Learning Array System Janusz Starzyk & Zhen Zhu School of EECS Ohio University.
Associative Learning in Hierarchical Self Organizing Learning Arrays Janusz A. Starzyk, Zhen Zhu, and Yue Li School of Electrical Engineering and Computer.
Classification and Prediction by Yen-Hsien Lee Department of Information Management College of Management National Sun Yat-Sen University March 4, 2003.
Face Processing System Presented by: Harvest Jang Group meeting Fall 2002.
Fourth International Symposium on Neural Networks (ISNN) June 3-7, 2007, Nanjing, China Online Dynamic Value System for Machine Learning Haibo He, Stevens.
Neural Optimization of Evolutionary Algorithm Strategy Parameters Hiral Patel.
ICCINC' Janusz Starzyk, Yongtao Guo and Zhineng Zhu Ohio University, Athens, OH 45701, U.S.A. 6 th International Conference on Computational.
1 EE 616 Computer Aided Analysis of Electronic Networks Lecture 10 Instructor: Dr. J. A. Starzyk, Professor School of EECS Ohio University Athens, OH,
DESIGN OF A SELF- ORGANIZING LEARNING ARRAY SYSTEM Dr. Janusz Starzyk Tsun-Ho Liu Ohio University School of Electrical Engineering and Computer Science.
A Hybrid Self-Organizing Neural Gas Network James Graham and Janusz Starzyk School of EECS, Ohio University Stocker Center, Athens, OH USA IEEE World.
Self-organizing Learning Array based Value System — SOLAR-V Yinyin Liu EE690 Ohio University Spring 2005.
Dan Simon Cleveland State University
On the Application of Artificial Intelligence Techniques to the Quality Improvement of Industrial Processes P. Georgilakis N. Hatziargyriou Schneider ElectricNational.
Performance Optimization of the Magneto-hydrodynamic Generator at the Scramjet Inlet Nilesh V. Kulkarni Advisors: Prof. Minh Q. Phan Dartmouth College.
Rohit Ray ESE 251. What are Artificial Neural Networks? ANN are inspired by models of the biological nervous systems such as the brain Novel structure.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
ENDA MOLLOY, ELECTRONIC ENG. FINAL PRESENTATION, 31/03/09. Automated Image Analysis Techniques for Screening of Mammography Images.
Machine Learning. Learning agent Any other agent.
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
A Shaft Sensorless Control for PMSM Using Direct Neural Network Adaptive Observer Authors: Guo Qingding Luo Ruifu Wang Limei IEEE IECON 22 nd International.
Evolving a Sigma-Pi Network as a Network Simulator by Justin Basilico.
Artificial Neural Networks
Massive MIMO Systems Massive MIMO is an emerging technology,
Chapter 9 Neural Network.
Neural Network Hopfield model Kim, Il Joong. Contents  Neural network: Introduction  Definition & Application  Network architectures  Learning processes.
Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos VC 14/15 – TP19 Neural Networks & SVMs Miguel Tavares.
User Authentication Using Keystroke Dynamics Jeff Hieb & Kunal Pharas ECE 614 Spring 2005 University of Louisville.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 8: Neural Networks.
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
Powerpoint Templates Page 1 Powerpoint Templates Scalable Text Classification with Sparse Generative Modeling Antti PuurulaWaikato University.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
PART 9 Fuzzy Systems 1. Fuzzy controllers 2. Fuzzy systems and NNs 3. Fuzzy neural networks 4. Fuzzy Automata 5. Fuzzy dynamic systems FUZZY SETS AND FUZZY.
Face Image-Based Gender Recognition Using Complex-Valued Neural Network Instructor :Dr. Dong-Chul Kim Indrani Gorripati.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Supervised learning network G.Anuradha. Learning objectives The basic networks in supervised learning Perceptron networks better than Hebb rule Single.
Convolutional Restricted Boltzmann Machines for Feature Learning Mohammad Norouzi Advisor: Dr. Greg Mori Simon Fraser University 27 Nov
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
IEEE AI - BASED POWER SYSTEM TRANSIENT SECURITY ASSESSMENT Dr. Hossam Talaat Dept. of Electrical Power & Machines Faculty of Engineering - Ain Shams.
Neural Networks The Elements of Statistical Learning, Chapter 12 Presented by Nick Rizzolo.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Probabilistic Models for Linear Regression
An Improved Neural Network Algorithm for Classifying the Transmission Line Faults Slavko Vasilic Dr Mladen Kezunovic Texas A&M University.
Department of Electrical Engineering
Random Neural Network Texture Model
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

1 OPTIMIZED INTERCONNECTIONS IN PROBABILISTIC SELF-ORGANIZING LEARNING Janusz Starzyk, Mingwei Ding, Haibo He School of EECS Ohio University, Athens, OH February 14-16, 2005 Innsbruck, Austria

2 OUTLINE IntroductionIntroduction Self-organizing neural network structureSelf-organizing neural network structure Optimal and fixed input weightsOptimal and fixed input weights Optimal weights Optimal weights Binary weights Binary weights Simulation resultsSimulation results Financial data analysis Financial data analysis Power quality classification Power quality classification Hardware platform developmentHardware platform development ConclusionConclusion

3 Self-Organizing Learning Array (SOLAR )  SOLAR CHARACTERISTICS Entropy based self- organization Entropy based self- organization Dynamical reconfiguration Dynamical reconfiguration Local and sparse i nterconnections Local and sparse i nterconnections Online inputs selection Online inputs selection Feature neurons and merging neurons Feature neurons and merging neurons

4 SOLAR Hardware Structure

5 Neuron Structure

6 Self-organization Each neuron has the ability to self- organize according to received information Each neuron has the ability to self- organize according to received information Functionality – chose internal arithmetic and logic functionsFunctionality – chose internal arithmetic and logic functions Input selection – chose input connectionsInput selection – chose input connections

7 Input selection Merging neurons receive inputs from previous layers Merging neurons receive inputs from previous layers Variable probability of received information correctness Variable probability of received information correctness Two input selection strategies Two input selection strategies Random selectionRandom selection Greedy selectionGreedy selection

8 Input Weighting Weighted signal merging: Weighted signal merging: Signal energy Noise energy N S n, n 0 S 2, n 0 S 1, n 0 W1W1 W2W2 WnWn

9 Input Weighting (cont’d) Objective function Objective function Maximize the energy/noise ratioMaximize the energy/noise ratio Set gradient of the objective function to 0 * for i=1, 2, … n

10 Optimum Input weighting 2-class classification problem 2-class classification problem Each neuron receives recognition rate p from the previous layerEach neuron receives recognition rate p from the previous layer When p=0.5, least information, it can be either class When p=0 or 1, most information, knows which class data belong to for sure Define Signal/noise ratio

11 Optimum Input weighting (cont’d) Using the optimization result Using the optimization result Optimum weight: Weighted output: Solve and represents our belief that result belong to class 1

12 Optimum Input weighting (cont’d) Example: Consider 3 inputs to a neuron with correct classification probabilities equal to p i Estimated output probability p out for various input probabilities is as follows

13 Binary weighting Simplified selection algorithm is desired for hardware implementation Simplified selection algorithm is desired for hardware implementation Choose 0 or 1 as the weights for all the connected inputs Choose 0 or 1 as the weights for all the connected inputs This equation can be used to study the effect of adding or removing connections of different signal strength

14 Binary weighting (cont’d) A stronger connection P max a weaker connection P mix A stronger connection P max a weaker connection P mix Criteria for adding weaker connection Criteria for adding weaker connection P mix P max 0.5 P comb Gain of information for different P max and P mix Threshold for adding a new connection

15 Binary weighting (cont’d) From previous results, selection criteria for binary weighting can be established. From previous results, selection criteria for binary weighting can be established. P max = P comb P mix >0.60 P mix < Threshold for adding a weaker connection P max =0.69

16 Simulation results Case I: Prediction of financial performance Case I: Prediction of financial performance Based on S&P Research Insight DatabaseBased on S&P Research Insight Database More than 10,000 companies includedMore than 10,000 companies included Training and testing on 3-year periodTraining and testing on 3-year period 192 features extracted192 features extracted Kernel PCA used to reduce 192 features to 13~15Kernel PCA used to reduce 192 features to 13~15 Fig. from

17 Simulation results (cont’d) Test year Performance Training and testing data structure: Test result:

18 Simulation results (cont’d) Case II: Power quality disturbance classification problem Case II: Power quality disturbance classification problem People spill out onto Madison Avenue in New York after blackout hit. (4:00 pm, 14, August, 2003, CNN Report) Cars stopped about three-quarters of the way up the first hill of the Magnum XL200 ride at Cedar Point Amusement Park in Sandusky, Ohio. (15, August, 2003, CNN Report) THE COST: According to the North American Electric Reliability Council (NERC)

19 Formulation of the problem: Formulation of the problem: Wavelet Multiresolution Analysis (MRA) is used for feature vector constructionWavelet Multiresolution Analysis (MRA) is used for feature vector construction 7 classes classification problem:7 classes classification problem: Undisturbed sinusoid (normal); swell; sag; harmonics; outage; sag with harmonic; swell with harmonic outage; sag with harmonic; swell with harmonic Two hundred cases of each class were generated for training and another 200 cases were generated for testing.Two hundred cases of each class were generated for training and another 200 cases were generated for testing. Simulation results (cont’d)

20 Simulation results (cont’d) Reference [16]: T. K. A. Galil et. al, “Power Quality Disturbance Classification Using the Inductive Inference Approach, ” IEEE Transactions On Power Delivery, Vol.19, No.4, October, 2004

21 XILINX VIRTEX XCV 1000 Hardware Development

22 Conclusion Input selection strategy Input selection strategy Optimum weighting scheme theory Optimum weighting scheme theory Simple binary weighting for practical use Simple binary weighting for practical use Searching criteria for useful connections Searching criteria for useful connections Application study Application study Hardware platform design Hardware platform design

23 Questions?