Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 OPTIMIZED INTERCONNECTIONS IN PROBABILISTIC SELF-ORGANIZING LEARNING Janusz Starzyk, Mingwei Ding, Haibo He School of EECS Ohio University, Athens, OH.

Similar presentations


Presentation on theme: "1 OPTIMIZED INTERCONNECTIONS IN PROBABILISTIC SELF-ORGANIZING LEARNING Janusz Starzyk, Mingwei Ding, Haibo He School of EECS Ohio University, Athens, OH."— Presentation transcript:

1 1 OPTIMIZED INTERCONNECTIONS IN PROBABILISTIC SELF-ORGANIZING LEARNING Janusz Starzyk, Mingwei Ding, Haibo He School of EECS Ohio University, Athens, OH February 14-16, 2005 Innsbruck, Austria

2 2 OUTLINE IntroductionIntroduction Self-organizing neural network structureSelf-organizing neural network structure Optimal and fixed input weightsOptimal and fixed input weights Optimal weights Optimal weights Binary weights Binary weights Simulation resultsSimulation results Financial data analysis Financial data analysis Power quality classification Power quality classification Hardware platform developmentHardware platform development ConclusionConclusion

3 3 Self-Organizing Learning Array (SOLAR )  SOLAR CHARACTERISTICS Entropy based self- organization Entropy based self- organization Dynamical reconfiguration Dynamical reconfiguration Local and sparse i nterconnections Local and sparse i nterconnections Online inputs selection Online inputs selection Feature neurons and merging neurons Feature neurons and merging neurons

4 4 SOLAR Hardware Structure

5 5 Neuron Structure

6 6 Self-organization Each neuron has the ability to self- organize according to received information Each neuron has the ability to self- organize according to received information Functionality – chose internal arithmetic and logic functionsFunctionality – chose internal arithmetic and logic functions Input selection – chose input connectionsInput selection – chose input connections

7 7 Input selection Merging neurons receive inputs from previous layers Merging neurons receive inputs from previous layers Variable probability of received information correctness Variable probability of received information correctness Two input selection strategies Two input selection strategies Random selectionRandom selection Greedy selectionGreedy selection

8 8 Input Weighting Weighted signal merging: Weighted signal merging: Signal energy Noise energy N S n, n 0 S 2, n 0 S 1, n 0 W1W1 W2W2 WnWn

9 9 Input Weighting (cont’d) Objective function Objective function Maximize the energy/noise ratioMaximize the energy/noise ratio Set gradient of the objective function to 0 * for i=1, 2, … n

10 10 Optimum Input weighting 2-class classification problem 2-class classification problem Each neuron receives recognition rate p from the previous layerEach neuron receives recognition rate p from the previous layer When p=0.5, least information, it can be either class When p=0 or 1, most information, knows which class data belong to for sure Define Signal/noise ratio

11 11 Optimum Input weighting (cont’d) Using the optimization result Using the optimization result Optimum weight: Weighted output: Solve and represents our belief that result belong to class 1

12 12 Optimum Input weighting (cont’d) Example: Consider 3 inputs to a neuron with correct classification probabilities equal to p i Estimated output probability p out for various input probabilities is as follows

13 13 Binary weighting Simplified selection algorithm is desired for hardware implementation Simplified selection algorithm is desired for hardware implementation Choose 0 or 1 as the weights for all the connected inputs Choose 0 or 1 as the weights for all the connected inputs This equation can be used to study the effect of adding or removing connections of different signal strength

14 14 Binary weighting (cont’d) A stronger connection P max a weaker connection P mix A stronger connection P max a weaker connection P mix Criteria for adding weaker connection Criteria for adding weaker connection P mix P max 0.5 P comb Gain of information for different P max and P mix Threshold for adding a new connection

15 15 Binary weighting (cont’d) From previous results, selection criteria for binary weighting can be established. From previous results, selection criteria for binary weighting can be established. P max =0.69 0.5 P comb P mix >0.60 P mix <0.60 0.5 Threshold for adding a weaker connection P max =0.69

16 16 Simulation results Case I: Prediction of financial performance Case I: Prediction of financial performance Based on S&P Research Insight DatabaseBased on S&P Research Insight Database More than 10,000 companies includedMore than 10,000 companies included Training and testing on 3-year periodTraining and testing on 3-year period 192 features extracted192 features extracted Kernel PCA used to reduce 192 features to 13~15Kernel PCA used to reduce 192 features to 13~15 Fig. from http://goldennumber.net/stocks.htm

17 17 Simulation results (cont’d) Test year 200120022003 Performance0.58460.59620.5577 Training and testing data structure: Test result:

18 18 Simulation results (cont’d) Case II: Power quality disturbance classification problem Case II: Power quality disturbance classification problem People spill out onto Madison Avenue in New York after blackout hit. (4:00 pm, 14, August, 2003, CNN Report) Cars stopped about three-quarters of the way up the first hill of the Magnum XL200 ride at Cedar Point Amusement Park in Sandusky, Ohio. (15, August, 2003, CNN Report) THE COST: According to the North American Electric Reliability Council (NERC)

19 19 Formulation of the problem: Formulation of the problem: Wavelet Multiresolution Analysis (MRA) is used for feature vector constructionWavelet Multiresolution Analysis (MRA) is used for feature vector construction 7 classes classification problem:7 classes classification problem: Undisturbed sinusoid (normal); swell; sag; harmonics; outage; sag with harmonic; swell with harmonic outage; sag with harmonic; swell with harmonic Two hundred cases of each class were generated for training and another 200 cases were generated for testing.Two hundred cases of each class were generated for training and another 200 cases were generated for testing. Simulation results (cont’d)

20 20 Simulation results (cont’d) Reference [16]: T. K. A. Galil et. al, “Power Quality Disturbance Classification Using the Inductive Inference Approach, ” IEEE Transactions On Power Delivery, Vol.19, No.4, October, 2004

21 21 XILINX VIRTEX XCV 1000 Hardware Development

22 22 Conclusion Input selection strategy Input selection strategy Optimum weighting scheme theory Optimum weighting scheme theory Simple binary weighting for practical use Simple binary weighting for practical use Searching criteria for useful connections Searching criteria for useful connections Application study Application study Hardware platform design Hardware platform design

23 23 Questions?


Download ppt "1 OPTIMIZED INTERCONNECTIONS IN PROBABILISTIC SELF-ORGANIZING LEARNING Janusz Starzyk, Mingwei Ding, Haibo He School of EECS Ohio University, Athens, OH."

Similar presentations


Ads by Google