Fourth International Symposium on Neural Networks (ISNN) June 3-7, 2007, Nanjing, China A Hierarchical Self-organizing Associative Memory for Machine Learning.

Slides:



Advertisements
Similar presentations
Associative Learning Memories -SOLAR_A
Advertisements

20031 Janusz Starzyk and Yongtao Guo School of Electrical Engineering and Computer Science Ohio University, Athens, OH 45701, U.S.A. September,
Sparse Coding in Sparse Winner networks Janusz A. Starzyk 1, Yinyin Liu 1, David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University,
Template design only ©copyright 2008 Ohio UniversityMedia Production Spring Quarter  A hierarchical neural network structure for text learning.
Analog Circuits for Self-organizing Neural Networks Based on Mutual Information Janusz Starzyk and Jing Liang School of Electrical Engineering and Computer.
Artificial Neural Networks ECE 398BD Instructor: Shobha Vasudevan.
Characterization Presentation Neural Network Implementation On FPGA Supervisor: Chen Koren Maria Nemets Maxim Zavodchik
[1].Edward G. Jones, Microcolumns in the Cerebral Cortex, Proc. of National Academy of Science of United States of America, vol. 97(10), 2000, pp
Hybrid Pipeline Structure for Self-Organizing Learning Array Yinyin Liu 1, Ding Mingwei 2, Janusz A. Starzyk 1, 1 School of Electrical Engineering & Computer.
Self-Organizing Hierarchical Neural Network
1 OPTIMIZED INTERCONNECTIONS IN PROBABILISTIC SELF-ORGANIZING LEARNING Janusz Starzyk, Mingwei Ding, Haibo He School of EECS Ohio University, Athens, OH.
Dynamic Face Recognition Committee Machine Presented by Sunny Tang.
Future Hardware Realization of Self-Organizing Learning Array and Its Software Simulation Adviser: Dr. Janusz Starzyk Student: Tsun-Ho Liu Ohio University.
EE141 1 Broca’s area Pars opercularis Motor cortexSomatosensory cortex Sensory associative cortex Primary Auditory cortex Wernicke’s area Visual associative.
Introduction to Neural Network Justin Jansen December 9 th 2002.
Presenting: Itai Avron Supervisor: Chen Koren Final Presentation Spring 2005 Implementation of Artificial Intelligence System on FPGA.
1 FPGA Lab School of Electrical Engineering and Computer Science Ohio University, Athens, OH 45701, U.S.A. An Entropy-based Learning Hardware Organization.
Software Simulation of a Self-organizing Learning Array System Janusz Starzyk & Zhen Zhu School of EECS Ohio University.
Artificial Neural Networks (ANNs)
Associative Learning in Hierarchical Self Organizing Learning Arrays Janusz A. Starzyk, Zhen Zhu, and Yue Li School of Electrical Engineering and Computer.
IT 691 Final Presentation Pace University Created by: Robert M Gust Mark Lee Samir Hessami Mark Lee Samir Hessami.
Fourth International Symposium on Neural Networks (ISNN) June 3-7, 2007, Nanjing, China Online Dynamic Value System for Machine Learning Haibo He, Stevens.
ICCINC' Janusz Starzyk, Yongtao Guo and Zhineng Zhu Ohio University, Athens, OH 45701, U.S.A. 6 th International Conference on Computational.
DESIGN OF A SELF- ORGANIZING LEARNING ARRAY SYSTEM Dr. Janusz Starzyk Tsun-Ho Liu Ohio University School of Electrical Engineering and Computer Science.
Information Fusion Yu Cai. Research Article “Comparative Analysis of Some Neural Network Architectures for Data Fusion”, Authors: Juan Cires, PA Romo,
A Hybrid Self-Organizing Neural Gas Network James Graham and Janusz Starzyk School of EECS, Ohio University Stocker Center, Athens, OH USA IEEE World.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Presented by: Kamakhaya Argulewar Guided by: Prof. Shweta V. Jain
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Review – Backpropagation
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
Machine Learning. Learning agent Any other agent.
Chapter 14: Artificial Intelligence Invitation to Computer Science, C++ Version, Third Edition.
Artificial Neural Network Theory and Application Ashish Venugopal Sriram Gollapalli Ulas Bardak.
Computer Architecture and Organization Introduction.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Introduction to Neural Networks Debrup Chakraborty Pattern Recognition and Machine Learning 2006.
Outline What Neural Networks are and why they are desirable Historical background Applications Strengths neural networks and advantages Status N.N and.
Advances in Modeling Neocortex and its impact on machine intelligence Jeff Hawkins Numenta Inc. VS265 Neural Computation December 2, 2010 Documentation.
NEURAL NETWORKS FOR DATA MINING
Chapter 1 Introduction. Architecture & Organization 1 Architecture is those attributes visible to the programmer —Instruction set, number of bits used.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
NEURAL - FUZZY LOGIC FOR AUTOMATIC OBJECT RECOGNITION.
Language Project.  Neural networks have a large appeal to many researchers due to their great closeness to the structure of the brain, a characteristic.
Akram Bitar and Larry Manevitz Department of Computer Science
Networks in Engineering A network consists of a set of interconnected components that deliver a predictable output to a given set of inputs. Function InputOutput.
Lecture 5 Neural Control
Supervised Learning. Teacher response: Emulation. Error: y1 – y2, where y1 is teacher response ( desired response, y2 is actual response ). Aim: To reduce.
20031 Janusz Starzyk, Yongtao Guo and Zhineng Zhu Ohio University, Athens, OH 45701, U.S.A. April 27 th, 2003.
Joe Bradish Parallel Neural Networks. Background  Deep Neural Networks (DNNs) have become one of the leading technologies in artificial intelligence.
How do you get here?
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Artificial Neural Networks By: Steve Kidos. Outline Artificial Neural Networks: An Introduction Frank Rosenblatt’s Perceptron Multi-layer Perceptron Dot.
Building a scalable neural processing subsystem Joy Bose Supervisors Steve Furber (Amulet Group) Jonathan Shapiro (AI Group) Amulet Group Meeting 30 January.
Bayesian Brain - Chapter 11 Neural Models of Bayesian Belief Propagation Rajesh P.N. Rao Summary by B.-H. Kim Biointelligence Lab School of.
General-Purpose Learning Machine
Neural Network Architecture Session 2
Multilayer Perceptrons
Final Year Project Presentation --- Magic Paint Face
Non-linear hypotheses
ECEG-3202 Computer Architecture and Organization
Approximate Fully Connected Neural Network Generation
ECEG-3202 Computer Architecture and Organization
network of simple neuron-like computing elements
Prepared by: Mahmoud Rafeek Al-Farra
August 8, 2006 Danny Budik, Itamar Elhanany Machine Intelligence Lab
Akram Bitar and Larry Manevitz Department of Computer Science
Presentation transcript:

Fourth International Symposium on Neural Networks (ISNN) June 3-7, 2007, Nanjing, China A Hierarchical Self-organizing Associative Memory for Machine Learning Janusz A. Starzyk, Ohio University Haibo He, Stevens Institute of Technology Yue Li, O2 Micro Inc

2/23 Outline  Introduction;  Associative learning algorithm;  Memory network architecture and operation;  Simulation analysis;  Conclusion and future research;

3/23 Introduction: A biological point of view Source: “The computational brain” by P. S. Churchland and T. J. Sejnowski Memory is a critical component for understanding and developing natural intelligent machines/systems The question is: How???

4/23 Introduction: self-organizing learning array (SOLAR) Characteristics: * Self-organization * Sparse and local interconnections * Dynamically reconfigurable * Online data-driven learning Other Neurons Nearest neighbour neuron Remote neurons System clock ID: information deficiency II: information index

5/23 Introduction: from SOLAR to AM Characteristics:  Self-organization;  Sparse and local interconnections;  Feedback propagation;  Information inference;  Hierarchical organization;  Robust and self-adaptive;  Capable of both hetero-associative (HA) and auto-associative (AA) Feed forward only Feed forward Feed backward

6/23 Outline  Introduction; Associative learning algorithm;  Memory network architecture and operation;  Simulation analysis;  Conclusion and future research;

7/23 Basic learning element Self-determination of the function value: An example:

8/23 Signal strength (SS) Signal strength (SS) =| Signal value – logic threshold| (SS range: [0, 1])  Provides a coherent way to determine when to trigger an association;  Helps to resolve multiple feedback signals;

9/23 Three types of associations  IOA: Input only association;  OOA: Output only association;  INOUA: Input-output association;

10/23 Probability based associative learning algorithm  Case 1: Given the values of both inputs, decide the output value;

11/23 Probability based associative learning algorithm  Case 2: Given the values of one input and an un-defined output, decide the value of the other input; For instance:

12/23 Probability based associative learning algorithm  Case 3: Given the values of the output, decide the values of both inputs;

13/23 Probability based associative learning algorithm  Case 4: Given the values of one input and the output, decide the other input value; For instance:

14/23 Outline  Introduction;  Associative learning algorithm; Memory network architecture and operation;  Simulation analysis;  Conclusion and future research;

15/23 Network operations Feedback operationFeed forward operation Depth Input data Depth ?.??.?

16/23 Memory operation Undefined signal Defined signal Recovered signal Input data Signal resolved based on SS

17/23 Outline  Introduction;  Associative learning algorithm;  Memory network architecture and operation; Simulation analysis;  Conclusion and future research;

18/23 Hetero-associative memory: Iris database classification N-bits sliding-bar coding mechanism: Features: Class identity labels: In our simulation: N=80, L=20, M=30 3 classes, 4 numeric attributes, 150 instances

19/23 Neuron association pathway Classification accuracy: 96%

20/23 Auto-associative memory: Panda image recovery 30% missing pixels Original image 64x64 binary image Error: % Block half Error: 2.42% 64 x 64 binary panda image: for a black pixel; for a white pixel;

21/23 Outline  Introduction;  Associative learning algorithm;  Memory network architecture and operation;  Simulation analysis; Conclusion and future research;

22/23 Conclusion and future research  Hierarchical associative memory architecture;  Probabilistic information processing, transmission, association and prediction;  Self-organization;  Self-adaptive;  Robustness;

23/23 It’s all about design natural intelligent machines ! Future research  Multiple-inputs (>2) association mechanism;  Dynamically self-reconfigurable;  Hardware implementation;  Facilitate goal-driven learning;  Spatio-temporal memory organization; How far are we??? “Brain On Silicon” will not just be a dream or scientific fiction in the future! 3DANN Picture source: and Irvine Sensors Corporation (Costa Mesa, CA)