An Associative Memory based on a Mixed-Signal Cellular Neural Network Michael Flynn, Daniel Weyer.

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

Computer Architecture
B.Macukow 1 Lecture 12 Neural Networks. B.Macukow 2 Neural Networks for Matrix Algebra Problems.
Kohonen Self Organising Maps Michael J. Watts
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Machine Learning Neural Networks
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Module 3.4: Switching Circuit Switching Packet Switching K. Salah.
A Review: Architecture
Correlation Matrix Memory CS/CMPE 333 – Neural Networks.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
1 Sec (2.1) Computer Architectures. 2 For temporary storage of information, the CPU contains cells, or registers, that are conceptually similar to main.
Chapter Seven The Network Approach: Mind as a Web.
Neural Networks for Information Retrieval Hassan Bashiri May 2005.
The importance of switching in communication The cost of switching is high Definition: Transfer input sample points to the correct output ports at the.
Before we start ADALINE
Data Mining with Neural Networks (HK: Chapter 7.5)
Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden.
Neural Optimization of Evolutionary Algorithm Strategy Parameters Hiral Patel.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
EKT303/4 PRINCIPLES OF PRINCIPLES OF COMPUTER ARCHITECTURE (PoCA)
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Combinational Logic Design
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Neural NetworksNN 11 Neural netwoks thanks to: Basics of neural network theory and practice for supervised and unsupervised.
10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network.
Chapter 9 Neural Network.
-Artificial Neural Network- Hopfield Neural Network(HNN) 朝陽科技大學 資訊管理系 李麗華 教授.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
Hebbian Coincidence Learning
Recurrent Network InputsOutputs. Motivation Associative Memory Concept Time Series Processing – Forecasting of Time series – Classification Time series.
1 Pattern Classification X. 2 Content General Method K Nearest Neighbors Decision Trees Nerual Networks.
Fuzzy BSB-neuro-model. «Brain-State-in-a-Box Model» (BSB-model) Dynamic of BSB-model: (1) Activation function: (2) 2.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
CHAPTER 9 MULTIPLEXERS, DECODERS, AND PROGRAMMABLE LOGIC DEVICES
Artificial Neural Networks Bruno Angeles McGill University – Schulich School of Music MUMT-621 Fall 2009.
ADVANCED PERCEPTRON LEARNING David Kauchak CS 451 – Fall 2013.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 8: Neural Networks.
Features of Biological Neural Networks 1)Robustness and Fault Tolerance. 2)Flexibility. 3)Ability to deal with variety of Data situations. 4)Collective.
Features of Biological Neural Networks 1)Robustness and Fault Tolerance. 2)Flexibility. 3)Ability to deal with variety of Data situations. 4)Collective.
Language Project.  Neural networks have a large appeal to many researchers due to their great closeness to the structure of the brain, a characteristic.
Network On Chip Platform
Computer Architecture Lecture 32 Fasih ur Rehman.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Supervised learning network G.Anuradha. Learning objectives The basic networks in supervised learning Perceptron networks better than Hebb rule Single.
November 21, 2013Computer Vision Lecture 14: Object Recognition II 1 Statistical Pattern Recognition The formal description consists of relevant numerical.
Information Retrieval CSE 8337 Spring 2005 Modeling (Part II) Material for these slides obtained from: Modern Information Retrieval by Ricardo Baeza-Yates.
A Continuously Adapting Floating-Gate Node Jeff Dugger and Paul Hasler School of ECE Georgia Institute of Technology
Dynamic Logic Circuits Static logic circuits allow implementation of logic functions based on steady state behavior of simple nMOS or CMOS structures.
CS623: Introduction to Computing with Neural Nets (lecture-12) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
3.13 How many output lines will a five-input decoder have?
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
3/19/  Differentiate the class of memory  List the type of main memory  Explain memory architecture and operation  Draw memory map  Design.
Lecture 39 Hopfield Network
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Class Exercise 1B.
Dynamic connection system
Ch7: Hopfield Neural Model
Covariation Learning and Auto-Associative Memory
Corso su Sistemi complessi:
Final Project presentation
The Network Approach: Mind as a Web
Presentation transcript:

An Associative Memory based on a Mixed-Signal Cellular Neural Network Michael Flynn, Daniel Weyer

 Design of an Associative Memory  Implementation with a Mixed-Signal Neural Network Motivation: Weight Lookup Memory neuron ID bus neuron ID weight …… retrieved weight 2

Associative memory can recover corrupted bits Neuron states represent bits of stored bit patterns  bit patterns are composed of address bits and data bits Treat data bits as corrupted bits  provide address bits and let neural network dynamics retrieve correct data bits Data Retrieval Concept amam...a3a3 a2a2 a1a1 dndn d2d2 d1d1 address bits (neuron ID) data bits (associated weight) 3

Architectural approach: Cellular Neural Network  every neuron is connected to neighboring neurons only  limited interconnectivity = good for VLSI implementation “Data neurons” encircled by “address neurons” Neural Network Architecture “address bit neuron” “data bit neuron” 4

Example: 12-bit pattern, 8 address bits + 4 data bits Bit Mapping on the Neuron States neuron network

Example: 12-bit pattern, 8 address bits + 4 data bits Data Retrieval Process initialize states of “address neurons” read out final states of “data neurons” update network state (synchronous updates) ① ② ③ 6

Mixed-signal implementation of the neuron transfer function Circuit Implementation comparator weight multiplier … x1x1 x2x2 xnxn w1w1 w2w2 wnwn I1I1 I2I2 InIn  Ii Ii … xnxn sgn( w n ) |wn||wn| InIn weight multiplier neuron 7

Bit patterns are held in the connection weights of the network Repetitive application of Hebb’s rule to store multiple patterns Storage of k-th bit pattern:  w ij (k) = w ij (k-1) + 1 if bits i (k) and j (k) are equal  w ij (k) = w ij (k-1) - 1 if bits i (k) and j (k) have opposite values Contributions to w ij are likely to cancel each other if stored bit patterns are disparate, i.e. weakly correlated  store bit patterns only if certain level of correlation is given Programming of the Memory i j w ij 8

Storage of 12-bit patterns in a 25-neuron network Simulation Setup 9 mixed-signal neuron multiplexer for node initialization digital blocks for weight programming

Example: Stored 12-bit patterns 0xC54, 0x835, 0xE8A, 0x0E6 Simulation Results 10 clock init signal applied address data bit 4 data bit 3 data bit 2 data bit 1 programming phase initialization phases update & retrieval phases

Example: 12-bit patterns, 8 address bits + 4 data bits  increased storage capacity if address patterns differ in about half the number of bit positions Memory Capacity d:Hamming distance between address bit patterns no restriction 1  d  7 2  d  6 3  d  5 d = 4 11

“Filter” bit patterns to ensure desired pattern distances  optimize overall storage capacity Outlook: Modular Memory Composition pattern filter 12