Ch6: AM and BAM 6.1 Introduction AM: Associative Memory

Slides:



Advertisements
Similar presentations
Bioinspired Computing Lecture 16
Advertisements

Chapter3 Pattern Association & Associative Memory
Pattern Association.
Feedback Networks and Associative Memories
Memristor in Learning Neural Networks
Computational Intelligence
Clustering: Introduction Adriano Joaquim de O Cruz ©2002 NCE/UFRJ
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Hinrich Schütze and Christina Lioma
Brain-like design of sensory-motor programs for robots G. Palm, Uni-Ulm.
Pattern Association A pattern association learns associations between input patterns and output patterns. One of the most appealing characteristics of.
Correlation Matrix Memory CS/CMPE 333 – Neural Networks.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
Introduction to Information Retrieval Introduction to Information Retrieval Hinrich Schütze and Christina Lioma Lecture 18: Latent Semantic Indexing 1.
Module 1: Machine Learning
Associative Learning in Hierarchical Self Organizing Learning Arrays Janusz A. Starzyk, Zhen Zhu, and Yue Li School of Electrical Engineering and Computer.
November 30, 2010Neural Networks Lecture 20: Interpolative Associative Memory 1 Associative Networks Associative networks are able to store a set of patterns.
Y. Weiss (Hebrew U.) A. Torralba (MIT) Rob Fergus (NYU)
CHAPTER 3 Pattern Association.
Supervised Hebbian Learning
Chapter 6 Associative Models. Introduction Associating patterns which are –similar, –contrary, –in close proximity (spatial), –in close succession (temporal)
© Negnevitsky, Pearson Education, Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works Introduction, or.
Associative-Memory Networks Input: Pattern (often noisy/corrupted) Output: Corresponding pattern (complete / relatively noise-free) Process 1.Load input.
Introduction to Neural Networks Debrup Chakraborty Pattern Recognition and Machine Learning 2006.
-Artificial Neural Network- Hopfield Neural Network(HNN) 朝陽科技大學 資訊管理系 李麗華 教授.
Chapter 8 Fuzzy Associative Memories Li Lin
CHAPTER 09 Compiled by: Dr. Mohammad Omar Alhawarat Sorting & Searching.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
Hebbian Coincidence Learning
Recurrent Network InputsOutputs. Motivation Associative Memory Concept Time Series Processing – Forecasting of Time series – Classification Time series.
CpSc 881: Information Retrieval. 2 Recall: Term-document matrix This matrix is the basis for computing the similarity between documents and queries. Today:
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Focus on Unsupervised Learning.  No teacher specifying right answer.
The Perceptron. Perceptron Pattern Classification One of the purposes that neural networks are used for is pattern classification. Once the neural network.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
Associative Memory “Remembering”? – Associating something with sensory cues Cues in terms of text, picture or anything Modeling the process of memorization.
Linear Algebra Diyako Ghaderyan 1 Contents:  Linear Equations in Linear Algebra  Matrix Algebra  Determinants  Vector Spaces  Eigenvalues.
Linear Algebra Diyako Ghaderyan 1 Contents:  Linear Equations in Linear Algebra  Matrix Algebra  Determinants  Vector Spaces  Eigenvalues.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
CS654: Digital Image Analysis Lecture 11: Image Transforms.
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
0 Assignment 1 (Due: 10/2) Input/Output an image: (i) Design a program to input and output a color image. (ii) Transform the output color image C(R,G,B)
Ch 7. Computing with Population Coding Summarized by Kim, Kwonill Bayesian Brain: Probabilistic Approaches to Neural Coding P. Latham & A. Pouget.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Introduction toData structures and Algorithms
Chapter 6 Associative Models
Neural Networks.
Ch7: Hopfield Neural Model
ECE 471/571 - Lecture 15 Hopfield Network 03/29/17.
ECE 471/571 - Lecture 19 Hopfield Network.
Outline Associative Learning: Hebbian Learning
Ch2: Adaline and Madaline
Introduction PCA (Principal Component Analysis) Characteristics:
Word Embedding Word2Vec.
Ying Dai Faculty of software and information science,
Ying Dai Faculty of software and information science,
Ying Dai Faculty of software and information science,
Ying Dai Faculty of software and information science,
A Novel Smoke Detection Method Using Support Vector Machine
Data Pre-processing Lecture Notes for Chapter 2
Bug Localization with Combination of Deep Learning and Information Retrieval A. N. Lam et al. International Conference on Program Comprehension 2017.
Ch. 4 Vocabulary continued (4-3)
Presentation transcript:

Ch6: AM and BAM 6.1 Introduction AM: Associative Memory BAM: Bidirectional Associative Memory 6.1 Introduction 。 Memory: store and deduce information Memory can be pre-stored or trained AM retrieves memory in one shot BAM retrieves memory through many iterations

。 Categories of memory LAM (Local Addressable Memory) -- give an address to access the content pointed by the address. CAM (Content Addressable Memory) -- give pieces of information (key) to find the address where stores the complete information, ex. hash table.

AM (Associative Memory) -- give a datum (i) to find or recover the original noise–free data (ii) to retrieve all the related data Matrix Representation

Example: Input vector: Fault tolerance Recall:

6.3 Types of AM Input vector: Output vector: Autoassociative memory - If e.g., color correction, color constancy Heteroassociative memory - If e.g., i, Space transforms: Fourier transforms ii, Dimensionality reduction: PCA

Interpolative associative memory - If e.g., Compute AM mapping is defined as output input Suppose are orthonormal vectors.

a. Continuous-valued input patterns i, Ideal pattern retrieval (noise-free) e.g., ii, Noisy input noise

b. Binary – valued input patterns: 0, 1 1. Memory matrix where i.e., W is formed from bipolar – valued patterns: -1, 1 2. Threshold vector i = 1, …, N (row sum) 3. Retrieval: where I : input vector, O : output vector

。 Example: Auto-association memory Two I/O pairs : ∵ Auto-AM i, Memory matrix ,

ii, Threshold vector 1. First test pattern

Output pattern: 2. Second test pattern Output pattern: nonlinear processing Output pattern: 2. Second test pattern Output pattern:

6.4 BAM Characteristics:

For autoassociative memory,

‧ Output

19

Second trial:

6.4.2 Energy Function Dynamic system: a system whose state changes with time. State: a collection of adaptable quantitative and qualitative items that characterizing the system, e.g., weights, data flows …..

31

32