An Illustrative Example.

Slides:



Advertisements
Similar presentations
Pattern Association.
Advertisements

Perceptron Lecture 4.
Introduction to Neural Networks Computing
Artificial Neural Networks (1)
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
An Illustrative Example.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
An Illustrative Example.
Lecture 14 – Neural Networks
Artificial Neural Networks
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
An Illustrative Example
1 Pertemuan 9 JARINGAN LEARNING VECTOR QUANTIZATION Matakuliah: H0434/Jaringan Syaraf Tiruan Tahun: 2005 Versi: 1.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Ming-Feng Yeh1 CHAPTER 3 An Illustrative Example.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Radial-Basis Function Networks
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Neural NetworksNN 11 Neural netwoks thanks to: Basics of neural network theory and practice for supervised and unsupervised.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks Debrup Chakraborty Pattern Recognition and Machine Learning 2006.
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Neural Network Hopfield model Kim, Il Joong. Contents  Neural network: Introduction  Definition & Application  Network architectures  Learning processes.
1 Chapter 6: Artificial Neural Networks Part 2 of 3 (Sections 6.4 – 6.6) Asst. Prof. Dr. Sukanya Pongsuparb Dr. Srisupa Palakvangsa Na Ayudhya Dr. Benjarath.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology.
NEURAL NETWORKS FOR DATA MINING
Hebbian Coincidence Learning
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Ming-Feng Yeh1 CHAPTER 16 AdaptiveResonanceTheory.
381 Self Organization Map Learning without Examples.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Perceptrons Michael J. Watts
Chapter 6 Neural Network.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
Ch 7. Computing with Population Coding Summarized by Kim, Kwonill Bayesian Brain: Probabilistic Approaches to Neural Coding P. Latham & A. Pouget.
Neural networks.
Chapter 5 Unsupervised learning
Artificial Neural Networks
Learning in Neural Networks
ECE 471/571 - Lecture 15 Hopfield Network 03/29/17.
CSSE463: Image Recognition Day 17
ECE 471/571 - Lecture 19 Hopfield Network.
Lecture 9 MLP (I): Feed-forward Model
Chapter 3. Artificial Neural Networks - Introduction -
Biological Neuron Cell body Dendrites Axon Impulses Synapses
XOR problem Input 2 Input 1
Competitive Networks.
An Illustrative Example.
Adaptive Resonance Theory
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
Competitive Networks.
Adaptive Resonance Theory
Neuro-Computing Lecture 2 Single-Layer Perceptrons
Chapter - 3 Single Layer Percetron
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
Perceptron Learning Rule
Perceptron Learning Rule
Perceptron Learning Rule
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

An Illustrative Example

Apple/Banana Sorter

Prototype Vectors Measurement Vector Prototype Banana Prototype Apple Shape: {1 : round ; -1 : eliptical} Texture: {1 : smooth ; -1 : rough} Weight: {1 : > 1 lb. ; -1 : < 1 lb.}

Perceptron

Two-Input Case Decision Boundary

Apple/Banana Example The decision boundary should separate the prototype vectors. The weight vector should be orthogonal to the decision boundary, and should point in the direction of the vector which should produce an output of 1. The bias determines the position of the boundary

Testing the Network Banana: Apple: “Rough” Banana:

Hamming Network

Properties of Hamming Network It Was designed for binary pattern recognition problems Input is either 1 or -1 Has a feed-forward and recurrent layers Number of neurons in the first layer is the same as those in the second layer It is intended to decide which given prototype vector is closest to the input The feedforward layer performs inner product between each of the prototype patterns and the input pattern The rows of the W in the feedforward layer are set to the prototype patterns The feedforward layer uses a linear transform (purelin) Each element of the bias vector is the number of elements in the input vector, R. This assures that input to the recurrent network is always positive One neuron in the recurrent layer for each prototype pattern The recurrent layer is known as the “competitive layer” inputs to this layer are the outputs from the feedforward layer When the recurrent layer converges then there will be one neuron with non -zero output that determines the prototype closest to the input vector

For Banana/Apple Recognition Feedforward Layer For Banana/Apple Recognition

Recurrent Layer S is Number of neurons in recurrent layer

Hamming Operation First Layer Input (Rough Banana)

Hamming Operation Second Layer

Hopfield Network

Properties of Hopfield Network The neurons are initialized with the input vector The network iterates until the output converges The network is converged when one of the inputs appear at the output. That is if the network works correctly. It produces one of the prototypes. The satlin transfer function is used in the feedforward layer. Computing W is complex but for now we wish to make it such that the cause of difference in strengthen and the cause of similarity is weakened.

Apple/Banana Problem Test: “Rough” Banana (Banana)

Summary Perceptron Hamming Network Hopfield Network Feedforward Network Linear Decision Boundary One Neuron for Each Decision Hamming Network Competitive Network First Layer – Pattern Matching (Inner Product) Second Layer – Competition (Winner-Take-All) # Neurons = # Prototype Patterns Hopfield Network Dynamic Associative Memory Network Network Output Converges to a Prototype Pattern # Neurons = # Elements in each Prototype Pattern