Download presentation
Presentation is loading. Please wait.
1
Pertemuan 7 JARINGAN INSTAR DAN OUTSTAR
Matakuliah : H0434/Jaringan Syaraf Tiruan Tahun : 2005 Versi : 1 Pertemuan 7 JARINGAN INSTAR DAN OUTSTAR
2
Menghubungkan Instar dan Outstar Rule pada Jaringan Associative
Learning Outcomes Pada akhir pertemuan ini, diharapkan mahasiswa akan mampu : Menghubungkan Instar dan Outstar Rule pada Jaringan Associative
3
Arsitektur Jaringan Instar Arsitektur Jaringan Outstar Learning Rule
Outline Materi Arsitektur Jaringan Instar Arsitektur Jaringan Outstar Learning Rule
4
INSTAR RECOGNITION NETWORK
5
INSTAR OPERATION The instar will be active when or
1 p q cos b – = For normalized vectors, the largest inner product occurs when the angle between the weight vector and the input vector is zero -- the input vector is equal to the weight vector. The rows of a weight matrix represent patterns to be recognized.
6
INSTAR RULE Hebb with Decay
Modify so that learning and forgetting will only occur when the neuron is active - Instar Rule: w i j q ( ) 1 – a p g + = or Vector Form:
7
GRAPHICAL REPRESENTATION
For the case where the instar is active (ai = 1): or For the case where the instar is inactive (ai = 0):
8
CONTOH
9
TRAINING First Iteration (a=1):
10
Orange will now be detected if either set of sensors works.
Further Training (orange) h a 2 ( ) r d l i m w p W – + = 3 1 × è ø ç ÷ æ ö a 3 ( ) h r d l i m w p W 2 – + = (orange) × 1 è ø ç ÷ æ ö Orange will now be detected if either set of sensors works.
11
Kohonen Rule Learning occurs when the neuron’s index i is a member of
the set X(q). We will see in Chapter 14 that this can be used to train all neurons in a given neighborhood.
12
OUTSTAR RECALL NETWORK
13
Outstar Operation Suppose we want the outstar to recall a certain pattern a* whenever the input p = 1 is presented to the network. Let Then, when p = 1 and the pattern is correctly recalled. The columns of a weight matrix represent patterns to be recalled.
14
Outstar Rule For the instar rule we made the weight decay term of the Hebb rule proportional to the output of the network. For the outstar rule we make the weight decay term proportional to the input of the network. If we make the decay rate g equal to the learning rate a, Vector Form:
15
Example - Pineapple Recall
16
Definitions
17
Iteration 1 a = 1
18
Convergence
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.