Notes Lect 17autoassociated - Hopfield
Notes Lect 17autoassociated - Hopfield
Notes Lect 17autoassociated - Hopfield
W14 W24
1
W34
W13 W23 W43
W12 W32 W42
W21 W31
W41
4 2
connection 1 2 3 4 neurons
weights wij I1 I2 I3 I4
inputs I 3
V1 V2 V3 V4
outputs V
unit is connected to every other unit in the network but not to itself.
connection weight between or from neuron j to i is given by a
number wij. The collection of all such numbers are represented
by the weight matrix W which is square and symmetric, ie , w i j = w j i
for i, j = 1, 2,............, m.
each unit has an external input I which leads to a modification
th
and xi is the i component of pattern Xk
each unit acts as both input and output unit. Like linear associator,
neurons
neurons
weights wij
x1 w11
y1
w12 w21
x2
w22 y2
inputs
w2m outputs
w1m
wn1
wn2
Xn wnm
Ym
20
1. Auto-associative Memory (auto-correlators)
In the previous section, the structure of the Hopfield model has been
explained. It is an auto-associative memory model which means patterns,
rather than associated pattern pairs, are stored in memory. In this
section, the working of an auto-associative memory (auto-correlator) is
illustrated using some examples.
Working of an auto-correlator :
21
• How to Store Patterns : Example
A2 = ( 1, 1 , 1 , -1 )
A3 = (-1, -1 , -1 , 1 )
T 1 -1 1 -1
-1 1 -1 1
[A1 ] 4x1 [A1 ] 1x4 =
1 -1 1 -1
-1 1 -1 1
i
T
1 1 1 -1
1 1 1 -1
[A2 ] 4x1 [A2 ] 1x4 = 1 1 1 -1
-1 -1 -1 1
i
j
1 1 1 -1
T
1 1 1 -1
[A3 ] 4x1 [A3 ] 1x4 = 1 1 1 -1
-1 -1 -1 1
i
i= 1 2 3 4
= ai t i , j=1 1x3 + 1x1 + 1x3 + -1x-3 = 10 1
= ai t i , j=2 1x1 + 1x3 + 1x1 + -1x-1 = 6 1
= ai t i , j=3 1x3 + 1x1 + 1x3 + -1x-3 = 10 1
= ai t i , j=4 1x-3 + 1x-1 + 1x-3 + -1x3 = -1 -1
new old
Therefore aj = ƒ ( ai t ij , aj ) for j = 1 , 2 , . . . , p is ƒ ( , )
new
a1
= ƒ (10 , 1)
new
a2 = ƒ (6 , 1)
new
a3 = ƒ (10 , 1)
new
a4 = ƒ (-1 , -1)
23
• Recognition of Noisy Patterns (ref. previous slide)
The HDs of A' from each of the stored patterns A1 , A2, A3 are
i= 1 2 3 4
= ai t i , j=1 1x3 + 1x1 + 1x3 + 1x-3 = 4 1
= ai t i , j=2 1x1 + 1x3 + 1x1 + 1x-1 = 4 1
= ai t i , j=3 1x3 + 1x1 + 1x3 + 1x-3 = 4 1
= ai t i , j=4 1x-3 + 1x-1 + 1x-3 + 1x3 = -4 -1
new old
Therefore aj = ƒ ( ai t i , aj ) for j = 1 , 2 , . . . , p is ƒ ( , )
new j
a1
new = ƒ (4 , 1)
a2
new
= ƒ (4 , 1)
a3
new
= ƒ (4 , 1)
a4
= ƒ (-4 , -1)
Definition : If the associated pattern pairs (X, Y) are different and if the
model recalls a pattern Y given a pattern X or vice-versa, then it is
termed as hetero-associative memory.
25
Bidirectional Associative Memory (BAM) Operations
N
the original correlation matrix of the BAM is M0 = T
[ Xi ] [ Yi ]
i=1
where Xi = (xi1 , xi2 , . . . , xin) and Yi = (yi1 , yi2 , . . . , yip)
26
• Retrieve the Nearest of a Pattern Pair, given any pair
(ref : previous slide)
Example
Retrieve the nearest of (Ai , Bi) pattern pair, given any pair ( , ) .
gi = , fi<0
-1 (bipolar)
previous g i , fi= 0
Kosko has proved that this process will converge for any
correlation matrix M.
27
• Addition and Deletion of Pattern Pairs
T T T T
Mnew = X1 Y1 + X1 Y1 + . . . . + Xn Yn + X' Y'
from the correlation matrix M , them the new correlation matrix Mnew
is given by
T
Mnew = M - ( Xj Yj )
28
Energy Function for BAM
Note : A system that changes with time is a dynamic system. There are two types
of dynamics in a neural network. During training phase it iteratively update
weights and during production phase it asymptotically converges to the solution
patterns. State is a collection of qualitative and qualitative items that characterize
the system e.g., weights, data flows. The Energy function (or Lyapunov function)
is a bounded function of the system state that decreases with time and the
system solution is the minimum energy.
Let a pair (A , B) defines the state of a BAM.
to store a pattern, the value of the energy function for that pattern
E(A, B) = - AM T
B ; for a particular case A = B , it corresponds
to Hopfield auto-associative function.
We wish to retrieve the nearest of (Ai , Bi) pair, when any ( , ) pair
is presented as initial condition to BAM. The neurons change
their states until a bidirectional stable state (Af , Bf) is reached. Kosko
has shown that such stable state is reached for any matrix M when it
corresponds to local minimum of the energy function. Each cycle of
decoding lowers the energy E if the energy function for any point
( , ) is given by T
E= M
If the energy T
E = Ai M B evaluated using coordinates of the pair
i
(Ai , Bi) does not constitute a local minimum, then the point cannot
be recalled, even though one starts with = Ai. Thus Kosko's encoding
method does not ensure that the stored pairs are at a local minimum.
29
• Example : Kosko's BAM for Retrieval of Associated Pair
The working of Kosko's BAM for retrieval of associated pair.
Start with X3, and hope to retrieve the associated pair Y3 .
A1 = ( 1 0 0 0 0 1 ) B1 = ( 1 1 0 0 0 )
A2 = ( 0 1 1 0 0 0 ) B2 = ( 1 0 1 0 0 )
A3 = ( 0 0 1 0 1 1 ) B3 = ( 0 1 1 1 0 )
1 1 -3 -1 1
1 -3 1 -1 1
T T T -1 -1 3 1 -1
M = X1 Y1 + X2 Y2 + X3 Y3 =
-1 -1 -1 1 3
-3 1 1 3 1
-1 3 -1 1 -1
Suppose we start with = X3, and we hope to retrieve the associated pair
Y3 . The calculations for the retrieval of Y3 yield :
M = ( -1 -1 1 -1 1 1 ) ( M ) = ( -6 6 6 6 -6 )
( M) = ' = ( -1 1 1 1 -1 )
T
' M = ( -5 -5 5 -3 7 5 )
T
(' M ) = ( -1 -1 1 -1 1 1 ) = '
' M = ( -1 -1 1 -1 1 1 )M = ( -6 6 6 6 -6 )
(' M) = " = ( -1 1 1 1 1 -1 )
= '
A1 = ( 1 0 0 1 1 1 0 0 0 ) B1 = ( 1 1 1 0 0 0 0 1 0 )
A2 = ( 0 1 1 1 0 0 1 1 1 ) B2 = ( 1 0 0 0 0 0 0 0 1 )
A3 = ( 1 0 1 0 1 1 0 1 1 ) B3 = ( 0 1 0 1 0 0 1 0 1 )
X1 = ( 1 -1 -1 1 1 1 -1 -1 -1 ) Y1 = ( 1 1 1 -1 -1 -1 -1 1 -1 )
X2 = ( -1 1 1 1 -1 -1 1 11 ) Y2 = ( 1 -1 -1 -1 -1 -1 -1 -1 1 )
X3 = ( 1 -1 1 -1 1 1 -1 1 1 ) Y3 = ( -1 1 -1 1 -1 -1 1 0 1 )
-1 3 1 1 -1 -1 1 1 -1
1 -3 -1 -1 1 1 -1 -1 1
-1 -1 -3 1 -1 -1 1 -3 3
3 -1 1 -3 -1 -1 -3 1 -1
= -1 3 1 1 -1 -1 1 1 -1
-1 3 1 1 -1 -1 1 1 -1
1 -3 -1 -1 1 1 -1 -1 1
-1 -1 -3 1 -1 -1 1 -3 3
-1 -1 -3 1 -1 -1 1 -3 3
SC - Bidirectional hetero AM
( M) = ( 1 -1 -1 -1 1 1 -1 -1 1 ) = '
T
' M = ( -11 11 5 5 -11 -11 11 5 5 )
T =
(' M ) = ( -1 1 1 1 -1 -1 1 1 1 ) '
' M = ( 5 -19 -13 -5 1 1 -5 -13 13 )
(' M) = ( 1 -1 -1 -1 1 1 -1 -1 1 ) = "
= '
F = ' = ( -1 1 1 1 -1 -1 1 1 1 ) = X2
F = ' = ( 1 -1 -1 -1 1 1 -1 -1 1 ) ≠ Y2
But ' is not Y2 . Thus the vector pair (X2 , Y2) is not recalled correctly
by Kosko's decoding process.
32
SC - Bidirectional hetero AM
Y2 ' = ( 1 -1 -1 -1 1 -1 -1 -1 1 )
33
SC - Bidirectional hetero AM
Multiple Training Encoding Strategy
retrieve the nearest pair given any pair ( , ), with the help of recall
equations. However, Kosko's encoding method does not ensure that the stored
pairs are at local minimum and hence, results in incorrect recall.
P = (q – T
1) Xi Yi where (Xi , Yi) are the bipolar form of (Ai , Bi).
The new value of the energy function E evaluated at (Ai , Bi) then becomes
T T T
E' (Ai , Bi) = – Ai M B
i – (q – 1) Ai Xi i Bi
X2 = ( -1 1 1 1 -1 -1 1 11 ) Y2 = ( 1 -1 -1 -1 -1 -1 -1 -1 1 )
T
Choose q=2, so that P = X2 Y2 ,the augmented correlation matrix M
T T T
becomes M = X1 Y1 + 2 X2 Y2 + X3 Y3
4 2 2 0 0 2 2 -2
2 -4 -2 -2 0 0 -2 -2 2
0 -2 -4 0 -2 -2 0 -4 4
= 4 -2 0 -4 -2 -2 -4 0 0
-2 4 2 2 0 0 2 2 -2
-2 4 2 2 0 0 2 2 -2
2 -4 -2 -2 0 0 -2 -2 2
0 -2 -4 0 -2 -2 0 -4 4
0 -2 -4 0 -2 -2 0 -4 4
( M) = ( 1 -1 -1 -1 -1 -1 -1 -1 1 ) = '
T
' M = ( -16 16 18 18 -16 -16 16 18 18 )
T =
(' M ) = ( -1 1 1 1 -1 -1 1 1 1 ) '
' M = ( 14 -28 -22 -14 -8 -8 -14 -22 23 )
(' M) = ( 1 -1 -1 -1 1 1 -1 -1 1 ) = "
F = ' = ( -1 1 1 1 -1 -1 1 1 1 ) = X2
F = ' = ( 1 -1 -1 -1 1 1 -1 -1 1 ) = Y2
36
Note : The previous slide showed that the pattern pair (X2 , Y2 ) is correctly
recalled, using augmented correlation matrix
T
T T
M = X1 Y1 X3 Y3
+ 2 X2 Y2 +
but then the same matrix M can not recall correctly the other
pattern pair (X1 , Y1 ) as shown below.
X1 = ( 1 -1 -1 1 1 1 -1 -1 -1 ) Y1 = ( 1 1 1 -1 -1 -1 -1 1 -1 )
M = ( -6 24 22 6 4 4 6 22 -22 )
( M) = ( -1 1 1 1 1 1 1 1 -1 ) = '
T
' M = ( 16 -16 -18 -18 16 16 -16 -18 -18 )
T =
(' M ) = ( 1 -1 -1 -1 1 1 -1 -1 -1 ) '
' M = ( -14 28 22 14 8 8 14 22 -22 )
(' M) = ( -1 1 1 1 1 1 1 1 -1 ) = "
F = ' = ( 1 -1 -1 -1 1 1 -1 -1 -1 ) = X1
F = ' = ( -1 1 1 1 1 1 1 1 -1 ) ≠ Y1
Thus, the pattern pair (X1 , Y1 ) is not correctly recalled, using augmented
correlation matrix M.
Now observe in the next slide that all three pairs can be correctly recalled.
38
Recall of pattern pair(X1 , Y1 )
X1 =(1 -1 -1 1 1 1 -1 -1 -1 )
Y1 =(1 1 1 -1 -1 -1 -1 1 -1 )
F = ' = ( -1 1 1 1 -1 -1 1 1 1 ) = X2
F = ' = ( 1 -1 -1 -1 -1 -1 -1 -1 1 ) = Y2
F = ' = ( 1 -1 1 -1 1 1 -1 1 1 ) = X3
F = ' = ( -1 1 -1 1 -1 -1 1 0 1 ) = Y3
40
SC - Bidirectional hetero AM
• Algorithm (for the Multiple training encoding strategy)
Algorithm Mul_Tr_Encode ( N , Xi
, Yi , qi ) where
For i 1 to N
M M [ qi Transpose ( Xi ) ( Xi ) end
(symbols matrix addition, matrix multiplication, and
scalar multiplication)
Step 3 Read input bipolar pattern A
42