Memory and learning of sequential patterns by nonmonotone neural networks

M Morita - Neural Networks, 1996 - Elsevier
M Morita
Neural Networks, 1996Elsevier
Conventional neural network models for temporal association generally do not work well in
the absence of synchronizing neurons. This is because their dynamical properties are
fundamentally not suitable for storing sequential patterns, no matter what storage or learning
algorithm is used. The present article describes a nonmonotone neural network (NNN)
model in which sequential patterns are stored by being embedded in a trajectory attractor of
the dynamical system, and recalled stably and smoothly without synchronization; recall is …
Conventional neural network models for temporal association generally do not work well in the absence of synchronizing neurons. This is because their dynamical properties are fundamentally not suitable for storing sequential patterns, no matter what storage or learning algorithm is used. The present article describes a nonmonotone neural network (NNN) model in which sequential patterns are stored by being embedded in a trajectory attractor of the dynamical system, and recalled stably and smoothly without synchronization; recall is done in such a way that the network state successively moves along the trajectory. A simple and natural learning algorithm for the NNN is also presented, where one only has to vary the input pattern gradually and modify the synaptic weights according to a kind of covariance rule; then the network state follows slightly behind the input pattern, and its trajectory grows to be an attractor with a small number of repetitions. Copyright © 1996 Elsevier Science Ltd.
Elsevier
Showing the best result for this search. See all results