[PDF][PDF] On the problem of local minima in backpropagation

M Gori, A Tesi - IEEE Transactions on Pattern Analysis and Machine …, 1992 - cs.cmu.edu
Supervised learning in multilayered neural networks (MLN's) has been recently proposed
through the well-known backpropagation (BP) algorithm. This is a gradient method that can
get stuck in local minima, as simple examples can show. In this paper, some conditions on
the network architecture and the learning environment, which ensure the convergence of the
BP algorithm, are proposed. It is proven in particular that the convergence holds if the
classes are linearly separable. In this case, the experience gained in several experiments …
Showing the best result for this search. See all results