Backpropagation for linearly-separable patterns: A detailed analysis
IEEE International Conference on Neural Networks, 1993•ieeexplore.ieee.org
A sufficient condition for learning without local minima in multilayered networks is proposed.
A fundamental assumption on the network architecture is removed. It is proved that the
conclusions drawn by M. Gori and A. Tesi (IEEE Trans. Pattern Anal. Mach. Intell., vol. 14,
no. 1, pp. 76-86,(1992)) also hold provided that the weight matrix associated with the hidden
and output layer is pyramidal and has full rank. The analysis is carried out by using least
mean squares (LMS)-threshold cost functions, which allow the identification of spurious and …
A fundamental assumption on the network architecture is removed. It is proved that the
conclusions drawn by M. Gori and A. Tesi (IEEE Trans. Pattern Anal. Mach. Intell., vol. 14,
no. 1, pp. 76-86,(1992)) also hold provided that the weight matrix associated with the hidden
and output layer is pyramidal and has full rank. The analysis is carried out by using least
mean squares (LMS)-threshold cost functions, which allow the identification of spurious and …
A sufficient condition for learning without local minima in multilayered networks is proposed. A fundamental assumption on the network architecture is removed. It is proved that the conclusions drawn by M. Gori and A. Tesi (IEEE Trans. Pattern Anal. Mach. Intell., vol.14, no.1, pp.76-86, (1992)) also hold provided that the weight matrix associated with the hidden and output layer is pyramidal and has full rank. The analysis is carried out by using least mean squares (LMS)-threshold cost functions, which allow the identification of spurious and structural local minima.<>
ieeexplore.ieee.org
Showing the best result for this search. See all results