Improving the convergence of backpropagation by opposite transfer functions
M Ventresca, HR Tizhoosh - The 2006 IEEE International Joint …, 2006 - ieeexplore.ieee.org
The 2006 IEEE International Joint Conference on Neural Network …, 2006•ieeexplore.ieee.org
The backpropagation algorithm is a very popular approach to learning in feed-forward multi-
layer perceptron networks. However, in many scenarios the time required to adequately
learn the task is considerable. Many existing approaches have improved the convergence
rate by altering the learning algorithm. We present a simple alternative approach inspired by
opposition-based learning that simultaneously considers each network transfer function and
its opposite. The effect is an improvement in convergence rate and over traditional …
layer perceptron networks. However, in many scenarios the time required to adequately
learn the task is considerable. Many existing approaches have improved the convergence
rate by altering the learning algorithm. We present a simple alternative approach inspired by
opposition-based learning that simultaneously considers each network transfer function and
its opposite. The effect is an improvement in convergence rate and over traditional …
The backpropagation algorithm is a very popular approach to learning in feed-forward multi-layer perceptron networks. However, in many scenarios the time required to adequately learn the task is considerable. Many existing approaches have improved the convergence rate by altering the learning algorithm. We present a simple alternative approach inspired by opposition-based learning that simultaneously considers each network transfer function and its opposite. The effect is an improvement in convergence rate and over traditional backpropagation learning with momentum. We use four common benchmark problems to illustrate the improvement in convergence time.
ieeexplore.ieee.org
Showing the best result for this search. See all results