Advanced convolutional neural network with feedforward inhibition
L Liu, S Yang, D Shi - 2019 International Conference on …, 2019 - ieeexplore.ieee.org
L Liu, S Yang, D Shi
2019 International Conference on Machine Learning and Cybernetics …, 2019•ieeexplore.ieee.orgConvolutional neural network is a multi-layer neural network with robust pattern recognition
ability. However, when the activation function is sigmoid, the convolutional neural network
produces gradient vanishing problem. First, this paper analyzes the gradient vanishing
problem, and then based on the balance of excitation and inhibition mechanism in
neurology, it is proposed to use feed-forward inhibition to reduce activition value and wipe
off the scale effect of weights, so that the model can accelerate convergence under the …
ability. However, when the activation function is sigmoid, the convolutional neural network
produces gradient vanishing problem. First, this paper analyzes the gradient vanishing
problem, and then based on the balance of excitation and inhibition mechanism in
neurology, it is proposed to use feed-forward inhibition to reduce activition value and wipe
off the scale effect of weights, so that the model can accelerate convergence under the …
Convolutional neural network is a multi-layer neural network with robust pattern recognition ability. However, when the activation function is sigmoid, the convolutional neural network produces gradient vanishing problem. First, this paper analyzes the gradient vanishing problem, and then based on the balance of excitation and inhibition mechanism in neurology, it is proposed to use feed-forward inhibition to reduce activition value and wipe off the scale effect of weights, so that the model can accelerate convergence under the premise of maintaining the nonlinear fitting ability. The results show that the improved convolutional neural network can effectively relieve the gradient vanishing problem.
ieeexplore.ieee.org
Showing the best result for this search. See all results