Online efficient learning with quantized KLMS and L1 regularization

B Chen, S Zhao, S Seth… - The 2012 International …, 2012 - ieeexplore.ieee.org
The 2012 International Joint Conference on Neural Networks (IJCNN), 2012ieeexplore.ieee.org
In a recent work, we have proposed the quantized kernel least mean square (QKLMS)
algorithm, which is quite effective in online learning sequentially a nonlinear mapping with a
slowly growing radial basis function (RBF) structure. In this paper, in order to further reduce
the network size, we propose a sparse QKLMS algorithm, which is derived by adding a
sparsity inducing l 1 norm penalty of the coefficients to the squared error cost. Simulation
examples show that the new algorithm works efficiently, and results in a much sparser …
In a recent work, we have proposed the quantized kernel least mean square (QKLMS) algorithm, which is quite effective in online learning sequentially a nonlinear mapping with a slowly growing radial basis function (RBF) structure. In this paper, in order to further reduce the network size, we propose a sparse QKLMS algorithm, which is derived by adding a sparsity inducing l1 norm penalty of the coefficients to the squared error cost. Simulation examples show that the new algorithm works efficiently, and results in a much sparser network while preserving a desirable performance.
ieeexplore.ieee.org
Showing the best result for this search. See all results