Coefficient Structure of Kernel Perceptrons and Support Vector Reduction
International Work-Conference on the Interplay Between Natural and Artificial …, 2007•Springer
Abstract Support Vector Machines (SVMs) with few support vectors are quite desirable, as
they have a fast application to new, unseen patterns. In this work we shall study the
coefficient structure of the dual representation of SVMs constructed for nonlinearly separable
problems through kernel perceptron training. We shall relate them with the margin of their
support vectors (SVs) and also with the number of iterations in which these SVs take part.
These considerations will lead to a remove–and–retrain procedure for building SVMs with a …
they have a fast application to new, unseen patterns. In this work we shall study the
coefficient structure of the dual representation of SVMs constructed for nonlinearly separable
problems through kernel perceptron training. We shall relate them with the margin of their
support vectors (SVs) and also with the number of iterations in which these SVs take part.
These considerations will lead to a remove–and–retrain procedure for building SVMs with a …
Abstract
Support Vector Machines (SVMs) with few support vectors are quite desirable, as they have a fast application to new, unseen patterns. In this work we shall study the coefficient structure of the dual representation of SVMs constructed for nonlinearly separable problems through kernel perceptron training. We shall relate them with the margin of their support vectors (SVs) and also with the number of iterations in which these SVs take part. These considerations will lead to a remove–and–retrain procedure for building SVMs with a small number of SVs where both suitably small and large coefficient SVs will be taken out from the training sample. Besides providing a significant SV reduction, our method’s computational cost is comparable to that of a single SVM training.
Springer
Showing the best result for this search. See all results