A new maximum margin algorithm for one-class problems and its boosting implementation
Q Tao, G Wu, J Wang - Pattern Recognition, 2005 - Elsevier
Q Tao, G Wu, J Wang
Pattern Recognition, 2005•ElsevierIn this paper, each one-class problem is regarded as trying to estimate a function that is
positive on a desired slab and negative on the complement. The main advantage of this
viewpoint is that the loss function and the expected risk can be defined to ensure that the
slab can contain as many samples as possible. Inspired by the nature of SVMs, the intuitive
margin is also defined. As a result, a new linear optimization problem to maximize the
margin and some theoretically motivated learning algorithms are obtained. Moreover, the …
positive on a desired slab and negative on the complement. The main advantage of this
viewpoint is that the loss function and the expected risk can be defined to ensure that the
slab can contain as many samples as possible. Inspired by the nature of SVMs, the intuitive
margin is also defined. As a result, a new linear optimization problem to maximize the
margin and some theoretically motivated learning algorithms are obtained. Moreover, the …
In this paper, each one-class problem is regarded as trying to estimate a function that is positive on a desired slab and negative on the complement. The main advantage of this viewpoint is that the loss function and the expected risk can be defined to ensure that the slab can contain as many samples as possible. Inspired by the nature of SVMs, the intuitive margin is also defined. As a result, a new linear optimization problem to maximize the margin and some theoretically motivated learning algorithms are obtained. Moreover, the proposed algorithms can be implemented by boosting techniques to solve nonlinear one-class classifications.
Elsevier
Showing the best result for this search. See all results