[edit]
Confidence Classifiers with Guaranteed Accuracy or Precision
Proceedings of the Twelfth Symposium on Conformal
and Probabilistic Prediction with Applications, PMLR 204:513-533, 2023.
Abstract
In many situations, probabilistic predictors have
replaced conformal classifiers. The main reason is
arguably that the set predictions of conformal
classifiers, with the accompanying significance
level, are hard to interpret. In this paper, we
demonstrate how conformal classification can be used
as a basis for a classifier with reject
option. Specifically, we introduce and evaluate two
algorithms that are able to perfectly estimate
accuracy or precision for a set of test instances,
in a classifier with reject scenario. In the
empirical investigation, the suggested algorithms
are shown to clearly outperform both calibrated and
uncalibrated probabilistic predictors.