EAT: Towards Long-Tailed Out-of-Distribution Detection
DOI:
https://doi.org/10.1609/aaai.v38i14.29508Keywords:
ML: Semi-Supervised Learning, ML: Multi-class/Multi-label Learning & Extreme ClassificationAbstract
Despite recent advancements in out-of-distribution (OOD) detection, most current studies assume a class-balanced in-distribution training dataset, which is rarely the case in real-world scenarios. This paper addresses the challenging task of long-tailed OOD detection, where the in-distribution data follows a long-tailed class distribution. The main difficulty lies in distinguishing OOD data from samples belonging to the tail classes, as the ability of a classifier to detect OOD instances is not strongly correlated with its accuracy on the in-distribution classes. To overcome this issue, we propose two simple ideas: (1) Expanding the in-distribution class space by introducing multiple abstention classes. This approach allows us to build a detector with clear decision boundaries by training on OOD data using virtual labels. (2) Augmenting the context-limited tail classes by overlaying images onto the context-rich OOD data. This technique encourages the model to pay more attention to the discriminative features of the tail classes. We provide a clue for separating in-distribution and OOD data by analyzing gradient noise. Through extensive experiments, we demonstrate that our method outperforms the current state-of-the-art on various benchmark datasets. Moreover, our method can be used as an add-on for existing long-tail learning approaches, significantly enhancing their OOD detection performance. Code is available at: https://github.com/Stomach-ache/Long-Tailed-OOD-Detection.Downloads
Published
2024-03-24
How to Cite
Wei, T., Wang, B.-L., & Zhang, M.-L. (2024). EAT: Towards Long-Tailed Out-of-Distribution Detection. Proceedings of the AAAI Conference on Artificial Intelligence, 38(14), 15787-15795. https://doi.org/10.1609/aaai.v38i14.29508
Issue
Section
AAAI Technical Track on Machine Learning V