Generative Max-Mahalanobis Classifiers for Image Classification, Generation and More
Machine Learning and Knowledge Discovery in Databases. Research Track …, 2021•Springer
Abstract Joint Energy-based Model (JEM) of 11 shows that a standard softmax classifier can
be reinterpreted as an energy-based model (EBM) for the joint distribution p (x, y) p (x, y); the
resulting model can be optimized to improve calibration, robustness and out-of-distribution
detection, while generating samples rivaling the quality of recent GAN-based approaches.
However, the softmax classifier that JEM exploits is inherently discriminative and its latent
feature space is not well formulated as probabilistic distributions, which may hinder its …
be reinterpreted as an energy-based model (EBM) for the joint distribution p (x, y) p (x, y); the
resulting model can be optimized to improve calibration, robustness and out-of-distribution
detection, while generating samples rivaling the quality of recent GAN-based approaches.
However, the softmax classifier that JEM exploits is inherently discriminative and its latent
feature space is not well formulated as probabilistic distributions, which may hinder its …
Abstract
Joint Energy-based Model (JEM) of [11] shows that a standard softmax classifier can be reinterpreted as an energy-based model (EBM) for the joint distribution ; the resulting model can be optimized to improve calibration, robustness and out-of-distribution detection, while generating samples rivaling the quality of recent GAN-based approaches. However, the softmax classifier that JEM exploits is inherently discriminative and its latent feature space is not well formulated as probabilistic distributions, which may hinder its potential for image generation and incur training instability. We hypothesize that generative classifiers, such as Linear Discriminant Analysis (LDA), might be more suitable for image generation since generative classifiers model the data generation process explicitly. This paper therefore investigates an LDA classifier for image classification and generation. In particular, the Max-Mahalanobis Classifier (MMC) [30], a special case of LDA, fits our goal very well. We show that our Generative MMC (GMMC) can be trained discriminatively, generatively or jointly for image classification and generation. Extensive experiments on multiple datasets show that GMMC achieves state-of-the-art discriminative and generative performances, while outperforming JEM in calibration, adversarial robustness and out-of-distribution detection by a significant margin. Our source code is available at https://github.com/sndnyang/GMMC .
Springer
Showing the best result for this search. See all results