×
30 Jun 2021 · In this paper, we propose a novel approach to address this problem. Specifically, we adapt self-supervision and self-distillation to reduce the impact of ...
Self-supervision and self-distillation are gen- erally used for improving the representation learning, and generalization ability of the model and are not ...
In this paper, we propose a novel approach to address this problem. Specifically, we adapt self-supervision and self-distillation to reduce the impact of biases ...
This work adapts self-supervision and self-distillation to reduce the impact of biases on the model in this setting and empirically shows that this approach ...
For self-distillation, we use κ = 4. For a fair comparison, we use the same training settings for all the methods as described in [5]. For the experiments in ...
30 Jun 2021 · In this paper, we propose a novel approach to address this problem. Specifically, we adapt self-supervision and self-distillation to reduce the ...
23 May 2023 · Limited Data Bias Mitigation approach (LDBM) is proposed, which uses (2.1) self-supervision and (2.2) self-distillation for bias mitigation.
In the scope of fair deep learning, one can use model distillation to produce fairer student models based on an unfair teacher. To that end, most studies add a ...
Fair Visual Recognition in Limited Data Regime using Self-Supervision and Self-Distillation. Pratik Mazumder, Pravendra Singh, Vinay P. Namboodiri · Department ...
Fair Visual Recognition in Limited Data Regime using Self-Supervision and Self-Distillation (2022). First Author: Mazumder P. Attributed to: Centre for the ...