Attaining Class-Level Forgetting in Pretrained Model Using Few Samples

P Singh, P Mazumder, MA Karim - European Conference on Computer …, 2022 - Springer
European Conference on Computer Vision, 2022Springer
In order to address real-world problems, deep learning models are jointly trained on many
classes. However, in the future, some classes may become restricted due to privacy/ethical
concerns, and the restricted class knowledge has to be removed from the models that have
been trained on them. The available data may also be limited due to privacy/ethical
concerns, and re-training the model will not be possible. We propose a novel approach to
address this problem without affecting the model's prediction power for the remaining …
Abstract
In order to address real-world problems, deep learning models are jointly trained on many classes. However, in the future, some classes may become restricted due to privacy/ethical concerns, and the restricted class knowledge has to be removed from the models that have been trained on them. The available data may also be limited due to privacy/ethical concerns, and re-training the model will not be possible. We propose a novel approach to address this problem without affecting the model’s prediction power for the remaining classes. Our approach identifies the model parameters that are highly relevant to the restricted classes and removes the knowledge regarding the restricted classes from them using the limited available training data. Our approach is significantly faster and performs similar to the model re-trained on the complete data of the remaining classes.
Springer