Multi-modal meta continual learning

S Gai, Z Chen, D Wang - 2021 International Joint Conference …, 2021 - ieeexplore.ieee.org
S Gai, Z Chen, D Wang
2021 International Joint Conference on Neural Networks (IJCNN), 2021ieeexplore.ieee.org
Recently, meta learning methods able to provide multiple initializations have drawn much
attention due to its capability of handling multi-modal tasks. However, the modal differences
in multi-modal distributions aggravate the catastrophic forgetting. In this paper, we augment
multi-initial meta learning for continual learning, which alleviates the forgetting issue while
allowing beneficial transfer to previous tasks from multi-modal distributions. Specifically, we
propose a multi-modal meta continual learning (M3CL) framework, which is able to mitigate …
Recently, meta learning methods able to provide multiple initializations have drawn much attention due to its capability of handling multi-modal tasks. However, the modal differences in multi-modal distributions aggravate the catastrophic forgetting. In this paper, we augment multi-initial meta learning for continual learning, which alleviates the forgetting issue while allowing beneficial transfer to previous tasks from multi-modal distributions. Specifically, we propose a multi-modal meta continual learning (M3CL) framework, which is able to mitigate forgetting via gradient episodic memory (GEM). Modulation network in M3CL has the capability of identifying the mode of each image from online multimodal distributions. In addition, in order to take advantage of both modulation network and GEM, we consider the entropy-based regularization to learn effective modulation network and penalize overfitting to a specific task. Experiments on multimodal datasets identify that the modal differences indeed aggravate the catastrophic forgetting, and demonstrate the strong performance of M3CL in comparison to the state-of-the-art.
ieeexplore.ieee.org
Showing the best result for this search. See all results