×
Aug 10, 2023 · In this paper, we show how to benefit from additional modalities without requiring additional labels. We propose a self-training framework that ...
Dec 31, 2021 · We propose a self-training framework that allows us to add a modality to a pretrained model in order to improve its performance. The main ...
Aug 10, 2023 · We propose a self-training framework that allows us to add a modality to a pretrained model in order to improve its performance.
Nov 21, 2024 · Less Labels, More Modalities: A Self-Training Framework to Reuse Pretrained Networks. August 2023; Lecture Notes in Computer Science. DOI: ...
In this paper, we show how to benefit from additional modalities without requiring additional labels. We propose a self-training framework that allows us to add ...
Less Labels, More Modalities: A Self-Training Framework to Reuse Pretrained Networks. Jean-Christophe Burnel (1) , Sébastien Lefèvre (1) , Luc Courtrai (1).
In this paper, we show how to benefit from additional modalities without requiring additional labels. We propose a self-training framework that allows us to add ...
Less Labels, More Modalities: A Self-Training Framework to Reuse Pretrained Networks ... Remote sensing largely benefits from recent advances in deep learning.
Less Labels, More Modalities: A Self-Training Framework to Reuse Pretrained Networks · Generating Natural Adversarial Remote Sensing Images.
Apr 25, 2024 · Less Labels, More Modalities: A Self-Training Framework to Reuse Pretrained Networks. ICPR Workshops (3) 2022: 287-302; 2020. [c1]. view.