May 23, 2022 · We develop a novel UWT method dubbed Word Alignment using Language-Image Pretraining (WALIP), which leverages visual observations via the shared embedding ...
scholar.google.com › citations
Specifically, we develop a novel UWT method dubbed Word Alignment using Language-Image Pretraining (WALIP), which leverages visual observations via the shared ...
When distant languages are involved, the proposed solution illustrates robustness and outperforms existing unsupervised multilingual word embedding approaches.
Our work investigates the potential of using not only visual observations but also pretrained language-image models for enabling a more efficient and robust UWT.
Utilizing Language-Image Pretraining for Efficient and Robust Bilingual Word Alignment ... Word translation without parallel corpora has become feasible, rivaling ...
This work develops a novel UWT method dubbed Word Alignment using Language-Image Pretraining (WALIP), which leverages visual observations via the shared ...
Utilizing Language-Image Pretraining for Efficient and Robust Bilingual Word Alignment ... Word translation without parallel corpora has become feasible, rivaling ...
Utilizing Language-Image Pretraining for Efficient and Robust Bilingual Word Alignment · pdf icon · hmtl icon · 2022 (modified: 16 Apr 2023) · EMNLP (Findings) ...
Utilizing Language-Image Pretraining for Efficient and Robust Bilingual Word Alignment ... Debiasing Pre-Trained Language Models via Efficient Fine-tuning