×
Jan 9, 2022 · In this work, we study how to effectively leverage BERT in context-aware neural machine translation. We mainly study three approaches to extract ...
In this work, we conduct a study about leveraging BERT to encode the contextual information for NMT, and explore three commonly used methods to aggregate the ...
Abstract: Context-aware neural machine translation (NMT), which targets at translating sentences with contextual information, has attracted much attention ...
In this work, we conduct a study about leveraging BERT to encode the contextual information for NMT, and explore three commonly used methods to aggregate the ...
Nov 17, 2021 · A key problem for context-aware NMT is to effectively encode and aggregate the contextual information. BERT has been proven to be an effective ...
People also ask
Oct 21, 2022 · Abstract. This paper addresses the task of contex- tual translation using multi-segment models. Specifically we show that increasing model.
Jun 27, 2024 · This paper further investigates this observation by explicitly modelling context encoding through multi-task learn- ing (MTL) to make the model ...
In this work, BERT performs as a context encoder to achieve document-level contextual information, which is then integrated into both the encoder and decoder.
Jul 15, 2024 · This study presents a context-aware MT model that explains the translation output by predicting coreference clusters in the source side. The ...
May 21, 2024 · This approach has been used especially for analysing large neural models learned by self-supervision like BERT (Devlin et al., 2019). In ...