×
We implement and examine a simple approach to continual learning for neural machine translation, exploring tradeoffs between consistency, the model's ability to learn from incoming data, and the time a client would need to wait to obtain a newly trained translation system.
Sep 30, 2024
Sep 30, 2024 · This could include new topics that are being discussed, changes to terminology, use of spelling variants, or changes in translator/translation ...
People also ask
This paper considers continual learning of large-scale pretrained neural machine translation model without accessing the previous training data or ...
... tradeoffs"> <titleInfo> <title>Some Tradeoffs in Continual Learning for Parliamentary Neural ... machine translation systems that can adapt to changes over time.
... Machine Translation in the Americas (Volume 1 ... Learning Methods</title> </titleInfo> <name type ... tradeoffs"> <titleInfo> <title>Some Tradeoffs ...
Some Tradeoffs in Continual Learning for Parliamentary Neural Machine Translation Systems · Adding soft terminology constraints to pre-trained generic MT models ...
Some Tradeoffs in Continual Learning for Parliamentary Neural Machine Translation Systems. AMTA (1) 2024: 102-118; 2023. [c22]. view. electronic edition via DOI ...
Nov 3, 2022 · This paper considers continual learning of large-scale pretrained neural machine translation model without accessing the previous training data or introducing ...
Missing: Tradeoffs Parliamentary Systems.
We present iterative back-translation, a method for generating increasingly better synthetic parallel data from monolingual data to train neural machine ...
Although NMT has become the state-of-of-the-art approach to automatic translation, there are still a number of challenges that NMT systems face.