Revisiting source context in nearest neighbor machine translation

X Li, P Li, P Hu - Proceedings of the 2023 Conference on …, 2023 - aclanthology.org
X Li, P Li, P Hu
Proceedings of the 2023 Conference on Empirical Methods in Natural …, 2023aclanthology.org
Nearest neighbor machine translation (k NN-MT), which interpolates target token
probabilities with estimates derived from additional examples, has achieved significant
improvements and attracted extensive interest in recent years. However, existing research
does not explicitly consider the source context when retrieving similar examples, potentially
leading to suboptimal performance. To address this, we comprehensively revisit the role of
source context and propose a simple and effective method for improving neural machine …
Abstract
Nearest neighbor machine translation (k NN-MT), which interpolates target token probabilities with estimates derived from additional examples, has achieved significant improvements and attracted extensive interest in recent years. However, existing research does not explicitly consider the source context when retrieving similar examples, potentially leading to suboptimal performance. To address this, we comprehensively revisit the role of source context and propose a simple and effective method for improving neural machine translation via source context enhancement, demonstrating its crucial role in both retrieving superior examples and determining more suitable interpolation coefficients. Furthermore, we reveal that the probability estimation can be further optimized by incorporating a source-aware distance calibration module. Comprehensive experiments show that our proposed approach can be seamlessly integrated with representative k NN-MT baselines, resulting in substantial improvements over these strong baselines across a number of settings and domains. Remarkably, these improvements can reach up to 1.6 BLEU points.
aclanthology.org
Showing the best result for this search. See all results