1. Investigating the potential of Sparse Mixtures-of-Experts for multi-domain neural machine translation
- Author
-
Chirkova, Nadezhda, Nikoulina, Vassilina, Meunier, Jean-Luc, and Bérard, Alexandre
- Subjects
Computer Science - Computation and Language ,Computer Science - Artificial Intelligence - Abstract
We focus on multi-domain Neural Machine Translation, with the goal of developing efficient models which can handle data from various domains seen during training and are robust to domains unseen during training. We hypothesize that Sparse Mixture-of-Experts (SMoE) models are a good fit for this task, as they enable efficient model scaling, which helps to accommodate a variety of multi-domain data, and allow flexible sharing of parameters between domains, potentially enabling knowledge transfer between similar domains and limiting negative transfer. We conduct a series of experiments aimed at validating the utility of SMoE for the multi-domain scenario, and find that a straightforward width scaling of Transformer is a simpler and surprisingly more efficient approach in practice, and reaches the same performance level as SMoE. We also search for a better recipe for robustness of multi-domain systems, highlighting the importance of mixing-in a generic domain, i.e. Paracrawl, and introducing a simple technique, domain randomization.
- Published
- 2024