MoNMT: Modularly Leveraging Monolingual and Bilingual Knowledge for Neural Machine Translation
- The code is based on the fairseq toolkit, version 1.0.0a0, forked from the Graformer codebase.
- python 3.8
We show one example for training a MoNMT model.
See run-pretrian.sh
- After training, we can get a Encoder-to-Decoder Denoising model for the source and the target languages.
- In this example, the source and target languages share the same encoding and decoding modules.
See run-train-MoNMT.sh
- After training, we can get a source-to-target MoNMT translation model.
- The paper has been accepted by LREC-COLING 2024.