Skip to content
/ MoNMT Public

MoNMT: Modularly Leveraging Monolingual and Bilingual Knowledge for Neural Machine Translation

License

Notifications You must be signed in to change notification settings

NLP2CT/MoNMT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MoNMT

MoNMT: Modularly Leveraging Monolingual and Bilingual Knowledge for Neural Machine Translation

Environment

  • The code is based on the fairseq toolkit, version 1.0.0a0, forked from the Graformer codebase.
  • python 3.8

Training

We show one example for training a MoNMT model.

Pretrain

See run-pretrian.sh

  • After training, we can get a Encoder-to-Decoder Denoising model for the source and the target languages.
  • In this example, the source and target languages share the same encoding and decoding modules.

train MoNMT

See run-train-MoNMT.sh

  • After training, we can get a source-to-target MoNMT translation model.

Other Information

  • The paper has been accepted by LREC-COLING 2024.

About

MoNMT: Modularly Leveraging Monolingual and Bilingual Knowledge for Neural Machine Translation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages