Code for our AAAI2020 paper,
Graph Transformer for Graph-to-Sequence Learning. [preprint]
Deng Cai and Wai Lam.
The code is tested with Python 3.6. All dependencies are listed in requirements.txt.
The instructions for Syntax-based Machine Translation are given in the translator_data folder.
The instructions for AMR-to-Text Generation are given in the generator_data folder.
Step 3-6 should be conducted in the generator
folder for AMR-to-Text Generation, and the translator
folder for Syntax-based Machine Translation respectively. The default settings in this repo should reproduce the results in our paper.
cd generator/tranlator
sh prepare.sh # check it before use
cd generator/tranlator
sh train.sh # check it before use
cd generator/tranlator
sh work.sh # check it before use
# postprocess
sh test.sh (make sure --output is set)# check it before use
for bleu: use sh multi-bleu.perl (-lc)
for chrf++: use python chrF++.py (c6+w2-F2)
for meteor: use meteor-1.5 "java -Xmx2G -jar meteor-1.5.jar test reference -l en"
If you find the code useful, please cite our paper.
@inproceedings{cai-lam-2020-graph,
title = "Graph Transformer for Graph-to-Sequence Learning",
author = "Cai, Deng and Lam, Wai",
booktitle = "Proceedings of The Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI)",
year = "2020",
}
For any questions, please drop an email to Deng Cai.
(Pretrained models and our system output are available upon request.)