-
Notifications
You must be signed in to change notification settings - Fork 37
Description
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
CODE
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
from inference.engine import Model
src_lang = "eng_Latn"
tgt_lang = "hin_Deva"
ckpt_dir = "models/en-indic-preprint/fairseq_model"
print("#########################################################################")
print("Load Model..........................")
print("#########################################################################")
model = Model(ckpt_dir, model_type="fairseq")
sents = ["While developing the LLM Tamil concept modelt."]
for a batch of sentences
output = model.batch_translate(sents, src_lang, tgt_lang)
print("#########################################################################")
print(output)
print("#########################################################################")
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
ERROR
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Number of sentences in input: 4
Applying sentence piece
skipped 1 empty lines
filtered 0 lines
Adding language tags
3it [00:00, 42653.94it/s]
Decoding
joint_translate.sh: line 66: 23181 Killed fairseq-interactive $ckpt_dir/final_bin -s $SRC_PREFIX -t $TGT_PREFIX --distributed-world-size 1 --fp16 --path $ckpt_dir/model/checkpoint_best.pt --task translation --user-dir model_configs --skip-invalid-size-inputs-valid-test --batch-size 128 --buffer-size 2500 --beam 5 --input $outfname.bpe > $outfname.log 2>&1
Extracting translations, script conversion, and detokenization
Translation completed
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
System Configuration
