model: universal_transformer can't working with worker_gpu = 2 ? #1006
Description
Description
when I training using t2t-trainer, and try the universal_transformer together with worker_gpu = 2 , training failed immediately with error and exit (BTW: train default with 1 GPU works).
INFO:tensorflow:Cannot use 'Identity_122' as input to 'Identity_33' because they are in different while loops.
Identity_122 while context: universal_transformer/parallel_1_5/universal_transformer/universal_transformer/body/encoder/universal_transformer_basic/foldl/while/while_context
Identity_33 while context: universal_transformer/parallel_0_5/universal_transformer/universal_transformer/body/encoder/universal_transformer_basic/foldl/while/while_context
Traceback for Identity_122:
.....
Environment information
OS:
Linux 8d9c9f85bad0 4.4.0-131-generic
$ pip freeze | grep tensor
tensor2tensor==1.7.0
tensorboard==1.9.0
tensorflow==1.9.0
$ python -V
Python 3.6.5 :: Anaconda, Inc.
For bugs: reproduction and error logs
# Steps to reproduce:
t2t-trainer \
--data_dir=$DATA_DIR \
--problem=translate_enzh_wmt32k \
--model=universal_transformer \
--hparams_set=universal_transformer_small \
--hparams='batch_size=5120' \
--train_steps=800000 \
--random_seed=33 \
--worker_gpu=2 \
--output_dir=$TRAIN_DIR
Error logs:
INFO:tensorflow:Cannot use 'Identity_122' as input to 'Identity_33' because they are in different while loops.
Identity_122 while context: universal_transformer/parallel_1_5/universal_transformer/universal_transformer/body/encoder/universal_transformer_basic/foldl/while/while_context
Identity_33 while context: universal_transformer/parallel_0_5/universal_transformer/universal_transformer/body/encoder/universal_transformer_basic/foldl/while/while_context
Traceback for Identity_122:
.....