Description
Hello,
I was trying to train using the sejong_treebank.sample file, so I ran the following commands:
$ ./sejong/split.sh
$ ./sejong/c2d.sh
$ ./train_sejong.sh
But had an error (same as one below -- "Assign requires shapes of both tensors to match").
So then, I tried downloading a larger treebank corpus from sejong.or.kr (it seems to be the full version of the sejong_treebank.sample in your repository, but then again I'm not sure...) But, the same thing happened.
My input file (tried both sample and full) is just an long stream of the following in UTF-8, just like your sample Sejong file. Is there somewhere else I need to put this? Or is there something else I need other than saving this as sejong/sejong_treebank.txt.v1 and running the scripts?
; 1993/06/08 19
(NP (NP 1993/SN + //SP + 06/SN + //SP + 08/SN)
(NP 19/SN))
; 엠마누엘 웅가로 /
(NP (NP (NP 엠마누엘/NNP)
(NP 웅가로/NNP))
(X //SP))
; 의상서 실내 장식품으로…
(NP_AJT (NP_AJT 의상/NNG + 서/JKB)
(NP_AJT (NP 실내/NNG)
(NP_AJT 장식품/NNG + 으로/JKB + …/SE)))
; 디자인 세계 넓혀
(VP (NP_OBJ (NP 디자인/NNG)
(NP_OBJ 세계/NNG))
(VP 넓히/VV + 어/EC))
; 프랑스의 세계적인 의상 디자이너 엠마누엘 웅가로가 실내 장식용 직물 디자이너로 나섰다.
(S (NP_SBJ (NP (NP_MOD 프랑스/NNP + 의/JKG)
(NP (VNP_MOD 세계/NNG + 적/XSN + 이/VCP + ᆫ/ETM)
(NP (NP 의상/NNG)
(NP 디자이너/NNG))))
(NP_SBJ (NP 엠마누엘/NNP)
(NP_SBJ 웅가로/NNP + 가/JKS)))
(VP (NP_AJT (NP (NP (NP 실내/NNG)
(NP 장식/NNG + 용/XSN))
(NP 직물/NNG))
(NP_AJT 디자이너/NNG + 로/JKB))
(VP 나서/VV + 었/EP + 다/EF + ./SF)))
Here's the logs with all the verbose options.
andy@andy ~/Downloads/syntaxnet/models/syntaxnet/work $ ./sejong/split.sh -v -v
+ '[' 0 '!=' 0 ']'
++++ readlink -f ./sejong/split.sh
+++ dirname /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/split.sh
++ readlink -f /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong
+ CDIR=/home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong
+ [[ -f /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/env.sh ]]
+ . /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/env.sh
++ set -o errexit
++ export LC_ALL=ko_KR.UTF-8
++ LC_ALL=ko_KR.UTF-8
++ export LANG=ko_KR.UTF-8
++ LANG=ko_KR.UTF-8
+++++ readlink -f /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/env.sh
++++ dirname /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/env.sh
+++ readlink -f /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong
++ CDIR=/home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong
+++++ readlink -f /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/env.sh
++++ dirname /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/env.sh
+++ readlink -f /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/..
++ PDIR=/home/andy/Downloads/syntaxnet/models/syntaxnet/work
++ python=/usr/bin/python
+ make_calmness
+ exec
+ exec
+ child_verbose='-v -v'
+ '[' '!' -e /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/wdir ']'
+ WDIR=/home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/wdir
+ '[' '!' -e /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/log ']'
+ LDIR=/home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/log
+ /usr/bin/python /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/split.py --mode=0
+ /usr/bin/python /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/split.py --mode=1
+ /usr/bin/python /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/split.py --mode=2
+ close_fd
+ exec
andy@andy ~/Downloads/syntaxnet/models/syntaxnet/work $ ./sejong/c2d.sh -v -v
+ '[' 0 '!=' 0 ']'
++++ readlink -f ./sejong/c2d.sh
+++ dirname /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/c2d.sh
++ readlink -f /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong
+ CDIR=/home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong
+ [[ -f /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/env.sh ]]
+ . /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/env.sh
++ set -o errexit
++ export LC_ALL=ko_KR.UTF-8
++ LC_ALL=ko_KR.UTF-8
++ export LANG=ko_KR.UTF-8
++ LANG=ko_KR.UTF-8
+++++ readlink -f /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/env.sh
++++ dirname /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/env.sh
+++ readlink -f /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong
++ CDIR=/home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong
+++++ readlink -f /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/env.sh
++++ dirname /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/env.sh
+++ readlink -f /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/..
++ PDIR=/home/andy/Downloads/syntaxnet/models/syntaxnet/work
++ python=/usr/bin/python
+ make_calmness
+ exec
+ exec
+ child_verbose='-v -v'
+ '[' '!' -e /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/wdir ']'
+ WDIR=/home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/wdir
+ '[' '!' -e /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/log ']'
+ LDIR=/home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/log
+ for SET in training tuning test
+ /usr/bin/python /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/c2d.py --mode=0
+ /usr/bin/python /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/c2d.py --mode=1
+ /usr/bin/python /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/align.py
number_of_sent = 0, number_of_sent_skip = 0
+ for SET in training tuning test
+ /usr/bin/python /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/c2d.py --mode=0
+ /usr/bin/python /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/c2d.py --mode=1
+ /usr/bin/python /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/align.py
number_of_sent = 0, number_of_sent_skip = 0
+ for SET in training tuning test
+ /usr/bin/python /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/c2d.py --mode=0
+ /usr/bin/python /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/c2d.py --mode=1
+ /usr/bin/python /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/align.py
number_of_sent = 0, number_of_sent_skip = 0
+ close_fd
+ exec
andy@andy ~/Downloads/syntaxnet/models/syntaxnet/work $ ./train_sejong.sh -v -v
+ '[' 0 '!=' 0 ']'
++++ readlink -f ./train_sejong.sh
+++ dirname /home/andy/Downloads/syntaxnet/models/syntaxnet/work/train_sejong.sh
++ readlink -f /home/andy/Downloads/syntaxnet/models/syntaxnet/work
+ CDIR=/home/andy/Downloads/syntaxnet/models/syntaxnet/work
++++ readlink -f ./train_sejong.sh
+++ dirname /home/andy/Downloads/syntaxnet/models/syntaxnet/work/train_sejong.sh
++ readlink -f /home/andy/Downloads/syntaxnet/models/syntaxnet/work/..
+ PDIR=/home/andy/Downloads/syntaxnet/models/syntaxnet
+ make_calmness
+ exec
+ exec
+ cd /home/andy/Downloads/syntaxnet/models/syntaxnet
+ python=/usr/bin/python
+ SYNTAXNET_HOME=/home/andy/Downloads/syntaxnet/models/syntaxnet
+ BINDIR=/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet
+ CONTEXT=/home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/context.pbtxt_p
+ TMP_DIR=/home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output
+ mkdir -p /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output
+ cat /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/context.pbtxt_p
+ sed s=OUTPATH=/home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output=
+ MODEL_DIR=/home/andy/Downloads/syntaxnet/models/syntaxnet/work/models
+ HIDDEN_LAYER_SIZES=512,512
+ HIDDEN_LAYER_PARAMS=512,512
+ BATCH_SIZE=256
+ BEAM_SIZE=16
+ LP_PARAMS=512,512-0.08-4400-0.85
+ GP_PARAMS=512,512-0.02-100-0.9
+ pretrain_parser
+ /home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_trainer --arg_prefix=brain_parser --batch_size=256 --compute_lexicon --decay_steps=4400 --graph_builder=greedy --hidden_layer_sizes=512,512 --learning_rate=0.08 --momentum=0.85 --beam_size=1 --output_path=/home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output --task_context=/home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/context --projectivize_training_set --training_corpus=tagged-training-corpus --tuning_corpus=tagged-tuning-corpus --params=512,512-0.08-4400-0.85 --num_epochs=20 --report_every=100 --checkpoint_every=1000 --logtostderr
INFO:tensorflow:Computing lexicon...
I syntaxnet/lexicon_builder.cc:124] Term maps collected over 0 tokens from 0 documents
I syntaxnet/term_frequency_map.cc:137] Saved 0 terms to /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/word-map.
I syntaxnet/term_frequency_map.cc:137] Saved 0 terms to /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/lcword-map.
I syntaxnet/term_frequency_map.cc:137] Saved 0 terms to /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/tag-map.
I syntaxnet/term_frequency_map.cc:137] Saved 0 terms to /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/category-map.
I syntaxnet/term_frequency_map.cc:137] Saved 0 terms to /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/label-map.
I syntaxnet/term_frequency_map.cc:101] Loaded 0 terms from /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/label-map.
I syntaxnet/embedding_feature_extractor.cc:35] Features: input.word input(1).word input(2).word input(3).word stack.word stack(1).word stack(2).word stack(3).word stack.child(1).word stack.child(1).sibling(-1).word stack.child(-1).word stack.child(-1).sibling(1).word stack(1).child(1).word stack(1).child(1).sibling(-1).word stack(1).child(-1).word stack(1).child(-1).sibling(1).word stack.child(2).word stack.child(-2).word stack(1).child(2).word stack(1).child(-2).word; input.tag input(1).tag input(2).tag input(3).tag stack.tag stack(1).tag stack(2).tag stack(3).tag stack.child(1).tag stack.child(1).sibling(-1).tag stack.child(-1).tag stack.child(-1).sibling(1).tag stack(1).child(1).tag stack(1).child(1).sibling(-1).tag stack(1).child(-1).tag stack(1).child(-1).sibling(1).tag stack.child(2).tag stack.child(-2).tag stack(1).child(2).tag stack(1).child(-2).tag; stack.child(1).label stack.child(1).sibling(-1).label stack.child(-1).label stack.child(-1).sibling(1).label stack(1).child(1).label stack(1).child(1).sibling(-1).label stack(1).child(-1).label stack(1).child(-1).sibling(1).label stack.child(2).label stack.child(-2).label stack(1).child(2).label stack(1).child(-2).label
I syntaxnet/embedding_feature_extractor.cc:36] Embedding names: words;tags;labels
I syntaxnet/embedding_feature_extractor.cc:37] Embedding dims: 64;32;32
I syntaxnet/term_frequency_map.cc:101] Loaded 0 terms from /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/word-map.
I syntaxnet/term_frequency_map.cc:101] Loaded 0 terms from /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/tag-map.
INFO:tensorflow:Preprocessing...
INFO:tensorflow:Training...
INFO:tensorflow:Building training network with parameters: feature_sizes: [20 20 12] domain_sizes: [3 3 3]
INFO:tensorflow:Initializing...
INFO:tensorflow:Training...
I syntaxnet/embedding_feature_extractor.cc:35] Features: input.word input(1).word input(2).word input(3).word stack.word stack(1).word stack(2).word stack(3).word stack.child(1).word stack.child(1).sibling(-1).word stack.child(-1).word stack.child(-1).sibling(1).word stack(1).child(1).word stack(1).child(1).sibling(-1).word stack(1).child(-1).word stack(1).child(-1).sibling(1).word stack.child(2).word stack.child(-2).word stack(1).child(2).word stack(1).child(-2).word; input.tag input(1).tag input(2).tag input(3).tag stack.tag stack(1).tag stack(2).tag stack(3).tag stack.child(1).tag stack.child(1).sibling(-1).tag stack.child(-1).tag stack.child(-1).sibling(1).tag stack(1).child(1).tag stack(1).child(1).sibling(-1).tag stack(1).child(-1).tag stack(1).child(-1).sibling(1).tag stack.child(2).tag stack.child(-2).tag stack(1).child(2).tag stack(1).child(-2).tag; stack.child(1).label stack.child(1).sibling(-1).label stack.child(-1).label stack.child(-1).sibling(1).label stack(1).child(1).label stack(1).child(1).sibling(-1).label stack(1).child(-1).label stack(1).child(-1).sibling(1).label stack.child(2).label stack.child(-2).label stack(1).child(2).label stack(1).child(-2).label
I syntaxnet/embedding_feature_extractor.cc:36] Embedding names: words;tags;labels
I syntaxnet/embedding_feature_extractor.cc:37] Embedding dims: 64;32;32
I syntaxnet/term_frequency_map.cc:101] Loaded 0 terms from /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/word-map.
I syntaxnet/term_frequency_map.cc:101] Loaded 0 terms from /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/tag-map.
I syntaxnet/term_frequency_map.cc:101] Loaded 0 terms from /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/label-map.
I syntaxnet/reader_ops.cc:141] Starting epoch 1
I syntaxnet/reader_ops.cc:141] Starting epoch 2
I syntaxnet/reader_ops.cc:141] Starting epoch 3
I syntaxnet/reader_ops.cc:141] Starting epoch 4
I syntaxnet/reader_ops.cc:141] Starting epoch 5
I syntaxnet/reader_ops.cc:141] Starting epoch 6
I syntaxnet/reader_ops.cc:141] Starting epoch 7
I syntaxnet/reader_ops.cc:141] Starting epoch 8
I syntaxnet/reader_ops.cc:141] Starting epoch 9
I syntaxnet/reader_ops.cc:141] Starting epoch 10
I syntaxnet/reader_ops.cc:141] Starting epoch 11
I syntaxnet/reader_ops.cc:141] Starting epoch 12
I syntaxnet/reader_ops.cc:141] Starting epoch 13
I syntaxnet/reader_ops.cc:141] Starting epoch 14
I syntaxnet/reader_ops.cc:141] Starting epoch 15
I syntaxnet/reader_ops.cc:141] Starting epoch 16
I syntaxnet/reader_ops.cc:141] Starting epoch 17
I syntaxnet/reader_ops.cc:141] Starting epoch 18
I syntaxnet/reader_ops.cc:141] Starting epoch 19
I syntaxnet/reader_ops.cc:141] Starting epoch 20
+ evaluate_pretrained_parser
+ for SET in training tuning test
+ /home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval --task_context=/home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/brain_parser/greedy/512,512-0.08-4400-0.85/context --batch_size=256 --hidden_layer_sizes=512,512 --beam_size=1 --input=tagged-training-corpus --output=parsed-training-corpus --arg_prefix=brain_parser --graph_builder=greedy --model_path=/home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/brain_parser/greedy/512,512-0.08-4400-0.85/model
I syntaxnet/term_frequency_map.cc:101] Loaded 0 terms from /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/label-map.
I syntaxnet/embedding_feature_extractor.cc:35] Features: input.word input(1).word input(2).word input(3).word stack.word stack(1).word stack(2).word stack(3).word stack.child(1).word stack.child(1).sibling(-1).word stack.child(-1).word stack.child(-1).sibling(1).word stack(1).child(1).word stack(1).child(1).sibling(-1).word stack(1).child(-1).word stack(1).child(-1).sibling(1).word stack.child(2).word stack.child(-2).word stack(1).child(2).word stack(1).child(-2).word; input.tag input(1).tag input(2).tag input(3).tag stack.tag stack(1).tag stack(2).tag stack(3).tag stack.child(1).tag stack.child(1).sibling(-1).tag stack.child(-1).tag stack.child(-1).sibling(1).tag stack(1).child(1).tag stack(1).child(1).sibling(-1).tag stack(1).child(-1).tag stack(1).child(-1).sibling(1).tag stack.child(2).tag stack.child(-2).tag stack(1).child(2).tag stack(1).child(-2).tag; stack.child(1).label stack.child(1).sibling(-1).label stack.child(-1).label stack.child(-1).sibling(1).label stack(1).child(1).label stack(1).child(1).sibling(-1).label stack(1).child(-1).label stack(1).child(-1).sibling(1).label stack.child(2).label stack.child(-2).label stack(1).child(2).label stack(1).child(-2).label
I syntaxnet/embedding_feature_extractor.cc:36] Embedding names: words;tags;labels
I syntaxnet/embedding_feature_extractor.cc:37] Embedding dims: 64;32;32
I syntaxnet/term_frequency_map.cc:101] Loaded 0 terms from /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/word-map.
I syntaxnet/term_frequency_map.cc:101] Loaded 0 terms from /home/andy/Downloads/syntaxnet/models/syntaxnet/work/sejong/tmp_p/syntaxnet-output/tag-map.
INFO:tensorflow:Building training network with parameters: feature_sizes: [20 20 12] domain_sizes: [3 3 3]
Traceback (most recent call last):
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/__main__/syntaxnet/parser_eval.py", line 149, in <module>
tf.app.run()
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/tf/tensorflow/python/platform/app.py", line 30, in run
sys.exit(main(sys.argv))
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/__main__/syntaxnet/parser_eval.py", line 145, in main
Eval(sess, num_actions, feature_sizes, domain_sizes, embedding_dims)
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/__main__/syntaxnet/parser_eval.py", line 98, in Eval
parser.saver.restore(sess, FLAGS.model_path)
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/tf/tensorflow/python/training/saver.py", line 1104, in restore
{self.saver_def.filename_tensor_name: save_path})
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/tf/tensorflow/python/client/session.py", line 333, in run
run_metadata_ptr)
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/tf/tensorflow/python/client/session.py", line 573, in _run
feed_dict_string, options, run_metadata)
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/tf/tensorflow/python/client/session.py", line 653, in _do_run
target_list, options, run_metadata)
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/tf/tensorflow/python/client/session.py", line 673, in _do_call
raise type(e)(node_def, op, message)
tensorflow.python.framework.errors.InvalidArgumentError: Assign requires shapes of both tensors to match. lhs shape= [3,64] rhs shape= [485,64]
[[Node: save/Assign_5 = Assign[T=DT_FLOAT, _class=["loc:@embedding_matrix_0"], use_locking=true, validate_shape=true, _device="/job:localhost/replica:0/task:0/cpu:0"](params/embedding_matrix_0/ExponentialMovingAverage, save/restore_slice_5)]]
Caused by op u'save/Assign_5', defined at:
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/__main__/syntaxnet/parser_eval.py", line 149, in <module>
tf.app.run()
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/tf/tensorflow/python/platform/app.py", line 30, in run
sys.exit(main(sys.argv))
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/__main__/syntaxnet/parser_eval.py", line 145, in main
Eval(sess, num_actions, feature_sizes, domain_sizes, embedding_dims)
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/__main__/syntaxnet/parser_eval.py", line 96, in Eval
parser.AddSaver(FLAGS.slim_model)
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/__main__/syntaxnet/graph_builder.py", line 568, in AddSaver
self.saver = tf.train.Saver(variables_to_save)
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/tf/tensorflow/python/training/saver.py", line 845, in __init__
restore_sequentially=restore_sequentially)
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/tf/tensorflow/python/training/saver.py", line 515, in build
filename_tensor, vars_to_save, restore_sequentially, reshape)
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/tf/tensorflow/python/training/saver.py", line 281, in _AddRestoreOps
validate_shape=validate_shape))
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/tf/tensorflow/python/ops/gen_state_ops.py", line 45, in assign
use_locking=use_locking, name=name)
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/tf/tensorflow/python/ops/op_def_library.py", line 693, in apply_op
op_def=op_def)
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/tf/tensorflow/python/framework/ops.py", line 2186, in create_op
original_op=self._default_original_op, op_def=op_def)
File "/home/andy/Downloads/syntaxnet/models/syntaxnet/bazel-bin/syntaxnet/parser_eval.runfiles/tf/tensorflow/python/framework/ops.py", line 1170, in __init__
self._traceback = _extract_stack()