-
Notifications
You must be signed in to change notification settings - Fork 2k
shape issue in the dynamic_decode with GreedyEmbeddingHelper #117
Comments
hi, i have the same problem. did you get the solutions? |
I got this exact same problem. |
Can you provide a command that can regenerate the issue with this codebase? Also python version and tensorflow version may also help. |
I use python3 + tensorflow1.4. And i got the same problem. Did you get solutions? |
python 3.6 + tf 1.5 same here, batch_size = 2 works fine and batch_size = 1 raises the error |
Same issue. Also, how does one perform inference for a single sequence if the greedy embedding helper does not accept batch size 1? |
This workaround fixes batch size 1 (batch size 0 is still broken):
|
@tavianator Is batch_size = 1 for this case? I tried the value but it was already 1 and got same error result = |
I think the best way to do that is to create a dummy sequence, like an empty one and then pass both of the single sentence appended by the empty one to the greedy embedding |
I'm also having exactly this problem, with python 2.7.12 and tf 1.13.1. @roholazandie, how could I best proceed adding this empty embedding? Should it be added to the initial_state of the following BasicDecoder, which is followed by dynamic_decode? |
I found a solution to this problem, at this link. The way to go is to set the start tokens in the following manner: Here self.inputs refers to the input tensor |
Hello,
i'm trying dynamic_decode with GreedyEmbeddingHelper, but i got a error of shape:
ValueError: The shape for decoder_1/decoder/while/Merge_7:0 is not an invariant for the loop. It enters the loop with shape (1, 128), but has shape (?, 128) after one iteration. Provide shape invariants using either the
shape_invariants` argument of tf.while_loop or set_shape() on the loop variables.PS. 128 is embedding size
i tried batch_size with 2, work fine, but not with size 1.
The text was updated successfully, but these errors were encountered: