Skip to content

Commit 42149c9

Browse files
committed
modified setting key_sequence_length in scaled dot product in appendix self-attention network
1 parent d5a6265 commit 42149c9

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

6.CHATBOT/Appendix-transformer/model.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ def __init__(self, num_units, heads, masked=False):
4545
self.value_dense = tf.keras.layers.Dense(num_units, activation=tf.nn.relu)
4646

4747
def scaled_dot_product_attention(self, query, key, value, masked=False):
48-
key_seq_length = float(key.get_shape().as_list()[-2])
48+
key_seq_length = float(key.get_shape().as_list()[-1])
4949
key = tf.transpose(key, perm=[0, 2, 1])
5050
outputs = tf.matmul(query, key) / tf.sqrt(key_seq_length)
5151

0 commit comments

Comments
 (0)