Skip to content

Commit

Permalink
fix attention in biaffine
Browse files Browse the repository at this point in the history
  • Loading branch information
huseinzol05 committed Sep 30, 2019
1 parent 084db60 commit 83a13c3
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion dependency-parser/7.biaffine-attention.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -407,7 +407,7 @@
" def forward(self, input_word, input_char, mask):\n",
" arcs, types, _ = self.encode(input_word, input_char)\n",
" \n",
" out_arc = tf.squeeze(self.attention.forward(arcs[0], arcs[0], mask_d=mask, mask_e=mask), axis = 1)\n",
" out_arc = tf.squeeze(self.attention.forward(arcs[0], arcs[1], mask_d=mask, mask_e=mask), axis = 1)\n",
" return out_arc, types, mask\n",
" \n",
" def loss(self, input_word, input_char, mask, heads, types):\n",
Expand Down

0 comments on commit 83a13c3

Please sign in to comment.