Skip to content
This repository was archived by the owner on Jul 7, 2023. It is now read-only.

Enable Relative Dot Product Visualization #1303

Conversation

aeloyq
Copy link
Contributor

@aeloyq aeloyq commented Dec 14, 2018

Hi, I've got some further works on dot_product_relative here.

This PR enables visualization of self-attention and encdec-attention when using dot_product_relative as self-attention type

@googlebot googlebot added the cla: yes PR author has signed CLA label Dec 14, 2018
Copy link
Contributor

@lukaszkaiser lukaszkaiser left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@lukaszkaiser lukaszkaiser merged commit 98ec1ee into tensorflow:master Jan 4, 2019
@lukaszkaiser
Copy link
Contributor

Great thanks for adding this!

tensorflow-copybara pushed a commit that referenced this pull request Jan 4, 2019
PiperOrigin-RevId: 227927931
@aeloyq
Copy link
Contributor Author

aeloyq commented Jan 5, 2019

It’s my pleasure :)

kpe pushed a commit to kpe/tensor2tensor that referenced this pull request Mar 2, 2019
* add caching mechanism support for fast decoding with relative_dot_product in transformer model

* fix typo

* enable visualization when use dot_product_relative in self-attention

* clean code
kpe pushed a commit to kpe/tensor2tensor that referenced this pull request Mar 2, 2019
PiperOrigin-RevId: 227927931
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
cla: yes PR author has signed CLA
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants