Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Output attention takes an s #6903

Merged
merged 3 commits into from
Sep 2, 2020
Merged

Conversation

sgugger
Copy link
Collaborator

@sgugger sgugger commented Sep 2, 2020

Fixes #6902

Stas would have come with a nice Perl magic command but I did a regex search (output_attention[^s]) to fix all those misspelled args. In the process, I noticed a few examples were missing a line so added that too.

@sgugger sgugger requested a review from LysandreJik September 2, 2020 11:58
@codecov
Copy link

codecov bot commented Sep 2, 2020

Codecov Report

Merging #6903 into master will increase coverage by 0.29%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #6903      +/-   ##
==========================================
+ Coverage   79.30%   79.59%   +0.29%     
==========================================
  Files         157      157              
  Lines       28853    28853              
==========================================
+ Hits        22882    22966      +84     
+ Misses       5971     5887      -84     
Impacted Files Coverage Δ
src/transformers/configuration_auto.py 93.18% <ø> (ø)
src/transformers/configuration_utils.py 96.64% <ø> (+0.67%) ⬆️
src/transformers/modelcard.py 85.18% <ø> (ø)
src/transformers/modeling_auto.py 78.73% <ø> (ø)
src/transformers/modeling_distilbert.py 97.84% <ø> (ø)
src/transformers/modeling_encoder_decoder.py 92.00% <ø> (ø)
src/transformers/modeling_tf_auto.py 66.86% <ø> (ø)
src/transformers/modeling_tf_distilbert.py 64.47% <ø> (-34.36%) ⬇️
src/transformers/modeling_tf_utils.py 87.29% <ø> (+0.32%) ⬆️
src/transformers/modeling_utils.py 87.50% <ø> (ø)
... and 23 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 485da72...e8fd79c. Read the comment docs.

Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great, thanks!

@LysandreJik LysandreJik merged commit 8f2723c into master Sep 2, 2020
@LysandreJik LysandreJik deleted the output_attention_takes_an_s branch September 2, 2020 12:11
Zigur pushed a commit to Zigur/transformers that referenced this pull request Oct 26, 2020
* Fix output_attention -> output_attentions

* Formatting

* One unsaved file
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Example config code uses invalid 'output_attention' rather than 'output_attentions'
2 participants