-
Notifications
You must be signed in to change notification settings - Fork 6.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix shape issue of return_all_hiddens
in roberta
#1438
Conversation
By default `return_all_hiddens` is False, the shape of `features` will be BxTxC. If use `return_all_hiddens`, the shape of `features` will be TxBxC. See https://github.com/pytorch/fairseq/blob/9398a2829596393b73f5c5f1b99edf4c2d8f9316/fairseq/modules/transformer_sentence_encoder.py#L227
Hi HMJiangGatech! Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have you on file.In order for us to review and merge your code, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA. If you have received this in error or have any questions, please contact us at cla@fb.com. Thanks! |
Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Facebook open source project. Thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@myleott has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Thanks, this is a good catch! I think it makes more sense to fix the API for |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@myleott is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
…1438) Summary: By default `return_all_hiddens` is False, the shape of `features` will be BxTxC. If use `return_all_hiddens`, the shape of `features` will be TxBxC. See https://github.com/pytorch/fairseq/blob/9398a2829596393b73f5c5f1b99edf4c2d8f9316/fairseq/modules/transformer_sentence_encoder.py#L227 Pull Request resolved: facebookresearch#1438 Differential Revision: D18809509 Pulled By: myleott fbshipit-source-id: 696b395934e2b7e5807387069fe1da49a4df98c7
Summary: Fix #2897 Pull Request resolved: fairinternal/fairseq-py#1438 Reviewed By: myleott Differential Revision: D24992106 Pulled By: alexeib fbshipit-source-id: 0cb15c2e865c3e8f7950e8f5e6c54c5000637af2
Summary: Fix #2897 Pull Request resolved: fairinternal/fairseq-py#1438 Reviewed By: myleott Differential Revision: D24992106 Pulled By: alexeib fbshipit-source-id: 0cb15c2e865c3e8f7950e8f5e6c54c5000637af2
By default
return_all_hiddens
is False, the shape offeatures
will be BxTxC.If use
return_all_hiddens
, the shape offeatures
will be TxBxC.See
https://github.com/pytorch/fairseq/blob/9398a2829596393b73f5c5f1b99edf4c2d8f9316/fairseq/modules/transformer_sentence_encoder.py#L227