Skip to content

Single location to update optional args for all attentions #8128

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Feb 1, 2025

Conversation

iseeyuan
Copy link
Contributor

@iseeyuan iseeyuan commented Feb 1, 2025

Summary: Incremental improvement on the UX. When users need to add an optional argument for a new attention, there's this centralized location, ForwardOptions to be updated.

Differential Revision: D68988021

Summary: Incremental improvement on the UX. When users need to add an optional argument for a new attention, there's this centralized location, `ForwardOptions` to be updated.

Differential Revision: D68988021
Copy link

pytorch-bot bot commented Feb 1, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/8128

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (2 Unrelated Failures)

As of commit 1273002 with merge base a972e73 (image):

BROKEN TRUNK - The following jobs failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Feb 1, 2025
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D68988021

@iseeyuan
Copy link
Contributor Author

iseeyuan commented Feb 1, 2025

@pytorchbot label "topic: not user facing"

@facebook-github-bot facebook-github-bot merged commit a5c7609 into pytorch:main Feb 1, 2025
43 of 49 checks passed
sxu added a commit to sxu/executorch that referenced this pull request Feb 4, 2025
…t state as output

Summary: Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 4, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 4, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 4, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 4, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 4, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 4, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 4, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 5, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 5, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 5, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 5, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 5, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 5, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 5, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan, cccclai

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 6, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan, cccclai

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 6, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan, cccclai

Differential Revision: D69080123
sxu added a commit to sxu/executorch that referenced this pull request Feb 6, 2025
…t state as output (pytorch#8186)

Summary:

Pass a `ForwardOptions` argument (introduced by pytorch#8128) from the top level transformer, consolidate some existing inputs into it, and return any optional updates from the attention implementation.

Reviewed By: iseeyuan, cccclai

Differential Revision: D69080123
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported topic: not user facing
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants