Skip to content

[v0.9.1][DP] Tiny fix of dp and update example #1277

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 2 commits into from

Conversation

MengqingCao
Copy link
Collaborator

What this PR does / why we need it?

backport form #1273

Add max_num_tokens_across_dp to AscendMetadata to fix dp

This pr fixes the bug introduced by #1229, which add an arg max_num_tokens_across_dp when dp_size > 1.

Does this PR introduce any user-facing change?

How was this patch tested?

@MengqingCao
Copy link
Collaborator Author

Please help review this, thanks! @wangxiyuan

Copy link
Collaborator

@wangxiyuan wangxiyuan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. But i'd like to see the PR for main branch is merged first.

Copy link

This pull request has conflicts, please resolve those before we can evaluate the pull request.

Copy link

This pull request has conflicts, please resolve those before we can evaluate the pull request.

Signed-off-by: MengqingCao <cmq0113@163.com>
Signed-off-by: MengqingCao <cmq0113@163.com>
@ganyi1996ppo
Copy link
Collaborator

@zzzzwwjj have a PR to remove max_num_tokens_across_dp out of the attention metadata, maybe we can drop this change? @MengqingCao

@MengqingCao
Copy link
Collaborator Author

@zzzzwwjj have a PR to remove max_num_tokens_across_dp out of the attention metadata, maybe we can drop this change? @MengqingCao

Yes, I noticed that the mainly changes, including bug fix, example modification, are also included in #1422. Thus closing this pr

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants