Skip to content

[DCU] fix topp kernel #66630

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Aug 1, 2024
Merged

[DCU] fix topp kernel #66630

merged 4 commits into from
Aug 1, 2024

Conversation

YanhuiDua
Copy link
Contributor

@YanhuiDua YanhuiDua commented Jul 26, 2024

PR Category

Custom Device

PR Types

Bug fixes

Description

card-85848
fix warp_size and mismatched types in top_p_sampling kernels

Copy link

paddle-bot bot commented Jul 26, 2024

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

ronny1996
ronny1996 previously approved these changes Jul 30, 2024
Copy link
Contributor

@ronny1996 ronny1996 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@@ -919,8 +919,13 @@ __global__ void topp_sampling_ft(T* sorted_probs,
}
}
if (!skip) {
#ifdef PADDLE_WITH_CUDA
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

我觉得这一部分要调换一下逻辑,用PADDLE_WITH_HIP来判断,和前面语义保持一致会比较好~

@Deleter-D
Copy link
Contributor

LGTM

@YanhuiDua YanhuiDua merged commit fc53458 into PaddlePaddle:develop Aug 1, 2024
31 checks passed
Lans1ot pushed a commit to Lans1ot/Paddle that referenced this pull request Aug 5, 2024
* [DCU] fix topp

* fix

* fix
@YanhuiDua YanhuiDua deleted the fix_topp branch September 3, 2024 06:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants