Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【Hackathon 5th No.104】move self_dp_attention to phi -part #58715

Merged
merged 1 commit into from
Nov 9, 2023

Conversation

zeroRains
Copy link
Contributor

@zeroRains zeroRains commented Nov 6, 2023

PR types

Others

PR changes

Others

Description

move self_dp_attention to phi
#57262

yuanlehome
yuanlehome previously approved these changes Nov 7, 2023
Copy link
Contributor

@yuanlehome yuanlehome left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

dim_input));
DDim out_dims({dim_input[0], dim_input[1], dim_input[3], dim_input[4]});
out->set_dims(out_dims);
out->share_lod(x);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

设置一下 dtype

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个头文件看上去没有被其他地方使用,是否可以直接写到kernel源文件中?

@CLAassistant
Copy link

CLAassistant commented Nov 7, 2023

CLA assistant check
All committers have signed the CLA.

@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
1 out of 2 committers have signed the CLA.

✅ zeroRains
❌ zerorains


zerorains seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.
You have signed the CLA already but the status is still pending? Let us recheck it.

move self_dp_attention op to phi

move .h to the source

reomove the head file
@yuanlehome
Copy link
Contributor

我确认下这个pr和另一个pr58300,单测都是正常跑过的对吧?

@zeroRains
Copy link
Contributor Author

我确认下这个pr和另一个pr58300,单测都是正常跑过的对吧?

是的

Copy link
Contributor

@yuanlehome yuanlehome left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@XiaoguangHu01 XiaoguangHu01 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@sunzhongkai588 sunzhongkai588 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, no docs changes

@yuanlehome yuanlehome merged commit a0700da into PaddlePaddle:develop Nov 9, 2023
28 checks passed
@luotao1 luotao1 changed the title 【Hackathon 5th No.104】move self_dp_attention to phi 【Hackathon 5th No.104】move self_dp_attention to phi -part Nov 9, 2023
@zeroRains zeroRains deleted the 104_self_dp branch November 24, 2023 14:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants