-
Notifications
You must be signed in to change notification settings - Fork 5.8k
Move digamma to pten #39240
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Move digamma to pten #39240
Conversation
Thanks for your contribution! |
… move_digamma_to_pten
… move_digamma_to_pten
… move_digamma_to_pten
|
||
#pragma once | ||
|
||
#include <unsupported/Eigen/SpecialFunctions> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个可以移除吗
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
下面函数会使用
|
||
#pragma once | ||
|
||
#include <unsupported/Eigen/SpecialFunctions> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
同上
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
下面函数会使用
|
||
#include "paddle/pten/backends/gpu/gpu_context.h" | ||
#include "paddle/pten/common/scalar.h" | ||
#include "paddle/pten/core/dense_tensor.h" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
或许可以移除
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
可以的,但是不影响包体积
PR types
Breaking changes
PR changes
OPs
Describe
move digamma to pten