-
Notifications
You must be signed in to change notification settings - Fork 5.9k
[oneDNN] disable caching for interpolate and batch Norm #35030
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[oneDNN] disable caching for interpolate and batch Norm #35030
Conversation
|
Thanks for your contribution! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
| auto out_dims_vec = ComputeOutputShape(ctx); | ||
| framework::DDim dim_out = framework::make_ddim(out_dims_vec); | ||
| z->mutable_data<T>(dim_out, ctx.GetPlace()); | ||
| z->Resize(dim_out); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why did you earlier need this "mutable_data" call and why don't you need it now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great question. mutable data is not needed here as it is called internally inside AcquireDstMemory so having it here(as deleted) was creating situation that it was called twice which was not needed and in some situations it could cause performance drop
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR types
Others
PR changes
OPs
Describe
PR is disabling caching oneDNN objects by PaddlePaddle cache (cache of oneDNN library will be used instead)