Skip to content

Conversation

@Vvsmile
Copy link
Contributor

@Vvsmile Vvsmile commented Jun 21, 2023

PR types

Others

PR changes

OPs

Description

card-72073

  1. Modify the bf16 accuracy checking framework and modify the default tolerance.
  2. fix the elementwise_max and elementwise_min bf16 accuracy problem .
  3. fix the elementwise_div bf16 accuracy problem
  4. fix the matmul_v2 bf16 accuracy problem
  5. fix the unique bf16 accuracy problem
  6. fix the pool_max bf16 accuracy problem

@paddle-bot
Copy link

paddle-bot bot commented Jun 21, 2023

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@paddle-ci-bot
Copy link

paddle-ci-bot bot commented Jun 30, 2023

Sorry to inform you that 1b5431b's CIs have passed for more than 7 days. To prevent PR conflicts, you need to re-run all CIs manually.

'user_defined_grad_outputs': [self.grad_out],
'check_dygraph': self.check_dygraph,
'check_prim': self.check_prim,
'max_relative_error': 0.02,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fp32需要增大max_relative_error么

Copy link
Contributor

@ZzSean ZzSean left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@ZzSean ZzSean merged commit 6f7ceca into PaddlePaddle:develop Jul 13, 2023
cqulilujia pushed a commit to cqulilujia/Paddle that referenced this pull request Jul 24, 2023
* modify the accuracy checking framework of bf16 optest, including both of forward and backward
wz1qqx pushed a commit to wz1qqx/Paddle that referenced this pull request Jul 31, 2023
* modify the accuracy checking framework of bf16 optest, including both of forward and backward
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants