-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CPU] Skip CPU support unimplemented error #3633
[CPU] Skip CPU support unimplemented error #3633
Conversation
5838577
to
7e534ab
Compare
I found that if no tests were collected, pytest will return exit code 5 (the correct exit code should be 0). I updated the cmd of running ut to ensure that pytest can exit correctly. |
7e534ab
to
7ea1c8b
Compare
accelerator/cuda_accelerator.py
Outdated
@@ -147,6 +147,9 @@ def is_fp16_supported(self): | |||
else: | |||
return False | |||
|
|||
def supported_dtypes(self): | |||
return [torch.float, torch.half] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cuda support torch.bfloat16
as well, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there any reason we cannot use the data types defined here
https://pytorch.org/docs/stable/tensors.html#data-types
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi tjruwase, I am sorry for the late reply. Cuda support torch.bfloat16 as well. I will update my PR later, Thanks~
@Yejing-Lai, is this PR ready for review? Thanks! |
It's ready. The ut can work correctly on my local env. Please review. Thanks~ |
@Yejing-Lai, PR looks good. Please resolve conflict to enable merge. Thanks. |
Hi @tjruwase , can I run the CI again? I think the reason for the failure of both checks is a network issue. |
Revert "remove skip FusedAdamBuilder; add suported_dtypes" Revert "remove unused parameters" Revert "enable zero stage 1 and 2 for synchronized accelerator (a.k.a. CPU)" Revert "use cpu adam to implement fused adam" Revert "fused adam can build"
Hi @tjruwase, I am sorry to revert the commit about fusedadam. Our fusedadam needs to be further modified, and we will submit another PR about fusedadam after this PR merge. Thanks for your review/merge~ |
@Yejing-Lai, no worries. Thanks for the update. Please ping me when you are ready to move forward with this PR. Thanks! |
Hi @tjruwase. this PR is ready for merge. Please review. Thanks~ |
This PR aims to add CPU inference UT support. We skip CPU support unimplemented errors and update cpu inference workflow.
Skip logic: