Running inference on CPU #523
Henrique-Potter
started this conversation in
General
Replies: 1 comment
-
|
PyTorch model inference should work on CPU. For TensorRT and ORT inference, they only work on NVIDIA GPU. The model inference benchmarks are mainly designed to validate GPUs rather to CPUs. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
How can we run superbench models' inference on CPU? It seems that it only supports GPU.
Beta Was this translation helpful? Give feedback.
All reactions