Skip to content

Commit

Permalink
Update docs for ORT support on Jetson (triton-inference-server#4157) (t…
Browse files Browse the repository at this point in the history
  • Loading branch information
Hemant Jain authored Apr 6, 2022
1 parent 172f5f6 commit 847d5b2
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions docs/jetson.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,10 +43,10 @@ Triton Inference Server support on JetPack includes:
* [HTTP/REST and GRPC inference protocols](inference_protocols.md)
* [C API](inference_protocols.md#c-api)

Limitations on Jetson/JetPack:
Limitations on JetPack 5.0:

* Onnx Runtime backend does not support the OpenVino execution provider.
The TensorRT execution provider however is supported.
* Onnx Runtime backend does not support the OpenVino and TensorRT execution providers.
The CUDA execution provider is in Beta.
* The Python backend does not support GPU Tensors and Async BLS.
* CUDA IPC (shared memory) is not supported. System shared memory however is supported.
* GPU metrics, GCS storage, S3 storage and Azure storage are not supported.
Expand Down

0 comments on commit 847d5b2

Please sign in to comment.