Skip to content

Commit

Permalink
Remove mention of custom backend (triton-inference-server#2905)
Browse files Browse the repository at this point in the history
  • Loading branch information
CoderHam authored May 21, 2021
1 parent c07e9c5 commit 021593f
Showing 1 changed file with 1 addition and 2 deletions.
3 changes: 1 addition & 2 deletions docs/compose.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,8 +59,7 @@ To create an image containing the minimal possible Triton use the
following multi-stage Dockerfile. As mentioned above the amount of
customization currently available is limited. As a result the minimum
Triton still contains both HTTP/REST and GRPC endpoints; S3, GCS and
Azure Storage filesystem support; and the TensorRT and legacy custom
backends.
Azure Storage filesystem support; and the TensorRT backend.

```
FROM nvcr.io/nvidia/tritonserver:<xx.yy>-py3 as full
Expand Down

0 comments on commit 021593f

Please sign in to comment.