Fix local models being treated as S3 paths #1075
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
@vishalbollu when the model's path is not an S3 path, the model cacher and validator still treats it as an S3 path instead of just a local path. This prevents the deployment of APIs with locally-provided models (ONNX and TensorFlow).
How to reproduce:
With the above config, the resulting error would be
error: /home/robert/sandbox/feature/multi-model-endpoint/cortex_tf_sample.yaml: api-tf: predictor: model: "./inferentia-models/tensorflow/resnet50" is not a valid s3 path (e.g. s3://cortex-examples/iris-classifier/tensorflow is a valid s3 path)
checklist:
make test
andmake lint