Skip to content

Conversation

swenkel
Copy link

@swenkel swenkel commented May 24, 2022

Set execution provider explicitly as required since ORT 1.0

ValueError: This ORT build has ['CUDAExecutionProvider', 'DnnlExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['CUDAExecutionProvider', 'DnnlExecutionProvider', 'CPUExecutionProvider'], ...)

Set execution provider explicitly as required since ORT 1.0

ValueError: This ORT build has ['CUDAExecutionProvider', 'DnnlExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['CUDAExecutionProvider', 'DnnlExecutionProvider', 'CPUExecutionProvider'], ...)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant