diff --git a/ml.md b/ml.md index f230010..dc9139e 100644 --- a/ml.md +++ b/ml.md @@ -9,14 +9,14 @@ Then, the user passes tensor inputs to the graph, computes the
wasi:nn/tensor@0.2.0-rc-2024-06-25
wasi:nn/errors@0.2.0-rc-2024-06-25
wasi:nn/inference@0.2.0-rc-2024-06-25
wasi:nn/graph@0.2.0-rc-2024-06-25
wasi:nn/tensor@0.2.0-rc-2024-08-19
wasi:nn/errors@0.2.0-rc-2024-08-19
wasi:nn/inference@0.2.0-rc-2024-08-19
wasi:nn/graph@0.2.0-rc-2024-08-19
All inputs and outputs to an ML inference are represented as tensor
s.
[1]
for the tensor dimensions.
-TODO: create function-specific errors (https://github.com/WebAssembly/wasi-nn/issues/42)
An inference "session" is encapsulated by a graph-execution-context
. This structure binds a
graph
to input tensors before compute
-ing an inference:
A graph
is a loaded instance of a specific ML model (e.g., MobileNet) for a specific ML
framework (e.g., TensorFlow):