-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Description
Current API for loading models is as follows:
ITransformer mlModel = mlContext.Model.Load(path, out DataViewSchema inputSchema);
Note you have a transformer back and the input schema, plus transformer gives access to output schema as well.
For ONNX the same functionality will look something like this:
var estimator = mlContext.Transforms.ApplyOnnxModel(path);
var dataView = mlContext.Data.LoadFromEnumerable(new ModelInput[] { });
// Fit() will check the input schema of the model against the input dataview you're passing in
var transformer = estimator.Fit(dataView);
Note there's no way to get the input schema, you have to know it before you can do anything with the model. Also Fit() call is confusing and inconsistent with the much cleaner API above, it doesn't actually do anything other than verify the [unavailable from public interface] input schema against the schema that got loaded with the model.
The fix is to bring ONNX model loading API in line with what we have for regular ML.NET models.