Skip to content

API for loading ONNX model lacks access to input schema and is not consistent with regular model loading API #4335

@Dmitry-A

Description

@Dmitry-A

Current API for loading models is as follows:
ITransformer mlModel = mlContext.Model.Load(path, out DataViewSchema inputSchema);

Note you have a transformer back and the input schema, plus transformer gives access to output schema as well.
For ONNX the same functionality will look something like this:
var estimator = mlContext.Transforms.ApplyOnnxModel(path);
var dataView = mlContext.Data.LoadFromEnumerable(new ModelInput[] { });
// Fit() will check the input schema of the model against the input dataview you're passing in
var transformer = estimator.Fit(dataView);

Note there's no way to get the input schema, you have to know it before you can do anything with the model. Also Fit() call is confusing and inconsistent with the much cleaner API above, it doesn't actually do anything other than verify the [unavailable from public interface] input schema against the schema that got loaded with the model.

The fix is to bring ONNX model loading API in line with what we have for regular ML.NET models.

Metadata

Metadata

Assignees

No one assigned

    Labels

    P2Priority of the issue for triage purpose: Needs to be fixed at some point.bugSomething isn't workingonnxExporting ONNX models or loading ONNX models

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions