Skip to content

SaveOnnxCommand appears to ignore predictors when saving a model to ONNX format. #3974

Closed

Description

System information

  • OS version/distro: Windows 10

Steps To Recreate The Issue

  1. Create and save a PredictorModel to disk using the entry point api.

  2. Try and convert the model to ONNX format using the entry point api.

  3. Notice that SaveOnnxCommand.GetPipe only cycles through the transforms and never encounters the logistic regression node.

    This might be happening because ExecuteGraphCommand.GetOutputToPath saves a TlcModule.DataKind.PredictorModel to disk in step (1). And then, ExecuteGraphCommand.SetInputFromPath loads a TlcModule.DataKind.TransformModel from disk in step (2) (apparently a consequence of SaveOnnxCommand.Arguments.Model being of type TransformModel). PredictorModelImpl and TransformModelImpl don't appear to be compatible from a serialization point of view.

Source code / logs

See here for an ml.net test which demonstrates the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Assignees

Labels

P0Priority of the issue for triage purpose: IMPORTANT, needs to be fixed right away.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions