Skip to content

Support scenario: Input Bitmap + Training = ONNX model #7209

Open

Description

Is your feature request related to a problem? Please describe.

Right now, ML supports several scenarios like image classification and detection which take a Bitmap as an input, and after training, it produces a ML Zip file.

ML also supports training over plain data, that after training, it can be exported to an ONNX model.

But there is no scenario, or example that covers both, that is: training a model using bitmaps as input, and being able to output an onnx model.

Apparently, the main roadblock being that the ONNX Converter toolchain being limited to a few data types, which does not include MLImage.

Describe the solution you'd like

There's some solutions already proposed, for example: #5271 proposes to exclude the input data pre-processing part of the training pipeline, which happens to be the part that cannot be exported to ONNX. Ideally, the export process would begin at the point of the pipeline where the input image has been converted to a tensor.

Another solution would be to make ONNX converter toolchain to handle any incoming MLImage type as a tensor.

Yet another solution would be to introduce a new "image" type, which is lower level and more "palatable" by the ONNX converter. Theoretically, this image type would represent the images in its already pre-processed state, like scaled to a fixed size.

Finally, if this Bitmap + Training = ONNX scenario is already supported by the current libraries, it could be desirable to have an end-to-end example showcasing how to properly configure the input data and the pipeline so it can be successfully exported to ONNX. (And I've also looked for such an example in the examples repository with no success)

Describe alternatives you've considered

Not using ML at all and do the training with other frameworks.

Additional context

This is a long standing issue that has been already highlighted by issues like #6810, and I have to apologize for opening yet another one, but this problem seems to be kept unaswered for months (years?)... from time to time I come here to look for news and see if the latest version of the ML libraries finally solved this problem, just to discover it remains unanswered.

Additinally, We're using OnnxRuntime at low level for inference, so we really do require to export to ONNX, ML.Zip is not an option to us.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestonnxExporting ONNX models or loading ONNX modelsuntriagedNew issue has not been triaged

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions