Skip to content

Fix ModelWrapper method to preserve order of graph inputs and outputs #186

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jun 6, 2025

Conversation

auphelia
Copy link
Contributor

This PR fixes an issue with set_tensor_shape and the order of input and output tensors in the ONNX GraphProto. When setting the shape of these tensors, they used to be removed and then re-added, which potentially could mess up their order. This wasn't a problem for internal tensors (ValueInfo is unordered), but can lead to a reordering of graph.input or graph.output.

In this PR, the code now fetches the current index of the tensor in graph.input or graph.output. It then inserts the updated tensor with its new shape back into the same spot, keeping the original order intact.

@maltanar
Copy link
Collaborator

maltanar commented Jun 6, 2025

Thanks for catching and fixing this @auphelia ! I took the liberty to add a unit test that triggers the problem which addressed by this fix.

@maltanar maltanar merged commit 0630cea into fastmachinelearning:main Jun 6, 2025
2 of 3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants