Skip to content

Conversation

@HongHongHongL
Copy link
Contributor

@HongHongHongL HongHongHongL commented Sep 9, 2024

  1. In onnx_frontend.py, for parameters with names starting with "onnx::", a strip processing is performed. In self._params, they should be stored as the new var_name.
if self._keep_params_in_input:
    # Pytorch sometimes inserts silly weight prefix. Remove it.
    var_name = init_tensor.name.strip("onnx::")
    init_var = self._new_var(var_name, shape=array.shape, dtype=array.dtype)
    self._nodes[init_tensor.name] = init_var
    # We need to keep track of both the real value and variable for this variable.
    self._params[init_tensor.name] = (init_var, array)
  1. In ONNX models, a param can be used many times. As a result, we should not use pop in get_constant.
# Params is actually both the graph nodes and param dictionary, unpack them.
graph_nodes, params = params
# Convert if possible
if isinstance(var, relax.Var) and var.name_hint in params:
    # When converting a parameter to a constant, update references to it as well.
    _, value = params.pop(var.name_hint)
    const_value = relax.const(value)
    graph_nodes[var.name_hint] = const_value
    return const_value

@Hzfengsy
Copy link
Member

Please add a regression test, thank!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants