Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GraphGym MLP incorrect layer config #8484

Closed
RemyLau opened this issue Nov 29, 2023 · 0 comments · Fixed by #8486
Closed

GraphGym MLP incorrect layer config #8484

RemyLau opened this issue Nov 29, 2023 · 0 comments · Fixed by #8486
Labels

Comments

@RemyLau
Copy link
Contributor

RemyLau commented Nov 29, 2023

🐛 Describe the bug

I encountered a bug in the GraphGym module of PyG 2.4.0. It was due to an incorrect handling of inner dimension setting, resulting in unexpected hidden dimension of the constructed MLP. In some cases, it could also fail to construct the MLP.

To demonstrate the problem, please refer to the following example, where I want to create a 2-layer MLP with hidden dimension of 20.

>>> from torch_geometric.graphgym.models.layer import LayerConfig, MLP
>>> MLP(LayerConfig(dim_in=10, dim_out=5, dim_inner=20, num_layers=2))
MLP(
  (model): Sequential(
    (0): GeneralMultiLayer(
      (Layer_0): GeneralLayer(
        (layer): Linear(
          (model): Linear(10, 10, bias=True)
        )
        (post_layer): Sequential(
          (0): ReLU()
        )
      )
    )
    (1): Linear(
      (model): Linear(10, 5, bias=True)
    )
  )
)

However, as shown in the output, the actual hidden dimension was set to be the same as the input dimension, 10, instead of the intended dimension of 20. This was supposed to be the default behavior when dim_inner was not set, i.e., dim_inner=None. Conversely, when dim_inner is indeed unset, it results in an error due to trying to set the Linear layer with None dimension:

>>> MLP(LayerConfig(dim_in=10, dim_out=5, dim_inner=None, num_layers=2))
TypeError: empty() received an invalid combination of arguments - got (NoneType, int), but expected one of:
 * (tuple of ints size, *, tuple of names names, torch.memory_format memory_format, torch.dtype dtype, torch.layout layout, torch.device device, bool pin_memory, bool requires_grad)
 * (tuple of ints size, *, torch.memory_format memory_format, Tensor out, torch.dtype dtype, torch.layout layout, torch.device device, bool pin_memory, bool requires_grad)

The bug was introduced in #7885, which turned

dim_inner = layer_config.dim_in \
    if layer_config.dim_inner is None \
    else layer_config.dim_inner

into

if layer_config.dim_inner:
    dim_inner = layer_config.dim_in
else:
    dim_inner = layer_config.dim_inner

The fix is pretty simple, just reverse the if statement.

Environment

  • PyG version: 2.4.0
  • PyTorch version:
  • OS:
  • Python version:
  • CUDA/cuDNN version:
  • How you installed PyTorch and PyG (conda, pip, source):
  • Any other relevant information (e.g., version of torch-scatter):
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant