Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix graphgym example for activation functions #5243

Merged
merged 9 commits into from
Aug 20, 2022
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -158,5 +158,6 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Fixed a bug in which `nn.models.GAT` did not produce `out_channels`-many output channels ([#4299](https://github.com/pyg-team/pytorch_geometric/pull/4299))
- Fixed mini-batching with empty lists as attributes ([#4293](https://github.com/pyg-team/pytorch_geometric/pull/4293))
- Fixed a bug in which `GCNConv` could not be combined with `to_hetero` on heterogeneous graphs with one node type ([#4279](https://github.com/pyg-team/pytorch_geometric/pull/4279))
- Fixed a bug in which the custom activation functions in GraphGym did not work. ([#5243](https://github.com/pyg-team/pytorch_geometric/pull/5243))
### Removed
- Remove internal metrics in favor of `torchmetrics` ([#4287](https://github.com/pyg-team/pytorch_geometric/pull/4287))
6 changes: 4 additions & 2 deletions graphgym/custom_graphgym/act/example.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
from functools import partial

import torch
import torch.nn as nn

Expand All @@ -18,5 +20,5 @@ def forward(self, x):
return x * torch.sigmoid(x)


register_act('swish', SWISH(inplace=cfg.mem.inplace))
register_act('lrelu_03', nn.LeakyReLU(0.3, inplace=cfg.mem.inplace))
register_act('swish', lambda: SWISH(inplace=cfg.mem.inplace))
register_act('lrelu_03', partial(nn.LeakyReLU, 0.3, inplace=cfg.mem.inplace))