Skip to content

Conversation

MarcusLoppe
Copy link
Contributor

Seems like the torch_geometric dropped or change the args for SAGEConv so sageconv_dropout is no longer a valid arg.

@lucidrains Please approve this pull request since you are not able to create the autoencoder without removing this arg.
I've seen no improvements in accuracy when using the dropout so it probably isn't needed or very useful.
Thank you 😄

@lucidrains
Copy link
Owner

lucidrains commented Mar 14, 2024

oh thanks Marcus!

edit: want to bump the patch version while you are at it?

# initial sage conv

sageconv_kwargs = {**sageconv_kwargs, 'sageconv_dropout' : sageconv_dropout}
sageconv_kwargs = {**sageconv_kwargs}
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can just remove this line altogether

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can just remove this line altogether

All done :) I also bumped the version

@lucidrains lucidrains merged commit 7e58dd3 into lucidrains:main Mar 14, 2024
@MarcusLoppe MarcusLoppe deleted the sageconv_dropout_fix branch June 11, 2024 20:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants