Skip to content

Adding Graphormer layer #275

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 14 commits into from
Closed

Adding Graphormer layer #275

wants to merge 14 commits into from

Conversation

5hv5hvnk
Copy link

@5hv5hvnk 5hv5hvnk commented Mar 22, 2023

Continued from #265
fixes #251
To do:

  • struct based layers
  • doc-string
  • example pipeline
  • unit tests
  • pass build tests
  • fixed conflicts

@codecov
Copy link

codecov bot commented Mar 22, 2023

Codecov Report

Merging #275 (99813a6) into master (05fca7c) will decrease coverage by 1.62%.
The diff coverage is 0.00%.

❗ Current head 99813a6 differs from pull request most recent head 7387752. Consider uploading reports for the commit 7387752 to get more accurate results

@@            Coverage Diff             @@
##           master     #275      +/-   ##
==========================================
- Coverage   82.36%   80.74%   -1.62%     
==========================================
  Files          17       19       +2     
  Lines        1911     1958      +47     
==========================================
+ Hits         1574     1581       +7     
- Misses        337      377      +40     
Impacted Files Coverage Δ
src/layers/conv.jl 74.65% <0.00%> (-4.61%) ⬇️

... and 11 files with indirect coverage changes

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@5hv5hvnk 5hv5hvnk mentioned this pull request Mar 22, 2023
5 tasks
@@ -1688,3 +1688,83 @@ function Base.show(io::IO, l::TransformerConv)
(in, ein), out = l.channels
print(io, "TransformerConv(($in, $ein) => $out, heads=$(l.heads))")
end

@doc raw"""
GraphormerLayer constructor.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

make the docstring style consistent with the other layers

Wx = reshape(Wx, chout, heads, :)

# a hand-written message passing
m = apply_edges((xi, xj, e) -> message(l, xi, xj, e), g, Wx, Wx, e)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

where is the message function defined?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apologies I missed writing it I added the comment to go back and add that let me get on to this

@CarloLucibello
Copy link
Member

CarloLucibello commented Mar 22, 2023

This is not capturing the essential features of the paper:

  • The message function is not defined. Since this layer is basically a transformer, the attention runs on the whole graph instead of just the neighborhood of a node. Therefore, we shouldn't use the usual message passing scheme but we can rely on the MultiHeadAttention layer from Flux (soon to be released).
  • The bias term of eq. 5 is missing. The phi function should be given at construction time
  • As part of the layer, eq. 8 and eq. 9 prescribe layer norm, residual connection, and an MLP.
  • edge features can be considered in a second PR, and should implement the mechanics of eq. 7

@CarloLucibello
Copy link
Member

Why you checked the tests as done? I see no tests...

@5hv5hvnk
Copy link
Author

Why you checked the tests as done? I see no tests...

I have written some tests you can check here
I don't know why It isn't showing in the files changed let me look into it.

@5hv5hvnk
Copy link
Author

5hv5hvnk commented Apr 6, 2023

I have written some tests you can check here I don't know why It isn't showing in the files changed let me look into it.

@CarloLucibello Hi can you look into the updates and see the tests?

@CarloLucibello
Copy link
Member

@CarloLucibello Hi can you look into the updates and see the tests?

can you make the tests part of this PR? it's weird they are somewhere else

@5hv5hvnk
Copy link
Author

5hv5hvnk commented Jun 25, 2023

Hello @CarloLucibello Sorry for delayed response on this I believe we still need this layer, I redone each line of code after reading through the paper and more Julia blogs (mostly linked to flux) to have better understanding.

@5hv5hvnk 5hv5hvnk closed this Jun 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

implement Graphormer
2 participants