Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move nn.glob.attention.GlobalAttention to nn.aggr.attention.AttentionalAggregation #4986

Merged
merged 7 commits into from
Jul 16, 2022

Conversation

Padarn
Copy link
Contributor

@Padarn Padarn commented Jul 15, 2022

Addresses #4712

@Padarn Padarn mentioned this pull request Jul 15, 2022
26 tasks
@codecov
Copy link

codecov bot commented Jul 15, 2022

Codecov Report

Merging #4986 (aa160fa) into master (9ed1b79) will not change coverage.
The diff coverage is 91.66%.

❗ Current head aa160fa differs from pull request most recent head 0cfb2bc. Consider uploading reports for the commit 0cfb2bc to get more accurate results

@@           Coverage Diff           @@
##           master    #4986   +/-   ##
=======================================
  Coverage   82.79%   82.79%           
=======================================
  Files         330      330           
  Lines       17978    17978           
=======================================
  Hits        14885    14885           
  Misses       3093     3093           
Impacted Files Coverage Δ
torch_geometric/nn/glob/__init__.py 93.33% <80.00%> (-6.67%) ⬇️
torch_geometric/nn/aggr/__init__.py 100.00% <100.00%> (ø)
torch_geometric/nn/aggr/attention.py 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 9ed1b79...0cfb2bc. Read the comment docs.

torch_geometric/nn/aggr/attention.py Outdated Show resolved Hide resolved
torch_geometric/nn/aggr/attention.py Outdated Show resolved Hide resolved
torch_geometric/nn/aggr/attention.py Show resolved Hide resolved
torch_geometric/nn/aggr/attention.py Outdated Show resolved Hide resolved
test/nn/aggr/test_attention.py Outdated Show resolved Hide resolved
test/nn/aggr/test_attention.py Outdated Show resolved Hide resolved
test/nn/aggr/test_attention.py Outdated Show resolved Hide resolved
index = index.view(-1, 1).repeat(1, dim_size).view(-1)

assert aggr(x, index).size() == (dim_size, channels)
assert aggr(x, index,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Test

out = aggr(x, index)
assert out.size() == (dim_size, channels)
torch.allclose(aggr(x, index, dim_size=dim_size), out)

instead.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added this, but just to confirm you want to make surer that the output does not change if dim_size is provided? Maybe it would be better to test

out = aggr(x, index)
assert out.size() == (dim_size, channels)
torch.allclose(aggr(x, index, dim_size=dim_size + 1)[:3], out)

WDYT?

@Padarn
Copy link
Contributor Author

Padarn commented Jul 15, 2022

Thanks for the comments @rusty1s!

@Padarn Padarn self-assigned this Jul 16, 2022
@rusty1s rusty1s changed the title Move nn.glob.attention.GlobalAttention to nn.aggr.attention.AttentionAggregation Move nn.glob.attention.GlobalAttention to nn.aggr.attention.AttentionalAggregation Jul 16, 2022
test/nn/aggr/test_attention.py Outdated Show resolved Hide resolved
test/nn/aggr/test_attention.py Outdated Show resolved Hide resolved
torch_geometric/nn/aggr/attention.py Show resolved Hide resolved
@rusty1s rusty1s enabled auto-merge (squash) July 16, 2022 06:48
@rusty1s rusty1s merged commit a6e3496 into pyg-team:master Jul 16, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants