-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add include_fc and use_combined_linear
argument in the SABlock
#7996
Conversation
Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
for more information, see https://pre-commit.ci
Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
|
Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
SABlock
use_combined_linear
argument in the SABlock
Remaining parts need to be addressed. |
Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
for more information, see https://pre-commit.ci
Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
for more information, see https://pre-commit.ci
/build |
Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
For reference, I had attempted to test the similarity between the from tests.utils import SkipIfBeforePyTorchVersion, assert_allclose
from monai.utils import set_determinism
generative, has_generative = optional_import("generative")
xops, has_xformers = optional_import("xformers.ops")
class TestComparison(unittest.TestCase):
@parameterized.expand([["cuda:0", True], ["cuda:0", False], ["cpu", False]])
@skipUnless(has_einops, "Requires einops")
@skipUnless(has_generative, "Requires generative")
@skipUnless(has_xformers, "Requires xformers")
@SkipIfBeforePyTorchVersion((2, 0))
def test_generative_vs_core(self, _device, use_flash_attention):
device = torch.device(_device)
input_shape = (2, 512, 360)
input_param = {
"hidden_size": 360,
"num_heads": 4,
"dropout_rate": 0,
"use_flash_attention": use_flash_attention,
}
set_determinism(0)
net_gen = generative.networks.blocks.SABlock(**input_param).to(device)
set_determinism(0)
net_monai = SABlock(**input_param).to(device)
set_determinism(0)
input = torch.randn(input_shape).to(device)
net_monai.load_state_dict(net_gen.state_dict())
with eval_mode(net_gen, net_monai):
set_determinism(0)
r1 = net_gen(input)
set_determinism(0)
r2 = net_monai(input)
assert_allclose(r1.detach().cpu().numpy(), r2.detach().cpu().numpy()) I'm not suggesting we add this test but we should come back to where the differences are coming from. |
Hi @KumoLiu I had a few comments about the tests that need to be addressed to test the actual cases you want. I'm otherwise good with things. |
Hi @KumoLiu, shall we also add those parameters ( |
Co-authored-by: Eric Kerfoot <17726042+ericspod@users.noreply.github.com> Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
Co-authored-by: Eric Kerfoot <17726042+ericspod@users.noreply.github.com> Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
I actually tested this locally and confirmed that they return the same result. It seems that the issue with your script might be due to differences in the initial weights. To address this, I copied and reused the state dict. I didn't include this test in this pr because we aim to avoid introducing generative. For your reference, here is the test code I used:
Result:
And for this one :
Yes, we also include these tests. The gpu pipeline have been removed to blossom, but it will stilled be tested. |
Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
/build |
…roject-MONAI#7996) Fixes Project-MONAI#7991 Fixes Project-MONAI#7992 ### Description Add `include_fc` and `use_combined_linear` argument in the `SABlock`. ### Types of changes <!--- Put an `x` in all the boxes that apply, and remove the not applicable items --> - [x] Non-breaking change (fix or new feature that would not break existing functionality). - [ ] Breaking change (fix or new feature that would cause existing functionality to change). - [ ] New tests added to cover the changes. - [ ] Integration tests passed locally by running `./runtests.sh -f -u --net --coverage`. - [ ] Quick tests passed locally by running `./runtests.sh --quick --unittests --disttests`. - [ ] In-line docstrings updated. - [ ] Documentation updated, tested `make html` command in the `docs/` folder. --------- Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Eric Kerfoot <17726042+ericspod@users.noreply.github.com>
Fixes #7991
Fixes #7992
Description
Add
include_fc
anduse_combined_linear
argument in theSABlock
.Types of changes
./runtests.sh -f -u --net --coverage
../runtests.sh --quick --unittests --disttests
.make html
command in thedocs/
folder.