Skip to content

PyTorch gradient tests are taking up a lot of test time #294

Closed
@matt-graham

Description

@matt-graham

I noticed as part of working on #277 that the gradient checks on the PyTorch implementations of the precompute transforms are very slow and infact end up constituting a significant proportion of the overall test suite run time. With these checks removed I can run the whole test suite locally, distributing across 4 processes with pytest_xdist, in 7 minutes, compared to 55 minutes with these checks included.

Given how long these checks take, it might make sense to factor them out in to separate tests and apply a mark to them so they can be skipped when running the tests on pull requests and only run them when merging to main and in the scheduled runs.

# Test Gradients
flm_grad_test = torch.from_numpy(flm)
flm_grad_test.requires_grad = True
assert torch.autograd.gradcheck(
inverse,
(
flm_grad_test,
L,
spin,
torch.from_numpy(kernel),
sampling,
reality,
method,
),
)

# Test Gradients
flm_grad_test = torch.from_numpy(flm)
flm_grad_test.requires_grad = True
assert torch.autograd.gradcheck(
inverse,
(
flm_grad_test,
L,
0,
torch.from_numpy(kernel),
sampling,
reality,
method,
nside,
),

# Test Gradients
f_grad_test = torch.from_numpy(f)
f_grad_test.requires_grad = True
assert torch.autograd.gradcheck(
forward,
(
f_grad_test,
L,
spin,
torch.from_numpy(kernel),
sampling,
reality,
method,
),
)

# Test Gradients
f_grad_test = torch.from_numpy(f)
f_grad_test.requires_grad = True
assert torch.autograd.gradcheck(
forward,
(
f_grad_test,
L,
0,
torch.from_numpy(kernel),
sampling,
reality,
method,
nside,
iter,
),
)

# Test Gradients
flmn_grad_test = torch.from_numpy(flmn)
flmn_grad_test.requires_grad = True
assert torch.autograd.gradcheck(
inverse,
(
flmn_grad_test,
L,
N,
torch.from_numpy(kernel),
sampling,
reality,
method,
),
)

# Test Gradients
f_grad_test = torch.from_numpy(f)
f_grad_test.requires_grad = True
assert torch.autograd.gradcheck(
forward,
(
f_grad_test,
L,
N,
torch.from_numpy(kernel),
sampling,
reality,
method,
),
)

# Test Gradients
flmn_grad_test = torch.from_numpy(flmn)
flmn_grad_test.requires_grad = True
assert torch.autograd.gradcheck(
inverse,
(
flmn_grad_test,
L,
N,
torch.from_numpy(kernel),
sampling,
reality,
method,
nside,
),
)

# Test Gradients
f_grad_test = torch.from_numpy(f)
f_grad_test.requires_grad = True
assert torch.autograd.gradcheck(
forward,
(
f_grad_test,
L,
N,
torch.from_numpy(kernel),
sampling,
reality,
method,
nside,
),
)

Metadata

Metadata

Assignees

No one assigned

    Labels

    testsIssue or pull requests related to package tests

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions