Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Functional Laplace Updated #192

Merged
merged 129 commits into from
Jul 16, 2024
Merged

Functional Laplace Updated #192

merged 129 commits into from
Jul 16, 2024

Conversation

Ludvins
Copy link
Contributor

@Ludvins Ludvins commented Jun 4, 2024

Took @metodmove's contribution from PR 55 and adapted it to the current state of the repository.

Functional Laplace can be used with hessian='gp', there is an example at calibration_gp_example.py and calibration_gp_example.md.

Two unit-tests are available: test_functional_laplace.py and test_functional_laplace_unit.py.

laplace/baselaplace.py Show resolved Hide resolved
laplace/baselaplace.py Outdated Show resolved Hide resolved
laplace/curvature/backpack.py Outdated Show resolved Hide resolved
Comment on lines 225 to 226
np.random.seed(seed)
self.indices = torch.tensor(np.random.choice(list(range(N)), M, replace=False))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. The main issue for me is that it changes the global rng state. Can you use Numpy's generator?

https://numpy.org/doc/stable/reference/random/generated/numpy.random.Generator.choice.html#numpy.random.Generator.choice

tests/test_functional_laplace.py Outdated Show resolved Hide resolved
@wiseodd
Copy link
Collaborator

wiseodd commented Jun 23, 2024

Thanks for the changes! The code looks good now and passes all checks. Only small bits of changes needed this time.

@wiseodd
Copy link
Collaborator

wiseodd commented Jun 23, 2024

@aleximmer please review if you have chance.

@wiseodd wiseodd mentioned this pull request Jun 23, 2024
@Ludvins
Copy link
Contributor Author

Ludvins commented Jun 24, 2024

I've created two functions _glm_forward_call and _glm_predictive_samples in BaseLaplace that:

  1. Avoid the double computation of the glm predictive that was present in ParametricLaplace when using predictive samples.
  2. Is used for FunctionalLaplace and avoids repeating too much code in Parametric and Functional Laplace.

This can be moved as an utils function but it might be usefull in BaseLaplace too.

Copy link
Collaborator

@wiseodd wiseodd left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Once @aleximmer gives his reviews we can merge

Copy link
Owner

@aleximmer aleximmer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the great effort. Please see the comments regarding the two variable names and let's discuss them. Other than that, it's ready to merge.

laplace/baselaplace.py Outdated Show resolved Hide resolved
laplace/baselaplace.py Show resolved Hide resolved
@wiseodd wiseodd added this to the 0.3 milestone Jul 8, 2024
@wiseodd wiseodd linked an issue Jul 8, 2024 that may be closed by this pull request
laplace/baselaplace.py Outdated Show resolved Hide resolved
@aleximmer aleximmer merged commit 9160897 into aleximmer:main Jul 16, 2024
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Functional Laplace
5 participants