Skip to content

Fix trainable parameters in distributions #520

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 22, 2025

Conversation

vpratz
Copy link
Collaborator

@vpratz vpratz commented Jun 21, 2025

This PR fixes two issues for the distributions when trainable parameters are used.

  • make the normalization constant depend on the trainable value, was using only the initial value
  • use a copy for the values passed to keras.initializers.get, as the arrays seem to be freed for some reason, leading to a weird RuntimeError: Array has been deleted with shape=float32[4]. when trying to access them.

I encountered this when preparing #519, where I added a test for this. To reproduce it, check out the PR and run nox -- save dev tests/test_compatibility/test_distributions.

Copy link

codecov bot commented Jun 21, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Files with missing lines Coverage Δ
bayesflow/distributions/diagonal_normal.py 95.34% <100.00%> (-0.11%) ⬇️
bayesflow/distributions/diagonal_student_t.py 95.91% <100.00%> (-0.09%) ⬇️
bayesflow/distributions/mixture.py 98.03% <ø> (ø)

... and 15 files with indirect coverage changes

@stefanradev93 stefanradev93 merged commit e329f4b into dev Jun 22, 2025
9 checks passed
@stefanradev93 stefanradev93 deleted the fix-distribution-trainable branch June 22, 2025 08:02
stefanradev93 pushed a commit that referenced this pull request Jul 2, 2025
* fix trainable parameters in distributions (#520)

* Improve numerical precision in MVNScore.log_prob

* add log_gamma diagnostic (#522)

* add log_gamma diagnostic

* add missing export for log_gamma

* add missing export for gamma_null_distribution, gamma_discrepancy

* fix broken unit tests

* rename log_gamma module to sbc

* add test_log_gamma unit test

* add return information to log_gamma doc string

* fix typo in docstring, use fixed-length np array to collect log_gammas instead of appending to an empty list

* Breaking changes: Fix bugs regarding counts in standardization layer (#525)

* standardization: add test for multi-input values (failing)

This test reveals to bugs in the standarization layer:

- count is updated multiple times
- batch_count is too small, as the sizes from reduce_axes have to be
  multiplied

* breaking: fix bugs regarding count in standardization layer

Fixes #524

This fixes the two bugs described in c4cc133:

- count was accidentally updated, leading to wrong values
- count was calculated wrongly, as only the batch size was used. Correct
  is the product of all reduce dimensions. This lead to wrong standard
  deviations

While the batch dimension is the same for all inputs, the size of the
second dimension might vary. For this reason, we need to introduce an
input-specific `count` variable. This breaks serialization.

* fix assert statement in test

* bump version to 2.0.5, adjust deprecation warnings

* rename log_gamma to calibration_log_gamma (#527)

---------

Co-authored-by: han-ol <g@hans.olischlaeger.com>
Co-authored-by: Daniel Habermann <133031176+daniel-habermann@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants