Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Two moons test fit #175

Merged

Conversation

Chase-Grajeda
Copy link
Collaborator

@Chase-Grajeda Chase-Grajeda commented Jun 20, 2024

Adds tests to verify convergence in test_two_moons/test_two_moons.py

test_fit

  • Tests for decreasing KL-Divergence loss after training
  • Tests for decreasing validation loss after training
  • Tests weights haven't vanished
  • Tests model performance via MMD before and after training

Chase-Grajeda and others added 8 commits June 18, 2024 10:48
commit 320f1ae
Author: lars <[email protected]>
Date:   Tue Jun 18 16:23:01 2024 +0200

    fix two moons simulated dtype

commit 27f99cd
Author: lars <[email protected]>
Date:   Tue Jun 18 16:09:45 2024 +0200

    fix data modification for tensorflow compiled mode

commit c8060fc
Merge: 3150d11 e2355de
Author: lars <[email protected]>
Date:   Tue Jun 18 15:35:59 2024 +0200

    Merge remote-tracking branch 'origin/streamlined-backend' into streamlined-backend

commit 3150d11
Author: lars <[email protected]>
Date:   Tue Jun 18 15:35:52 2024 +0200

    add JAX Approximator
    finalize all Approximators

commit e2355de
Author: Chase Grajeda <[email protected]>
Date:   Tue Jun 18 22:15:37 2024 +0900

    Configurator Unit Tests (stefanradev93#174)

    * First additions

    Added __init__.py for test module.
    Added test_configurators.py.
    Added basic fixtures and construction tests.

    * Remaining tests

    Added remaining unit tests

    * Added conftest

    Separated fixtures and placed them in conftest.py

    * Added requested changes

    Added batch_size, set_size, and num_features parameterizations in conftest.py.
    Combined repetitive fixtures in conftest.py.
    Combined repetitive tests in test_configurators.py.
    Parameterized Configurator initialization in conftest.py.
    Parameterized parameter selection in conftest.py.
    Removed initialization tests in test_configurators.py.
    Added summary_inputs and summary_conditions to parameters.
    Changed instances of '==None' to 'is None'.
    Removed 'config=Configurator' instances in test_configurators.py.
Added test for post-training loss < pre-training loss to test_fit.py::test_fit
Added test in test_fit.py::test_fit for vanishing weights
Added test to test_fit.py for verifying the simulator produces random and consistent data
Added MMD test to test_two_moons.py.
Added MMD method to utils/ops.py.
Added test_dataset to test_two_moons/conftest.py.
Added auto-formatting changes from ruff
@Chase-Grajeda Chase-Grajeda self-assigned this Jun 21, 2024
@Chase-Grajeda Chase-Grajeda added the unit tests A new set of tests needs to be added. label Jun 21, 2024
@Chase-Grajeda Chase-Grajeda marked this pull request as ready for review June 21, 2024 19:41
Copy link
Collaborator

@LarsKue LarsKue left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great changes, thank you. I added some comments for future improvement, but I will merge this as-is. Also make sure to pass the linter check in the future (you can do this automatically by installing the pre-commit hook with conda env update -f environment.yaml && pre-commit install && pre-commit run --all-files.)

@@ -7,3 +7,30 @@ def isclose(x1, x2, rtol=1e-5, atol=1e-8):

def allclose(x1, x2, rtol=1e-5, atol=1e-8):
return keras.ops.all(isclose(x1, x2, rtol, atol))


def max_mean_discrepancy(x, y):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the future, let's put these in bayesflow/metrics directly. This time, I will move this and make it a bit more customizable.

# Test model weights have not vanished
for layer in approximator.layers:
for weight in layer.weights:
assert not keras.ops.any(keras.ops.isnan(weight)).numpy()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tensor.numpy() is not backend-agnostic and also not necessary here (when necessary, use keras.ops.convert_to_numpy instead)

pre_loss = approximator.compute_metrics(train_dataset.data)["loss"]
pre_val_loss = approximator.compute_metrics(validation_dataset.data)["loss"]
x_before = approximator.inference_network(inf_vars, conditions=inf_conds)
mmd_before = max_mean_discrepancy(x_before, y)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

inconsistent naming

approximator.compile(jit_compile=jit_compile, loss=keras.losses.KLDivergence())
inf_vars = approximator.configurator.configure_inference_variables(test_dataset.data)
inf_conds = approximator.configurator.configure_inference_conditions(test_dataset.data)
y = test_dataset.data["x"]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

relies on the internal structure of the dataset fixture, which is not good. Use test_batch = test_dataset[0]; observables = test_batch["x"] instead.

@LarsKue LarsKue merged commit 8b47c03 into stefanradev93:streamlined-backend Jun 24, 2024
0 of 13 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
unit tests A new set of tests needs to be added.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants