Skip to content

Commit

Permalink
update doc
Browse files Browse the repository at this point in the history
  • Loading branch information
marvinschmitt committed Jul 5, 2023
1 parent c65944c commit ff75f2e
Show file tree
Hide file tree
Showing 23 changed files with 451 additions and 372 deletions.
1 change: 1 addition & 0 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ jobs:
uses: actions/setup-python@v4
with:
python-version: 3.11
cache: 'pip'

- name: Install dependencies
run: |
Expand Down
8 changes: 5 additions & 3 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,10 @@ __pycache__/
*/__pycache__/
projects/
*/bayesflow.egg-info
docsrc/build/
docsrc/_build/
build
docs/

# Notebooks
docsrc/source/tutorial_notebooks/**

# mypy
.mypy_cache
Expand All @@ -31,3 +30,6 @@ docsrc/source/tutorial_notebooks/**

# tox
.tox

# MacOS
.DS_Store
25 changes: 24 additions & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
Contributing to BayesFlow
==========
=========================

Workflow
--------
Expand Down Expand Up @@ -65,3 +65,26 @@ You can run the all tests locally via:
Or a specific test via:

pytest -e test_[mytest]

Tutorial Notebooks
------------------

New tutorial notebooks are always welcome! You can add your tutorial notebook file to `examples/` and add a reference
to the list of notebooks in `docsrc/source/examples.rst`.
Re-build the documentation (see below) and your notebook will be included.

Documentation
-------------

The documentation uses [sphinx](https://www.sphinx-doc.org/) and relies on [numpy style docstrings](https://numpydoc.readthedocs.io/en/latest/format.html) in classes and functions.
The overall *structure* of the documentation is manually designed. This also applies to the API documentation. This has two implications for you:

1. If you add to existing submodules, the documentation will update automatically (given that you use proper numpy docstrings).
2. If you add a new submodule or subpackage, you need to add a file to `docsrc/source/api` and a reference to the new module to the appropriate section of `docsrc/source/api/bayesflow.rst`.

You can re-build the documentation with

cd docsrc/
make clean && make github

The entry point of the rendered documentation will be at `docs/index.html`.
4 changes: 2 additions & 2 deletions bayesflow/amortizers.py
Original file line number Diff line number Diff line change
Expand Up @@ -1103,8 +1103,8 @@ def sample(self, input_dict, n_samples, to_numpy=True, **kwargs):
**kwargs : dict, optional, default: {}
Additional keyword arguments passed to the summary network as the amortizers
Returns:
--------
Returns
-------
samples_dict : dict
A dictionary with keys `global_samples` and `local_samples`
Local samples will hold an array-like of shape (num_replicas, num_samples, num_local)
Expand Down
8 changes: 4 additions & 4 deletions bayesflow/computational_utilities.py
Original file line number Diff line number Diff line change
Expand Up @@ -59,8 +59,8 @@ def posterior_calibration_error(
max_quantile : float in (0, 1), optional, default: 0.995
The maximum posterior quantile to consider
Returns:
--------
Returns
-------
calibration_errors : np.ndarray of shape (num_params, ) or (alpha_resolution, num_params),
if ``aggregator_fun is None``.
The aggregated calibration error per marginal posterior.
Expand Down Expand Up @@ -248,8 +248,8 @@ def expected_calibration_error(m_true, m_pred, num_bins=10):
Obtaining well calibrated probabilities using bayesian binning.
In Proceedings of the AAAI conference on artificial intelligence (Vol. 29, No. 1).
Important
---------
Notes
-----
Make sure that ``m_true`` are **one-hot encoded** classes!
Parameters
Expand Down
4 changes: 2 additions & 2 deletions bayesflow/coupling_networks.py
Original file line number Diff line number Diff line change
Expand Up @@ -601,8 +601,8 @@ def call(self, target_or_z, condition, inverse=False, **kwargs):
target : tf.Tensor
If inverse=True: The back-transformed z, shape (batch_size, inp_dim)
Important
---------
Notes
-----
If ``inverse=False``, the return is ``(z, log_det_J)``.\n
If ``inverse=True``, the return is ``target``
"""
Expand Down
36 changes: 21 additions & 15 deletions bayesflow/helper_networks.py
Original file line number Diff line number Diff line change
Expand Up @@ -292,24 +292,30 @@ def call(self, inputs):
class ActNorm(tf.keras.Model):
"""Implements an Activation Normalization (ActNorm) Layer.
Activation Normalization is learned invertible normalization, using
a Scale (s) and Bias (b) vector [1].
y = s * x + b (forward)
x = (y - b)/s (inverse)
a Scale (s) and Bias (b) vector::
The scale and bias can be data dependent initalized, such that the
output has a mean of zero and standard deviation of one [1,2].
y = s * x + b (forward)
x = (y - b)/s (inverse)
Notes
-----
The scale and bias can be data dependent initialized, such that the
output has a mean of zero and standard deviation of one [1]_[2]_.
Alternatively, it is initialized with vectors of ones (scale) and
zeros (bias).
[1] - Kingma, Diederik P., and Prafulla Dhariwal.
"Glow: Generative flow with invertible 1x1 convolutions."
arXiv preprint arXiv:1807.03039 (2018).
References
----------
.. [1] Kingma, Diederik P., and Prafulla Dhariwal.
"Glow: Generative flow with invertible 1x1 convolutions."
arXiv preprint arXiv:1807.03039 (2018).
[2] - Salimans, Tim, and Durk P. Kingma.
"Weight normalization: A simple reparameterization to accelerate
training of deep neural networks."
Advances in neural information processing systems 29
(2016): 901-909.
.. [2] Salimans, Tim, and Durk P. Kingma.
"Weight normalization: A simple reparameterization to accelerate
training of deep neural networks."
Advances in neural information processing systems 29 (2016): 901-909.
"""

def __init__(self, latent_dim, act_norm_init, **kwargs):
Expand Down Expand Up @@ -353,8 +359,8 @@ def call(self, target, inverse=False):
target : tf.Tensor
If inverse=True: The inversly transformed targets, shape == target.shape
Important
---------
Notes
-----
If ``inverse=False``, the return is ``(z, log_det_J)``.\n
If ``inverse=True``, the return is ``target``.
"""
Expand Down
4 changes: 2 additions & 2 deletions bayesflow/inference_networks.py
Original file line number Diff line number Diff line change
Expand Up @@ -167,8 +167,8 @@ def call(self, targets, condition, inverse=False, **kwargs):
target : tf.Tensor
If inverse=True: The transformed out, shape (batch_size, ...)
Important
---------
Notes
-----
If ``inverse=False``, the return is ``(z, log_det_J)``.\n
If ``inverse=True``, the return is ``target``.
"""
Expand Down
64 changes: 35 additions & 29 deletions bayesflow/simulation.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,10 +52,13 @@ class ContextGenerator:
While the latter can also be considered batchable in principle, batching them would require non-Tensor
(i.e., non-rectangular) data structures, which usually means inefficient computations.
Examples
--------
Example for a simulation context which will generate a random number of observations between 1 and 100 for
each training batch:
>>> gen = ContextGenerator(non_batchable_context_fun=lambda : np.random.randint(1, 101))
"""

def __init__(
Expand Down Expand Up @@ -103,8 +106,8 @@ def __call__(self, batch_size, *args, **kwargs):
context_dict : dictionary
A dictionary with context variables with the following keys:
`batchable_context` : value
`non_batchable_context` : value
``batchable_context`` : value
``non_batchable_context`` : value
Note, that the values of the context variables will be None, if the
corresponding context-generating functions have not been provided when
Expand Down Expand Up @@ -210,7 +213,7 @@ def __init__(
self.is_batched = False

def __call__(self, batch_size, *args, **kwargs):
"""Generates `batch_size` draws from the prior given optional context generator.
"""Generates ``batch_size`` draws from the prior given optional context generator.
Parameters
----------
Expand Down Expand Up @@ -313,12 +316,12 @@ def __call__(self, batch_size, *args, **kwargs):

def plot_prior2d(self, **kwargs):
"""Generates a 2D plot representing bivariate prior ditributions. Uses the function
`bayesflow.diagnostics.plot_prior2d() internally for generating the plot.
``bayesflow.diagnostics.plot_prior2d()`` internally for generating the plot.
Parameters
----------
**kwargs : dict
Optional keyword arguments passed to the `plot_prior2d` function.
Optional keyword arguments passed to the ``plot_prior2d`` function.
Returns
-------
Expand Down Expand Up @@ -400,9 +403,10 @@ def __init__(
An optional function (ideally an instance of ``ContextGenerator``) for generating control variables
for the local_prior_fun.
Example: Varying number of local factors (e.g., groups, participants) between 1 and 100:
Examples
--------
Varying number of local factors (e.g., groups, participants) between 1 and 100::
``
def draw_hyper():
# Draw location for 2D conditional prior
return np.random.normal(size=2)
Expand All @@ -415,6 +419,7 @@ def draw_prior(means, num_groups, sigma=1.):
context = ContextGenerator(non_batchable_context_fun=lambda : np.random.randint(1, 101))
prior = TwoLevelPrior(draw_hyper, draw_prior, local_context_generator=context)
prior_dict = prior(batch_size=32)
"""

self.hyper_prior = hyper_prior_fun
Expand Down Expand Up @@ -512,19 +517,19 @@ class Simulator:
An optional context generator (i.e., an instance of ContextGenerator) or a user-defined callable object
implementing the following two methods can be provided:
- context_generator.batchable_context(batch_size)
- context_generator.non_batchable_context()
- ``context_generator.batchable_context(batch_size)``
- ``context_generator.non_batchable_context()``
"""

def __init__(self, batch_simulator_fun=None, simulator_fun=None, context_generator=None):
"""Instantiates a data generator which will perform randomized simulations given a set of parameters and optional context.
Either a batch_simulator_fun or simulator_fun, but not both, should be provided to instantiate a Simulator object.
Either a ``batch_simulator_fun`` or ``simulator_fun``, but not both, should be provided to instantiate a ``Simulator`` object.
If a batch_simulator_fun is provided, the interface will assume that the function operates on batches of parameter
If a ``batch_simulator_fun`` is provided, the interface will assume that the function operates on batches of parameter
vectors and context variables and will pass the latter directly to the function. Power users should attempt to provide
optimized batched simulators.
If a simulator_fun is provided, the interface will assume thatthe function operates on single parameter vectors and
If a ``simulator_fun`` is provided, the interface will assume thatthe function operates on single parameter vectors and
context variables and will wrap the simulator internally to allow batched functionality.
Parameters
Expand All @@ -535,8 +540,8 @@ def __init__(self, batch_simulator_fun=None, simulator_fun=None, context_generat
simulator_fun : callable
A function (callable object) with optional control arguments responsible for generating a simulaiton given
a single parameter vector and optional variables.
context generator : callable (default None, recommended instance of ContextGenerator)
An optional function (ideally an instance of ContextGenerator) for generating prior context variables.
context_generator : callable (default None, recommended instance of ContextGenerator)
An optional function (ideally an instance of ``ContextGenerator``) for generating prior context variables.
"""

if (batch_simulator_fun is None) is (simulator_fun is None):
Expand All @@ -562,9 +567,9 @@ def __call__(self, params, *args, **kwargs):
out_dict : dictionary
An output dictionary with randomly simulated variables, the following keys are mandatory, if default keys not modified:
`sim_data` : value
`non_batchable_context` : value
`batchable_context` : value
``sim_data`` : value
``non_batchable_context`` : value
``batchable_context`` : value
"""

# Always assume first dimension is batch dimension
Expand Down Expand Up @@ -728,8 +733,8 @@ def __init__(
name : str (default - "anonoymous")
An optional name for the generative model. If kept default (None), 'anonymous' is set as name.
Important
----------
Notes
-----
If you are not using the provided ``Prior`` and ``Simulator`` wrappers for your prior and data generator,
only functions returning a ``np.ndarray`` in the correct format will be accepted, since these will be
wrapped internally. In addition, you need to indicate whether your simulator operates on batched of
Expand Down Expand Up @@ -761,7 +766,7 @@ def __init__(
self._test()

def __call__(self, batch_size, **kwargs):
"""Carries out forward inference 'batch_size' times."""
"""Carries out forward inference ``batch_size`` times."""

# Forward inference
prior_out = self.prior(batch_size, **kwargs.pop("prior_args", {}))
Expand All @@ -780,7 +785,7 @@ def __call__(self, batch_size, **kwargs):
return out_dict

def _config_custom_simulator(self, sim_fun, is_batched):
"""Only called if user has provided a custom simulator not using the Simulator wrapper."""
"""Only called if user has provided a custom simulator not using the ``Simulator`` wrapper."""

if is_batched is None:
raise ConfigurationError(
Expand All @@ -796,8 +801,8 @@ def _config_custom_simulator(self, sim_fun, is_batched):
def plot_pushforward(
self, parameter_draws=None, funcs_list=None, funcs_labels=None, batch_size=1000, show_raw_sims=True
):
"""Creates simulations from parameter_draws (generated from self.prior if they are not passed as an argument)
and plots visualizations for them.
"""Creates simulations from ``parameter_draws`` (generated from ``self.prior`` if they are not passed as
an argument) and plots visualizations for them.
Parameters
----------
Expand Down Expand Up @@ -959,16 +964,16 @@ def presimulate_and_save(
disable_user_input: bool, optional, default: False
If True, user will not be asked if memory space is sufficient for presimulation.
Important
----------
Notes
-----
One of the following pairs of parameters has to be provided:
- (iterations_per_epoch, epochs),
- (total_iterations, iterations_per_epoch)
- (total_iterations, epochs)
Providing all three of the parameters in these pairs leads to a consistency check,
since incompatible combinations are possible.
since incompatible combinations are possible.
"""
# Ensure that the combination of parameters provided is sufficient to perform presimulation
# and does not contain internal contradictions
Expand Down Expand Up @@ -1117,6 +1122,7 @@ def presimulate_and_save(

class TwoLevelGenerativeModel:
"""Basic interface for a generative model in a simulation-based context.
Generally, a generative model consists of two mandatory components:
- MultilevelPrior : A randomized function returning random parameter draws from a two-level prior distribution;
- Simulator : A function which transforms the parameters into observables in a non-deterministic manner.
Expand Down Expand Up @@ -1149,8 +1155,8 @@ def __init__(
name : str (default - "anonymous")
An optional name for the generative model.
Important
----------
Notes
-----
If you are not using the provided ``TwoLevelPrior`` and ``Simulator`` wrappers for your prior and data
generator, only functions returning a ``np.ndarray`` in the correct format will be accepted, since these will be
wrapped internally. In addition, you need to indicate whether your simulator operates on batched of
Expand Down Expand Up @@ -1205,7 +1211,7 @@ def __call__(self, batch_size, **kwargs):
return out_dict

def _config_custom_simulator(self, sim_fun, is_batched):
"""Only called if user has provided a custom simulator not using the Simulator wrapper."""
"""Only called if user has provided a custom simulator not using the ``Simulator`` wrapper."""

if is_batched is None:
raise ConfigurationError(
Expand Down
Loading

0 comments on commit ff75f2e

Please sign in to comment.