bayesflow.mcmc module#

class bayesflow.mcmc.MCMCSurrogateLikelihood(amortized_likelihood, configurator=None, likelihood_postprocessor=None, grad_postprocessor=None)[source]#

Bases: object

An interface to provide likelihood evaluation and gradient estimation of a pre-trained AmortizedLikelihood instance, which can be used in tandem with (HMC)-MCMC, as implemented, for instance, in PyMC3.

__init__(amortized_likelihood, configurator=None, likelihood_postprocessor=None, grad_postprocessor=None)[source]#

Creates in instance of the surrogate likelihood using a pre-trained AmortizedLikelihood instance.

Parameters:
amortized_likelihoodbayesflow.amortized_inference.AmortizedLikelihood

A pre-trained (invertible) inference network which processes the outputs of the generative model.

configuratorcallable, optional, default: None

A function that takes the input to the log_likelihood and log_likelihood_grad calls and converts them to a dictionary containing the following mandatory keys, if DEFAULT_KEYS unchanged:

observables - the variables over which a condition density is learned (i.e., the observables) conditions - the conditioning variables that the directly passed to the inference network

default: Return the first parameter - has to be a dicitionary with the mentioned characteristics

likelihood_postprocessorcallable, optional, default: None

A function that takes the likelihood for each observable as an input. Can be used for aggregation default: sum all likelihood values and return a single value.

grad_postprocessorcallable, optional, default: None

A function that takes the gradient for each value in conditions as returned by the preprocessor default: Leave the values unchanged.

log_likelihood(*args, **kwargs)[source]#

Calculates the approximate log-likelihood of targets given conditional variables.

Parameters:
The parameters as expected by ``configurator``. For the default configurator,
the first parameter has to be a dictionary containing the following mandatory keys,
if DEFAULT_KEYS unchanged:

observables - the variables over which a condition density is learned (i.e., the observables) conditions - the conditioning variables that the directly passed to the inference network

Returns:
outnp.ndarray

The output as returned by likelihood_postprocessor. For the default postprocessor, this is the total log-likelihood given by the sum of all log-likelihood values.

log_likelihood_grad(*args, **kwargs)[source]#

Calculates the gradient of the surrogate likelihood with respect to every parameter in conditions.

Parameters:
The parameters as expected by ``configurator``. For the default configurator,
the first parameter has to be a dictionary containing the following mandatory keys,
if ``DEFAULT_KEYS`` unchanged:

observables - the variables over which a condition density is learned (i.e., the observables) conditions - the conditioning variables that the directly passed to the inference network

Returns:
outnp.ndarray

The output as returned by grad_postprocessor. For the default postprocessor, this is an array containing the derivative with respect to each value in conditions as returned by configurator.

class bayesflow.mcmc.PyMCSurrogateLikelihood(amortized_likelihood, observables, configurator=None, likelihood_postprocessor=None, grad_postprocessor=None, default_pymc_type=<class 'numpy.float64'>, default_tf_type=<class 'numpy.float32'>)[source]#

Bases: Op, MCMCSurrogateLikelihood

itypes: Sequence[Type] | None = [TensorType(float64, (?,))]#
otypes: Sequence[Type] | None = [TensorType(float64, ())]#
__init__(amortized_likelihood, observables, configurator=None, likelihood_postprocessor=None, grad_postprocessor=None, default_pymc_type=<class 'numpy.float64'>, default_tf_type=<class 'numpy.float32'>)[source]#

A custom surrogate likelihood function for integration with PyMC3, to be used with pymc.Potential

Parameters:
amortized_likelihoodbayesflow.amortized_inference.AmortizedLikelihood

A pre-trained (invertible) inference network which processes the outputs of the generative model.

observables

The “observed” data that will be passed to the configurator. For the default configurator, an np.array of shape (N, x_dim).

configuratorcallable, optional, default None

A function that takes the input to the log_likelihood and log_likelihood_grad calls and converts it to a dictionary containing the following mandatory keys, if DEFAULT_KEYS unchanged:

observables - the variables over which a condition density is learned (i.e., the observables) conditions - the conditioning variables that the directly passed to the inference network

default behavior: convert observables to shape (1, N, x_dim),

expand parameters of shape (cond_dim) to shape (1, N, cond_dim)

likelihood_postprocessorcallable, optional, default: None

A function that takes the likelihood for each observable, can be used for aggregation default behavior: sum all likelihoods and return a single value

grad_postprocessorcallable, optional, default: None

A function that takes the gradient for each value in conditions as returned by the preprocessor default behavior: Reduce shape from (1, N, cond_dim) to (cond_dim) by summing the corresponding values

default_pymc_typenp.dtype, optional, default: np.float64

The default float type to use for numpy arrays as required by PyMC.

default_tf_typenp.dtype, optional, default: np.float32

The default float type to use for tensorflow tensors.

perform(node, inputs, outputs)[source]#

Computes the log-likelihood of inputs (typically the parameter vector of a model).

Parameters:
nodeThe symbolic aesara.graph.basic.Apply node that represents this computation.
inputsImmutable sequence of non-symbolic/numeric inputs. These are the values of each

Variable in node.inputs.

outputsList of mutable single-element lists (do not change the length of these lists).

Each sub-list corresponds to value of each Variable in node.outputs. The primary purpose of this method is to set the values of these sub-lists.

grad(inputs, output_grads)[source]#

Aggregates gradients with respect to inputs (typically the parameter vector)

Parameters:
inputsThe input variables.
output_gradsThe gradients of the output variables.
Returns:
gradsThe gradients with respect to each Variable in inputs.