tfp.experimental.substrates.jax.bijectors.Chain

View source on GitHub

Bijector which applies a sequence of bijectors.

Inherits From: Bijector

tfp.experimental.substrates.jax.bijectors.Chain(
    bijectors=None, validate_args=False, name=None
)

Example Use:

chain = Chain([Exp(), Softplus()], name="one_plus_exp")

Results in:

  • Forward:
 exp = Exp()
 softplus = Softplus()
 Chain([exp, softplus]).forward(x)
 = exp.forward(softplus.forward(x))
 = tf.exp(tf.log(1. + tf.exp(x)))
 = 1. + tf.exp(x)
 ```

* Inverse:

 ```python
 exp = Exp()
 softplus = Softplus()
 Chain([exp, softplus]).inverse(y)
 = softplus.inverse(exp.inverse(y))
 = tf.log(tf.exp(tf.log(y)) - 1.)
 = tf.log(y - 1.)
 ```

#### Args:


* <b>`bijectors`</b>: Python `list` of bijector instances. An empty list makes this
  bijector equivalent to the `Identity` bijector.
* <b>`validate_args`</b>: Python `bool` indicating whether arguments should be
  checked for correctness.
* <b>`name`</b>: Python `str`, name given to ops managed by this object. Default:
  E.g., `Chain([Exp(), Softplus()]).name == "chain_of_exp_of_softplus"`.


#### Attributes:

* <b>`bijectors`</b>
* <b>`dtype`</b>:   dtype of `Tensor`s transformable by this distribution.
* <b>`forward_min_event_ndims`</b>:   Returns the minimal number of dimensions bijector.forward operates on.
* <b>`graph_parents`</b>:   Returns this `Bijector`'s graph_parents as a Python list.
* <b>`inverse_min_event_ndims`</b>:   Returns the minimal number of dimensions bijector.inverse operates on.
* <b>`is_constant_jacobian`</b>:   Returns true iff the Jacobian matrix is not a function of x.

  Note: Jacobian matrix is either constant for both forward and inverse or
  neither.

* <b>`name`</b>:   Returns the string name of this `Bijector`.
* <b>`trainable_variables`</b>
* <b>`validate_args`</b>:   Returns True if Tensor arguments will be validated.
* <b>`variables`</b>


#### Raises:


* <b>`ValueError`</b>: if bijectors have different dtypes.

## Methods

<h3 id="__call__"><code>__call__</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.9.0/tensorflow_probability/python/bijectors/_jax/bijector.py#L781-L866">View source</a>

```python
__call__(
    value, name=None, **kwargs
)

Applies or composes the Bijector, depending on input type.

This is a convenience function which applies the Bijector instance in three different ways, depending on the input:

  1. If the input is a tfd.Distribution instance, return tfd.TransformedDistribution(distribution=input, bijector=self).
  2. If the input is a tfb.Bijector instance, return tfb.Chain([self, input]).
  3. Otherwise, return self.forward(input)

Args:

  • value: A tfd.Distribution, tfb.Bijector, or a Tensor.
  • name: Python str name given to ops created by this function.
  • **kwargs: Additional keyword arguments passed into the created tfd.TransformedDistribution, tfb.Bijector, or self.forward.

Returns:

  • composition: A tfd.TransformedDistribution if the input was a tfd.Distribution, a tfb.Chain if the input was a tfb.Bijector, or a Tensor computed by self.forward.

Examples

sigmoid = tfb.Reciprocal()(
    tfb.AffineScalar(shift=1.)(
      tfb.Exp()(
        tfb.AffineScalar(scale=-1.))))
# ==> `tfb.Chain([
#         tfb.Reciprocal(),
#         tfb.AffineScalar(shift=1.),
#         tfb.Exp(),
#         tfb.AffineScalar(scale=-1.),
#      ])`  # ie, `tfb.Sigmoid()`

log_normal = tfb.Exp()(tfd.Normal(0, 1))
# ==> `tfd.TransformedDistribution(tfd.Normal(0, 1), tfb.Exp())`

tfb.Exp()([-1., 0., 1.])
# ==> tf.exp([-1., 0., 1.])

forward

View source

forward(
    x, name='forward', **kwargs
)

Returns the forward Bijector evaluation, i.e., X = g(Y).

Args:

  • x: Tensor. The input to the 'forward' evaluation.
  • name: The name to give this op.
  • **kwargs: Named arguments forwarded to subclass implementation.

Returns:

Tensor.

Raises:

  • TypeError: if self.dtype is specified and x.dtype is not self.dtype.
  • NotImplementedError: if _forward is not implemented.

forward_event_shape

View source

forward_event_shape(
    input_shape
)

Shape of a single sample from a single batch as a TensorShape.

Same meaning as forward_event_shape_tensor. May be only partially defined.

Args:

  • input_shape: TensorShape indicating event-portion shape passed into forward function.

Returns:

  • forward_event_shape_tensor: TensorShape indicating event-portion shape after applying forward. Possibly unknown.

forward_event_shape_tensor

View source

forward_event_shape_tensor(
    input_shape, name='forward_event_shape_tensor'
)

Shape of a single sample from a single batch as an int32 1D Tensor.

Args:

  • input_shape: Tensor, int32 vector indicating event-portion shape passed into forward function.
  • name: name to give to the op

Returns:

  • forward_event_shape_tensor: Tensor, int32 vector indicating event-portion shape after applying forward.

forward_log_det_jacobian

View source

forward_log_det_jacobian(
    x, event_ndims, name='forward_log_det_jacobian', **kwargs
)

Returns both the forward_log_det_jacobian.

Args:

  • x: Tensor. The input to the 'forward' Jacobian determinant evaluation.
  • event_ndims: Number of dimensions in the probabilistic events being transformed. Must be greater than or equal to self.forward_min_event_ndims. The result is summed over the final dimensions to produce a scalar Jacobian determinant for each event, i.e. it has shape rank(x) - event_ndims dimensions.
  • name: The name to give this op.
  • **kwargs: Named arguments forwarded to subclass implementation.

Returns:

Tensor, if this bijector is injective. If not injective this is not implemented.

Raises:

  • TypeError: if self.dtype is specified and y.dtype is not self.dtype.
  • NotImplementedError: if neither _forward_log_det_jacobian nor {_inverse, _inverse_log_det_jacobian} are implemented, or this is a non-injective bijector.

inverse

View source

inverse(
    y, name='inverse', **kwargs
)

Returns the inverse Bijector evaluation, i.e., X = g^{-1}(Y).

Args:

  • y: Tensor. The input to the 'inverse' evaluation.
  • name: The name to give this op.
  • **kwargs: Named arguments forwarded to subclass implementation.

Returns:

Tensor, if this bijector is injective. If not injective, returns the k-tuple containing the unique k points (x1, ..., xk) such that g(xi) = y.

Raises:

  • TypeError: if self.dtype is specified and y.dtype is not self.dtype.
  • NotImplementedError: if _inverse is not implemented.

inverse_event_shape

View source

inverse_event_shape(
    output_shape
)

Shape of a single sample from a single batch as a TensorShape.

Same meaning as inverse_event_shape_tensor. May be only partially defined.

Args:

  • output_shape: TensorShape indicating event-portion shape passed into inverse function.

Returns:

  • inverse_event_shape_tensor: TensorShape indicating event-portion shape after applying inverse. Possibly unknown.

inverse_event_shape_tensor

View source

inverse_event_shape_tensor(
    output_shape, name='inverse_event_shape_tensor'
)

Shape of a single sample from a single batch as an int32 1D Tensor.

Args:

  • output_shape: Tensor, int32 vector indicating event-portion shape passed into inverse function.
  • name: name to give to the op

Returns:

  • inverse_event_shape_tensor: Tensor, int32 vector indicating event-portion shape after applying inverse.

inverse_log_det_jacobian

View source

inverse_log_det_jacobian(
    y, event_ndims, name='inverse_log_det_jacobian', **kwargs
)

Returns the (log o det o Jacobian o inverse)(y).

Mathematically, returns: log(det(dX/dY))(Y). (Recall that: X=g^{-1}(Y).)

Note that forward_log_det_jacobian is the negative of this function, evaluated at g^{-1}(y).

Args:

  • y: Tensor. The input to the 'inverse' Jacobian determinant evaluation.
  • event_ndims: Number of dimensions in the probabilistic events being transformed. Must be greater than or equal to self.inverse_min_event_ndims. The result is summed over the final dimensions to produce a scalar Jacobian determinant for each event, i.e. it has shape rank(y) - event_ndims dimensions.
  • name: The name to give this op.
  • **kwargs: Named arguments forwarded to subclass implementation.

Returns:

  • ildj: Tensor, if this bijector is injective. If not injective, returns the tuple of local log det Jacobians, log(det(Dg_i^{-1}(y))), where g_i is the restriction of g to the ith partition Di.

Raises:

  • TypeError: if self.dtype is specified and y.dtype is not self.dtype.
  • NotImplementedError: if _inverse_log_det_jacobian is not implemented.