Missed TensorFlow Dev Summit? Check out the video playlist. Watch recordings

tfp.bijectors.Chain

View source on GitHub

Bijector which applies a sequence of bijectors.

Inherits From: Bijector

tfp.bijectors.Chain(
    bijectors=None, validate_args=False, name=None
)

Example Use:

chain = Chain([Exp(), Softplus()], name="one_plus_exp")

Results in:

  • Forward:
 exp = Exp()
 softplus = Softplus()
 Chain([exp, softplus]).forward(x)
 = exp.forward(softplus.forward(x))
 = tf.exp(tf.log(1. + tf.exp(x)))
 = 1. + tf.exp(x)
 ```

* Inverse:

 ```python
 exp = Exp()
 softplus = Softplus()
 Chain([exp, softplus]).inverse(y)
 = softplus.inverse(exp.inverse(y))
 = tf.log(tf.exp(tf.log(y)) - 1.)
 = tf.log(y - 1.)
 ```

#### Args:


* <b>`bijectors`</b>: Python `list` of bijector instances. An empty list makes this
  bijector equivalent to the `Identity` bijector.
* <b>`validate_args`</b>: Python `bool` indicating whether arguments should be
  checked for correctness.
* <b>`name`</b>: Python `str`, name given to ops managed by this object. Default:
  E.g., `Chain([Exp(), Softplus()]).name == "chain_of_exp_of_softplus"`.


#### Attributes:

* <b>`bijectors`</b>
* <b>`dtype`</b>:   dtype of `Tensor`s transformable by this distribution.
* <b>`forward_min_event_ndims`</b>:   Returns the minimal number of dimensions bijector.forward operates on.
* <b>`graph_parents`</b>:   Returns this `Bijector`'s graph_parents as a Python list.
* <b>`inverse_min_event_ndims`</b>:   Returns the minimal number of dimensions bijector.inverse operates on.
* <b>`is_constant_jacobian`</b>:   Returns true iff the Jacobian matrix is not a function of x.

  Note: Jacobian matrix is either constant for both forward and inverse or
  neither.

* <b>`name`</b>:   Returns the string name of this `Bijector`.
* <b>`name_scope`</b>:   Returns a <a href="/api_docs/python/tf/name_scope"><code>tf.name_scope</code></a> instance for this class.
* <b>`submodules`</b>:   Sequence of all sub-modules.

  Submodules are modules which are properties of this module, or found as
  properties of modules which are properties of this module (and so on).

>     a = tf.Module()
>     b = tf.Module()
>     c = tf.Module()
>     a.b = b
>     b.c = c
>     assert list(a.submodules) == [b, c]
>     assert list(b.submodules) == [c]
>     assert list(c.submodules) == []

* <b>`trainable_variables`</b>:   Sequence of trainable variables owned by this module and its submodules.

  Note: this method uses reflection to find variables on the current instance
  and submodules. For performance reasons you may wish to cache the result
  of calling this method if you don't expect the return value to change.

* <b>`validate_args`</b>:   Returns True if Tensor arguments will be validated.
* <b>`variables`</b>:   Sequence of variables owned by this module and its submodules.

  Note: this method uses reflection to find variables on the current instance
  and submodules. For performance reasons you may wish to cache the result
  of calling this method if you don't expect the return value to change.


#### Raises:


* <b>`ValueError`</b>: if bijectors have different dtypes.

## Methods

<h3 id="__call__"><code>__call__</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.9.0/tensorflow_probability/python/bijectors/bijector.py#L780-L865">View source</a>

```python
__call__(
    value, name=None, **kwargs
)

Applies or composes the Bijector, depending on input type.

This is a convenience function which applies the Bijector instance in three different ways, depending on the input:

  1. If the input is a tfd.Distribution instance, return tfd.TransformedDistribution(distribution=input, bijector=self).
  2. If the input is a tfb.Bijector instance, return tfb.Chain([self, input]).
  3. Otherwise, return self.forward(input)

Args:

  • value: A tfd.Distribution, tfb.Bijector, or a Tensor.
  • name: Python str name given to ops created by this function.
  • **kwargs: Additional keyword arguments passed into the created tfd.TransformedDistribution, tfb.Bijector, or self.forward.

Returns:

  • composition: A tfd.TransformedDistribution if the input was a tfd.Distribution, a tfb.Chain if the input was a tfb.Bijector, or a Tensor computed by self.forward.

Examples

sigmoid = tfb.Reciprocal()(
    tfb.AffineScalar(shift=1.)(
      tfb.Exp()(
        tfb.AffineScalar(scale=-1.))))
# ==> `tfb.Chain([
#         tfb.Reciprocal(),
#         tfb.AffineScalar(shift=1.),
#         tfb.Exp(),
#         tfb.AffineScalar(scale=-1.),
#      ])`  # ie, `tfb.Sigmoid()`

log_normal = tfb.Exp()(tfd.Normal(0, 1))
# ==> `tfd.TransformedDistribution(tfd.Normal(0, 1), tfb.Exp())`

tfb.Exp()([-1., 0., 1.])
# ==> tf.exp([-1., 0., 1.])

forward

View source

forward(
    x, name='forward', **kwargs
)

Returns the forward Bijector evaluation, i.e., X = g(Y).

Args:

  • x: Tensor. The input to the 'forward' evaluation.
  • name: The name to give this op.
  • **kwargs: Named arguments forwarded to subclass implementation.

Returns:

Tensor.

Raises:

  • TypeError: if self.dtype is specified and x.dtype is not self.dtype.
  • NotImplementedError: if _forward is not implemented.

forward_event_shape

View source

forward_event_shape(
    input_shape
)

Shape of a single sample from a single batch as a TensorShape.

Same meaning as forward_event_shape_tensor. May be only partially defined.

Args:

  • input_shape: TensorShape indicating event-portion shape passed into forward function.

Returns:

  • forward_event_shape_tensor: TensorShape indicating event-portion shape after applying forward. Possibly unknown.

forward_event_shape_tensor

View source

forward_event_shape_tensor(
    input_shape, name='forward_event_shape_tensor'
)

Shape of a single sample from a single batch as an int32 1D Tensor.

Args:

  • input_shape: Tensor, int32 vector indicating event-portion shape passed into forward function.
  • name: name to give to the op

Returns:

  • forward_event_shape_tensor: Tensor, int32 vector indicating event-portion shape after applying forward.

forward_log_det_jacobian

View source

forward_log_det_jacobian(
    x, event_ndims, name='forward_log_det_jacobian', **kwargs
)

Returns both the forward_log_det_jacobian.

Args:

  • x: Tensor. The input to the 'forward' Jacobian determinant evaluation.
  • event_ndims: Number of dimensions in the probabilistic events being transformed. Must be greater than or equal to self.forward_min_event_ndims. The result is summed over the final dimensions to produce a scalar Jacobian determinant for each event, i.e. it has shape rank(x) - event_ndims dimensions.
  • name: The name to give this op.
  • **kwargs: Named arguments forwarded to subclass implementation.

Returns:

Tensor, if this bijector is injective. If not injective this is not implemented.

Raises:

  • TypeError: if self.dtype is specified and y.dtype is not self.dtype.
  • NotImplementedError: if neither _forward_log_det_jacobian nor {_inverse, _inverse_log_det_jacobian} are implemented, or this is a non-injective bijector.

inverse

View source

inverse(
    y, name='inverse', **kwargs
)

Returns the inverse Bijector evaluation, i.e., X = g^{-1}(Y).

Args:

  • y: Tensor. The input to the 'inverse' evaluation.
  • name: The name to give this op.
  • **kwargs: Named arguments forwarded to subclass implementation.

Returns:

Tensor, if this bijector is injective. If not injective, returns the k-tuple containing the unique k points (x1, ..., xk) such that g(xi) = y.

Raises:

  • TypeError: if self.dtype is specified and y.dtype is not self.dtype.
  • NotImplementedError: if _inverse is not implemented.

inverse_event_shape

View source

inverse_event_shape(
    output_shape
)

Shape of a single sample from a single batch as a TensorShape.

Same meaning as inverse_event_shape_tensor. May be only partially defined.

Args:

  • output_shape: TensorShape indicating event-portion shape passed into inverse function.

Returns:

  • inverse_event_shape_tensor: TensorShape indicating event-portion shape after applying inverse. Possibly unknown.

inverse_event_shape_tensor

View source

inverse_event_shape_tensor(
    output_shape, name='inverse_event_shape_tensor'
)

Shape of a single sample from a single batch as an int32 1D Tensor.

Args:

  • output_shape: Tensor, int32 vector indicating event-portion shape passed into inverse function.
  • name: name to give to the op

Returns:

  • inverse_event_shape_tensor: Tensor, int32 vector indicating event-portion shape after applying inverse.

inverse_log_det_jacobian

View source

inverse_log_det_jacobian(
    y, event_ndims, name='inverse_log_det_jacobian', **kwargs
)

Returns the (log o det o Jacobian o inverse)(y).

Mathematically, returns: log(det(dX/dY))(Y). (Recall that: X=g^{-1}(Y).)

Note that forward_log_det_jacobian is the negative of this function, evaluated at g^{-1}(y).

Args:

  • y: Tensor. The input to the 'inverse' Jacobian determinant evaluation.
  • event_ndims: Number of dimensions in the probabilistic events being transformed. Must be greater than or equal to self.inverse_min_event_ndims. The result is summed over the final dimensions to produce a scalar Jacobian determinant for each event, i.e. it has shape rank(y) - event_ndims dimensions.
  • name: The name to give this op.
  • **kwargs: Named arguments forwarded to subclass implementation.

Returns:

  • ildj: Tensor, if this bijector is injective. If not injective, returns the tuple of local log det Jacobians, log(det(Dg_i^{-1}(y))), where g_i is the restriction of g to the ith partition Di.

Raises:

  • TypeError: if self.dtype is specified and y.dtype is not self.dtype.
  • NotImplementedError: if _inverse_log_det_jacobian is not implemented.

with_name_scope

@classmethod
with_name_scope(
    cls, method
)

Decorator to automatically enter the module name scope.

class MyModule(tf.Module):
  @tf.Module.with_name_scope
  def __call__(self, x):
    if not hasattr(self, 'w'):
      self.w = tf.Variable(tf.random.normal([x.shape[1], 64]))
    return tf.matmul(x, self.w)

Using the above module would produce tf.Variables and tf.Tensors whose names included the module name:

mod = MyModule()
mod(tf.ones([8, 32]))
# ==> <tf.Tensor: ...>
mod.w
# ==> <tf.Variable ...'my_module/w:0'>

Args:

  • method: The method to wrap.

Returns:

The original method wrapped such that it enters the module's name scope.