TF 2.0 is out! Get hands-on practice at TF World, Oct 28-31. Use code TF20 for 20% off select passes. Register now

tfp.experimental.substrates.jax.bijectors.Reshape

View source on GitHub

Class Reshape

Reshapes the event_shape of a Tensor.

Inherits From: Bijector

The semantics generally follow that of tf.reshape(), with a few differences:

  • The user must provide both the input and output shape, so that the transformation can be inverted. If an input shape is not specified, the default assumes a vector-shaped input, i.e., event_shape_in = (-1,).
  • The Reshape bijector automatically broadcasts over the leftmost dimensions of its input (sample_shape and batch_shape); only the rightmost event_ndims_in dimensions are reshaped. The number of dimensions to reshape is inferred from the provided event_shape_in (event_ndims_in = len(event_shape_in)).

Example usage:

r = tfp.bijectors.Reshape(event_shape_out=[1, -1])

r.forward([3., 4.])    # shape [2]
# ==> [[3., 4.]]       # shape [1, 2]

r.forward([[1., 2.], [3., 4.]])  # shape [2, 2]
# ==> [[[1., 2.]],
#      [[3., 4.]]]   # shape [2, 1, 2]

r.inverse([[3., 4.]])  # shape [1,2]
# ==> [3., 4.]         # shape [2]

r.forward_log_det_jacobian(any_value)
# ==> 0.

r.inverse_log_det_jacobian(any_value)
# ==> 0.

[1] The case in question is exemplified in the following snippet:

bijector = tfp.bijectors.Reshape(
  event_shape_out=tf.placeholder(dtype=tf.int32, shape=[1]),
  event_shape_in= tf.placeholder(dtype=tf.int32, shape=[3]),
  validate_args=True)

bijector.forward_event_shape(tf.TensorShape([5, 2, 3, 7]))
# Chosen policy    ==> (5, None)
# Alternate policy ==> (5, 42)

In the chosen policy, since we don't know what event_shape_in/out are at the time of the call to forward_event_shape, we simply fill in everything we do know, which is that the last three dims will be replaced with "something".

In the alternate policy, we would assume that the intention must be to reshape [5, 2, 3, 7] such that the last three dims collapse to one, which is only possible if the resulting shape is [5, 42].

Note that the above is the only case in which we could do such inference; if the output shape has more than 1 dim, we can't infer anything. E.g., we would have

bijector = tfp.bijectors.Reshape(
  event_shape_out=tf.placeholder(dtype=tf.int32, shape=[2]),
  event_shape_in= tf.placeholder(dtype=tf.int32, shape=[3]),
  validate_args=True)

bijector.forward_event_shape(tf.TensorShape([5, 2, 3, 7]))
# Either policy ==> (5, None, None)

__init__

View source

__init__(
    event_shape_out,
    event_shape_in=(-1,),
    validate_args=False,
    name=None
)

Creates a Reshape bijector.

Args:

  • event_shape_out: An int-like vector-shaped Tensor representing the event shape of the transformed output.
  • event_shape_in: An optional int-like vector-shape Tensor representing the event shape of the input. This is required in order to define inverse operations; the default of (-1,) assumes a vector-shaped input.
  • validate_args: Python bool indicating whether arguments should be checked for correctness.
  • name: Python str, name given to ops managed by this object.

Raises:

  • TypeError: if either event_shape_in or event_shape_out has non-integer dtype.
  • ValueError: if either of event_shape_in or event_shape_out has non-vector shape (rank > 1), or if their sizes do not match.

Properties

dtype

dtype of Tensors transformable by this distribution.

forward_min_event_ndims

Returns the minimal number of dimensions bijector.forward operates on.

graph_parents

Returns this Bijector's graph_parents as a Python list.

inverse_min_event_ndims

Returns the minimal number of dimensions bijector.inverse operates on.

is_constant_jacobian

Returns true iff the Jacobian matrix is not a function of x.

Returns:

  • is_constant_jacobian: Python bool.

name

Returns the string name of this Bijector.

trainable_variables

validate_args

Returns True if Tensor arguments will be validated.

variables

Methods

__call__

View source

__call__(
    value,
    name=None,
    **kwargs
)

Applies or composes the Bijector, depending on input type.

This is a convenience function which applies the Bijector instance in three different ways, depending on the input:

  1. If the input is a tfd.Distribution instance, return tfd.TransformedDistribution(distribution=input, bijector=self).
  2. If the input is a tfb.Bijector instance, return tfb.Chain([self, input]).
  3. Otherwise, return self.forward(input)

Args:

  • value: A tfd.Distribution, tfb.Bijector, or a Tensor.
  • name: Python str name given to ops created by this function.
  • **kwargs: Additional keyword arguments passed into the created tfd.TransformedDistribution, tfb.Bijector, or self.forward.

Returns:

  • composition: A tfd.TransformedDistribution if the input was a tfd.Distribution, a tfb.Chain if the input was a tfb.Bijector, or a Tensor computed by self.forward.

Examples

sigmoid = tfb.Reciprocal()(
    tfb.AffineScalar(shift=1.)(
      tfb.Exp()(
        tfb.AffineScalar(scale=-1.))))
# ==> `tfb.Chain([
#         tfb.Reciprocal(),
#         tfb.AffineScalar(shift=1.),
#         tfb.Exp(),
#         tfb.AffineScalar(scale=-1.),
#      ])`  # ie, `tfb.Sigmoid()`

log_normal = tfb.Exp()(tfd.Normal(0, 1))
# ==> `tfd.TransformedDistribution(tfd.Normal(0, 1), tfb.Exp())`

tfb.Exp()([-1., 0., 1.])
# ==> tf.exp([-1., 0., 1.])

forward

View source

forward(
    x,
    name='forward',
    **kwargs
)

Returns the forward Bijector evaluation, i.e., X = g(Y).

Args:

  • x: Tensor. The input to the 'forward' evaluation.
  • name: The name to give this op.
  • **kwargs: Named arguments forwarded to subclass implementation.

Returns:

Tensor.

Raises:

  • TypeError: if self.dtype is specified and x.dtype is not self.dtype.
  • NotImplementedError: if _forward is not implemented.

forward_event_shape

View source

forward_event_shape(input_shape)

Shape of a single sample from a single batch as a TensorShape.

Same meaning as forward_event_shape_tensor. May be only partially defined.

Args:

  • input_shape: TensorShape indicating event-portion shape passed into forward function.

Returns:

  • forward_event_shape_tensor: TensorShape indicating event-portion shape after applying forward. Possibly unknown.

forward_event_shape_tensor

View source

forward_event_shape_tensor(
    input_shape,
    name='forward_event_shape_tensor'
)

Shape of a single sample from a single batch as an int32 1D Tensor.

Args:

  • input_shape: Tensor, int32 vector indicating event-portion shape passed into forward function.
  • name: name to give to the op

Returns:

  • forward_event_shape_tensor: Tensor, int32 vector indicating event-portion shape after applying forward.

forward_log_det_jacobian

View source

forward_log_det_jacobian(
    x,
    event_ndims,
    name='forward_log_det_jacobian',
    **kwargs
)

Returns both the forward_log_det_jacobian.

Args:

  • x: Tensor. The input to the 'forward' Jacobian determinant evaluation.
  • event_ndims: Number of dimensions in the probabilistic events being transformed. Must be greater than or equal to self.forward_min_event_ndims. The result is summed over the final dimensions to produce a scalar Jacobian determinant for each event, i.e. it has shape rank(x) - event_ndims dimensions.
  • name: The name to give this op.
  • **kwargs: Named arguments forwarded to subclass implementation.

Returns:

Tensor, if this bijector is injective. If not injective this is not implemented.

Raises:

  • TypeError: if self.dtype is specified and y.dtype is not self.dtype.
  • NotImplementedError: if neither _forward_log_det_jacobian nor {_inverse, _inverse_log_det_jacobian} are implemented, or this is a non-injective bijector.

inverse

View source

inverse(
    y,
    name='inverse',
    **kwargs
)

Returns the inverse Bijector evaluation, i.e., X = g^{-1}(Y).

Args:

  • y: Tensor. The input to the 'inverse' evaluation.
  • name: The name to give this op.
  • **kwargs: Named arguments forwarded to subclass implementation.

Returns:

Tensor, if this bijector is injective. If not injective, returns the k-tuple containing the unique k points (x1, ..., xk) such that g(xi) = y.

Raises:

  • TypeError: if self.dtype is specified and y.dtype is not self.dtype.
  • NotImplementedError: if _inverse is not implemented.

inverse_event_shape

View source

inverse_event_shape(output_shape)

Shape of a single sample from a single batch as a TensorShape.

Same meaning as inverse_event_shape_tensor. May be only partially defined.

Args:

  • output_shape: TensorShape indicating event-portion shape passed into inverse function.

Returns:

  • inverse_event_shape_tensor: TensorShape indicating event-portion shape after applying inverse. Possibly unknown.

inverse_event_shape_tensor

View source

inverse_event_shape_tensor(
    output_shape,
    name='inverse_event_shape_tensor'
)

Shape of a single sample from a single batch as an int32 1D Tensor.

Args:

  • output_shape: Tensor, int32 vector indicating event-portion shape passed into inverse function.
  • name: name to give to the op

Returns:

  • inverse_event_shape_tensor: Tensor, int32 vector indicating event-portion shape after applying inverse.

inverse_log_det_jacobian

View source

inverse_log_det_jacobian(
    y,
    event_ndims,
    name='inverse_log_det_jacobian',
    **kwargs
)

Returns the (log o det o Jacobian o inverse)(y).

Mathematically, returns: log(det(dX/dY))(Y). (Recall that: X=g^{-1}(Y).)

Note that forward_log_det_jacobian is the negative of this function, evaluated at g^{-1}(y).

Args:

  • y: Tensor. The input to the 'inverse' Jacobian determinant evaluation.
  • event_ndims: Number of dimensions in the probabilistic events being transformed. Must be greater than or equal to self.inverse_min_event_ndims. The result is summed over the final dimensions to produce a scalar Jacobian determinant for each event, i.e. it has shape rank(y) - event_ndims dimensions.
  • name: The name to give this op.
  • **kwargs: Named arguments forwarded to subclass implementation.

Returns:

  • ildj: Tensor, if this bijector is injective. If not injective, returns the tuple of local log det Jacobians, log(det(Dg_i^{-1}(y))), where g_i is the restriction of g to the ith partition Di.

Raises:

  • TypeError: if self.dtype is specified and y.dtype is not self.dtype.
  • NotImplementedError: if _inverse_log_det_jacobian is not implemented.