TF 2.0 is out! Get hands-on practice at TF World, Oct 28-31. Use code TF20 for 20% off select passes. Register now

tfp.bijectors.Softfloor

View source on GitHub

Class Softfloor

Compute a differentiable approximation to tf.math.floor.

Inherits From: Bijector

Given x, compute a differentiable approximation to tf.math.floor(x). It is parameterized by a temperature parameter t to control the closeness of the approximation at the cost of numerical stability of the inverse.

This Bijector has the following properties: * This Bijector is a map between R to R. * For t close to 0, this bijector mimics the identity function. * For t approaching infinity, this bijector converges pointwise to tf.math.floor (except at integer points).

Note that for lower temperatures t, this bijector becomes more numerically unstable. In particular, the inverse for this bijector is not numerically stable at lower temperatures, because flooring is not a bijective function ( and hence any pointwise limit towards the floor function will start to have a non-numerically stable inverse).

Mathematical details

Let x be in [0.5, 1.5]. We would like to simulate the floor function on this interval. We will do this via a shifted and rescaled sigmoid.

floor(x) = 0 for x < 1 and floor(x) = 1 for x >= 1. If we take f(x) = sigmoid((x - 1.) / t), where t > 0, we can see that when t goes to zero, we get that when x > 1, the f(x) tends towards 1 while f(x) tends to 0 when x < 1, thus giving us a function that looks like the floor function. If we shift f(x) by -sigmoid(-0.5 / t) and rescale by 1 / (sigmoid(0.5 / t) - sigmoid(-0.5 / t)), we preserve the pointwise limit, but also fix f(0.5) = 0. and f(1.5) = 1..

Thus we can define softfloor(x, t) = a * sigmoid((x - 1.) / t) + b

where * a = 1 / (sigmoid(0.5 / t) - sigmoid(-0.5 / t)) * b = -sigmoid(-0.5 / t) / (sigmoid(0.5 / t) - sigmoid(-0.5 / t))

The implementation of the Softfloor bijector follows this, with the caveat that we extend the function to all of the real line, by appropriately shifting this function for each integer.

Examples

Example use:

# High temperature.
soft_floor = Softfloor(temperature=100.)
x = [2.1, 3.2, 5.5]
soft_floor.forward(x)

# Low temperature. This acts like a floor.
soft_floor = Softfloor(temperature=0.01)
soft_floor.forward(x) # Should be close to [2., 3., 5.]

# Ceiling is just a shifted floor at non-integer points.
soft_ceiling = tfb.Chain(
  [tfb.AffineScalar(1.),
   tfb.Softfloor(temperature=1.)])
soft_ceiling.forward(x) # Should be close to [3., 5., 6.]

__init__

View source

__init__(
    temperature,
    validate_args=False,
    name='softfloor'
)

Constructs Bijector.

A Bijector transforms random variables into new random variables.

Examples:

# Create the Y = g(X) = X transform.
identity = Identity()

# Create the Y = g(X) = exp(X) transform.
exp = Exp()

See Bijector subclass docstring for more details and specific examples.

Args:

  • graph_parents: Python list of graph prerequisites of this Bijector.
  • is_constant_jacobian: Python bool indicating that the Jacobian matrix is not a function of the input.
  • validate_args: Python bool, default False. Whether to validate input with asserts. If validate_args is False, and the inputs are invalid, correct behavior is not guaranteed.
  • dtype: tf.dtype supported by this Bijector. None means dtype is not enforced.
  • forward_min_event_ndims: Python integer indicating the minimum number of dimensions forward operates on.
  • inverse_min_event_ndims: Python integer indicating the minimum number of dimensions inverse operates on. Will be set to forward_min_event_ndims by default, if no value is provided.
  • name: The name to give Ops created by the initializer.

Raises:

  • ValueError: If neither forward_min_event_ndims and inverse_min_event_ndims are specified, or if either of them is negative.
  • ValueError: If a member of graph_parents is not a Tensor.

Properties

dtype

dtype of Tensors transformable by this distribution.

forward_min_event_ndims

Returns the minimal number of dimensions bijector.forward operates on.

graph_parents

Returns this Bijector's graph_parents as a Python list.

inverse_min_event_ndims

Returns the minimal number of dimensions bijector.inverse operates on.

is_constant_jacobian

Returns true iff the Jacobian matrix is not a function of x.

Returns:

  • is_constant_jacobian: Python bool.

name

Returns the string name of this Bijector.

name_scope

Returns a tf.name_scope instance for this class.

submodules

Sequence of all sub-modules.

Submodules are modules which are properties of this module, or found as properties of modules which are properties of this module (and so on).

a = tf.Module()
b = tf.Module()
c = tf.Module()
a.b = b
b.c = c
assert list(a.submodules) == [b, c]
assert list(b.submodules) == [c]
assert list(c.submodules) == []

Returns:

A sequence of all submodules.

temperature

trainable_variables

Sequence of variables owned by this module and it's submodules.

Returns:

A sequence of variables for the current module (sorted by attribute name) followed by variables from all submodules recursively (breadth first).

validate_args

Returns True if Tensor arguments will be validated.

variables

Sequence of variables owned by this module and it's submodules.

Returns:

A sequence of variables for the current module (sorted by attribute name) followed by variables from all submodules recursively (breadth first).

Methods

__call__

View source

__call__(
    value,
    name=None,
    **kwargs
)

Applies or composes the Bijector, depending on input type.

This is a convenience function which applies the Bijector instance in three different ways, depending on the input:

  1. If the input is a tfd.Distribution instance, return tfd.TransformedDistribution(distribution=input, bijector=self).
  2. If the input is a tfb.Bijector instance, return tfb.Chain([self, input]).
  3. Otherwise, return self.forward(input)

Args:

  • value: A tfd.Distribution, tfb.Bijector, or a Tensor.
  • name: Python str name given to ops created by this function.
  • **kwargs: Additional keyword arguments passed into the created tfd.TransformedDistribution, tfb.Bijector, or self.forward.

Returns:

  • composition: A tfd.TransformedDistribution if the input was a tfd.Distribution, a tfb.Chain if the input was a tfb.Bijector, or a Tensor computed by self.forward.

Examples

sigmoid = tfb.Reciprocal()(
    tfb.AffineScalar(shift=1.)(
      tfb.Exp()(
        tfb.AffineScalar(scale=-1.))))
# ==> `tfb.Chain([
#         tfb.Reciprocal(),
#         tfb.AffineScalar(shift=1.),
#         tfb.Exp(),
#         tfb.AffineScalar(scale=-1.),
#      ])`  # ie, `tfb.Sigmoid()`

log_normal = tfb.Exp()(tfd.Normal(0, 1))
# ==> `tfd.TransformedDistribution(tfd.Normal(0, 1), tfb.Exp())`

tfb.Exp()([-1., 0., 1.])
# ==> tf.exp([-1., 0., 1.])

forward

View source

forward(
    x,
    name='forward',
    **kwargs
)

Returns the forward Bijector evaluation, i.e., X = g(Y).

Args:

  • x: Tensor. The input to the 'forward' evaluation.
  • name: The name to give this op.
  • **kwargs: Named arguments forwarded to subclass implementation.

Returns:

Tensor.

Raises:

  • TypeError: if self.dtype is specified and x.dtype is not self.dtype.
  • NotImplementedError: if _forward is not implemented.

forward_event_shape

View source

forward_event_shape(input_shape)

Shape of a single sample from a single batch as a TensorShape.

Same meaning as forward_event_shape_tensor. May be only partially defined.

Args:

  • input_shape: TensorShape indicating event-portion shape passed into forward function.

Returns:

  • forward_event_shape_tensor: TensorShape indicating event-portion shape after applying forward. Possibly unknown.

forward_event_shape_tensor

View source

forward_event_shape_tensor(
    input_shape,
    name='forward_event_shape_tensor'
)

Shape of a single sample from a single batch as an int32 1D Tensor.

Args:

  • input_shape: Tensor, int32 vector indicating event-portion shape passed into forward function.
  • name: name to give to the op

Returns:

  • forward_event_shape_tensor: Tensor, int32 vector indicating event-portion shape after applying forward.

forward_log_det_jacobian

View source

forward_log_det_jacobian(
    x,
    event_ndims,
    name='forward_log_det_jacobian',
    **kwargs
)

Returns both the forward_log_det_jacobian.

Args:

  • x: Tensor. The input to the 'forward' Jacobian determinant evaluation.
  • event_ndims: Number of dimensions in the probabilistic events being transformed. Must be greater than or equal to self.forward_min_event_ndims. The result is summed over the final dimensions to produce a scalar Jacobian determinant for each event, i.e. it has shape rank(x) - event_ndims dimensions.
  • name: The name to give this op.
  • **kwargs: Named arguments forwarded to subclass implementation.

Returns:

Tensor, if this bijector is injective. If not injective this is not implemented.

Raises:

  • TypeError: if self.dtype is specified and y.dtype is not self.dtype.
  • NotImplementedError: if neither _forward_log_det_jacobian nor {_inverse, _inverse_log_det_jacobian} are implemented, or this is a non-injective bijector.

inverse

View source

inverse(
    y,
    name='inverse',
    **kwargs
)

Returns the inverse Bijector evaluation, i.e., X = g^{-1}(Y).

Args:

  • y: Tensor. The input to the 'inverse' evaluation.
  • name: The name to give this op.
  • **kwargs: Named arguments forwarded to subclass implementation.

Returns:

Tensor, if this bijector is injective. If not injective, returns the k-tuple containing the unique k points (x1, ..., xk) such that g(xi) = y.

Raises:

  • TypeError: if self.dtype is specified and y.dtype is not self.dtype.
  • NotImplementedError: if _inverse is not implemented.

inverse_event_shape

View source

inverse_event_shape(output_shape)

Shape of a single sample from a single batch as a TensorShape.

Same meaning as inverse_event_shape_tensor. May be only partially defined.

Args:

  • output_shape: TensorShape indicating event-portion shape passed into inverse function.

Returns:

  • inverse_event_shape_tensor: TensorShape indicating event-portion shape after applying inverse. Possibly unknown.

inverse_event_shape_tensor

View source

inverse_event_shape_tensor(
    output_shape,
    name='inverse_event_shape_tensor'
)

Shape of a single sample from a single batch as an int32 1D Tensor.

Args:

  • output_shape: Tensor, int32 vector indicating event-portion shape passed into inverse function.
  • name: name to give to the op

Returns:

  • inverse_event_shape_tensor: Tensor, int32 vector indicating event-portion shape after applying inverse.

inverse_log_det_jacobian

View source

inverse_log_det_jacobian(
    y,
    event_ndims,
    name='inverse_log_det_jacobian',
    **kwargs
)

Returns the (log o det o Jacobian o inverse)(y).

Mathematically, returns: log(det(dX/dY))(Y). (Recall that: X=g^{-1}(Y).)

Note that forward_log_det_jacobian is the negative of this function, evaluated at g^{-1}(y).

Args:

  • y: Tensor. The input to the 'inverse' Jacobian determinant evaluation.
  • event_ndims: Number of dimensions in the probabilistic events being transformed. Must be greater than or equal to self.inverse_min_event_ndims. The result is summed over the final dimensions to produce a scalar Jacobian determinant for each event, i.e. it has shape rank(y) - event_ndims dimensions.
  • name: The name to give this op.
  • **kwargs: Named arguments forwarded to subclass implementation.

Returns:

  • ildj: Tensor, if this bijector is injective. If not injective, returns the tuple of local log det Jacobians, log(det(Dg_i^{-1}(y))), where g_i is the restriction of g to the ith partition Di.

Raises:

  • TypeError: if self.dtype is specified and y.dtype is not self.dtype.
  • NotImplementedError: if _inverse_log_det_jacobian is not implemented.

with_name_scope

with_name_scope(
    cls,
    method
)

Decorator to automatically enter the module name scope.

class MyModule(tf.Module):
  @tf.Module.with_name_scope
  def __call__(self, x):
    if not hasattr(self, 'w'):
      self.w = tf.Variable(tf.random.normal([x.shape[1], 64]))
    return tf.matmul(x, self.w)

Using the above module would produce tf.Variables and tf.Tensors whose names included the module name:

mod = MyModule()
mod(tf.ones([8, 32]))
# ==> <tf.Tensor: ...>
mod.w
# ==> <tf.Variable ...'my_module/w:0'>

Args:

  • method: The method to wrap.

Returns:

The original method wrapped such that it enters the module's name scope.