![]() |
Class Softfloor
Compute a differentiable approximation to tf.math.floor
.
Inherits From: Bijector
Given x
, compute a differentiable approximation to tf.math.floor(x)
.
It is parameterized by a temperature parameter t
to control the closeness
of the approximation at the cost of numerical stability of the inverse.
This Bijector
has the following properties:
* This Bijector
is a map between R
to R
.
* For t
close to 0
, this bijector mimics the identity function.
* For t
approaching infinity
, this bijector converges pointwise
to tf.math.floor
(except at integer points).
Note that for lower temperatures t
, this bijector becomes more numerically
unstable. In particular, the inverse for this bijector is not numerically
stable at lower temperatures, because flooring is not a bijective function (
and hence any pointwise limit towards the floor function will start to have a
non-numerically stable inverse).
Mathematical details
Let x
be in [0.5, 1.5]
. We would like to simulate the floor function on
this interval. We will do this via a shifted and rescaled sigmoid
.
floor(x) = 0
for x < 1
and floor(x) = 1
for x >= 1
.
If we take f(x) = sigmoid((x - 1.) / t)
, where t > 0
, we can see that
when t
goes to zero, we get that when x > 1
, the f(x)
tends towards 1
while f(x)
tends to 0
when x < 1
, thus giving us a function that looks
like the floor function. If we shift f(x)
by -sigmoid(-0.5 / t)
and
rescale by 1 / (sigmoid(0.5 / t) - sigmoid(-0.5 / t))
, we preserve the
pointwise limit, but also fix f(0.5) = 0.
and f(1.5) = 1.
.
Thus we can define softfloor(x, t) = a * sigmoid((x - 1.) / t) + b
where
* a = 1 / (sigmoid(0.5 / t) - sigmoid(-0.5 / t))
* b = -sigmoid(-0.5 / t) / (sigmoid(0.5 / t) - sigmoid(-0.5 / t))
The implementation of the Softfloor
bijector follows this, with the caveat
that we extend the function to all of the real line, by appropriately shifting
this function for each integer.
Examples
Example use:
# High temperature.
soft_floor = Softfloor(temperature=100.)
x = [2.1, 3.2, 5.5]
soft_floor.forward(x)
# Low temperature. This acts like a floor.
soft_floor = Softfloor(temperature=0.01)
soft_floor.forward(x) # Should be close to [2., 3., 5.]
# Ceiling is just a shifted floor at non-integer points.
soft_ceiling = tfb.Chain(
[tfb.AffineScalar(1.),
tfb.Softfloor(temperature=1.)])
soft_ceiling.forward(x) # Should be close to [3., 5., 6.]
__init__
__init__(
temperature,
validate_args=False,
name='softfloor'
)
Constructs Bijector.
A Bijector
transforms random variables into new random variables.
Examples:
# Create the Y = g(X) = X transform.
identity = Identity()
# Create the Y = g(X) = exp(X) transform.
exp = Exp()
See Bijector
subclass docstring for more details and specific examples.
Args:
graph_parents
: Python list of graph prerequisites of thisBijector
.is_constant_jacobian
: Pythonbool
indicating that the Jacobian matrix is not a function of the input.validate_args
: Pythonbool
, defaultFalse
. Whether to validate input with asserts. Ifvalidate_args
isFalse
, and the inputs are invalid, correct behavior is not guaranteed.dtype
:tf.dtype
supported by thisBijector
.None
means dtype is not enforced.forward_min_event_ndims
: Pythoninteger
indicating the minimum number of dimensionsforward
operates on.inverse_min_event_ndims
: Pythoninteger
indicating the minimum number of dimensionsinverse
operates on. Will be set toforward_min_event_ndims
by default, if no value is provided.name
: The name to give Ops created by the initializer.
Raises:
ValueError
: If neitherforward_min_event_ndims
andinverse_min_event_ndims
are specified, or if either of them is negative.ValueError
: If a member ofgraph_parents
is not aTensor
.
Properties
dtype
dtype of Tensor
s transformable by this distribution.
forward_min_event_ndims
Returns the minimal number of dimensions bijector.forward operates on.
graph_parents
Returns this Bijector
's graph_parents as a Python list.
inverse_min_event_ndims
Returns the minimal number of dimensions bijector.inverse operates on.
is_constant_jacobian
Returns true iff the Jacobian matrix is not a function of x.
Returns:
is_constant_jacobian
: Pythonbool
.
name
Returns the string name of this Bijector
.
temperature
trainable_variables
validate_args
Returns True if Tensor arguments will be validated.
variables
Methods
__call__
__call__(
value,
name=None,
**kwargs
)
Applies or composes the Bijector
, depending on input type.
This is a convenience function which applies the Bijector
instance in
three different ways, depending on the input:
- If the input is a
tfd.Distribution
instance, returntfd.TransformedDistribution(distribution=input, bijector=self)
. - If the input is a
tfb.Bijector
instance, returntfb.Chain([self, input])
. - Otherwise, return
self.forward(input)
Args:
value
: Atfd.Distribution
,tfb.Bijector
, or aTensor
.name
: Pythonstr
name given to ops created by this function.**kwargs
: Additional keyword arguments passed into the createdtfd.TransformedDistribution
,tfb.Bijector
, orself.forward
.
Returns:
composition
: Atfd.TransformedDistribution
if the input was atfd.Distribution
, atfb.Chain
if the input was atfb.Bijector
, or aTensor
computed byself.forward
.
Examples
sigmoid = tfb.Reciprocal()(
tfb.AffineScalar(shift=1.)(
tfb.Exp()(
tfb.AffineScalar(scale=-1.))))
# ==> `tfb.Chain([
# tfb.Reciprocal(),
# tfb.AffineScalar(shift=1.),
# tfb.Exp(),
# tfb.AffineScalar(scale=-1.),
# ])` # ie, `tfb.Sigmoid()`
log_normal = tfb.Exp()(tfd.Normal(0, 1))
# ==> `tfd.TransformedDistribution(tfd.Normal(0, 1), tfb.Exp())`
tfb.Exp()([-1., 0., 1.])
# ==> tf.exp([-1., 0., 1.])
forward
forward(
x,
name='forward',
**kwargs
)
Returns the forward Bijector
evaluation, i.e., X = g(Y).
Args:
x
:Tensor
. The input to the 'forward' evaluation.name
: The name to give this op.**kwargs
: Named arguments forwarded to subclass implementation.
Returns:
Tensor
.
Raises:
TypeError
: ifself.dtype
is specified andx.dtype
is notself.dtype
.NotImplementedError
: if_forward
is not implemented.
forward_event_shape
forward_event_shape(input_shape)
Shape of a single sample from a single batch as a TensorShape
.
Same meaning as forward_event_shape_tensor
. May be only partially defined.
Args:
input_shape
:TensorShape
indicating event-portion shape passed intoforward
function.
Returns:
forward_event_shape_tensor
:TensorShape
indicating event-portion shape after applyingforward
. Possibly unknown.
forward_event_shape_tensor
forward_event_shape_tensor(
input_shape,
name='forward_event_shape_tensor'
)
Shape of a single sample from a single batch as an int32
1D Tensor
.
Args:
input_shape
:Tensor
,int32
vector indicating event-portion shape passed intoforward
function.name
: name to give to the op
Returns:
forward_event_shape_tensor
:Tensor
,int32
vector indicating event-portion shape after applyingforward
.
forward_log_det_jacobian
forward_log_det_jacobian(
x,
event_ndims,
name='forward_log_det_jacobian',
**kwargs
)
Returns both the forward_log_det_jacobian.
Args:
x
:Tensor
. The input to the 'forward' Jacobian determinant evaluation.event_ndims
: Number of dimensions in the probabilistic events being transformed. Must be greater than or equal toself.forward_min_event_ndims
. The result is summed over the final dimensions to produce a scalar Jacobian determinant for each event, i.e. it has shaperank(x) - event_ndims
dimensions.name
: The name to give this op.**kwargs
: Named arguments forwarded to subclass implementation.
Returns:
Tensor
, if this bijector is injective.
If not injective this is not implemented.
Raises:
TypeError
: ifself.dtype
is specified andy.dtype
is notself.dtype
.NotImplementedError
: if neither_forward_log_det_jacobian
nor {_inverse
,_inverse_log_det_jacobian
} are implemented, or this is a non-injective bijector.
inverse
inverse(
y,
name='inverse',
**kwargs
)
Returns the inverse Bijector
evaluation, i.e., X = g^{-1}(Y).
Args:
y
:Tensor
. The input to the 'inverse' evaluation.name
: The name to give this op.**kwargs
: Named arguments forwarded to subclass implementation.
Returns:
Tensor
, if this bijector is injective.
If not injective, returns the k-tuple containing the unique
k
points (x1, ..., xk)
such that g(xi) = y
.
Raises:
TypeError
: ifself.dtype
is specified andy.dtype
is notself.dtype
.NotImplementedError
: if_inverse
is not implemented.
inverse_event_shape
inverse_event_shape(output_shape)
Shape of a single sample from a single batch as a TensorShape
.
Same meaning as inverse_event_shape_tensor
. May be only partially defined.
Args:
output_shape
:TensorShape
indicating event-portion shape passed intoinverse
function.
Returns:
inverse_event_shape_tensor
:TensorShape
indicating event-portion shape after applyinginverse
. Possibly unknown.
inverse_event_shape_tensor
inverse_event_shape_tensor(
output_shape,
name='inverse_event_shape_tensor'
)
Shape of a single sample from a single batch as an int32
1D Tensor
.
Args:
output_shape
:Tensor
,int32
vector indicating event-portion shape passed intoinverse
function.name
: name to give to the op
Returns:
inverse_event_shape_tensor
:Tensor
,int32
vector indicating event-portion shape after applyinginverse
.
inverse_log_det_jacobian
inverse_log_det_jacobian(
y,
event_ndims,
name='inverse_log_det_jacobian',
**kwargs
)
Returns the (log o det o Jacobian o inverse)(y).
Mathematically, returns: log(det(dX/dY))(Y)
. (Recall that: X=g^{-1}(Y)
.)
Note that forward_log_det_jacobian
is the negative of this function,
evaluated at g^{-1}(y)
.
Args:
y
:Tensor
. The input to the 'inverse' Jacobian determinant evaluation.event_ndims
: Number of dimensions in the probabilistic events being transformed. Must be greater than or equal toself.inverse_min_event_ndims
. The result is summed over the final dimensions to produce a scalar Jacobian determinant for each event, i.e. it has shaperank(y) - event_ndims
dimensions.name
: The name to give this op.**kwargs
: Named arguments forwarded to subclass implementation.
Returns:
ildj
:Tensor
, if this bijector is injective. If not injective, returns the tuple of local log det Jacobians,log(det(Dg_i^{-1}(y)))
, whereg_i
is the restriction ofg
to theith
partitionDi
.
Raises:
TypeError
: ifself.dtype
is specified andy.dtype
is notself.dtype
.NotImplementedError
: if_inverse_log_det_jacobian
is not implemented.