![]() |
Class Reshape
Reshapes the event_shape
of a Tensor
.
Inherits From: Bijector
The semantics generally follow that of tf.reshape()
, with
a few differences:
- The user must provide both the input and output shape, so that the transformation can be inverted. If an input shape is not specified, the default assumes a vector-shaped input, i.e., event_shape_in = (-1,).
- The
Reshape
bijector automatically broadcasts over the leftmost dimensions of its input (sample_shape
andbatch_shape
); only the rightmostevent_ndims_in
dimensions are reshaped. The number of dimensions to reshape is inferred from the providedevent_shape_in
(event_ndims_in = len(event_shape_in)
).
Example usage:
r = tfp.bijectors.Reshape(event_shape_out=[1, -1])
r.forward([3., 4.]) # shape [2]
# ==> [[3., 4.]] # shape [1, 2]
r.forward([[1., 2.], [3., 4.]]) # shape [2, 2]
# ==> [[[1., 2.]],
# [[3., 4.]]] # shape [2, 1, 2]
r.inverse([[3., 4.]]) # shape [1,2]
# ==> [3., 4.] # shape [2]
r.forward_log_det_jacobian(any_value)
# ==> 0.
r.inverse_log_det_jacobian(any_value)
# ==> 0.
[1] The case in question is exemplified in the following snippet:
bijector = tfp.bijectors.Reshape(
event_shape_out=tf.placeholder(dtype=tf.int32, shape=[1]),
event_shape_in= tf.placeholder(dtype=tf.int32, shape=[3]),
validate_args=True)
bijector.forward_event_shape(tf.TensorShape([5, 2, 3, 7]))
# Chosen policy ==> (5, None)
# Alternate policy ==> (5, 42)
In the chosen policy, since we don't know what event_shape_in/out
are at the
time of the call to forward_event_shape
, we simply fill in everything we
do know, which is that the last three dims will be replaced with
"something".
In the alternate policy, we would assume that the intention must be to reshape
[5, 2, 3, 7]
such that the last three dims collapse to one, which is only
possible if the resulting shape is [5, 42]
.
Note that the above is the only case in which we could do such inference; if the output shape has more than 1 dim, we can't infer anything. E.g., we would have
bijector = tfp.bijectors.Reshape(
event_shape_out=tf.placeholder(dtype=tf.int32, shape=[2]),
event_shape_in= tf.placeholder(dtype=tf.int32, shape=[3]),
validate_args=True)
bijector.forward_event_shape(tf.TensorShape([5, 2, 3, 7]))
# Either policy ==> (5, None, None)
__init__
__init__(
event_shape_out,
event_shape_in=(-1,),
validate_args=False,
name=None
)
Creates a Reshape
bijector.
Args:
event_shape_out
: Anint
-like vector-shapedTensor
representing the event shape of the transformed output.event_shape_in
: An optionalint
-like vector-shapeTensor
representing the event shape of the input. This is required in order to define inverse operations; the default of (-1,) assumes a vector-shaped input.validate_args
: Pythonbool
indicating whether arguments should be checked for correctness.name
: Pythonstr
, name given to ops managed by this object.
Raises:
TypeError
: if eitherevent_shape_in
orevent_shape_out
has non-integerdtype
.ValueError
: if either ofevent_shape_in
orevent_shape_out
has non-vector shape (rank > 1
), or if their sizes do not match.
Properties
dtype
dtype of Tensor
s transformable by this distribution.
forward_min_event_ndims
Returns the minimal number of dimensions bijector.forward operates on.
graph_parents
Returns this Bijector
's graph_parents as a Python list.
inverse_min_event_ndims
Returns the minimal number of dimensions bijector.inverse operates on.
is_constant_jacobian
Returns true iff the Jacobian matrix is not a function of x.
Returns:
is_constant_jacobian
: Pythonbool
.
name
Returns the string name of this Bijector
.
trainable_variables
validate_args
Returns True if Tensor arguments will be validated.
variables
Methods
__call__
__call__(
value,
name=None,
**kwargs
)
Applies or composes the Bijector
, depending on input type.
This is a convenience function which applies the Bijector
instance in
three different ways, depending on the input:
- If the input is a
tfd.Distribution
instance, returntfd.TransformedDistribution(distribution=input, bijector=self)
. - If the input is a
tfb.Bijector
instance, returntfb.Chain([self, input])
. - Otherwise, return
self.forward(input)
Args:
value
: Atfd.Distribution
,tfb.Bijector
, or aTensor
.name
: Pythonstr
name given to ops created by this function.**kwargs
: Additional keyword arguments passed into the createdtfd.TransformedDistribution
,tfb.Bijector
, orself.forward
.
Returns:
composition
: Atfd.TransformedDistribution
if the input was atfd.Distribution
, atfb.Chain
if the input was atfb.Bijector
, or aTensor
computed byself.forward
.
Examples
sigmoid = tfb.Reciprocal()(
tfb.AffineScalar(shift=1.)(
tfb.Exp()(
tfb.AffineScalar(scale=-1.))))
# ==> `tfb.Chain([
# tfb.Reciprocal(),
# tfb.AffineScalar(shift=1.),
# tfb.Exp(),
# tfb.AffineScalar(scale=-1.),
# ])` # ie, `tfb.Sigmoid()`
log_normal = tfb.Exp()(tfd.Normal(0, 1))
# ==> `tfd.TransformedDistribution(tfd.Normal(0, 1), tfb.Exp())`
tfb.Exp()([-1., 0., 1.])
# ==> tf.exp([-1., 0., 1.])
forward
forward(
x,
name='forward',
**kwargs
)
Returns the forward Bijector
evaluation, i.e., X = g(Y).
Args:
x
:Tensor
. The input to the 'forward' evaluation.name
: The name to give this op.**kwargs
: Named arguments forwarded to subclass implementation.
Returns:
Tensor
.
Raises:
TypeError
: ifself.dtype
is specified andx.dtype
is notself.dtype
.NotImplementedError
: if_forward
is not implemented.
forward_event_shape
forward_event_shape(input_shape)
Shape of a single sample from a single batch as a TensorShape
.
Same meaning as forward_event_shape_tensor
. May be only partially defined.
Args:
input_shape
:TensorShape
indicating event-portion shape passed intoforward
function.
Returns:
forward_event_shape_tensor
:TensorShape
indicating event-portion shape after applyingforward
. Possibly unknown.
forward_event_shape_tensor
forward_event_shape_tensor(
input_shape,
name='forward_event_shape_tensor'
)
Shape of a single sample from a single batch as an int32
1D Tensor
.
Args:
input_shape
:Tensor
,int32
vector indicating event-portion shape passed intoforward
function.name
: name to give to the op
Returns:
forward_event_shape_tensor
:Tensor
,int32
vector indicating event-portion shape after applyingforward
.
forward_log_det_jacobian
forward_log_det_jacobian(
x,
event_ndims,
name='forward_log_det_jacobian',
**kwargs
)
Returns both the forward_log_det_jacobian.
Args:
x
:Tensor
. The input to the 'forward' Jacobian determinant evaluation.event_ndims
: Number of dimensions in the probabilistic events being transformed. Must be greater than or equal toself.forward_min_event_ndims
. The result is summed over the final dimensions to produce a scalar Jacobian determinant for each event, i.e. it has shaperank(x) - event_ndims
dimensions.name
: The name to give this op.**kwargs
: Named arguments forwarded to subclass implementation.
Returns:
Tensor
, if this bijector is injective.
If not injective this is not implemented.
Raises:
TypeError
: ifself.dtype
is specified andy.dtype
is notself.dtype
.NotImplementedError
: if neither_forward_log_det_jacobian
nor {_inverse
,_inverse_log_det_jacobian
} are implemented, or this is a non-injective bijector.
inverse
inverse(
y,
name='inverse',
**kwargs
)
Returns the inverse Bijector
evaluation, i.e., X = g^{-1}(Y).
Args:
y
:Tensor
. The input to the 'inverse' evaluation.name
: The name to give this op.**kwargs
: Named arguments forwarded to subclass implementation.
Returns:
Tensor
, if this bijector is injective.
If not injective, returns the k-tuple containing the unique
k
points (x1, ..., xk)
such that g(xi) = y
.
Raises:
TypeError
: ifself.dtype
is specified andy.dtype
is notself.dtype
.NotImplementedError
: if_inverse
is not implemented.
inverse_event_shape
inverse_event_shape(output_shape)
Shape of a single sample from a single batch as a TensorShape
.
Same meaning as inverse_event_shape_tensor
. May be only partially defined.
Args:
output_shape
:TensorShape
indicating event-portion shape passed intoinverse
function.
Returns:
inverse_event_shape_tensor
:TensorShape
indicating event-portion shape after applyinginverse
. Possibly unknown.
inverse_event_shape_tensor
inverse_event_shape_tensor(
output_shape,
name='inverse_event_shape_tensor'
)
Shape of a single sample from a single batch as an int32
1D Tensor
.
Args:
output_shape
:Tensor
,int32
vector indicating event-portion shape passed intoinverse
function.name
: name to give to the op
Returns:
inverse_event_shape_tensor
:Tensor
,int32
vector indicating event-portion shape after applyinginverse
.
inverse_log_det_jacobian
inverse_log_det_jacobian(
y,
event_ndims,
name='inverse_log_det_jacobian',
**kwargs
)
Returns the (log o det o Jacobian o inverse)(y).
Mathematically, returns: log(det(dX/dY))(Y)
. (Recall that: X=g^{-1}(Y)
.)
Note that forward_log_det_jacobian
is the negative of this function,
evaluated at g^{-1}(y)
.
Args:
y
:Tensor
. The input to the 'inverse' Jacobian determinant evaluation.event_ndims
: Number of dimensions in the probabilistic events being transformed. Must be greater than or equal toself.inverse_min_event_ndims
. The result is summed over the final dimensions to produce a scalar Jacobian determinant for each event, i.e. it has shaperank(y) - event_ndims
dimensions.name
: The name to give this op.**kwargs
: Named arguments forwarded to subclass implementation.
Returns:
ildj
:Tensor
, if this bijector is injective. If not injective, returns the tuple of local log det Jacobians,log(det(Dg_i^{-1}(y)))
, whereg_i
is the restriction ofg
to theith
partitionDi
.
Raises:
TypeError
: ifself.dtype
is specified andy.dtype
is notself.dtype
.NotImplementedError
: if_inverse_log_det_jacobian
is not implemented.