tfp.bijectors.JointMap

Bijector which applies a structure of bijectors in parallel.

Inherits From: Composition, Bijector

This is the "structured" counterpart to Chain. Whereas Chain applies an ordered sequence, JointMap applies a structure of transformations to a matching structure of inputs.

Example Use:

exp = Exp()
scale = Scale(2.)
parallel = JointMap({'a': exp, 'b': scale})
x = {'a': 1., 'b': 2.}

parallel.forward(x)
# = {'a': exp.forward(x['a']), 'b': scale.forward(x['b'])}
# = {'a': tf.exp(1.), 'b': 2. * 2.}

parallel.inverse(x)
# = {'a': exp.inverse(x['a']), 'b': scale.inverse(x['b'])}
# = {'a': tf.log(1.), 'b': 2. / 2.}

Bijectors need not be a dictionary; it could be a list, tuple, list of dictionaries, or anything else supported by tf.nest.map_structure.

bijectors Structure of bijector instances to apply in parallel.
validate_args Python bool indicating whether arguments should be checked for correctness.
parameters Locals dict captured by subclass constructor, to be used for copy/slice re-instantiation operators.
name Python str, name given to ops managed by this object. Default: E.g., ``` JointMap([Exp(), Softplus()]).name == "jointmap_of_exp_and_softplus"

</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2"><h2 class="add-link">Raises</h2></th></tr>

<tr>
<td>
`ValueError`
</td>
<td>
if bijectors have different dtypes.
</td>
</tr>
</table>





<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2"><h2 class="add-link">Attributes</h2></th></tr>

<tr>
<td>
`bijectors`
</td>
<td>

</td>
</tr><tr>
<td>
`dtype`
</td>
<td>

</td>
</tr><tr>
<td>
`forward_min_event_ndims`
</td>
<td>
Returns the minimal number of dimensions bijector.forward operates on.

Multipart bijectors return structured `ndims`, which indicates the
expected structure of their inputs. Some multipart bijectors, notably
Composites, may return structures of `None`.
</td>
</tr><tr>
<td>
`graph_parents`
</td>
<td>
Returns this `Bijector`'s graph_parents as a Python list.
</td>
</tr><tr>
<td>
`has_static_min_event_ndims`
</td>
<td>
Returns True if the bijector has statically-known `min_event_ndims`.
</td>
</tr><tr>
<td>
`inverse_min_event_ndims`
</td>
<td>
Returns the minimal number of dimensions bijector.inverse operates on.

Multipart bijectors return structured `event_ndims`, which indicates the
expected structure of their outputs. Some multipart bijectors, notably
Composites, may return structures of `None`.
</td>
</tr><tr>
<td>
`is_constant_jacobian`
</td>
<td>
Returns true iff the Jacobian matrix is not a function of x.

Note: Jacobian matrix is either constant for both forward and inverse or
neither.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
Returns the string name of this `Bijector`.
</td>
</tr><tr>
<td>
`name_scope`
</td>
<td>
Returns a <a href="https://www.tensorflow.org/api_docs/python/tf/name_scope"><code>tf.name_scope</code></a> instance for this class.
</td>
</tr><tr>
<td>
`non_trainable_variables`
</td>
<td>
Sequence of non-trainable variables owned by this module and its submodules.

Note: this method uses reflection to find variables on the current instance
and submodules. For performance reasons you may wish to cache the result
of calling this method if you don't expect the return value to change.
</td>
</tr><tr>
<td>
`parameters`
</td>
<td>
Dictionary of parameters used to instantiate this `Bijector`.
</td>
</tr><tr>
<td>
`submodules`
</td>
<td>
Sequence of all sub-modules.

Submodules are modules which are properties of this module, or found as
properties of modules which are properties of this module (and so on).

<pre class="devsite-click-to-copy prettyprint lang-py">
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">a = tf.Module()</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">b = tf.Module()</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">c = tf.Module()</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">a.b = b</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">b.c = c</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">list(a.submodules) == [b, c]</code>
<code class="no-select nocode">True</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">list(b.submodules) == [c]</code>
<code class="no-select nocode">True</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">list(c.submodules) == []</code>
<code class="no-select nocode">True</code>
</pre>

</td>
</tr><tr>
<td>
`trainable_variables`
</td>
<td>
Sequence of trainable variables owned by this module and its submodules.

Note: this method uses reflection to find variables on the current instance
and submodules. For performance reasons you may wish to cache the result
of calling this method if you don't expect the return value to change.
</td>
</tr><tr>
<td>
`validate_args`
</td>
<td>
Returns True if Tensor arguments will be validated.
</td>
</tr><tr>
<td>
`validate_event_size`
</td>
<td>

</td>
</tr><tr>
<td>
`variables`
</td>
<td>
Sequence of variables owned by this module and its submodules.

Note: this method uses reflection to find variables on the current instance
and submodules. For performance reasons you may wish to cache the result
of calling this method if you don't expect the return value to change.
</td>
</tr>
</table>



## Methods

<h3 id="forward"><code>forward</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/master/tensorflow_probability/python/bijectors/bijector.py">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward(
    x, name='forward', **kwargs
)
</code></pre>

Returns the forward `Bijector` evaluation, i.e., X = g(Y).


<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`x`
</td>
<td>
`Tensor` (structure). The input to the 'forward' evaluation.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
The name to give this op.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Named arguments forwarded to subclass implementation.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>
<tr class="alt">
<td colspan="2">
`Tensor` (structure).
</td>
</tr>

</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Raises</th></tr>

<tr>
<td>
`TypeError`
</td>
<td>
if `self.dtype` is specified and `x.dtype` is not
`self.dtype`.
</td>
</tr><tr>
<td>
`NotImplementedError`
</td>
<td>
if `_forward` is not implemented.
</td>
</tr>
</table>



<h3 id="forward_dtype"><code>forward_dtype</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/master/tensorflow_probability/python/bijectors/bijector.py">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward_dtype(
    dtype=UNSPECIFIED, name='forward_dtype', **kwargs
)
</code></pre>

Returns the dtype returned by `forward` for the provided input.


<h3 id="forward_event_ndims"><code>forward_event_ndims</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/master/tensorflow_probability/python/bijectors/composition.py">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward_event_ndims(
    event_ndims, **kwargs
)
</code></pre>

Returns the number of event dimensions produced by `forward`.


<h3 id="forward_event_shape"><code>forward_event_shape</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/master/tensorflow_probability/python/bijectors/bijector.py">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward_event_shape(
    input_shape
)
</code></pre>

Shape of a single sample from a single batch as a `TensorShape`.

Same meaning as `forward_event_shape_tensor`. May be only partially defined.

<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`input_shape`
</td>
<td>
`TensorShape` (structure) indicating event-portion shape
passed into `forward` function.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`forward_event_shape_tensor`
</td>
<td>
`TensorShape` (structure) indicating
event-portion shape after applying `forward`. Possibly unknown.
</td>
</tr>
</table>



<h3 id="forward_event_shape_tensor"><code>forward_event_shape_tensor</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/master/tensorflow_probability/python/bijectors/bijector.py">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward_event_shape_tensor(
    input_shape, name='forward_event_shape_tensor'
)
</code></pre>

Shape of a single sample from a single batch as an `int32` 1D `Tensor`.


<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`input_shape`
</td>
<td>
`Tensor`, `int32` vector (structure) indicating event-portion
shape passed into `forward` function.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
name to give to the op
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`forward_event_shape_tensor`
</td>
<td>
`Tensor`, `int32` vector (structure)
indicating event-portion shape after applying `forward`.
</td>
</tr>
</table>



<h3 id="forward_log_det_jacobian"><code>forward_log_det_jacobian</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/master/tensorflow_probability/python/bijectors/bijector.py">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward_log_det_jacobian(
    x, event_ndims, name='forward_log_det_jacobian', **kwargs
)
</code></pre>

Returns both the forward_log_det_jacobian.


<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`x`
</td>
<td>
`Tensor` (structure). The input to the 'forward' Jacobian determinant
evaluation.
</td>
</tr><tr>
<td>
`event_ndims`
</td>
<td>
Number of dimensions in the probabilistic events being
transformed. Must be greater than or equal to
`self.forward_min_event_ndims`. The result is summed over the final
dimensions to produce a scalar Jacobian determinant for each event, i.e.
it has shape `rank(x) - event_ndims` dimensions.
Multipart bijectors require *structured* event_ndims, such that
`rank(y[i]) - rank(event_ndims[i])` is the same for all elements `i` of
the structured input. Furthermore, the first `event_ndims[i]` of each
`x[i].shape` must be the same for all `i` (broadcasting is not allowed).
</td>
</tr><tr>
<td>
`name`
</td>
<td>
The name to give this op.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Named arguments forwarded to subclass implementation.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>
<tr class="alt">
<td colspan="2">
`Tensor` (structure), if this bijector is injective.
If not injective this is not implemented.
</td>
</tr>

</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Raises</th></tr>

<tr>
<td>
`TypeError`
</td>
<td>
if `y`'s dtype is incompatible with the expected output dtype.
</td>
</tr><tr>
<td>
`NotImplementedError`
</td>
<td>
if neither `_forward_log_det_jacobian`
nor {`_inverse`, `_inverse_log_det_jacobian`} are implemented, or
this is a non-injective bijector.
</td>
</tr>
</table>



<h3 id="inverse"><code>inverse</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/master/tensorflow_probability/python/bijectors/bijector.py">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse(
    y, name='inverse', **kwargs
)
</code></pre>

Returns the inverse `Bijector` evaluation, i.e., X = g^{-1}(Y).


<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`y`
</td>
<td>
`Tensor` (structure). The input to the 'inverse' evaluation.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
The name to give this op.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Named arguments forwarded to subclass implementation.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>
<tr class="alt">
<td colspan="2">
`Tensor` (structure), if this bijector is injective.
If not injective, returns the k-tuple containing the unique
`k` points `(x1, ..., xk)` such that `g(xi) = y`.
</td>
</tr>

</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Raises</th></tr>

<tr>
<td>
`TypeError`
</td>
<td>
if `y`'s structured dtype is incompatible with the expected
output dtype.
</td>
</tr><tr>
<td>
`NotImplementedError`
</td>
<td>
if `_inverse` is not implemented.
</td>
</tr>
</table>



<h3 id="inverse_dtype"><code>inverse_dtype</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/master/tensorflow_probability/python/bijectors/bijector.py">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse_dtype(
    dtype=UNSPECIFIED, name='inverse_dtype', **kwargs
)
</code></pre>

Returns the dtype returned by `inverse` for the provided input.


<h3 id="inverse_event_ndims"><code>inverse_event_ndims</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/master/tensorflow_probability/python/bijectors/composition.py">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse_event_ndims(
    event_ndims, **kwargs
)
</code></pre>

Returns the number of event dimensions produced by `inverse`.


<h3 id="inverse_event_shape"><code>inverse_event_shape</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/master/tensorflow_probability/python/bijectors/bijector.py">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse_event_shape(
    output_shape
)
</code></pre>

Shape of a single sample from a single batch as a `TensorShape`.

Same meaning as `inverse_event_shape_tensor`. May be only partially defined.

<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`output_shape`
</td>
<td>
`TensorShape` (structure) indicating event-portion shape
passed into `inverse` function.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`inverse_event_shape_tensor`
</td>
<td>
`TensorShape` (structure) indicating
event-portion shape after applying `inverse`. Possibly unknown.
</td>
</tr>
</table>



<h3 id="inverse_event_shape_tensor"><code>inverse_event_shape_tensor</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/master/tensorflow_probability/python/bijectors/bijector.py">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse_event_shape_tensor(
    output_shape, name='inverse_event_shape_tensor'
)
</code></pre>

Shape of a single sample from a single batch as an `int32` 1D `Tensor`.


<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`output_shape`
</td>
<td>
`Tensor`, `int32` vector (structure) indicating
event-portion shape passed into `inverse` function.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
name to give to the op
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`inverse_event_shape_tensor`
</td>
<td>
`Tensor`, `int32` vector (structure)
indicating event-portion shape after applying `inverse`.
</td>
</tr>
</table>



<h3 id="inverse_log_det_jacobian"><code>inverse_log_det_jacobian</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/master/tensorflow_probability/python/bijectors/bijector.py">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse_log_det_jacobian(
    y, event_ndims, name='inverse_log_det_jacobian', **kwargs
)
</code></pre>

Returns the (log o det o Jacobian o inverse)(y).

Mathematically, returns: `log(det(dX/dY))(Y)`. (Recall that: `X=g^{-1}(Y)`.)

Note that `forward_log_det_jacobian` is the negative of this function,
evaluated at `g^{-1}(y)`.

<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`y`
</td>
<td>
`Tensor` (structure). The input to the 'inverse' Jacobian determinant
evaluation.
</td>
</tr><tr>
<td>
`event_ndims`
</td>
<td>
Number of dimensions in the probabilistic events being
transformed. Must be greater than or equal to
`self.inverse_min_event_ndims`. The result is summed over the final
dimensions to produce a scalar Jacobian determinant for each event, i.e.
it has shape `rank(y) - event_ndims` dimensions.
Multipart bijectors require *structured* event_ndims, such that
`rank(y[i]) - rank(event_ndims[i])` is the same for all elements `i` of
the structured input. Furthermore, the first `event_ndims[i]` of each
`x[i].shape` must be the same for all `i` (broadcasting is not allowed).
</td>
</tr><tr>
<td>
`name`
</td>
<td>
The name to give this op.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Named arguments forwarded to subclass implementation.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`ildj`
</td>
<td>
`Tensor`, if this bijector is injective.
If not injective, returns the tuple of local log det
Jacobians, `log(det(Dg_i^{-1}(y)))`, where `g_i` is the restriction
of `g` to the `ith` partition `Di`.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Raises</th></tr>

<tr>
<td>
`TypeError`
</td>
<td>
if `x`'s dtype is incompatible with the expected inverse-dtype.
</td>
</tr><tr>
<td>
`NotImplementedError`
</td>
<td>
if `_inverse_log_det_jacobian` is not implemented.
</td>
</tr>
</table>



<h3 id="with_name_scope"><code>with_name_scope</code></h3>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>@classmethod</code>
<code>with_name_scope(
    method
)
</code></pre>

Decorator to automatically enter the module name scope.

<pre class="devsite-click-to-copy prettyprint lang-py">
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">class MyModule(tf.Module):</code>
<code class="devsite-terminal" data-terminal-prefix="...">  @tf.Module.with_name_scope</code>
<code class="devsite-terminal" data-terminal-prefix="...">  def __call__(self, x):</code>
<code class="devsite-terminal" data-terminal-prefix="...">    if not hasattr(self, &#x27;w&#x27;):</code>
<code class="devsite-terminal" data-terminal-prefix="...">      self.w = tf.Variable(tf.random.normal([x.shape[1], 3]))</code>
<code class="devsite-terminal" data-terminal-prefix="...">    return tf.matmul(x, self.w)</code>
</pre>


Using the above module would produce <a href="https://www.tensorflow.org/api_docs/python/tf/Variable"><code>tf.Variable</code></a>s and <a href="https://www.tensorflow.org/api_docs/python/tf/Tensor"><code>tf.Tensor</code></a>s whose
names included the module name:

<pre class="devsite-click-to-copy prettyprint lang-py">
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">mod = MyModule()</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">mod(tf.ones([1, 2]))</code>
<code class="no-select nocode">&lt;tf.Tensor: shape=(1, 3), dtype=float32, numpy=..., dtype=float32)&gt;</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">mod.w</code>
<code class="no-select nocode">&lt;tf.Variable &#x27;my_module/Variable:0&#x27; shape=(2, 3) dtype=float32,</code>
<code class="no-select nocode">numpy=..., dtype=float32)&gt;</code>
</pre>


<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`method`
</td>
<td>
The method to wrap.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>
<tr class="alt">
<td colspan="2">
The original method wrapped such that it enters the module's name scope.
</td>
</tr>

</table>



<h3 id="__call__"><code>__call__</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/master/tensorflow_probability/python/bijectors/bijector.py">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>__call__(
    value, name=None, **kwargs
)
</code></pre>

Applies or composes the `Bijector`, depending on input type.

This is a convenience function which applies the `Bijector` instance in
three different ways, depending on the input:

1. If the input is a `tfd.Distribution` instance, return
   `tfd.TransformedDistribution(distribution=input, bijector=self)`.
2. If the input is a `tfb.Bijector` instance, return
   `tfb.Chain([self, input])`.
3. Otherwise, return `self.forward(input)`

<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`value`
</td>
<td>
A `tfd.Distribution`, `tfb.Bijector`, or a (structure of) `Tensor`.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
Python `str` name given to ops created by this function.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Additional keyword arguments passed into the created
`tfd.TransformedDistribution`, `tfb.Bijector`, or `self.forward`.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="responsive fixed orange">
<colgroup><col width="214px"><col></colgroup>
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`composition`
</td>
<td>
A `tfd.TransformedDistribution` if the input was a
`tfd.Distribution`, a `tfb.Chain` if the input was a `tfb.Bijector`, or
a (structure of) `Tensor` computed by `self.forward`.
</td>
</tr>
</table>


#### Examples

```python
sigmoid = tfb.Reciprocal()(
    tfb.AffineScalar(shift=1.)(
      tfb.Exp()(
        tfb.AffineScalar(scale=-1.))))
# ==> `tfb.Chain([
#         tfb.Reciprocal(),
#         tfb.AffineScalar(shift=1.),
#         tfb.Exp(),
#         tfb.AffineScalar(scale=-1.),
#      ])`  # ie, `tfb.Sigmoid()`

log_normal = tfb.Exp()(tfd.Normal(0, 1))
# ==> `tfd.TransformedDistribution(tfd.Normal(0, 1), tfb.Exp())`

tfb.Exp()([-1., 0., 1.])
# ==> tf.exp([-1., 0., 1.])

__eq__

View source

Return self==value.