tfp.bijectors.Chain

View source on GitHub

Bijector which applies a sequence of bijectors.

Inherits From: Bijector

Example Use:

chain = Chain([Exp(), Softplus()], name="one_plus_exp")

Results in:

  • Forward:
 exp = Exp()
 softplus = Softplus()
 Chain([exp, softplus]).forward(x)
 = exp.forward(softplus.forward(x))
 = tf.exp(tf.log(1. + tf.exp(x)))
 = 1. + tf.exp(x)
 ```

* Inverse:

 ```python
 exp = Exp()
 softplus = Softplus()
 Chain([exp, softplus]).inverse(y)
 = softplus.inverse(exp.inverse(y))
 = tf.log(tf.exp(tf.log(y)) - 1.)
 = tf.log(y - 1.)
 ```

<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2"><h2 class="add-link">Args</h2></th></tr>

<tr>
<td>
`bijectors`
</td>
<td>
Python `list` of bijector instances. An empty list makes this
bijector equivalent to the `Identity` bijector.
</td>
</tr><tr>
<td>
`validate_args`
</td>
<td>
Python `bool` indicating whether arguments should be
checked for correctness.
</td>
</tr><tr>
<td>
`parameters`
</td>
<td>
Locals dict captured by subclass constructor, to be used for
copy/slice re-instantiation operators.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
Python `str`, name given to ops managed by this object. Default:
E.g., `Chain([Exp(), Softplus()]).name == "chain_of_exp_of_softplus"`.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2"><h2 class="add-link">Raises</h2></th></tr>

<tr>
<td>
`ValueError`
</td>
<td>
if bijectors have different dtypes.
</td>
</tr>
</table>





<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2"><h2 class="add-link">Attributes</h2></th></tr>

<tr>
<td>
`bijectors`
</td>
<td>

</td>
</tr><tr>
<td>
`dtype`
</td>
<td>
dtype of `Tensor`s transformable by this distribution.
</td>
</tr><tr>
<td>
`forward_min_event_ndims`
</td>
<td>
Returns the minimal number of dimensions bijector.forward operates on.
</td>
</tr><tr>
<td>
`graph_parents`
</td>
<td>
Returns this `Bijector`'s graph_parents as a Python list.
</td>
</tr><tr>
<td>
`inverse_min_event_ndims`
</td>
<td>
Returns the minimal number of dimensions bijector.inverse operates on.
</td>
</tr><tr>
<td>
`is_constant_jacobian`
</td>
<td>
Returns true iff the Jacobian matrix is not a function of x.

Note: Jacobian matrix is either constant for both forward and inverse or
neither.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
Returns the string name of this `Bijector`.
</td>
</tr><tr>
<td>
`name_scope`
</td>
<td>
Returns a <a href="/api_docs/python/tf/name_scope"><code>tf.name_scope</code></a> instance for this class.
</td>
</tr><tr>
<td>
`parameters`
</td>
<td>
Dictionary of parameters used to instantiate this `Bijector`.
</td>
</tr><tr>
<td>
`submodules`
</td>
<td>
Sequence of all sub-modules.

Submodules are modules which are properties of this module, or found as
properties of modules which are properties of this module (and so on).

<pre class="devsite-click-to-copy prettyprint lang-py">
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">a = tf.Module()</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">b = tf.Module()</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">c = tf.Module()</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">a.b = b</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">b.c = c</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">list(a.submodules) == [b, c]</code>
<code class="no-select nocode">True</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">list(b.submodules) == [c]</code>
<code class="no-select nocode">True</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">list(c.submodules) == []</code>
<code class="no-select nocode">True</code>
</pre>

</td>
</tr><tr>
<td>
`trainable_variables`
</td>
<td>
Sequence of trainable variables owned by this module and its submodules.

Note: this method uses reflection to find variables on the current instance
and submodules. For performance reasons you may wish to cache the result
of calling this method if you don't expect the return value to change.
</td>
</tr><tr>
<td>
`validate_args`
</td>
<td>
Returns True if Tensor arguments will be validated.
</td>
</tr><tr>
<td>
`variables`
</td>
<td>
Sequence of variables owned by this module and its submodules.

Note: this method uses reflection to find variables on the current instance
and submodules. For performance reasons you may wish to cache the result
of calling this method if you don't expect the return value to change.
</td>
</tr>
</table>



## Methods

<h3 id="forward"><code>forward</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.10.0/tensorflow_probability/python/bijectors/bijector.py#L1000-L1016">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward(
    x, name='forward', **kwargs
)
</code></pre>

Returns the forward `Bijector` evaluation, i.e., X = g(Y).


<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`x`
</td>
<td>
`Tensor`. The input to the 'forward' evaluation.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
The name to give this op.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Named arguments forwarded to subclass implementation.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Returns</th></tr>
<tr class="alt">
<td colspan="3">
`Tensor`.
</td>
</tr>

</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Raises</th></tr>

<tr>
<td>
`TypeError`
</td>
<td>
if `self.dtype` is specified and `x.dtype` is not
`self.dtype`.
</td>
</tr><tr>
<td>
`NotImplementedError`
</td>
<td>
if `_forward` is not implemented.
</td>
</tr>
</table>



<h3 id="forward_dtype"><code>forward_dtype</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.10.0/tensorflow_probability/python/bijectors/bijector.py#L1378-L1394">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward_dtype(
    dtype, name='forward_dtype', **kwargs
)
</code></pre>

Returns the dtype of the output of the forward transformation.


<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`dtype`
</td>
<td>
`tf.dtype`, or nested structure of `tf.dtype`s, of the input to
`forward`.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
The name to give this op.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Named arguments forwarded to subclass implementation.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Returns</th></tr>
<tr class="alt">
<td colspan="3">
`tf.dtype` or nested structure of `tf.dtype`s of the output of `forward`.
</td>
</tr>

</table>



<h3 id="forward_event_shape"><code>forward_event_shape</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.10.0/tensorflow_probability/python/bijectors/bijector.py#L912-L926">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward_event_shape(
    input_shape
)
</code></pre>

Shape of a single sample from a single batch as a `TensorShape`.

Same meaning as `forward_event_shape_tensor`. May be only partially defined.

<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`input_shape`
</td>
<td>
`TensorShape` indicating event-portion shape passed into
`forward` function.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`forward_event_shape_tensor`
</td>
<td>
`TensorShape` indicating event-portion shape
after applying `forward`. Possibly unknown.
</td>
</tr>
</table>



<h3 id="forward_event_shape_tensor"><code>forward_event_shape_tensor</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.10.0/tensorflow_probability/python/bijectors/bijector.py#L885-L905">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward_event_shape_tensor(
    input_shape, name='forward_event_shape_tensor'
)
</code></pre>

Shape of a single sample from a single batch as an `int32` 1D `Tensor`.


<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`input_shape`
</td>
<td>
`Tensor`, `int32` vector indicating event-portion shape
passed into `forward` function.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
name to give to the op
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`forward_event_shape_tensor`
</td>
<td>
`Tensor`, `int32` vector indicating
event-portion shape after applying `forward`.
</td>
</tr>
</table>



<h3 id="forward_log_det_jacobian"><code>forward_log_det_jacobian</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.10.0/tensorflow_probability/python/bijectors/bijector.py#L1338-L1366">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>forward_log_det_jacobian(
    x, event_ndims, name='forward_log_det_jacobian', **kwargs
)
</code></pre>

Returns both the forward_log_det_jacobian.


<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`x`
</td>
<td>
`Tensor`. The input to the 'forward' Jacobian determinant evaluation.
</td>
</tr><tr>
<td>
`event_ndims`
</td>
<td>
Number of dimensions in the probabilistic events being
transformed. Must be greater than or equal to
`self.forward_min_event_ndims`. The result is summed over the final
dimensions to produce a scalar Jacobian determinant for each event, i.e.
it has shape `rank(x) - event_ndims` dimensions.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
The name to give this op.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Named arguments forwarded to subclass implementation.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Returns</th></tr>
<tr class="alt">
<td colspan="3">
`Tensor`, if this bijector is injective.
If not injective this is not implemented.
</td>
</tr>

</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Raises</th></tr>

<tr>
<td>
`TypeError`
</td>
<td>
if `self.dtype` is specified and `y.dtype` is not
`self.dtype`.
</td>
</tr><tr>
<td>
`NotImplementedError`
</td>
<td>
if neither `_forward_log_det_jacobian`
nor {`_inverse`, `_inverse_log_det_jacobian`} are implemented, or
this is a non-injective bijector.
</td>
</tr>
</table>



<h3 id="inverse"><code>inverse</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.10.0/tensorflow_probability/python/bijectors/bijector.py#L1068-L1086">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse(
    y, name='inverse', **kwargs
)
</code></pre>

Returns the inverse `Bijector` evaluation, i.e., X = g^{-1}(Y).


<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`y`
</td>
<td>
`Tensor`. The input to the 'inverse' evaluation.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
The name to give this op.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Named arguments forwarded to subclass implementation.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Returns</th></tr>
<tr class="alt">
<td colspan="3">
`Tensor`, if this bijector is injective.
If not injective, returns the k-tuple containing the unique
`k` points `(x1, ..., xk)` such that `g(xi) = y`.
</td>
</tr>

</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Raises</th></tr>

<tr>
<td>
`TypeError`
</td>
<td>
if `self.dtype` is specified and `y.dtype` is not
`self.dtype`.
</td>
</tr><tr>
<td>
`NotImplementedError`
</td>
<td>
if `_inverse` is not implemented.
</td>
</tr>
</table>



<h3 id="inverse_dtype"><code>inverse_dtype</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.10.0/tensorflow_probability/python/bijectors/bijector.py#L1396-L1412">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse_dtype(
    dtype, name='inverse_dtype', **kwargs
)
</code></pre>

Returns the dtype of the output of the inverse transformation.


<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`dtype`
</td>
<td>
`tf.dtype`, or nested structure of `tf.dtype`s, of the input to
`inverse`.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
The name to give this op.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Named arguments forwarded to subclass implementation.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Returns</th></tr>
<tr class="alt">
<td colspan="3">
`tf.dtype` or nested structure of `tf.dtype`s of the output of `inverse`.
</td>
</tr>

</table>



<h3 id="inverse_event_shape"><code>inverse_event_shape</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.10.0/tensorflow_probability/python/bijectors/bijector.py#L960-L974">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse_event_shape(
    output_shape
)
</code></pre>

Shape of a single sample from a single batch as a `TensorShape`.

Same meaning as `inverse_event_shape_tensor`. May be only partially defined.

<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`output_shape`
</td>
<td>
`TensorShape` indicating event-portion shape passed into
`inverse` function.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`inverse_event_shape_tensor`
</td>
<td>
`TensorShape` indicating event-portion shape
after applying `inverse`. Possibly unknown.
</td>
</tr>
</table>



<h3 id="inverse_event_shape_tensor"><code>inverse_event_shape_tensor</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.10.0/tensorflow_probability/python/bijectors/bijector.py#L933-L953">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse_event_shape_tensor(
    output_shape, name='inverse_event_shape_tensor'
)
</code></pre>

Shape of a single sample from a single batch as an `int32` 1D `Tensor`.


<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`output_shape`
</td>
<td>
`Tensor`, `int32` vector indicating event-portion shape
passed into `inverse` function.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
name to give to the op
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`inverse_event_shape_tensor`
</td>
<td>
`Tensor`, `int32` vector indicating
event-portion shape after applying `inverse`.
</td>
</tr>
</table>



<h3 id="inverse_log_det_jacobian"><code>inverse_log_det_jacobian</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.10.0/tensorflow_probability/python/bijectors/bijector.py#L1263-L1296">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>inverse_log_det_jacobian(
    y, event_ndims, name='inverse_log_det_jacobian', **kwargs
)
</code></pre>

Returns the (log o det o Jacobian o inverse)(y).

Mathematically, returns: `log(det(dX/dY))(Y)`. (Recall that: `X=g^{-1}(Y)`.)

Note that `forward_log_det_jacobian` is the negative of this function,
evaluated at `g^{-1}(y)`.

<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`y`
</td>
<td>
`Tensor`. The input to the 'inverse' Jacobian determinant evaluation.
</td>
</tr><tr>
<td>
`event_ndims`
</td>
<td>
Number of dimensions in the probabilistic events being
transformed. Must be greater than or equal to
`self.inverse_min_event_ndims`. The result is summed over the final
dimensions to produce a scalar Jacobian determinant for each event, i.e.
it has shape `rank(y) - event_ndims` dimensions.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
The name to give this op.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Named arguments forwarded to subclass implementation.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`ildj`
</td>
<td>
`Tensor`, if this bijector is injective.
If not injective, returns the tuple of local log det
Jacobians, `log(det(Dg_i^{-1}(y)))`, where `g_i` is the restriction
of `g` to the `ith` partition `Di`.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Raises</th></tr>

<tr>
<td>
`TypeError`
</td>
<td>
if `self.dtype` is specified and `y.dtype` is not
`self.dtype`.
</td>
</tr><tr>
<td>
`NotImplementedError`
</td>
<td>
if `_inverse_log_det_jacobian` is not implemented.
</td>
</tr>
</table>



<h3 id="with_name_scope"><code>with_name_scope</code></h3>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>@classmethod</code>
<code>with_name_scope(
    method
)
</code></pre>

Decorator to automatically enter the module name scope.

<pre class="devsite-click-to-copy prettyprint lang-py">
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">class MyModule(tf.Module):</code>
<code class="devsite-terminal" data-terminal-prefix="...">  @tf.Module.with_name_scope</code>
<code class="devsite-terminal" data-terminal-prefix="...">  def __call__(self, x):</code>
<code class="devsite-terminal" data-terminal-prefix="...">    if not hasattr(self, &#x27;w&#x27;):</code>
<code class="devsite-terminal" data-terminal-prefix="...">      self.w = tf.Variable(tf.random.normal([x.shape[1], 3]))</code>
<code class="devsite-terminal" data-terminal-prefix="...">    return tf.matmul(x, self.w)</code>
</pre>


Using the above module would produce <a href="/api_docs/python/tf/Variable"><code>tf.Variable</code></a>s and <a href="/api_docs/python/tf/Tensor"><code>tf.Tensor</code></a>s whose
names included the module name:

<pre class="devsite-click-to-copy prettyprint lang-py">
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">mod = MyModule()</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">mod(tf.ones([1, 2]))</code>
<code class="no-select nocode">&lt;tf.Tensor: shape=(1, 3), dtype=float32, numpy=..., dtype=float32)&gt;</code>
<code class="devsite-terminal" data-terminal-prefix="&gt;&gt;&gt;">mod.w</code>
<code class="no-select nocode">&lt;tf.Variable &#x27;my_module/Variable:0&#x27; shape=(2, 3) dtype=float32,</code>
<code class="no-select nocode">numpy=..., dtype=float32)&gt;</code>
</pre>


<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`method`
</td>
<td>
The method to wrap.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Returns</th></tr>
<tr class="alt">
<td colspan="3">
The original method wrapped such that it enters the module's name scope.
</td>
</tr>

</table>



<h3 id="__call__"><code>__call__</code></h3>

<a target="_blank" href="https://github.com/tensorflow/probability/blob/v0.10.0/tensorflow_probability/python/bijectors/bijector.py#L793-L878">View source</a>

<pre class="devsite-click-to-copy prettyprint lang-py tfo-signature-link">
<code>__call__(
    value, name=None, **kwargs
)
</code></pre>

Applies or composes the `Bijector`, depending on input type.

This is a convenience function which applies the `Bijector` instance in
three different ways, depending on the input:

1. If the input is a `tfd.Distribution` instance, return
   `tfd.TransformedDistribution(distribution=input, bijector=self)`.
2. If the input is a `tfb.Bijector` instance, return
   `tfb.Chain([self, input])`.
3. Otherwise, return `self.forward(input)`

<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Args</th></tr>

<tr>
<td>
`value`
</td>
<td>
A `tfd.Distribution`, `tfb.Bijector`, or a `Tensor`.
</td>
</tr><tr>
<td>
`name`
</td>
<td>
Python `str` name given to ops created by this function.
</td>
</tr><tr>
<td>
`**kwargs`
</td>
<td>
Additional keyword arguments passed into the created
`tfd.TransformedDistribution`, `tfb.Bijector`, or `self.forward`.
</td>
</tr>
</table>



<!-- Tabular view -->
 <table class="properties responsive orange">
<tr><th colspan="2">Returns</th></tr>

<tr>
<td>
`composition`
</td>
<td>
A `tfd.TransformedDistribution` if the input was a
`tfd.Distribution`, a `tfb.Chain` if the input was a `tfb.Bijector`, or
a `Tensor` computed by `self.forward`.
</td>
</tr>
</table>


#### Examples

```python
sigmoid = tfb.Reciprocal()(
    tfb.AffineScalar(shift=1.)(
      tfb.Exp()(
        tfb.AffineScalar(scale=-1.))))
# ==> `tfb.Chain([
#         tfb.Reciprocal(),
#         tfb.AffineScalar(shift=1.),
#         tfb.Exp(),
#         tfb.AffineScalar(scale=-1.),
#      ])`  # ie, `tfb.Sigmoid()`

log_normal = tfb.Exp()(tfd.Normal(0, 1))
# ==> `tfd.TransformedDistribution(tfd.Normal(0, 1), tfb.Exp())`

tfb.Exp()([-1., 0., 1.])
# ==> tf.exp([-1., 0., 1.])